Assignment 03
Critical Points
Due: Oct. 29, 2017 11:59:59 PM
Graded: Nov. 05, 2017
Percentage of Grade: 11%
Assignment Description: Finalized
- Datasets
- Part 1: Code Structure
- Part 2: Required Code Functionality
- Part 3: Experimentation
- Submission and Grading
In this assignment, we’ll recreate a TTK filter and plugin to compute the critical points of a PL scalar field defined on the vertices of a simplicial complex. Given a continuous scalar field, critical points are defined as locations where the gradient of the field is zero. Given a PL scalar field, critical points are always on vertices and can be compute by a local check of values on neighboring vertices (without needing to compute the gradient). Figure IV.9 of Edelsbrunner and Harer illustrates the situation for 2D simplicial complexes.
Recall that critical points have various types: minima, maxima, and saddles. For 2D datasets, I expect that you will produce three types: index 0 (minima), index 1 (saddles), and index 2 (maxima). For 3D datasets, I expect that you will produce four types: index 0 (minima), index 1 (saddles), index 2 (saddles), and index 3 (maxima).
As you will see, TTK makes most of this operation fairly easy to do. Besides, you have a working example in ttkScalarFieldCriticalPoints and ttk::ScalarFieldCriticalPoints. Our implementation will change in a variety of ways, and part of the assignment will involve experimentation to compare the two.
Datasets
For this assignment, I’ve included a variety of scalar data, available at data03. Not all of this data is simplicial (e.g., I’ve included a couple of *.vti
files), but this demonstrates the capability of TTK to handle a variety of data modalities, converting all of them to the ttk::Triangulation data structure. In addition, I also recommend you test with some of the simple geometry data from the previous assignments, and, using ParaView, construct scalar fields on these meshes (e.g. using the Elevation
filter).
Part 1: Code Structure
Using the createTTKmodule.sh
, create a new project called CriticalPoints
. This will create the usual files in baseCode
, vtkWrappers
, paraview
, and standalone
. You will need to eventually edit all of these.
In this assignment, your vtkWrapper will mostly be using the defaults. Your input should be any vtkDataSet
so you do not have to override FillInputPortInformation
. Your output will be a vtkUnstructuredGrid
and you should adjust FillOutputPortInformation
to force this. Note that ttkScalarFieldCriticalPoints
is similar in structure here. Even though your output type will be a vtkUnstructureGrid
, you will only be setting the point set of this, there will be no need to create cells. Instead, this object will simply wrap a vtkPointSet
that has the position of each critical point as well as a scalar field of type vtkIntArray
that stores the index of the critical points.
As a results, you should be able to leave the defaults in the paraview/CriticalPoints.xml
as they also expect there to be a scalar field on the input vertices.
Additionally, your code should pass three boolean parameters that will be used to specify whether or not you want to output critical points of type minima, maxima, or saddle. Your code should only add points for the critical points of the type(s) specified. This can be handy; in particular, if the user chooses not to output saddles, you can likely speed up your code significantly.
At the lower level, I recommend that your baseCode class uses an input that is a void *
, but will eventually be cast into an array of datatype *
. You will need to setup the triangulation, as I will describe below, and this will encapsulate all of the connectivity information for the vertices with whatever you preprocess. Instead of producing an output scalar field, in my implementation I passed a vector<vector<int> > crit_pts
. This produced a list of indices of critical points for each type (e.g. crit_pts[0]
was my list of critical points of index 0). You’ll note that ttk::ScalarFieldCriticalPoints
does this slightly differently (a vector<pair<int, char> >
, which is a list of critical points stored as a pair of (id, type)). You’re welcome to experiment with whatever is more efficient for you.
Part 2: Required Code Functionality
Our implementation of critical point extraction will only use the data values stored on the vertices. Note that ttk::ScalarFieldCriticalPoints
is more sophisticated in that it uses a technique called Simulation of Simplicity to break ties and construct a Morse function. Instead, we will implement the basic critical point extraction that suffers from issues when the function is not Morse. Part of the experimental section will be on exploring the differences.
As in the last assignment, your execute()
function in your baseCode
class should take parameters for the flags that indicate which critical points will be returned. These flags can also be used to help optimize what you do in the execute()
function.
To extract critical points, you should investigate the vertices of the upper and lower link for each vertex in the input simplicial complex. Extracting extrema is straightforward: if the upper link is empty then the vertex is a minima. If the lower link is empty then the vertex is a maxima.
Extracting saddles and differentiating them from regular points is trickier, and depends on the dimension of the input. For two dimensions, a regular vertex has one connected component in the lower link and one connected component in the upper link. A (typical) saddle has exactly two components in the upper and lower link. Atypical saddles (such as monkey saddles) might have more than this. For three dimensions, regular vertices are the same but typical saddles have two types: (1) index 1 saddles have 2 components in the lower link and 1 in the upper link (a merge) while (2) index 2 saddles have 1 component in the lower link and 2 in the upper link (a split).
Thus, to determine these, you will need not just the size of the upper and lower link, but the connectivity. ttk::ScalarFieldCriticalPoints
employs a Union-Find data structure for this, but there are a variety of other ways to extract this (for example, using breadth-first search and a queue). Your method of implementing this is your choice.
To extract connectivity in the link, you’ll also need to enumerate not just the vertices in the link, but their adjacencies. There are a variety of ways to extract this. ttk::ScalarFieldCriticalPoints
uses the cells (triangles in 2D, tetrahedra in 3D) of the (closed) star of the vertex. Alternatively, you can also access the cells (edges in 2D, triangles in 3D) in the link directly. Feel free to experiment with these to figure out which method of enumeration you prefer from the point of view of coding and from the point of view efficiency. Be sure to call the appropriate preprocessing functions. Please document in your report a high-level version of the algorithm.
Note that all of the above assumes that you are not working with critical points that live on the boundary of the input data. On the boundary, the topology of the link changes and thus the simple rules above no longer hold. For this assignment, you do not need to implement any special cases to handle boundary.
Part 3: Experimentation
Using the datasets provided, we can explore the critical points for a variety of shapes using ParaView. To begin, let’s use torus.vtp
. This PolyData does not have a scalar field defined on it, so you’ll have to first define one using the Elevation
filter. How many critical points, and of what type, does it have? After computing the critical points, you should be able to use the ParaView filter TTKSphereFromPoint
to visualize them, but you may have to adjust the radius input parameter and select the appropriate field to color by.
Next, try changing the Elevation
filter to align with how the torus would rest on a table by placing the guide vector through the through hole of the torus (you can manually type in values in ParaView for this and it should align with the y-axis). What type of critical points, and how many, do you produce? Compare your filter to TTKScalarFieldCriticalPoints
.
You should be able to similar make observations for dragon.vtu
and other shapes from data01 using an Elevation
filter (although shape3.vtp
has a scalar field predefined). For dragon.vtu
, report how many critical points your filter had (at whatever elevation direction you specified) and verify that this matches the PL Morse Inequalities.
Next, let’s try some 3D data that is already has scalar data. Your filter should be able to process hydrogen.vti
and fuel.vti
in only a few seconds. Compare the number, type, and location of critical points that you define vs. TTKScalarFieldCriticalPoints
. I do not expect these numbers to be the same, but as a baseline you might want to debug with foo.vti
, which is a PL Morse function, and therefore you should be able to produce identical numbers of critical points. Do these critical points correspond to any interesting features in these volumes? You may want to explore the volumes using volume rendering, slicing, and/or contouring to compare. You might also want to try out the tooth.vti
dataset, which is about the same size as foo.vti
, but less well behaved.
Finally, load BuiltInExample1.vti
, a test dataset from TTK. This is a vector field, so it should not be able to be processed immediately. Using the Calculator
filter, compute the magnitude of this vector field to produce a scalar field. Consider the critical points for the magnitude field; do they correspond to any interesting features in the data?
Document the answer to these questions, as well as previously asked ones in your report. Feel free to include screenshots if they are helpful. Also consider experimenting with the other datasets as well as other scalar fields you might be able to produce on them.
Submission and Grading
Submit this entire release folder as a subdirectory to your git repository on bitbucket.
Your grade for this assignment will be based on the following:
- 60% Correct implementation of the filter and ParaView plugin.
- 40% Completing the report, being sure to document any implementation details as well as the experimental results.
This percentage will be scaled to the total value of this assignment for your final grade (11%). Extra credit may be assigned for going beyond the specifications of the assignment. One possible direction you might consider is modifying your filter to handle various types of saddles and boundary cases. Producing an implementation that is faster than TTKScalarFieldCriticalPoints
(and being able to explain why) will also result in extra credit.