Description
Context
The ultimate goal is to make good hex-mehses. One of the metrics for good hex-meshes is the per-cell distortion like the Scaled Jacobian. However, to generate a good hex-mesh of a given 3D model with an iterative algorithm, it is too costly to go as far as the hex-mesh in the feedback loop.
Polycube-based hex-meshing proposes to iterate over polycubes (orthogonal polyhedra), a good polycube being a proxy for a good hex-mesh. Some of the metrics for good polycubes are the angle and area distortions of the mapping.
The pipeline can be further simplified by iterating on polycube labelings (association of surface triangles to ±X, ±Y or ±Z) instead of handling a (volume) polycube, a good labeling being a proxy for a good polycube. A labeling must be valid (i.e. being the representation of a polycube boundary), and a good labeling is compact, has high geometric fidelity (but not always) and all-monotone boundaries.
Currently in the code base we can evaluate labelings and hex-meshes (with the Scaled Jacobian). Contrary to some papers on polycube-based hex-meshing, we cannot provide values on polycube distortion. Evocube1 leverages distortion metrics, on one hand for results analysis (Tables 1 & 3 + supplemental material), on the other hand as score components (section 2.1, workability).
Starting point
Like Evocube1, we have two input triangle meshes, with the same number of vertices & triangles, with the same vertices connectivity (= the same triangles) but of different vertices coordinates.
Evocube section 2.1:
The per-triangle distortion is measured using the singular values
$\sigma_1$ and$\sigma_2$ of the Jacobian of the mapping from the initial triangle to its equivalent in [the polycube domain].
On this point, the authors refer to Tarini et al. 2004 2.
On the score side (section 2.1), the per-triangle workability
Questions
Where does this -4 come from?
-
$\sigma_1 + \sigma_2 + \frac{1}{\sigma_1 \sigma_2}$ looks like the area distortion ($\sigma_1 \sigma_2 + \frac{1}{\sigma_1 \sigma_2}$ ) we will see below. Is there a typo? -
$\frac{\sigma_1}{\sigma_2} + \frac{\sigma_2}{\sigma_1}$ is the angle distortion we will see below. - lastly
$-4$ . The ideal value for each distortion being$1$ , why not subtract 2 so that the cost is null when the distortion is null?
As for the results analysis (section 4), Evocube uses the area distortion
In the source code of Evocube, we can find out how the computation of these metrics has been implemented.
On the score side, the function include/evaluator.h > evaluate()
as indeed 4 components:
- the (in)valididy score
invalid_score
- the compactness score
compact_score
- the (geometric) fidelity score
fidelity_score
- the workability
fast_poly_score
, obtained viasrc/quick_label_ev.cpp > QuickLabelEv::evaluate()
, which is the average distortion + a cost for inverted triangles
Let's take the average distortion calculation step-by-step:
- A polycube is obtained with
src/quick_label_ev.cpp QuickLabelEv::LDLTDeformation()
, which looks like a transcoding offastbndpolycube
and/orpolycube_withHexEx
. - The normals
N_def
of the polycube triangles are computed - Per-triangle distortions are computed via
src/distortion.cpp > computeDisto()
, which:- Computes the Jacobian
jacobians[f_id]
- Computes singular values
s1
ands2
($\sigma_1$ et$\sigma_2$ ) usingEigen::JacobiSVD
- Computes the area distortion
s1 * s2 + 1.0 / (s1 * s2) - 2.0
(this goes along with a typo in the article, see above) - Computes the angle distortion
s1 / s2 + s2 / s1 - 2.0
- Writes the sum of the two distortions as the per-triangle distorsion
- Computes the Jacobian
- Squaring of each per-triangle distorsion
- Computation of the distorsion integral over the surface via
src/distortion.cpp > integrateDistortion()
, which :- Caps distorsions with a threshold at
$10^8$ , giving thedistop
vector - Computes
(A.array() * distop.array()).sum() / A.sum()
.A
being the triangles area vector of the input mesh, we have the weighted sum of distortions, normalized by the total area:
- Caps distorsions with a threshold at
- (For the final value, taking into consideration of the number of inverted triangles, i.e. polycube triangles for which the dot product with the label's direction is negative)
Question
Why 2 is subtracted to the area and angle distortions? It's consistent with the -4 seen above, but not consistent with the equations we'll see below.
As for the results analysis side, Evocube has a measurement
app (app/measurement.cpp
) which reads a triangle mesh and a polycube, to write in the logs (JSON files) the distortion metrics value.
The first operations are:
- The computation of per-triangle areas for the input mesh (
A1
) and the polycube (A2
) - The computation of the total areas for the input mesh (
A_m
) and the polycube (A_d
) - The polycube is scaled to have the same total surface as the input mesh:
V2 = V2 * std::sqrt(A_m / A_d)
. I suppose there is a square root because we're going from a scalar to dimension 2 (area). - Areas are recomputed
- Jacobians are computed
- Singular values are computed using
Eigen::JacobiSVD
- The stretch (another distortion metric) is computed via
src/distortion.cpp > computeStretch()
- The area distortion is computed via
src/distortion.cpp > computeAreaDisto()
- The angle distorsion is computed via
src/distortion.cpp > computeAngleDisto()
- The isometric distortion (another distortion metric) is computed via
src/distortion.cpp > computeIsometricDisto()
Let's take these distortion computations one by one.
// Spherical Parametrization and Remeshing
// Praun & Hoppe
double computeStretch(const Eigen::VectorXd& A, double A_m, double A_d,
std::vector<std::pair<double, double>> per_tri_singular_values){
double sum = 0;
for (int f_id = 0; f_id < per_tri_singular_values.size(); f_id++){
double s1 = per_tri_singular_values[f_id].first;
double s2 = per_tri_singular_values[f_id].second;
sum += A(f_id) * (s1 * s1 + s2 * s2) / 2.0;
}
sum /= A.sum();
return (A_m / A_d) * (1.0 / std::pow(sum, 2));
}
So we have:
// PolyCube-Maps
// Tarini & Hormann & Cignoni & Montani
double computeAreaDisto(const Eigen::VectorXd& A, std::vector<std::pair<double, double>> per_tri_singular_values){
double sum = 0;
for (int f_id = 0; f_id < per_tri_singular_values.size(); f_id++){
double s1 = per_tri_singular_values[f_id].first;
double s2 = per_tri_singular_values[f_id].second;
sum += A(f_id) * 0.5 * (s1 * s2 + 1.0 / (s1 * s2));
}
return sum / A.sum();
}
This gives:
double computeAngleDisto(const Eigen::VectorXd& A, std::vector<std::pair<double, double>> per_tri_singular_values){
double sum = 0;
for (int f_id = 0; f_id < per_tri_singular_values.size(); f_id++){
double s1 = per_tri_singular_values[f_id].first;
double s2 = per_tri_singular_values[f_id].second;
sum += A(f_id) * 0.5 * (s1 / s2 + s2 / s1);
}
return sum / A.sum();
}
This gives:
Lastly:
// Computing Surface PolyCube-Maps by Constrained Voxelization
// Yang, Fu, Liu
double computeIsometricDisto(const Eigen::VectorXd& A, std::vector<std::pair<double, double>> per_tri_singular_values){
double sum = 0;
for (int f_id = 0; f_id < per_tri_singular_values.size(); f_id++){
double s1 = per_tri_singular_values[f_id].first;
double s2 = per_tri_singular_values[f_id].second;
sum += std::max(s1, 1.0 / s2) * A(f_id);
}
return sum / A.sum();
}
This results in:
Transcoding
In validity-first-polycube-labeling, I simply copied the Evocube code, cf src/geometry_distortion.cpp
, but using the STL & Geogram, instead of libigl & Eigen.
Result: our executable polycube_distortion
gives the same values as measurement
from Evocube.
However, I don't find the same values as the Evocube output data. When Evocube was executed on the whole input dataset, measurement
, via init_from_folder
, created a logs.json
file for each 3D model. Then supplemental_generator
browsed all sub-folders and generated a PDF report. By executing our polycube_distortion
on the Evocube output mesh, I don't obtain the same values as the ones in logs.json
, the latter being different from the one in the PDF (!). Was the distortions calculation modified in the meantime? The code history seems to say no.
Dive into referred articles
Let's start with the stretch, In the code, Evocube refer to Praun and Hoppe 2003 3. Indeed, section 3.3 "Review of planar-domain stretch metrics", we find a local definition, at a point
The
With
In case of a triangle mesh, the parametrization
Question
How do you go from the integral to the sum?
Question
Why Evocube (cf eq.3 of this issue) uses an inverse function? And why the sum is squared, whereas here there is an encompassing square root?
Praun and Hoppe 2003 3 refers to Sander et al. 2001 4 about the stretch. This article talks about it in section 3 "Texture stretch metric". We find the eq.7 of this issue (calculation of
With
Question
That doesn't look like what Evocube does, does it?
The normalization, so that 1.0 is the minimum, is doable in two ways:
- by the scaling, upstream, of the texture domain, so that its area is the same as the area of the 3D surface
- by the multiplication of the stretch by the factor
$\sqrt{\frac{ \displaystyle\sum_{T_i \in M}{A(T_i)} }{ \displaystyle\sum_{T_i \in M}{A'(T_i)} }}$
Question
Evocube does both, doesn't it?
After the stretch, let's see area and angle distortions. In the source code, Evocube referred to Tarini et al. 2004 2. In the Table 1 legend, we can read they are "measured by integrating and normalizing the values
In Degener et al. 2003 5, we have in section 2.5:
I suppose eq.10 is equivalent to
The next section (2.6) talks about discretization and what are becoming
In Floater & Hormann 2004 6, we find (section 8):
I didn't find a similar formulation of the angle distortion.
Lastly, the isometric distortion, for which the Evocube source code is referring Yang et al. 2019 7. This article defines this distortion on a triangle
with
Question
What do the braces mean?
Yang et al. 2019 7 refers (in section 4) to Fu et al. 2015 8, which present the isometric distortion as penalizing both the conformal distortion (angle?) and the area distortion:
Question
What is
They refer to Degener et al. 2003 5 as well, mentioned above for another distortion. In section 2.6 we can find:
Which looks like eq.14?
But neither eq.13 nor eq.14 nor eq.15 look like the Evocube code.
Conclusion
- the workability of Evocube is, although not clearly transcribed in the article, the sum of the angle and area distortions, from which we subtract 4 (not explained).
- the stretch calculation in the Evocube code seems very different from referred articles
- the isometric distortion in the Evocube code does not look like the equations of referred articles
- the metrics normalization does not seem homogeneous
- I can't find the same values that Evocube recorded
Footnotes
-
Dumery, Protais, Mestrallet, Bourcier, Ledoux, "Evocube: a Genetic Labeling Framework for Polycube-Maps", Computer Graphics Forum, 2022, https://onlinelibrary.wiley.com/doi/10.1111/cgf.14649 ↩ ↩2
-
Tarini, Hormann, Cignoni, Montani, "PolyCube-Maps", Proceeding of SIGGRAPH, 2004, https://vcg.isti.cnr.it/polycubemaps/resources/sigg04.pdf ↩ ↩2 ↩3
-
Praun, Hoppe, "Spherical Parametrization and Remeshing", ACM Transactions on Graphics, 2003, https://hhoppe.com/sphereparam.pdf ↩ ↩2
-
Sander, Snyder, Gortier, Hoppe, "Texture Mapping Progressive Meshes", SIGGRAPH, 2001, https://hhoppe.com/tmpm.pdf ↩
-
Degener, Meseth, Klein, "An adaptable surface parametrization method", International Meshing Roundtable, 2003, https://cg.cs.uni-bonn.de/backend/v1/files/publications/degener-2003-adaptable.pdf ↩ ↩2 ↩3
-
Floater, Hormann, "Surface parametrization: a tutorial and survey", Advances in Multiresolution for Geometric Modelling. Mathematics and Visualization, 2004, https://graphics.stanford.edu/courses/cs468-05-fall/Papers/param-survey.pdf ↩ ↩2
-
Yang, Fu, Liu, "Computing Surface PolyCube-Maps by Constrained Voxelization", Computer Graphics Forum, 2019, https://onlinelibrary.wiley.com/doi/full/10.1111/cgf.13838 ↩ ↩2
-
Fu, Liu, Guo, "Computing locally injective mappings by advanced MIPS", ACM Transactions on Graphics, Volume 34, Issue 4, 2015, https://dl.acm.org/doi/10.1145/2766938 ↩