In this post, I'll to give an overview of the HACD parameters and explain their meaning and how they should be set. The text will be improved over time. My main concern is to have things written down for reference...
com/2011/10/hacd-optimization. html
- Parameters overview
Parameter
|
Description
|
Default
|
NTargetTrianglesDecimatedMesh
|
Target
number of triangles in the decimated mesh. The decimation stage was added
mainly to decrease the computation costs for dense meshes (refer to Section 2.1).
|
1000
|
NVerticesPerCH
|
Maximum number
of vertices in the generated convex-hulls.
|
100
|
ScaleFactor
|
Normalization
factor used to ensure that the
other parameters (e.g. concavity) are expressed w.r.t. a fixed size. Refer to Section 2.3 for details
|
1000
|
SmallClusterThreshold
|
Threshold on
the clusters area (expressed as a percentage of the entire mesh area) under
which the cluster is considered small and it is forced to be merged with
other clusters at the price of a high concavity.
|
0.25
|
AddFacesPoints
|
If enabled an additional
ray located at the center of each triangle pointing toward its normal is
considered when computing the concavity of a non-flat cluster. The parameter was
added to handle coarse meshes (i.e. with a low number of vertices)
|
ON
|
AddExtraDistPoitns
|
If enabled additional
rays are considered to handle bowl-like shapes.
|
ON
|
NClusters
|
Minimum number of
convex-hulls to be generated
|
1
|
Concavity
|
Maximum allowed
concavity
|
100
|
ConnectDist
|
If the distance
between two triangles, each belonging to a different connected components (CCs),
is lower than the ConnectDist threshold an additional edge connecting
them is added to the dual graph. This parameter was added to handle meshes
with multiple CCs.
|
30
|
VolumeWeight
|
Weight
controlling the contribution of the volume related cost to the global
edgecollapse cost (refer to XXX for details).
|
0.0
(not used)
|
CompacityWeight
|
Weight
controlling the contribution of the shape factor related cost to the global
edgecollapse cost (refer to XXX for details).
|
0.0001
|
FlatRegionThreshold
|
Threshold expressed
a percentage of ScaleFactor under which a cluster is considered flat.
|
1
|
ComputationWeight
|
Weight
controlling the contribution of the computation related cost to the global
edgecollapse cost (refer to XXX for details).
|
0.01
|
2. Detailed description
- NTargetTrianglesDecimatedMesh
- NVerticesPerCH
In order to optimally choose the best vertices to keep in the final CH, the ICHull::Process(unsigned long nPointsCH) function implements a slightly different version of the Incremental Convex Hull algorithm (cf. demo code ). Here, at each step, the point with the highest volume increment is chosen, until all points are processed or the CH has exactly NVerticesPerCH points.
The code looks like this:
while (!vertices.GetData().m_tag &&
addedPoints < nPointsCH) // not processed
{
if (!FindMaxVolumePoint())
{
break;
}
vertices.GetData().m_tag = true;
if (ProcessPoint())
{
addedPoints++;
CleanUp(addedPoints);
vertices.Next();
}
}
// delete remaining points
while (!vertices.GetData().m_tag)
{
vertices.Delete();
}
- ScaleFactor
A normalization process is applied to the input mesh in order to ensure that the other parameters (e.g. concavity) are expressed w.r.t. a fixed size. This process is inverted before producing the final CHs.
The HACD::Compute() function follows the following main steps:
bool HACD::Compute(bool
fullCH, bool exportDistPoints)
{
if (m_targetNTrianglesDecimatedMesh > 0)
{
DecimateMesh(targetNTrianglesDecimatedMesh);
}
NormalizeData();
CreateGraph();
InitializeDualGraph();
InitializePriorityQueue();
Simplify();
DenormalizeData();
CreateFinalCH();
return true;
}
The HACD::NormalizeData() function centers the mesh and scale it so its coordinates are in the interval [-m_sacle, m_sclae]x[-m_sacle, m_sclae]x[-m_sacle, m_sclae]. The code proceeds as follows:
void HACD::NormalizeData()
{
const Real
invDiag = static_cast<Real>(2.0 * m_scale
/ m_diag);
for (size_t v = 0; v < m_nPoints ; v++)
{
m_points[v]
= (m_points[v] - m_barycenter) * invDiag;
}
}
The HACD::DenormalizeData() function invert the normalization operated by HACD::NormalizeData():
void HACD::DenormalizeData()
{
const
Real diag = static_cast<Real>(m_diag /
(2.0 * m_scale));
for
(size_t v = 0; v < m_nPoints ; v++)
{
m_points[v] = m_points[v] * diag +
m_barycenter;
}
}
- SmallClusterThreshold
The condition2 in the HACD::Simplify() function forces small clusters to be merged (m_area is the area of the entire mesh):
void HACD::Simplify()
{
double areaThreshold = m_area * m_smallClusterThreshold
/ 100.0;
while (
!m_pqueue.empty() )
{
currentEdge = m_pqueue.top();
m_pqueue.pop();
v1 = m_graph.m_edges[currentEdge.m_name].m_v1;
v2 = m_graph.m_edges[currentEdge.m_name].m_v2;
bool
condition1 = (m_graph.m_edges[currentEdge.m_name].m_concavity <
m_concavity)
&& (globalConcavity < m_concavity) &&
(m_graph.GetNVertices() > m_nMinClusters) &&
(m_graph.GetNEdges()
> 0);
bool condition2 = (m_graph.m_vertices[v1].m_surf <
areaThreshold ||
m_graph.m_vertices[v2].m_surf < areaThreshold);
if
(condition1 || condition2)
{
m_graph.EdgeCollapse(v1,
v2);
long idEdge;
for(size_t itE = 0; itE <
m_graph.m_vertices[v1].m_edges.Size(); ++itE)
{
idEdge = m_graph.m_vertices[v1].m_edges[itE];
ComputeEdgeCost(idEdge);
}
}
}
- AddFacesPoints and AddExtraDistPoitns
The parameter AddFacesPoints was introduced to improve the precision of the concavity computation for meshes with a low number of vertices. The idea is to add additional rays located each at the center of a triangle and pointing to the same direction as its normal.
The Parameter AddExtraDistPoints was added to handle bowl-like shapes. As illustrated below, if only the rays located on the current cluster are considered when computing its concavity, you may end up with a big cluster corresponding to the external surface (which is convex) of the bowl containing a lot of small clusters located on the internal part (which is concave).
Bad decomposition for bowl-like shapes if AddExtraDistPoints is not enabled |
In order to avoid such a bad decomposition, the idea consists in introducing new rays that would constrain the propagation of the cluster corresponding to the external surface of the bowl by taking into account rays located on the concave part. More precisely, during the initialization stage, an additional ray (the yellow ray in the figure below) is associated with each triangle (colored in red).
The additional ray, denoted R (the yellow arrow), is defined as follows. Let N be the normal (the blue arrow) to the current triangle T (colored in red) and X be the ray starting at the center of T and with direction (-N) (the dotted green arrow). The starting point of R, denoted P0 (the yellow point), is defined a the nearest intersection point of X and the mesh. Moreover, the normal of the surface at P0 should point to the same direction as X. R has the direction of the normal of the surface at the P0.
Additional ray (yellow) is associated with the red triangle when AddExtraDistPoints is activated |
The code looks like this:
void HACD::InitializeDualGraph()
{
for(unsigned long f = 0; f < m_nTriangles; f++)
{
i
= m_triangles[f].X();
j =
m_triangles[f].Y();
k = m_triangles[f].Z();
m_graph.m_vertices[f].m_distPoints.PushBack(DPoint(i,
0, false, false));
m_graph.m_vertices[f].m_distPoints.PushBack(DPoint(j,
0, false, false));
m_graph.m_vertices[f].m_distPoints.PushBack(DPoint(k,
0, false, false));
u
= m_points[j] - m_points[i];
v = m_points[k] - m_points[i];
w = m_points[k] - m_points[j];
normal = u ^ v;
m_normals[i] += normal;
m_normals[j] += normal;
m_normals[k] += normal;
m_graph.m_vertices[f].m_surf =
normal.GetNorm();
m_area
+= m_graph.m_vertices[f].m_surf;
normal.Normalize();
if(m_addFacesPoints)
{
m_faceNormals[f] = normal;
m_facePoints[f] = (m_points[i] + m_points[j] + m_points[k]) / 3.0;
}
if (m_addExtraDistPoints)
{
Vec3<Real> seedPoint((m_points[i] + m_points[j] + m_points[k]) /
3.0);
Vec3<Real> hitPoint;
Vec3<Real> hitNormal;
normal = -normal;
if
(rm.Raycast(seedPoint,normal,hitTriangle,dist, hitPoint, hitNormal))
{
faceIndex = hitTriangle;
}
if (faceIndex < m_nTriangles )
{
m_extraDistPoints[f] = hitPoint;
m_extraDistNormals[f] = hitNormal;
m_graph.m_vertices[f].m_distPoints.PushBack(DPoint(m_nPoints+f,
0, false, true));
}
}
}
for (size_t v = 0; v < m_nPoints; v++)
{
m_normals[v].Normalize();
}
}
- Concavity
- ConnectDist
The code looks like this:
void HACD::CreateGraph()
{
…
if (m_ccConnectDist >= 0.0)
{
m_graph.ExtractCCs();
if (m_graph.m_nCCs > 1)
{
std::vector< std::set<long>
> cc2V;
cc2V.resize(m_graph.m_nCCs);
long cc;
for(size_t
t = 0; t < m_nTriangles; ++t)
{
cc =
m_graph.m_vertices[t].m_cc;
cc2V[cc].insert(m_triangles[t].X());
cc2V[cc].insert(m_triangles[t].Y());
cc2V[cc].insert(m_triangles[t].Z());
}
for(size_t cc1 = 0; cc1 < m_graph.m_nCCs;
++cc1)
{
for(size_t cc2 = cc1+1; cc2 <
m_graph.m_nCCs; ++cc2)
{
std::set<long>::const_iterator itV1(cc2V[cc1].begin()),
itVEnd1(cc2V[cc1].end());
for(;
itV1 != itVEnd1; ++itV1)
{
double distC1C2 = std::numeric_limits<double>::max();
double dist;
t1 = -1;
t2 = -1;
std::set<long>::const_iterator itV2(cc2V[cc2].begin()),
itVEnd2(cc2V[cc2].end());
for(; itV2 != itVEnd2; ++itV2)
{
dist =
(m_points[*itV1] - m_points[*itV2]).GetNorm();
if (dist < distC1C2)
{
distC1C2 =
dist;
t1 =
*vertexToTriangles[*itV1].begin();
std::set<long>::const_iterator
it2(vertexToTriangles[*itV2].begin()),
it2End(vertexToTriangles[*itV2].end());
t2 =
-1;
for(; it2 != it2End; ++it2)
{
if (*it2 != t1)
{
t2
= *it2;
break;
}
}
}
}
if (distC1C2 <= m_ccConnectDist && t1
>= 0 && t2 >= 0)
{
m_graph.AddEdge(t1, t2);
}
}
}
}
}
}
}
- FlatRegionThreshold
In practice, the final concavity is computed as weighted sum of the flat region concavity and the 3D concavity. The weight is a function of the flatness of the cluster.
The code is as follows:
void HACD::ComputeEdgeCost(size_t e)
{
…
double surfCH = ch->ComputeArea() /
2.0;
double volumeCH =
ch->ComputeVolume();
double vol2Surf = volumeCH / surfCH;
double concavity_flat =
sqrt(fabs(surfCH-surf));
double
weightFlat = std::max(0.0, 1.0 - pow(- vol2Surf * 100.0 / (m_scale *
m_flatRegionThreshold), 2.0));
concavity_flat *= weightFlat;
if(!ch->IsFlat())
{
concavity = Concavity(*ch, distPoints);
}
concavity += concavity_flat;
…
}