Scott Mitchell - Academia.edu (original) (raw)

Papers by Scott Mitchell

Research paper thumbnail of Curve Reconstruction With Only Half The Samples

Research paper thumbnail of Computing Geometry as a Mathematician in an Engineering Laboratory

OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information), Sep 1, 2021

Research paper thumbnail of Incremental Interval Assignment by Integer Linear Algebra

Zenodo (CERN European Organization for Nuclear Research), Oct 9, 2021

Interval Assignment (IA) is the problem of selecting the number of mesh edges (intervals) for eac... more Interval Assignment (IA) is the problem of selecting the number of mesh edges (intervals) for each curve for conforming quad and hex meshing. The intervals x is fundamentally integer-valued, yet many approaches perform floating-point optimization and convert a floating-point solution into an integer solution. We avoid such steps: we start integer, stay integer. Incremental Interval Assignment (IIA) uses integer linear algebra (Hermite normal form) to find an initial solution to the matrix equation Ax = b satisfying the meshing constraints. Solving for reduced row echelon form provides integer vectors spanning the nullspace of A. We add vectors from the nullspace to improve the initial solution. Compared to floating-point optimization approaches, IIA is faster and always produces an integer solution. The potential drawback is that there is no theoretical guarantee that the solution is optimal, but in practice we achieve solutions close to the user goals. The software is freely available.

Research paper thumbnail of k-d darts: Sampling by k-Dimensional Flat Searches

OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information), Feb 1, 2012

We formalize the notion of sampling a function using k-d darts. A k-d dart is a set of independen... more We formalize the notion of sampling a function using k-d darts. A k-d dart is a set of independent, mutually orthogonal, k-dimensional subspaces called k-d flats. Each dart has d choose k flats, aligned with the coordinate axes for efficiency. We show that k-d darts are useful for exploring a function's properties, such as estimating its integral, or finding an exemplar above a threshold. We describe a recipe for converting an algorithm from point sampling to k-d dart sampling, assuming the function can be evaluated along a k-d flat. We demonstrate that k-d darts are more efficient than point-wise samples in high dimensions, depending on the characteristics of the sampling domain: e.g. the subregion of interest has small volume and evaluating the function along a flat is not too expensive. We present three concrete applications using line darts (1-d darts): relaxed maximal Poisson-disk sampling, high-quality rasterization of depth-of-field blur, and estimation of the probability of failure from a response surface for uncertainty quantification. In these applications, line darts achieve the same fidelity output as point darts in less time. We also demonstrate the accuracy of higher dimensional darts for a volume estimation problem. For Poisson-disk sampling, we use significantly less memory, enabling the generation of larger point clouds in higher dimensions.

Research paper thumbnail of Incremental Interval Assignment (IIA) for Scalable Mesh Preparation

Research paper thumbnail of Statistical Inference for Porous Materials using Persistent Homology

We propose a porous materials analysis pipeline using persistent homology. We first compute persi... more We propose a porous materials analysis pipeline using persistent homology. We first compute persistent homology of binarized 3D images of sampled material subvolumes. For each image we compute sets of homology intervals, which are represented as summary graphics called persistence diagrams. We convert persistence diagrams into image vectors in order to analyze the similarity of the homology of the material images using the mature tools for image analysis. Each image is treated as a vector and we compute its principal components to extract features. We fit a statistical model using the loadings of principal components to estimate material porosity, permeability, anisotropy, and tortuosity. We also propose an adaptive version of the structural similarity index (SSIM), a similarity metric for images, as a measure to determine the statistical representative elementary volumes (sREV) for persistence homology. Thus we provide a capability for making a statistical inference of the fluid flow and transport properties of porous materials based on their geometry and connectivity.

Research paper thumbnail of Root Cause Analysis of Networked Computer Alerts

Technometrics, Jun 1, 2010

Research paper thumbnail of Geometric Comparison of Popular Mixture-Model Distances

Computer Aided Geometric Design, Dec 1, 2010

Statistical Latent Dirichlet Analysis produces mixture model data that are geometrically equivale... more Statistical Latent Dirichlet Analysis produces mixture model data that are geometrically equivalent to points lying on a regular simplex in moderate to high dimensions. Numerous other statistical models and techniques also produce data in this geometric category, even though the meaning of the axes and coordinate values differs significantly. A distance function is used to further analyze these points, for example to cluster them. Several different distance functions are popular amongst statisticians; which distance function is chosen is usually driven by the historical preference of the application domain, information-theoretic considerations, or by the desirability of the clustering results. Relatively little consideration is usually given to how distance functions geometrically transform data, or the distances algebraic properties. Here we take a look at these issues, in the hope of providing complementary insight and inspiring further geometric thought. Several popular distances, χ 2 , Jensen-Shannon divergence, and the square of the Hellinger distance, are shown to be nearly equivalent; in terms of functional forms after transformations, factorizations, and series expansions; and in terms of the shape and proximity of constant-value contours. This is somewhat surprising given that their original functional forms look quite different. Cosine similarity is the square of the Euclidean distance, and a similar geometric relationship is shown with Hellinger and another cosine. We suggest a geodesic variation of Hellinger. The square-root projection that arises in Hellinger distance is briefly compared to standard normalization for Euclidean distance. We include detailed derivations of some ratio and difference bounds for illustrative purposes. We provide some constructions that nearly achieve the worst-case ratios, relevant for contours.

Research paper thumbnail of Approximating the maxmin-angle covering triangulation

Computational Geometry: Theory and Applications, 1997

Given a planar straight-line graph or polygon with holes, we seek a covering triangulation whose ... more Given a planar straight-line graph or polygon with holes, we seek a covering triangulation whose minimum angle is as large as possible. A covering triangulation is a Steiner triangulation with the following restriction: no Steiner vertices may be added on an input edge. We give an explicit upper bound on the largest possible minimum angle in any covering triangulation of a given input. This upper bound depends on local geometric features of the input. We then show that our covering triangulation has minimum angle at least a constant factor times this upper bound. This is the first known algorithm for generating a covering triangulation of an arbitrary input with a provable bound on triangle shape. Covering triangulations can be used to triangulate intersecting regions independently, and so solve several subproblems of mesh generation.

Research paper thumbnail of Mesh generation with provable quality bounds

Research paper thumbnail of Incremental Interval Assignment by Integer Linear Algebra with Improvements

Computer Aided Design, May 1, 2023

Research paper thumbnail of LDRD 149045 final report distinguishing documents

performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Seve... more performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Several other ongoing efforts were leveraged, including the Networks Grand Challenge LDRD, and the Computational Topology CSRF project, and the some of the leveraged work is described here. We proposed a sentence mining technique that exploited both the distribution and the order of parts-of-speech (POS) in sentences in English language documents. The ultimate goal was to be able to discover "call-to-action" framing documents hidden within a corpus of mostly expository documents, even if the documents were all on the same topic and used the same vocabulary. Using POS was novel. We also took a novel approach to analyzing POS. We used the hypothesis that English follows a dynamical system and the POS are trajectories from one state to another. We analyzed the sequences of POS using support vector machines and the cycles of POS using computational homology. We discovered that the POS were a very weak signal and did not support our hypothesis well. Our original goal appeared to be unobtainable with our original approach. We turned our attention to study an aspect of a more traditional approach to distinguishing documents. Latent Dirichlet Allocation (LDA) turns documents into bags-of-words then into mixture-model points. A distance function is used to cluster groups of points to discover relatedness between documents. We performed a geometric and algebraic analysis of the most popular distance functions and made some significant and surprising discoveries, described in a separate technical report.[9]

Research paper thumbnail of LDRD 102610 final report new processes for innovative microsystems engineering with predictive simulation

This LDRD Final report describes work that Stephen W. Thomas performed in 2006. The initial probl... more This LDRD Final report describes work that Stephen W. Thomas performed in 2006. The initial problem was to develop a modeling, simulation, and optimization strategy for the design of a high speed microsystem switch. The challenge was to model the right phenomena at the right level of fidelity, and capture the right design parameters. This effort focused on the design context, in contrast to other Sandia efforts' focus on high-fidelity assessment. This report contains the initial proposal and the annual progress report. This report also describes exploratory work on micromaching using femtosecond lasers. Steve's time developing a proposal and collaboration on this topic was partly funded by this LDRD.

Research paper thumbnail of Refining a triangulation of a planar straight-line graph to eliminate large angles

Triangulations without large angles have a number of applications in nmnerical analysis and compu... more Triangulations without large angles have a number of applications in nmnerical analysis and computer graphics. In particular, the convergence of a finite element calculation depends on the largest angle of the triangulation. Also, the running time of a finite element calculation is dependent on the triangulation size, so having a triangulation with few Steiner points (added vertices) is also important. Bern, Dobkin and gppstein [1991] pose as an open problem the existence of an algorithm to triangulate a. planar straight-line graph (PSLG) without large angles using a polynomial number of Steiner points. We solve this problem by showing that any PSLG with v vertices can be triangulated wi{,h no angle larger than 7¢r/8 by adding O(v 2 log v) Steiner points in O(v 2 log 2 v) time. We first, triangulate the PSLG with an arbitrary constrained triangulation and then refi_ze that triangulation by adding additional vertices and edges. Some PSLGs require ff/(v2) Steiner points in any triangulation achieving any largest angle bound less than 7r. Hence the number of Steiner points added by our algorithm is within a log v factor of worst case optimal. We note that our refinement algorithm works on arbitrary triangulations: Given any triangulation, we show how to refine it so that no angle is larger than 77r/8. Our construction adds O(nm+np log m) vertices and runs in time O((nm+np log m) log(m+ p)), where n is the number of edges, m is one plus the number of obtuse angles, and p is one plus the number of holes and interior vertices in the original triangulation. A previously considered problem is refining a constrained triangulation of a simple polygon, where p = 1. Por this problem we add O(v 2) .qteiner points, which is within a constant factor of worst case optimal. The algorithms we present are very practical: Por most inputs the number of Steiner points and running time would be considerably smaller than in the worst case.°E

Research paper thumbnail of The All-Hex Geode-Template for Conforming a Diced Tetrahedral Mesh to any Diced Hexahedral Mesh

Engineering With Computers, Sep 13, 1999

Research paper thumbnail of Linear-size nonobtuse triangulation of polygons

We give an algorithm for triangulating n-vertex polygonal regions (with holes) so that no angle i... more We give an algorithm for triangulating n-vertex polygonal regions (with holes) so that no angle in the final triangulation measures more than 7r/2. The number of triangles in the triangulation is only O(n), improving a previous bound of 0(n2), and the worst-case running time is O(n logz n). The basic technique used in the algorithm, recursive subdivision by disks, is new and may have wider application in mesh generation. We also report on an implementation of our algorithm. Throughout the application areas named above, it is generally true that large angles (that is, angles close to m) are undesirable. Babu5ka and Aziz [2] justi

Research paper thumbnail of Reliable Whisker Weaving via Curve Contraction

Engineering With Computers, Sep 13, 1999

Whisker Weaving is an advancing front algorithm for all-hexahedral mesh generation. It uses globa... more Whisker Weaving is an advancing front algorithm for all-hexahedral mesh generation. It uses global information derived from grouping the mesh dual into surfaces, the STC, to construct the connectivity of the mesh, then positions the nodes afterwards. Currently we are able to reliably generate hexahedral meshes for complicated geometries and surface meshes. However, the surface mesh must be modified locally. Also, in large, highlyunstructured meshes, there are usually isolated regions where hex quality is poor. Reliability has been achieved by using new, provable curve-contraction algorithms to sequence the advancing front process. We have also demonstrated that sheet moving can remove certain types of invalid connectivity.

Research paper thumbnail of A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral mesh of the enclosed volume

Lecture Notes in Computer Science, 1996

A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral m... more A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral mesh of the enclosed volume

Research paper thumbnail of Quality Mesh Generation in Higher Dimensions

SIAM Journal on Computing, 2000

Research paper thumbnail of Quality mesh generation in three dimensions

We show how to triangulate a three dimensional polyhedral region with holes. Our triangulation is... more We show how to triangulate a three dimensional polyhedral region with holes. Our triangulation is optimal in the following two senses. First, our triangulation achieves the best possible aspect ratio up to a constant. Second, for any other triangulation of the same region into m triangles with bounded aspect ratio, our triangulation has size n = O(m). Such a triangulation is desired as an initial mesh for a finite element mesh refinement algorithm. Previous three dimensional triangulation schemes either worked only on a restricted class of input, or did not guarantee well-shaped tetrahedral, or were not able to bound the output size. We build on some of the ideas presented in previous work by Bern, Eppstein, and Gilbert, who have shown how to triangulate a two dimensional polyhedral region with holes, with similar quality and optimality bounds.

Research paper thumbnail of Curve Reconstruction With Only Half The Samples

Research paper thumbnail of Computing Geometry as a Mathematician in an Engineering Laboratory

OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information), Sep 1, 2021

Research paper thumbnail of Incremental Interval Assignment by Integer Linear Algebra

Zenodo (CERN European Organization for Nuclear Research), Oct 9, 2021

Interval Assignment (IA) is the problem of selecting the number of mesh edges (intervals) for eac... more Interval Assignment (IA) is the problem of selecting the number of mesh edges (intervals) for each curve for conforming quad and hex meshing. The intervals x is fundamentally integer-valued, yet many approaches perform floating-point optimization and convert a floating-point solution into an integer solution. We avoid such steps: we start integer, stay integer. Incremental Interval Assignment (IIA) uses integer linear algebra (Hermite normal form) to find an initial solution to the matrix equation Ax = b satisfying the meshing constraints. Solving for reduced row echelon form provides integer vectors spanning the nullspace of A. We add vectors from the nullspace to improve the initial solution. Compared to floating-point optimization approaches, IIA is faster and always produces an integer solution. The potential drawback is that there is no theoretical guarantee that the solution is optimal, but in practice we achieve solutions close to the user goals. The software is freely available.

Research paper thumbnail of k-d darts: Sampling by k-Dimensional Flat Searches

OSTI OAI (U.S. Department of Energy Office of Scientific and Technical Information), Feb 1, 2012

We formalize the notion of sampling a function using k-d darts. A k-d dart is a set of independen... more We formalize the notion of sampling a function using k-d darts. A k-d dart is a set of independent, mutually orthogonal, k-dimensional subspaces called k-d flats. Each dart has d choose k flats, aligned with the coordinate axes for efficiency. We show that k-d darts are useful for exploring a function's properties, such as estimating its integral, or finding an exemplar above a threshold. We describe a recipe for converting an algorithm from point sampling to k-d dart sampling, assuming the function can be evaluated along a k-d flat. We demonstrate that k-d darts are more efficient than point-wise samples in high dimensions, depending on the characteristics of the sampling domain: e.g. the subregion of interest has small volume and evaluating the function along a flat is not too expensive. We present three concrete applications using line darts (1-d darts): relaxed maximal Poisson-disk sampling, high-quality rasterization of depth-of-field blur, and estimation of the probability of failure from a response surface for uncertainty quantification. In these applications, line darts achieve the same fidelity output as point darts in less time. We also demonstrate the accuracy of higher dimensional darts for a volume estimation problem. For Poisson-disk sampling, we use significantly less memory, enabling the generation of larger point clouds in higher dimensions.

Research paper thumbnail of Incremental Interval Assignment (IIA) for Scalable Mesh Preparation

Research paper thumbnail of Statistical Inference for Porous Materials using Persistent Homology

We propose a porous materials analysis pipeline using persistent homology. We first compute persi... more We propose a porous materials analysis pipeline using persistent homology. We first compute persistent homology of binarized 3D images of sampled material subvolumes. For each image we compute sets of homology intervals, which are represented as summary graphics called persistence diagrams. We convert persistence diagrams into image vectors in order to analyze the similarity of the homology of the material images using the mature tools for image analysis. Each image is treated as a vector and we compute its principal components to extract features. We fit a statistical model using the loadings of principal components to estimate material porosity, permeability, anisotropy, and tortuosity. We also propose an adaptive version of the structural similarity index (SSIM), a similarity metric for images, as a measure to determine the statistical representative elementary volumes (sREV) for persistence homology. Thus we provide a capability for making a statistical inference of the fluid flow and transport properties of porous materials based on their geometry and connectivity.

Research paper thumbnail of Root Cause Analysis of Networked Computer Alerts

Technometrics, Jun 1, 2010

Research paper thumbnail of Geometric Comparison of Popular Mixture-Model Distances

Computer Aided Geometric Design, Dec 1, 2010

Statistical Latent Dirichlet Analysis produces mixture model data that are geometrically equivale... more Statistical Latent Dirichlet Analysis produces mixture model data that are geometrically equivalent to points lying on a regular simplex in moderate to high dimensions. Numerous other statistical models and techniques also produce data in this geometric category, even though the meaning of the axes and coordinate values differs significantly. A distance function is used to further analyze these points, for example to cluster them. Several different distance functions are popular amongst statisticians; which distance function is chosen is usually driven by the historical preference of the application domain, information-theoretic considerations, or by the desirability of the clustering results. Relatively little consideration is usually given to how distance functions geometrically transform data, or the distances algebraic properties. Here we take a look at these issues, in the hope of providing complementary insight and inspiring further geometric thought. Several popular distances, χ 2 , Jensen-Shannon divergence, and the square of the Hellinger distance, are shown to be nearly equivalent; in terms of functional forms after transformations, factorizations, and series expansions; and in terms of the shape and proximity of constant-value contours. This is somewhat surprising given that their original functional forms look quite different. Cosine similarity is the square of the Euclidean distance, and a similar geometric relationship is shown with Hellinger and another cosine. We suggest a geodesic variation of Hellinger. The square-root projection that arises in Hellinger distance is briefly compared to standard normalization for Euclidean distance. We include detailed derivations of some ratio and difference bounds for illustrative purposes. We provide some constructions that nearly achieve the worst-case ratios, relevant for contours.

Research paper thumbnail of Approximating the maxmin-angle covering triangulation

Computational Geometry: Theory and Applications, 1997

Given a planar straight-line graph or polygon with holes, we seek a covering triangulation whose ... more Given a planar straight-line graph or polygon with holes, we seek a covering triangulation whose minimum angle is as large as possible. A covering triangulation is a Steiner triangulation with the following restriction: no Steiner vertices may be added on an input edge. We give an explicit upper bound on the largest possible minimum angle in any covering triangulation of a given input. This upper bound depends on local geometric features of the input. We then show that our covering triangulation has minimum angle at least a constant factor times this upper bound. This is the first known algorithm for generating a covering triangulation of an arbitrary input with a provable bound on triangle shape. Covering triangulations can be used to triangulate intersecting regions independently, and so solve several subproblems of mesh generation.

Research paper thumbnail of Mesh generation with provable quality bounds

Research paper thumbnail of Incremental Interval Assignment by Integer Linear Algebra with Improvements

Computer Aided Design, May 1, 2023

Research paper thumbnail of LDRD 149045 final report distinguishing documents

performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Seve... more performed in 2010. Prof. Afra Zomorodian provided insight. This was a small late-start LDRD. Several other ongoing efforts were leveraged, including the Networks Grand Challenge LDRD, and the Computational Topology CSRF project, and the some of the leveraged work is described here. We proposed a sentence mining technique that exploited both the distribution and the order of parts-of-speech (POS) in sentences in English language documents. The ultimate goal was to be able to discover "call-to-action" framing documents hidden within a corpus of mostly expository documents, even if the documents were all on the same topic and used the same vocabulary. Using POS was novel. We also took a novel approach to analyzing POS. We used the hypothesis that English follows a dynamical system and the POS are trajectories from one state to another. We analyzed the sequences of POS using support vector machines and the cycles of POS using computational homology. We discovered that the POS were a very weak signal and did not support our hypothesis well. Our original goal appeared to be unobtainable with our original approach. We turned our attention to study an aspect of a more traditional approach to distinguishing documents. Latent Dirichlet Allocation (LDA) turns documents into bags-of-words then into mixture-model points. A distance function is used to cluster groups of points to discover relatedness between documents. We performed a geometric and algebraic analysis of the most popular distance functions and made some significant and surprising discoveries, described in a separate technical report.[9]

Research paper thumbnail of LDRD 102610 final report new processes for innovative microsystems engineering with predictive simulation

This LDRD Final report describes work that Stephen W. Thomas performed in 2006. The initial probl... more This LDRD Final report describes work that Stephen W. Thomas performed in 2006. The initial problem was to develop a modeling, simulation, and optimization strategy for the design of a high speed microsystem switch. The challenge was to model the right phenomena at the right level of fidelity, and capture the right design parameters. This effort focused on the design context, in contrast to other Sandia efforts' focus on high-fidelity assessment. This report contains the initial proposal and the annual progress report. This report also describes exploratory work on micromaching using femtosecond lasers. Steve's time developing a proposal and collaboration on this topic was partly funded by this LDRD.

Research paper thumbnail of Refining a triangulation of a planar straight-line graph to eliminate large angles

Triangulations without large angles have a number of applications in nmnerical analysis and compu... more Triangulations without large angles have a number of applications in nmnerical analysis and computer graphics. In particular, the convergence of a finite element calculation depends on the largest angle of the triangulation. Also, the running time of a finite element calculation is dependent on the triangulation size, so having a triangulation with few Steiner points (added vertices) is also important. Bern, Dobkin and gppstein [1991] pose as an open problem the existence of an algorithm to triangulate a. planar straight-line graph (PSLG) without large angles using a polynomial number of Steiner points. We solve this problem by showing that any PSLG with v vertices can be triangulated wi{,h no angle larger than 7¢r/8 by adding O(v 2 log v) Steiner points in O(v 2 log 2 v) time. We first, triangulate the PSLG with an arbitrary constrained triangulation and then refi_ze that triangulation by adding additional vertices and edges. Some PSLGs require ff/(v2) Steiner points in any triangulation achieving any largest angle bound less than 7r. Hence the number of Steiner points added by our algorithm is within a log v factor of worst case optimal. We note that our refinement algorithm works on arbitrary triangulations: Given any triangulation, we show how to refine it so that no angle is larger than 77r/8. Our construction adds O(nm+np log m) vertices and runs in time O((nm+np log m) log(m+ p)), where n is the number of edges, m is one plus the number of obtuse angles, and p is one plus the number of holes and interior vertices in the original triangulation. A previously considered problem is refining a constrained triangulation of a simple polygon, where p = 1. Por this problem we add O(v 2) .qteiner points, which is within a constant factor of worst case optimal. The algorithms we present are very practical: Por most inputs the number of Steiner points and running time would be considerably smaller than in the worst case.°E

Research paper thumbnail of The All-Hex Geode-Template for Conforming a Diced Tetrahedral Mesh to any Diced Hexahedral Mesh

Engineering With Computers, Sep 13, 1999

Research paper thumbnail of Linear-size nonobtuse triangulation of polygons

We give an algorithm for triangulating n-vertex polygonal regions (with holes) so that no angle i... more We give an algorithm for triangulating n-vertex polygonal regions (with holes) so that no angle in the final triangulation measures more than 7r/2. The number of triangles in the triangulation is only O(n), improving a previous bound of 0(n2), and the worst-case running time is O(n logz n). The basic technique used in the algorithm, recursive subdivision by disks, is new and may have wider application in mesh generation. We also report on an implementation of our algorithm. Throughout the application areas named above, it is generally true that large angles (that is, angles close to m) are undesirable. Babu5ka and Aziz [2] justi

Research paper thumbnail of Reliable Whisker Weaving via Curve Contraction

Engineering With Computers, Sep 13, 1999

Whisker Weaving is an advancing front algorithm for all-hexahedral mesh generation. It uses globa... more Whisker Weaving is an advancing front algorithm for all-hexahedral mesh generation. It uses global information derived from grouping the mesh dual into surfaces, the STC, to construct the connectivity of the mesh, then positions the nodes afterwards. Currently we are able to reliably generate hexahedral meshes for complicated geometries and surface meshes. However, the surface mesh must be modified locally. Also, in large, highlyunstructured meshes, there are usually isolated regions where hex quality is poor. Reliability has been achieved by using new, provable curve-contraction algorithms to sequence the advancing front process. We have also demonstrated that sheet moving can remove certain types of invalid connectivity.

Research paper thumbnail of A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral mesh of the enclosed volume

Lecture Notes in Computer Science, 1996

A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral m... more A characterization of the quadrilateral meshes of a surface which admit a compatible hexahedral mesh of the enclosed volume

Research paper thumbnail of Quality Mesh Generation in Higher Dimensions

SIAM Journal on Computing, 2000

Research paper thumbnail of Quality mesh generation in three dimensions

We show how to triangulate a three dimensional polyhedral region with holes. Our triangulation is... more We show how to triangulate a three dimensional polyhedral region with holes. Our triangulation is optimal in the following two senses. First, our triangulation achieves the best possible aspect ratio up to a constant. Second, for any other triangulation of the same region into m triangles with bounded aspect ratio, our triangulation has size n = O(m). Such a triangulation is desired as an initial mesh for a finite element mesh refinement algorithm. Previous three dimensional triangulation schemes either worked only on a restricted class of input, or did not guarantee well-shaped tetrahedral, or were not able to bound the output size. We build on some of the ideas presented in previous work by Bern, Eppstein, and Gilbert, who have shown how to triangulate a two dimensional polyhedral region with holes, with similar quality and optimality bounds.