Guoning Chen - Profile on Academia.edu (original) (raw)
Papers by Guoning Chen
Research Square (Research Square), Dec 27, 2023
In May 2020, 2 months after the COVID-19 pandemic caused institutions across the world to move al... more In May 2020, 2 months after the COVID-19 pandemic caused institutions across the world to move all their courses online, we conducted a survey to evaluate the impact of this transition on a group of computer science students. That first survey highlighted mostly negative effects, with students struggling to perform many class-related activities. About a year later, after a full year of remote teaching, we wanted to see if and how the students' sentiment had changed. To assess students' perceptions of remote teaching, we conducted a new survey composed of 41 between multiple choice, Likert scale and open ended questions. Additionally, we have also interviewed instructors of computer science courses, to learn about their experience and how they adapted to the new teaching modality. 137 students and 10 instructors have shared their feedback regarding their positive and negative experiences in the new learning format. Our results show that the students' experience has improved significantly, to the point that many of them expressed interest in continuing learning online, at least partially. Conversely, the instructors have concerns that this may not produce the best learning 1 outcomes for the students. Our study also shows that some populations may still be at a disadvantage in this learning format. The results and considerations included in this report may benefit the conversation on how to conduct computer science higher education in a post-pandemic world.
IEEE Transactions on Visualization and Computer Graphics, Aug 1, 2015
Vector field simplification aims to reduce the complexity of the flow by removing features in ord... more Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.
IS&T International Symposium on Electronic Imaging Science and Technology, Feb 14, 2016
Morse decomposition has been shown a reliable way to compute and represent vector field topology.... more Morse decomposition has been shown a reliable way to compute and represent vector field topology. Its computation first converts the original vector field into a directed graph representation, so that flow recurrent dynamics (i.e., Morse sets) can be identified as some strongly connected components of the graph. In this paper, we present a framework that enables the user to efficiently compute Morse decompositions of 3D piecewise linear vector fields defined on regular grids. Specifically, we extend the 2D adaptive edge sampling technique to 3D for the outer approximation computation of the image of any 3D cell for the construction of the directed graph. To achieve finer decomposition, a hierarchical refinement framework is applied to procedurally increase the integration steps and subdivide the underlying grids that contain certain Morse sets. To improve the computational performance, we implement our Morse decomposition framework using CUDA. We have applied our framework to a number of analytic and real-world 3D steady vector fields to demonstrate its utility.
IEEE Transactions on Visualization and Computer Graphics, Oct 1, 2012
Design of time-varying vector fields, i.e., vector fields that can change over time, has a wide v... more Design of time-varying vector fields, i.e., vector fields that can change over time, has a wide variety of important applications in computer graphics. Existing vector field design techniques do not address time-varying vector fields. In this paper, we present a framework for the design of time-varying vector fields, both for planar domains as well as manifold surfaces. Our system supports the creation and modification of various time-varying vector fields with desired spatial and temporal characteristics through several design metaphors, including streamlines, pathlines, singularity paths, and bifurcations. These design metaphors are integrated into an element-based design to generate the time-varying vector fields via a sequence of basis field summations or spatial constrained optimizations at the sampled times. The key-frame design and field deformation are also introduced to support other user design scenarios. Accordingly, a spatial-temporal constrained optimization and the time-varying transformation are employed to generate the desired fields for these two design scenarios, respectively. We apply the time-varying vector fields generated using our design system to a number of important computer graphics applications that require controllable dynamic effects, such as evolving surface appearance, dynamic scene design, steerable crowd movement, and painterly animation. Many of these are difficult or impossible to achieve via prior simulation-based methods. In these applications, the time-varying vector fields have been applied as either orientation fields or advection fields to control the instantaneous appearance or evolving trajectories of the dynamic effects.
Journal of Fluid Mechanics, Feb 2, 2016
The wall shear stress (WSS) vector field provides a signature for near wall convective transport,... more The wall shear stress (WSS) vector field provides a signature for near wall convective transport, and can be scaled to obtain a first order approximation of the near wall fluid velocity. The near wall flow field governs mass transfer problems in convection dominated open flows with high Schmidt number, in which case a flux at the wall will lead to a thin concentration boundary layer. Such near wall transport is of particular interest in cardiovascular flows whereby hemodynamics can initiate and progress biological events at the vessel wall. In this study we consider mass transfer processes in pulsatile blood flow of abdominal aortic aneurysms resulting from complex WSS patterns. Specifically, the Lagrangian surface transport of a species released at the vessel wall was advected in forward and backward time based on the near wall velocity field. Exposure time and residence time measures were defined to quantify accumulation of trajectories, as well as the time required to escape the near wall domain. The effect of diffusion and normal velocity was investigated. The trajectories induced by the WSS vector field were observed to form attracting and repelling coherent structures that delineated species distribution inside the boundary layer consistent with exposure and residence time measures. The results indicate that Lagrangian wall shear stress structures can provide a template for near wall transport.
Bulletin of the American Physical Society, Nov 23, 2015
Wall shear stress manifolds and near wall flow topology in aneurysms AMIRHOSSEIN
Biomechanics and Modeling in Mechanobiology, Nov 17, 2016
IEEE Transactions on Visualization and Computer Graphics, Apr 1, 2019
Analysis, visualization, and design of vector fields on surfaces have a wide variety of major app... more Analysis, visualization, and design of vector fields on surfaces have a wide variety of major applications in both scientific visualization and computer graphics. On the one hand, analysis and visualization of vector fields provide critical insights to the flow data produced from simulation or experiments of various engineering processes. On the other hand, many graphics applications require vector fields as input to drive certain graphical processes. This thesis addresses vector field analysis and design for both visualization and graphics applications. Topological analysis of vector fields provides the qualitative (or structural) information of the underlying dynamics of the given vector data, which helps the domain experts identify the critical features and behaviors efficiently. In this thesis, I introduce a more complete vector field topology called Entity Connection c Copyright by Guoning Chen Professor Andrzej Szymczak for providing the efficient computation algorithm of Conley index. I am also grateful for the valuable discussion with Professor Gerik Scheuermann, Professor Harry Yeh, and Dr. Pawe l Pilarczyk. I wish to thank Dr. Pascal Mueller for creating the beautiful images for the street modeling project. My thanks goes to Professor Mike Bailey and Professor Ronald A. Metoyer. From them I have learned not only cutting-edge graphics knowledge and techniques but also how to be a good teacher. I would like to thank all my colleagues, Jonathan Palacios, William Brendal,
Computer Graphics Forum, 2021
We introduce the curve complexity heuristic (CCH), a KD‐tree construction strategy for 3D curves,... more We introduce the curve complexity heuristic (CCH), a KD‐tree construction strategy for 3D curves, which enables interactive exploration of neighborhoods in dense and large line datasets. It can be applied to searches of k‐nearest curves (KNC) as well as radius‐nearest curves (RNC). The CCH KD‐tree construction consists of two steps: (i) 3D curve decomposition that takes into account curve complexity and (ii) KD‐tree construction, which involves a novel splitting and early termination strategy. The obtained KD‐tree allows us to improve the speed of existing neighborhood search approaches by at least an order of magnitude (i. e., 28×for KNC and 12×for RNC with 98% accuracy) by considering local curve complexity. We validate this performance with a quantitative evaluation of the quality of search results and computation time. Also, we demonstrate the usefulness of our approach for supporting various applications such as interactive line queries, line opacity optimization, and line abst...
ACM Transactions on Graphics, Jul 27, 2015
Figure 1: (a) An input hex-mesh [Li et al. 2012]: The image on the left shows its base-complex th... more Figure 1: (a) An input hex-mesh [Li et al. 2012]: The image on the left shows its base-complex that partitions the hexahedral mesh into different large components, illustrated with different colors on the right. Due to the misalignments between singularities, many (typically small) components arise. For instance, a strip of small components near the sharp feature is highlighted. (b) Our alignment algorithm reduces the complexity of the base-complex but leads to a hex-mesh with a large distortion. (c) Both the singularity placement and the element quality of the resulting hex-mesh are improved by our structure-aware optimization algorithm.
Computer Graphics Forum, Apr 11, 2017
Hexahedral (or Hex-) meshes are preferred in a number of scientific and engineering simulations a... more Hexahedral (or Hex-) meshes are preferred in a number of scientific and engineering simulations and analyses due to their desired numerical properties. Recent state-of-the-art techniques can generate high quality hex-meshes. However, they typically produce hex-meshes with uniform element sizes and thus may fail to preserve small scale features on the boundary surface. In this work, we present a new framework that enables users to generate hex-meshes with varying element sizes so that small features will be filled with smaller and denser elements, while the transition from smaller elements to larger ones is smooth, compared to the octree-based approach. This is achieved by first detecting regions of interest (ROI) of small scale features. These ROIs are then magnified using the as-rigid-as-possible (ARAP) deformation with either an automatically determined or a user-specified scale factor. A hex-mesh is then generated from the deformed mesh using existing approaches that produce hex-meshes with uniform-sized elements. This initial hex-mesh is then mapped back to the original volume before magnification to adjust the element sizes in those ROIs. We have applied this framework to a variety of man-made and natural models to demonstrate its effectiveness.
IEEE Transactions on Visualization and Computer Graphics, Jul 1, 2016
In this paper, we introduce a volumetric partitioning strategy based on a generalized sweeping fr... more In this paper, we introduce a volumetric partitioning strategy based on a generalized sweeping framework to seamlessly partition the volume of an input triangle mesh into a collection of deformed cuboids. This is achieved by a user-designed volumetric harmonic function that guides the decomposition of the input volume into a sequence of 2-manifold level sets. A skeletal structure whose corners correspond to corner vertices of a 2D parameterization is extracted for each level set. Corners are placed so that the skeletal structure aligns with features of the input object. Then, a skeletal surface is constructed by matching the skeletal structures of adjacent level sets. The surface sheets of this skeletal surface partition the input volume into the deformed cuboids. The collection of cuboids does not exhibit T-junctions, significantly simplifying the hexahedral mesh generation process, and in particular, it simplifies fitting trivariate B-splines to the deformed cuboids. Intersections of the surface sheets of the skeletal surface correspond to the singular edges of the generated hex-meshes. We apply our technique to a variety of 3D objects and demonstrate the benefit of the structure decomposition in data fitting.
ACM Transactions on Graphics, Nov 20, 2017
Computer Graphics Forum, 2017
Hexahedral (hex-) meshes are important for solving partial differential equations (PDEs) in appli... more Hexahedral (hex-) meshes are important for solving partial differential equations (PDEs) in applications of scientific computing and mechanical engineering. Many methods have been proposed aiming to generate hex-meshes with high scaled Jacobians. While it is well established that a hex-mesh should be inversion-free (i.e. have a positive Jacobian measured at every corner of its hexahedron), it is not well-studied that whether the scaled Jacobian is the most effective indicator of the quality of simulations performed on inversion-free hex-meshes given the existing dozens of quality metrics for hex-meshes. Due to the challenge of precisely defining the relations among metrics, studying the correlations among different quality metrics and their correlations with the stability and accuracy of the simulations is a first and effective approach to address the above question. In this work, we propose a correlation analysis framework to systematically study these correlations. Specifically, given a large hex-mesh dataset, we classify the existing quality metrics into groups based on their correlations, which characterizes their similarity in measuring the quality of hex-elements. In addition, we rank the individual metrics based on their correlations with the accuracy and stability metrics for simulations that solve a number of elliptic PDE problems. Our preliminary experiments suggest that metrics that assess the conditioning of the elements are more correlated to the quality of solving elliptic PDEs than the others. Furthermore, an inversion-free hex-mesh with higher average quality (measured by any quality metrics) usually leads to a more accurate and stable computation of elliptic PDEs. To support our correlation study and address the lack of a publicly available large hex-mesh dataset with sufficiently varying quality metric values, we also propose a two-level perturbation strategy to generate the desired dataset from a small number of meshes to exclude the influences of element numbers, vertex connectivity, and volume sizes to our study.
Combinatorial Vector Field Topology in Three Dimensions
Springer eBooks, Nov 14, 2011
Asymmetric tensor visualization with glyph and hyperstreamline placement on 2D manifolds
ABSTRACT Asymmetric tensor fields present new challenges for visualization techniques such as hyp... more ABSTRACT Asymmetric tensor fields present new challenges for visualization techniques such as hyperstreamline placement and glyph packing. This is because the physical behaviors of the tensors are fundamentally different inside real domains where eigenvalues are real and complex domains where eigenvalues are complex. We present a hybrid visualization approach in which hyperstreamlines are used to illustrate the tensors in the real domains while glyphs are employed for complex domains. This enables an effective visualization of the flow patterns everywhere and also provides a more intuitive illustration of elliptical flow patterns in the complex domains. The choice of the types of representation for different types of domains is motivated by the physical interpretation of asymmetric tensors in the context of fluid mechanics, i.e., when the tensor field is the velocity gradient tensor. In addition, we encode the tensor magnitude to the size of the glyphs and density of hyperstreamlines. We demonstrate the effectiveness of our visualization techniques with real-world engine simulation data.
Visualization of flow on boundary surfaces from computational flow dynamics (CFD) is challenging ... more Visualization of flow on boundary surfaces from computational flow dynamics (CFD) is challenging due to the complex, adaptive resolution nature of the meshes used in the modeling and simulation process. Part one of this paper presents a fast and simple glyph placement algorithm in order to investigate and visualize flow data based on unstructured, adaptive resolution boundary meshes from CFD. The algorithm has several advantages: (1) Glyphs are automatically placed at evenly-spaced intervals. (2) The user can interactively control the spatial resolution of the glyph placement and their precise location. (3) The algorithm is fast and supports multi-resolution visualization of the flow at surfaces. The implementation supports multiple representations of the flow-some optimized for speed others for accuracy. Furthermore the approach doesn't rely on any pre-processing of the data or parameterization of the surface and handles large meshes efficiently. The result is a tool that provides engineers with a fast and intuitive overview of their CFD simulation results. In part two, we introduce an automatic streamline seeding algorithm for vector fields defined on surfaces in 3D space. The algorithm generates evenly-spaced streamlines fast, simply, and efficiently for any general surface-based vector field. It is general because it handles large, complex, unstructured, adaptive resolution grids with holes and discontinuities, does not require a parameterization, and can generate both sparse and dense representations of the flow. It is efficient because streamlines are only integrated for visible portions of the surface. It is simple because the image-based approach removes the need to perform streamline tracing on a triangular mesh, a process which is complicated at best. And it is fast because it makes effective, balanced use of both the CPU and the GPU. The key to the algorithm's speed, simplicity, and efficiency is its image-based seeding strategy. We demonstrate our algorithm on complex, real-world simulation data sets from computational fluid dynamics and compare it with object-space streamline visualizations.
Electronic Imaging, 2021
Traffic signals are part of our critical infrastructure and protecting their integrity is a serio... more Traffic signals are part of our critical infrastructure and protecting their integrity is a serious concern. Security flaws in traffic signal systems have been documented and effective detection of exploitation of these flaws remains a challenge. In this paper we present a visual analytics approach to look for anomalies in traffic signal data (i.e., abnormal traffic light patterns) that may indicate a compromise of the system. To our knowledge it is a first time a visual analytics approach is applied for the processing and exploration of traffic signal data. This system supports level-of-detail exploration with various visualization techniques. Data cleaning and a number of preprocessing techniques for the extraction of summary information (e.g., traffic signal cycles) of the data are also performed before the visualization and data exploration. Our system successfully reveals the errors in the input data that would be difficult to capture with simple plots alone. In addition, our system captures some abnormal signal patterns that may indicate intrusions into the system. In summary, this work offers a new and effective way to study attacks or intrusions to traffic signal control systems via the visual analysis of traffic light signal patterns.
Research Square (Research Square), Dec 27, 2023
In May 2020, 2 months after the COVID-19 pandemic caused institutions across the world to move al... more In May 2020, 2 months after the COVID-19 pandemic caused institutions across the world to move all their courses online, we conducted a survey to evaluate the impact of this transition on a group of computer science students. That first survey highlighted mostly negative effects, with students struggling to perform many class-related activities. About a year later, after a full year of remote teaching, we wanted to see if and how the students' sentiment had changed. To assess students' perceptions of remote teaching, we conducted a new survey composed of 41 between multiple choice, Likert scale and open ended questions. Additionally, we have also interviewed instructors of computer science courses, to learn about their experience and how they adapted to the new teaching modality. 137 students and 10 instructors have shared their feedback regarding their positive and negative experiences in the new learning format. Our results show that the students' experience has improved significantly, to the point that many of them expressed interest in continuing learning online, at least partially. Conversely, the instructors have concerns that this may not produce the best learning 1 outcomes for the students. Our study also shows that some populations may still be at a disadvantage in this learning format. The results and considerations included in this report may benefit the conversation on how to conduct computer science higher education in a post-pandemic world.
IEEE Transactions on Visualization and Computer Graphics, Aug 1, 2015
Vector field simplification aims to reduce the complexity of the flow by removing features in ord... more Vector field simplification aims to reduce the complexity of the flow by removing features in order of their relevance and importance, to reveal prominent behavior and obtain a compact representation for interpretation. Most existing simplification techniques based on the topological skeleton successively remove pairs of critical points connected by separatrices, using distance or area-based relevance measures. These methods rely on the stable extraction of the topological skeleton, which can be difficult due to instability in numerical integration, especially when processing highly rotational flows. In this paper, we propose a novel simplification scheme derived from the recently introduced topological notion of robustness which enables the pruning of sets of critical points according to a quantitative measure of their stability, that is, the minimum amount of vector field perturbation required to remove them. This leads to a hierarchical simplification scheme that encodes flow magnitude in its perturbation metric. Our novel simplification algorithm is based on degree theory and has minimal boundary restrictions. Finally, we provide an implementation under the piecewise-linear setting and apply it to both synthetic and real-world datasets. We show local and complete hierarchical simplifications for steady as well as unsteady vector fields.
IS&T International Symposium on Electronic Imaging Science and Technology, Feb 14, 2016
Morse decomposition has been shown a reliable way to compute and represent vector field topology.... more Morse decomposition has been shown a reliable way to compute and represent vector field topology. Its computation first converts the original vector field into a directed graph representation, so that flow recurrent dynamics (i.e., Morse sets) can be identified as some strongly connected components of the graph. In this paper, we present a framework that enables the user to efficiently compute Morse decompositions of 3D piecewise linear vector fields defined on regular grids. Specifically, we extend the 2D adaptive edge sampling technique to 3D for the outer approximation computation of the image of any 3D cell for the construction of the directed graph. To achieve finer decomposition, a hierarchical refinement framework is applied to procedurally increase the integration steps and subdivide the underlying grids that contain certain Morse sets. To improve the computational performance, we implement our Morse decomposition framework using CUDA. We have applied our framework to a number of analytic and real-world 3D steady vector fields to demonstrate its utility.
IEEE Transactions on Visualization and Computer Graphics, Oct 1, 2012
Design of time-varying vector fields, i.e., vector fields that can change over time, has a wide v... more Design of time-varying vector fields, i.e., vector fields that can change over time, has a wide variety of important applications in computer graphics. Existing vector field design techniques do not address time-varying vector fields. In this paper, we present a framework for the design of time-varying vector fields, both for planar domains as well as manifold surfaces. Our system supports the creation and modification of various time-varying vector fields with desired spatial and temporal characteristics through several design metaphors, including streamlines, pathlines, singularity paths, and bifurcations. These design metaphors are integrated into an element-based design to generate the time-varying vector fields via a sequence of basis field summations or spatial constrained optimizations at the sampled times. The key-frame design and field deformation are also introduced to support other user design scenarios. Accordingly, a spatial-temporal constrained optimization and the time-varying transformation are employed to generate the desired fields for these two design scenarios, respectively. We apply the time-varying vector fields generated using our design system to a number of important computer graphics applications that require controllable dynamic effects, such as evolving surface appearance, dynamic scene design, steerable crowd movement, and painterly animation. Many of these are difficult or impossible to achieve via prior simulation-based methods. In these applications, the time-varying vector fields have been applied as either orientation fields or advection fields to control the instantaneous appearance or evolving trajectories of the dynamic effects.
Journal of Fluid Mechanics, Feb 2, 2016
The wall shear stress (WSS) vector field provides a signature for near wall convective transport,... more The wall shear stress (WSS) vector field provides a signature for near wall convective transport, and can be scaled to obtain a first order approximation of the near wall fluid velocity. The near wall flow field governs mass transfer problems in convection dominated open flows with high Schmidt number, in which case a flux at the wall will lead to a thin concentration boundary layer. Such near wall transport is of particular interest in cardiovascular flows whereby hemodynamics can initiate and progress biological events at the vessel wall. In this study we consider mass transfer processes in pulsatile blood flow of abdominal aortic aneurysms resulting from complex WSS patterns. Specifically, the Lagrangian surface transport of a species released at the vessel wall was advected in forward and backward time based on the near wall velocity field. Exposure time and residence time measures were defined to quantify accumulation of trajectories, as well as the time required to escape the near wall domain. The effect of diffusion and normal velocity was investigated. The trajectories induced by the WSS vector field were observed to form attracting and repelling coherent structures that delineated species distribution inside the boundary layer consistent with exposure and residence time measures. The results indicate that Lagrangian wall shear stress structures can provide a template for near wall transport.
Bulletin of the American Physical Society, Nov 23, 2015
Wall shear stress manifolds and near wall flow topology in aneurysms AMIRHOSSEIN
Biomechanics and Modeling in Mechanobiology, Nov 17, 2016
IEEE Transactions on Visualization and Computer Graphics, Apr 1, 2019
Analysis, visualization, and design of vector fields on surfaces have a wide variety of major app... more Analysis, visualization, and design of vector fields on surfaces have a wide variety of major applications in both scientific visualization and computer graphics. On the one hand, analysis and visualization of vector fields provide critical insights to the flow data produced from simulation or experiments of various engineering processes. On the other hand, many graphics applications require vector fields as input to drive certain graphical processes. This thesis addresses vector field analysis and design for both visualization and graphics applications. Topological analysis of vector fields provides the qualitative (or structural) information of the underlying dynamics of the given vector data, which helps the domain experts identify the critical features and behaviors efficiently. In this thesis, I introduce a more complete vector field topology called Entity Connection c Copyright by Guoning Chen Professor Andrzej Szymczak for providing the efficient computation algorithm of Conley index. I am also grateful for the valuable discussion with Professor Gerik Scheuermann, Professor Harry Yeh, and Dr. Pawe l Pilarczyk. I wish to thank Dr. Pascal Mueller for creating the beautiful images for the street modeling project. My thanks goes to Professor Mike Bailey and Professor Ronald A. Metoyer. From them I have learned not only cutting-edge graphics knowledge and techniques but also how to be a good teacher. I would like to thank all my colleagues, Jonathan Palacios, William Brendal,
Computer Graphics Forum, 2021
We introduce the curve complexity heuristic (CCH), a KD‐tree construction strategy for 3D curves,... more We introduce the curve complexity heuristic (CCH), a KD‐tree construction strategy for 3D curves, which enables interactive exploration of neighborhoods in dense and large line datasets. It can be applied to searches of k‐nearest curves (KNC) as well as radius‐nearest curves (RNC). The CCH KD‐tree construction consists of two steps: (i) 3D curve decomposition that takes into account curve complexity and (ii) KD‐tree construction, which involves a novel splitting and early termination strategy. The obtained KD‐tree allows us to improve the speed of existing neighborhood search approaches by at least an order of magnitude (i. e., 28×for KNC and 12×for RNC with 98% accuracy) by considering local curve complexity. We validate this performance with a quantitative evaluation of the quality of search results and computation time. Also, we demonstrate the usefulness of our approach for supporting various applications such as interactive line queries, line opacity optimization, and line abst...
ACM Transactions on Graphics, Jul 27, 2015
Figure 1: (a) An input hex-mesh [Li et al. 2012]: The image on the left shows its base-complex th... more Figure 1: (a) An input hex-mesh [Li et al. 2012]: The image on the left shows its base-complex that partitions the hexahedral mesh into different large components, illustrated with different colors on the right. Due to the misalignments between singularities, many (typically small) components arise. For instance, a strip of small components near the sharp feature is highlighted. (b) Our alignment algorithm reduces the complexity of the base-complex but leads to a hex-mesh with a large distortion. (c) Both the singularity placement and the element quality of the resulting hex-mesh are improved by our structure-aware optimization algorithm.
Computer Graphics Forum, Apr 11, 2017
Hexahedral (or Hex-) meshes are preferred in a number of scientific and engineering simulations a... more Hexahedral (or Hex-) meshes are preferred in a number of scientific and engineering simulations and analyses due to their desired numerical properties. Recent state-of-the-art techniques can generate high quality hex-meshes. However, they typically produce hex-meshes with uniform element sizes and thus may fail to preserve small scale features on the boundary surface. In this work, we present a new framework that enables users to generate hex-meshes with varying element sizes so that small features will be filled with smaller and denser elements, while the transition from smaller elements to larger ones is smooth, compared to the octree-based approach. This is achieved by first detecting regions of interest (ROI) of small scale features. These ROIs are then magnified using the as-rigid-as-possible (ARAP) deformation with either an automatically determined or a user-specified scale factor. A hex-mesh is then generated from the deformed mesh using existing approaches that produce hex-meshes with uniform-sized elements. This initial hex-mesh is then mapped back to the original volume before magnification to adjust the element sizes in those ROIs. We have applied this framework to a variety of man-made and natural models to demonstrate its effectiveness.
IEEE Transactions on Visualization and Computer Graphics, Jul 1, 2016
In this paper, we introduce a volumetric partitioning strategy based on a generalized sweeping fr... more In this paper, we introduce a volumetric partitioning strategy based on a generalized sweeping framework to seamlessly partition the volume of an input triangle mesh into a collection of deformed cuboids. This is achieved by a user-designed volumetric harmonic function that guides the decomposition of the input volume into a sequence of 2-manifold level sets. A skeletal structure whose corners correspond to corner vertices of a 2D parameterization is extracted for each level set. Corners are placed so that the skeletal structure aligns with features of the input object. Then, a skeletal surface is constructed by matching the skeletal structures of adjacent level sets. The surface sheets of this skeletal surface partition the input volume into the deformed cuboids. The collection of cuboids does not exhibit T-junctions, significantly simplifying the hexahedral mesh generation process, and in particular, it simplifies fitting trivariate B-splines to the deformed cuboids. Intersections of the surface sheets of the skeletal surface correspond to the singular edges of the generated hex-meshes. We apply our technique to a variety of 3D objects and demonstrate the benefit of the structure decomposition in data fitting.
ACM Transactions on Graphics, Nov 20, 2017
Computer Graphics Forum, 2017
Hexahedral (hex-) meshes are important for solving partial differential equations (PDEs) in appli... more Hexahedral (hex-) meshes are important for solving partial differential equations (PDEs) in applications of scientific computing and mechanical engineering. Many methods have been proposed aiming to generate hex-meshes with high scaled Jacobians. While it is well established that a hex-mesh should be inversion-free (i.e. have a positive Jacobian measured at every corner of its hexahedron), it is not well-studied that whether the scaled Jacobian is the most effective indicator of the quality of simulations performed on inversion-free hex-meshes given the existing dozens of quality metrics for hex-meshes. Due to the challenge of precisely defining the relations among metrics, studying the correlations among different quality metrics and their correlations with the stability and accuracy of the simulations is a first and effective approach to address the above question. In this work, we propose a correlation analysis framework to systematically study these correlations. Specifically, given a large hex-mesh dataset, we classify the existing quality metrics into groups based on their correlations, which characterizes their similarity in measuring the quality of hex-elements. In addition, we rank the individual metrics based on their correlations with the accuracy and stability metrics for simulations that solve a number of elliptic PDE problems. Our preliminary experiments suggest that metrics that assess the conditioning of the elements are more correlated to the quality of solving elliptic PDEs than the others. Furthermore, an inversion-free hex-mesh with higher average quality (measured by any quality metrics) usually leads to a more accurate and stable computation of elliptic PDEs. To support our correlation study and address the lack of a publicly available large hex-mesh dataset with sufficiently varying quality metric values, we also propose a two-level perturbation strategy to generate the desired dataset from a small number of meshes to exclude the influences of element numbers, vertex connectivity, and volume sizes to our study.
Combinatorial Vector Field Topology in Three Dimensions
Springer eBooks, Nov 14, 2011
Asymmetric tensor visualization with glyph and hyperstreamline placement on 2D manifolds
ABSTRACT Asymmetric tensor fields present new challenges for visualization techniques such as hyp... more ABSTRACT Asymmetric tensor fields present new challenges for visualization techniques such as hyperstreamline placement and glyph packing. This is because the physical behaviors of the tensors are fundamentally different inside real domains where eigenvalues are real and complex domains where eigenvalues are complex. We present a hybrid visualization approach in which hyperstreamlines are used to illustrate the tensors in the real domains while glyphs are employed for complex domains. This enables an effective visualization of the flow patterns everywhere and also provides a more intuitive illustration of elliptical flow patterns in the complex domains. The choice of the types of representation for different types of domains is motivated by the physical interpretation of asymmetric tensors in the context of fluid mechanics, i.e., when the tensor field is the velocity gradient tensor. In addition, we encode the tensor magnitude to the size of the glyphs and density of hyperstreamlines. We demonstrate the effectiveness of our visualization techniques with real-world engine simulation data.
Visualization of flow on boundary surfaces from computational flow dynamics (CFD) is challenging ... more Visualization of flow on boundary surfaces from computational flow dynamics (CFD) is challenging due to the complex, adaptive resolution nature of the meshes used in the modeling and simulation process. Part one of this paper presents a fast and simple glyph placement algorithm in order to investigate and visualize flow data based on unstructured, adaptive resolution boundary meshes from CFD. The algorithm has several advantages: (1) Glyphs are automatically placed at evenly-spaced intervals. (2) The user can interactively control the spatial resolution of the glyph placement and their precise location. (3) The algorithm is fast and supports multi-resolution visualization of the flow at surfaces. The implementation supports multiple representations of the flow-some optimized for speed others for accuracy. Furthermore the approach doesn't rely on any pre-processing of the data or parameterization of the surface and handles large meshes efficiently. The result is a tool that provides engineers with a fast and intuitive overview of their CFD simulation results. In part two, we introduce an automatic streamline seeding algorithm for vector fields defined on surfaces in 3D space. The algorithm generates evenly-spaced streamlines fast, simply, and efficiently for any general surface-based vector field. It is general because it handles large, complex, unstructured, adaptive resolution grids with holes and discontinuities, does not require a parameterization, and can generate both sparse and dense representations of the flow. It is efficient because streamlines are only integrated for visible portions of the surface. It is simple because the image-based approach removes the need to perform streamline tracing on a triangular mesh, a process which is complicated at best. And it is fast because it makes effective, balanced use of both the CPU and the GPU. The key to the algorithm's speed, simplicity, and efficiency is its image-based seeding strategy. We demonstrate our algorithm on complex, real-world simulation data sets from computational fluid dynamics and compare it with object-space streamline visualizations.
Electronic Imaging, 2021
Traffic signals are part of our critical infrastructure and protecting their integrity is a serio... more Traffic signals are part of our critical infrastructure and protecting their integrity is a serious concern. Security flaws in traffic signal systems have been documented and effective detection of exploitation of these flaws remains a challenge. In this paper we present a visual analytics approach to look for anomalies in traffic signal data (i.e., abnormal traffic light patterns) that may indicate a compromise of the system. To our knowledge it is a first time a visual analytics approach is applied for the processing and exploration of traffic signal data. This system supports level-of-detail exploration with various visualization techniques. Data cleaning and a number of preprocessing techniques for the extraction of summary information (e.g., traffic signal cycles) of the data are also performed before the visualization and data exploration. Our system successfully reveals the errors in the input data that would be difficult to capture with simple plots alone. In addition, our system captures some abnormal signal patterns that may indicate intrusions into the system. In summary, this work offers a new and effective way to study attacks or intrusions to traffic signal control systems via the visual analysis of traffic light signal patterns.