Kumaradevan Punithakumar | University of Alberta (original) (raw)

Papers by Kumaradevan Punithakumar

Research paper thumbnail of Detection of Left Ventricular Motion Abnormality Via Information Measures and Bayesian Filtering

IEEE Transactions on Information Technology in Biomedicine, 2000

We present an original information theoretic measure of heart motion based on the Shannon's diffe... more We present an original information theoretic measure of heart motion based on the Shannon's differential entropy (SDE), which allows heart wall motion abnormality detection. Based on functional images, which are subject to noise and segmentation inaccuracies, heart wall motion analysis is acknowledged as a difficult problem, and as such, incorporation of prior knowledge is crucial for improving accuracy. Given incomplete, noisy data and a dynamic model, the Kalman filter, a well-known recursive Bayesian filter, is devised in this study to the estimation of the left ventricular (LV) cavity points. However, due to similarity between the statistical information of normal and abnormal heart motions, detecting and classifying abnormality is a challenging problem, which we investigate with a global measure based on the SDE. We further derive two other possible information theoretic abnormality detection criteria, one is based on Rényi entropy and the other on Fisher information. The proposed methods analyze wall motion quantitatively by constructing distributions of the normalized radial distance estimates of the LV cavity. Using 269 × 20 segmented LV cavities of short-axis MRI obtained from 30 subjects, the experimental analysis demonstrates that the proposed SDE criterion can lead to a significant improvement over other features that are prevalent in the literature related to the LV cavity, namely, mean radial displacement and mean radial velocity.

Research paper thumbnail of Left ventricle segmentation via graph cut distribution matching

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2009

We present a discrete kernel density matching energy for segmenting the left ventricle cavity in ... more We present a discrete kernel density matching energy for segmenting the left ventricle cavity in cardiac magnetic resonance sequences. The energy and its graph cut optimization based on an original first-order approximation of the Bhattacharyya measure have not been proposed previously, and yield competitive results in nearly real-time. The algorithm seeks a region within each frame by optimization of two priors, one geometric (distance-based) and the other photometric, each measuring a distribution similarity between the region and a model learned from the first frame. Based on global rather than pixelwise information, the proposed algorithm does not require complex training and optimization with respect to geometric transformations. Unlike related active contour methods, it does not compute iterative updates of computationally expensive kernel densities. Furthermore, the proposed first-order analysis can be used for other intractable energies and, therefore, can lead to segmentati...

Research paper thumbnail of Left ventricle segmentation in MRI via convex relaxed distribution matching

Medical Image Analysis, 2013

A fundamental step in the diagnosis of cardiovascular diseases, automatic left ventricle (LV) seg... more A fundamental step in the diagnosis of cardiovascular diseases, automatic left ventricle (LV) segmentation in cardiac magnetic resonance images (MRIs) is still acknowledged to be a difficult problem. Most of the existing algorithms require either extensive training or intensive user inputs. This study investigates fast detection of the left ventricle (LV) endo-and epicardium surfaces in cardiac MRI via convex relaxation and distribution matching. The algorithm requires a single subject for training and a very simple user input, which amounts to a single point (mouse click) per target region (cavity or myocardium). It seeks cavity and myocardium regions within each 3D phase by optimizing two functionals, each containing two distribution-matching constraints: (1) a distance-based shape prior and (2) an intensity prior. Based on a global measure of similarity between distributions, the shape prior is intrinsically invariant with respect to translation and rotation. We further introduce a scale variable from which we derive a fixed-point equation (FPE), thereby achieving scale-invariance with only few fast computations. The proposed algorithm relaxes the need for costly pose estimation (or registration) procedures and large training sets, and can tolerate shape deformations, unlike template (or atlas) based priors. Our formulation leads to a challenging problem, which is not directly amenable to convex-optimization techniques. For each functional, we split the problem into a sequence of sub-problems, each of which can be solved exactly and globally via a convex relaxation and the augmented Lagrangian method. Unlike related graph-cut approaches, the proposed convex-relaxation solution can be parallelized to reduce substantially the computational time for 3D domains (or higher), extends directly to high dimensions, and does not have the grid-bias problem. Our parallelized implementation on a graphics processing unit (GPU) demonstrates that the proposed algorithm requires about 3.87 s for a typical cardiac MRI volume, a speed-up of about five times compared to a standard implementation. We report a performance evaluation over 400 volumes acquired from 20 subjects, which shows that the obtained 3D surfaces correlate with independent manual delineations. We further demonstrate experimentally that (1) the performance of the algorithm is not significantly affected by the choice of the training subject and (2) the shape description we use does not change significantly from one subject to another. These results support the fact that a single subject is sufficient for training the proposed algorithm.

Research paper thumbnail of Regional Assessment of Cardiac Left Ventricular Myocardial Function via MRI Statistical Features

IEEE Transactions on Medical Imaging, 2000

Automating the detection and localization of segmental (regional) left ventricle (LV) abnormaliti... more Automating the detection and localization of segmental (regional) left ventricle (LV) abnormalities in magnetic resonance imaging (MRI) has recently sparked an impressive research effort, with promising performances and a breadth of techniques. However, despite such an effort, the problem is still acknowledged to be challenging, with much room for improvements in regard to accuracy. Furthermore, most of the existing techniques are labor intensive, requiring delineations of the endo-and/or epi-cardial boundaries in all frames of a cardiac sequence. The purpose of this study is to investigate a real-time machine-learning approach which uses some image features that can be easily computed, but that nevertheless correlate well with the segmental cardiac function. Starting from a minimum user input in only one frame in a subject dataset, we build for all the regional segments and all subsequent frames a set of statistical MRI features based on a measure of similarity between distributions. We demonstrate that, over a cardiac cycle, the statistical features are related to the proportion of blood within each segment. Therefore, they can characterize segmental contraction without the need for delineating the LV boundaries in all the frames. We first seek the optimal direction along which the proposed image features are most descriptive via a linear discriminant analysis. Then, using the results as inputs to a linear support vector machine classifier, we obtain an abnormality assessment of each of the standard cardiac segments in real-time. We report a comprehensive experimental evaluation of the proposed algorithm over 928 cardiac segments obtained from 58 subjects. Compared against ground-truth evaluations by experienced radiologists, the proposed algorithm performed competitively, with an overall classification accuracy of 86.09% and a kappa measure of 0.73.

Research paper thumbnail of Detecting left ventricular impaired relaxation using MR imaging

IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), 2014

This study investigates automatic assessment of the left ventricular (LV) diastolic function usin... more This study investigates automatic assessment of the left ventricular (LV) diastolic function using cine cardiac magnetic resonance (MR) imaging. Most of the existing LV assessment algorithms addressed the systolic function, which essentially pertains to the analysis of regional wall motion abnormalities or the estimation of the ejection fraction. However, several recent clinical studies suggested that evaluating the diastolic function is essential. The diastolic function plays an important role in assessing cardiovascular abnormalities, particularly in the case of heart failure with preserved ejection fraction. The assessments of LV relaxation and stiffness abnormalities can be achieved with cardiac MR imaging. Unlike with transthoracic echocardiography, MR is not limited by an acoustic window, and allows exhaustive myocardial imaging with excellent spatial resolution. We propose an algorithm that evaluates the LV relaxation from short-axis cine MR images. The method is based on three main steps: (1) non-rigid registration, which yields a sequence of points over time, given a user-provided contour on the first frame; (2) computations of the LV filling rate and volume over the cardiac cycle; and (3) automatic detection of the maxima of the E and A waves. We report comprehensive experimental evaluations over MR data sets acquired from 53 subjects, including comparisons with independent reports for the same subjects from echocardiography. The proposed algorithm yielded a Kappa measure of 0.66, a substantial agreement with the echocardiography results.

Research paper thumbnail of Assessment of regional myocardial function via statistical features in MR images

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2011

Early and accurate detection of Left Ventricle (LV) regional wall motion abnormalities significan... more Early and accurate detection of Left Ventricle (LV) regional wall motion abnormalities significantly helps in the diagnosis and followup of cardiovascular diseases. We present a regional myocardial abnormality detection framework based on image statistics. The proposed framework requires a minimal user interaction, only to specify initial delineation and anatomical landmarks on the first frame. Then, approximations of regional myocardial segments in subsequent frames were systematically obtained by superimposing the initial delineation on the rest of the frames. The proposed method exploits the Bhattacharyya coefficient to measure the similarity between the image distribution within each segment approximation and the distribution of the corresponding user-provided segment. Linear Discriminate Analysis (LDA) is applied to find the optimal direction along which the projected features are the most descriptive. Then a Linear Support Vector Machine (SVM) classifier is employed for each o...

Research paper thumbnail of A convex max-flow segmentation of LV using subject-specific distributions on cardiac MRI

Information processing in medical imaging : proceedings of the ... conference, 2011

This work studies the convex relaxation approach to the left ventricle (LV) segmentation which gi... more This work studies the convex relaxation approach to the left ventricle (LV) segmentation which gives rise to a challenging multi-region seperation with the geometrical constraint. For each region, we consider the global Bhattacharyya metric prior to evaluate a gray-scale and a radial distance distribution matching. In this regard, the studied problem amounts to finding three regions that most closely match their respective input distribution model. It was previously addressed by curve evolution, which leads to sub-optimal and computationally intensive algorithms, or by graph cuts, which result in heavy metrication errors (grid bias). The proposed convex relaxation approach solves the LV segmentation through a sequence of convex sub-problems. Each sub-problem leads to a novel bound of the Bhattacharyya measure and yields the convex formulation which paves the way to build up the efficient and reliable solver. In this respect, we propose a novel flow configuration that accounts for la...

Research paper thumbnail of Regional heart motion abnormality detection via multiview fusion

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2012

This study investigates regional heart motion abnormality detection via multiview fusion in cine ... more This study investigates regional heart motion abnormality detection via multiview fusion in cine cardiac MR images. In contrast to previous methods which rely only on short-axis image sequences, the proposed approach exploits the information from several other long-axis image sequences, namely, 2-chamber, 3-chamber and 4-chamber MR images. Our analysis follows the standard issued by American Heart Association to identify 17 standardized left ventricular segments. The proposed method first computes an initial sequence of corresponding myocardial points using a nonrigid image registration algorithm within each sequence. Then, these points were mapped to 3D space and tracked using Unscented Kalman Filter (UKS). We propose a maximum likelihood based track-to-track fusion approach to combine UKS tracks from multiple image views. Finally, we use a Shannon's differential entropy of distributions of potential classifiers obtained from multiview fusion estimates, and a naive Bayes classi...

Research paper thumbnail of Heart motion abnormality detection via an information measure and Bayesian filtering

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2009

This study investigates heart wall motion abnormality detection with an information theoretic mea... more This study investigates heart wall motion abnormality detection with an information theoretic measure of heart motion based on the Shannon's differential entropy (SDE) and recursive Bayesian filtering. Heart wall motion is generally analyzed using functional images which are subject to noise and segmentation inaccuracies, and incorporation of prior knowledge is crucial in improving the accuracy. The Kalman filter, a well known recursive Bayesian filter, is used in this study to estimate the left ventricular (LV) cavity points given incomplete and noisy data, and given a dynamic model. However, due to similarities between the statistical information of normal and abnormal heart motions, detecting and classifying abnormality is a challenging problem which we proposed to investigate with a global measure based on the SDE. We further derive two other possible information theoretic abnormality detection criteria, one is based on Rényi entropy and the other on Fisher information. The ...

Research paper thumbnail of Accuracy of magnetic resonance imaging–cone beam computed tomography rigid registration of the head: an in-vitro study

Oral Surgery, Oral Medicine, Oral Pathology and Oral Radiology, 2015

To evaluate the performance of cross-modality image registration procedure between magnetic reson... more To evaluate the performance of cross-modality image registration procedure between magnetic resonance imaging (MRI) and cone beam computed tomography (CBCT). In vitro diagnostic MRI and CBCT images of 5 cadaver swine heads were obtained prospectively. Five radiopaque fiducial markers were attached to each cadaver skull by using resin screws. Automatic MRI-CBCT rigid registrations were performed. The specimens were then scanned using a 3-dimensional (3-D) laser scanner. The 3-D coordinate points for the centroid of the attached fiducial markers from laser scan were identified and considered ground truth. The distances between marker centroids were measured with MRI, CBCT, and MRI-CBCT. Accuracy was calculated by using repeated measures analysis of variance and mean difference values. The registration method was repeated 10 times for each specimen in MRI to measure the average error. There was no significant difference (P > .05) in mean distances of the markers between all images and the ground truth. The distances' mean difference between MRI, CBCT, and MRI-CBCT and the ground truth were 0.2 ± 1.1 mm, 0.3 ± 1.0 mm, 0.2 ± 1.2 mm, respectively. The detected method error ranged between 0.06 mm and 0.1 mm. The cross-modality image registration algorithm is accurate for head MRI-CBCT registration.

Research paper thumbnail of Distribution Matching with the Bhattacharyya Similarity: A Bound Optimization Framework

We present efficient graph cut algorithms for three problems: (1) finding a region in an image, s... more We present efficient graph cut algorithms for three problems: (1) finding a region in an image, so that the histogram (or distribution) of an image feature within the region most closely matches a given model; (2) cosegmentation of image pairs and (3) interactive image segmentation with a user-provided bounding box. Each algorithm seeks the optimum of a global cost function based on the Bhattacharyya measure, a convenient alternative to other matching measures such as the Kullback-Leibler divergence. Our functionals are not directly amenable to graph cut optimization as they contain non-linear functions of fractional terms, which make the ensuing optimization problems challenging. We first derive a family of parametric bounds of the Bhattacharyya measure by introducing an auxiliary labeling. Then, we show that these bounds are auxiliary functions of the Bhattacharyya measure, a result which allows us to solve each problem efficiently via graph cuts. We show that the proposed optimization procedures converge within very few graph cut iterations. Comprehensive and various experiments, including quantitative and comparative evaluations over two databases, demonstrate the advantages of the proposed algorithms over related works in regard to optimality, computational load, accuracy and flexibility.

Research paper thumbnail of Nonlinear filtering algorithms for multitarget tracking

Tracking multiple targets with uncertain target dynamics is a difficult problem, especially with ... more Tracking multiple targets with uncertain target dynamics is a difficult problem, especially with nonlinear state and/or measurement equations. Random finite set theory provides a rigorous foundation to multitarget tracking problems. It provides a framework to represent the full multitarget posterior in contrast to other conventional approaches. However, the computational complexity of performing multitarget recursion grows exponentially with the number of

Research paper thumbnail of <title>A distributed implementation of a sequential Monte Carlo probability hypothesis density filter for sensor networks</title>

Signal Processing, Sensor Fusion, and Target Recognition XV, 2006

This paper presents a Sequential Monte Carlo (SMC) Probability Hypothesis Density (PHD) algorithm... more This paper presents a Sequential Monte Carlo (SMC) Probability Hypothesis Density (PHD) algorithm for decentralized state estimation from multiple platforms. The proposed algorithm addresses the problem of communicating and fusing track information from a team of multiple sensing platforms detecting and tracking multiple targets in the surveillance region. Each sensing platform makes multiple, noisy measurements of an underlying, time-varying state

Research paper thumbnail of Adaptive Sensor Management for Target Tracking with Sensor Deployment and Motion Uncertainties

Research paper thumbnail of <title>Improved multi-target tracking using probability hypothesis density smoothing</title>

Signal and Data Processing of Small Targets 2007, 2007

The optimal Bayesian multi-target tracking is computationally demanding. The probability hypothes... more The optimal Bayesian multi-target tracking is computationally demanding. The probability hypothesis density (PHD) filter, which is a first moment approximation of the optimal one, is a computationally tractable alternative. By evaluating the PHD, one can extract the number of targets as well as their individual states. Recent sequential Monte Carlo (SMC) implementation of the PHD filter paves the way to apply the PHD filter to nonlinear non-Gaussian problems. It seems that the particle implementation of PHD filter is more dependent on current measurements, especially in the case of low observable target problems (i.e., estimates are sensitive to missed detections and false alarms). In this paper, a PHD smoothing algorithm is proposed to improve the capability of the PHD based tracking system. By performing smoothing, which gives delayed estimates, we will get not only better estimates for target states but also better estimate for number of targets. Simulations are performed on proposed method with a multi-target scenario. Simulation results confirm the improved performance of the proposed algorithm.

Research paper thumbnail of <title>Spline filter for nonlinear/non-Gaussian Bayesian tracking</title>

Signal and Data Processing of Small Targets 2007, 2007

This paper presents a method for the realization of nonlinear/non-Gaussian Bayesian filtering bas... more This paper presents a method for the realization of nonlinear/non-Gaussian Bayesian filtering based on spline interpolation. Sequential Monte Carlo (SMC) approaches are widely used in nonlinear/non-Gaussian Bayesian filtering in which the densities are approximated by taking discrete set of points in the state space. In contrast to the SMC methods, the proposed approach uses spline polynomial interpolation to approximate the

Research paper thumbnail of Regional Heart Motion Abnormality Detection via Information Measures and Unscented Kalman Filtering

Lecture Notes in Computer Science, 2010

This study investigates regional heart motion abnormality detection using various classifier feat... more This study investigates regional heart motion abnormality detection using various classifier features with Shannon's Differential Entropy (SDE). Rather than relying on elementary measurements or a fixed set of moments, the SDE measures global distribution information and, as such, has more discriminative power in classifying distributions. Based on functional images, which are subject to noise and segmentation inaccuracies, heart wall motion analysis is acknowledged as a difficult problem and, therefore, incorporation of prior knowledge is desirable to enhance the accuracy. Given noisy data and nonlinear dynamic model to describe the myocardial motion, unscented Kalman filter, a recursive nonlinear Bayesian filter, is devised in this study so as to estimate LV cavity points. Subsequently, a naive Bayes classifier algorithm is constructed from the SDEs of different features in order to automatically detect abnormal functional regions of the myocardium. Using 90×20 segmented LV cavities of short-axis magnetic resonance images obtained from 30 subjects, the experimental analysis carried over 480 myocardial segments demonstrates that the proposed method perform significantly better than other recent methods, and can lead to a promising diagnostic support tool to assist clinicians.

Research paper thumbnail of <title>Integrated bias removal in passive radar systems</title>

Signal and Data Processing of Small Targets 2008, 2008

A passive coherent location (PCL) system exploits the ambient FM radio or television signals from... more A passive coherent location (PCL) system exploits the ambient FM radio or television signals from powerful local transmitters, which makes it ideal for covert tracking. In a passive radar system, also known as PCL system, a variety of measurements can be used to estimate target states such as direction of arrival (DOA), time difference of arrival (TDOA) or Doppler shift.

Research paper thumbnail of <title>A sequential Monte Carlo probability hypothesis density algorithm for multitarget track-before-detect</title>

Signal and Data Processing of Small Targets 2005, 2005

In this paper, we present a recursive track-before-detect (TBD) algorithm based on the Probabilit... more In this paper, we present a recursive track-before-detect (TBD) algorithm based on the Probability Hypothesis Density (PHD) filter for multitarget tracking. TBD algorithms are better suited over standard target tracking methods for tracking dim targets in heavy clutter and noise. Classical target tracking, where the measurements are pre-processed at each time step before passing them to the tracking filter results in information loss, which is very damaging if the target signal-to-noise ratio is low. However, in TBD the tracking filter operates directly on the raw measurements at the expense of added computational burden. The development of a recursive TBD algorithm reduces the computational burden over conventional TBD methods, namely, Hough transform, dynamic programming, etc. The TBD is a hard nonlinear non-Gaussian problem even for single target scenarios. Recent advances in Sequential Monte Carlo (SMC) based nonlinear filtering make multitarget TBD feasible. However, the current implementations use a modeling setup to accommodate the varying number of targets where a multiple model SMC based TBD approach is used to solve the problem conditioned on the model, i.e., number of targets. The PHD filter, which propagates only the first-order statistical moment (or the PHD) of the full target posterior, has been shown to be a computationally efficient solution to multitarget tracking problems with varying number of targets. We propose a PHD filter based TBD so that there is no assumption to be made on the number of targets. Simulation results are presented to show the effectiveness of the proposed filter in tracking multiple weak targets.

Research paper thumbnail of <title>Spline filter for multidimensional nonlinear/non-Gaussian Bayesian tracking</title>

Signal and Data Processing of Small Targets 2008, 2008

This paper presents a novel continuous approximation approach to nonlinear/non-Gaussian Bayesian ... more This paper presents a novel continuous approximation approach to nonlinear/non-Gaussian Bayesian tracking. A good representation of the probability density and likelihood functions is essential for the effectiveness of nonlinear filtering algorithms since these functions ...

Research paper thumbnail of Detection of Left Ventricular Motion Abnormality Via Information Measures and Bayesian Filtering

IEEE Transactions on Information Technology in Biomedicine, 2000

We present an original information theoretic measure of heart motion based on the Shannon's diffe... more We present an original information theoretic measure of heart motion based on the Shannon's differential entropy (SDE), which allows heart wall motion abnormality detection. Based on functional images, which are subject to noise and segmentation inaccuracies, heart wall motion analysis is acknowledged as a difficult problem, and as such, incorporation of prior knowledge is crucial for improving accuracy. Given incomplete, noisy data and a dynamic model, the Kalman filter, a well-known recursive Bayesian filter, is devised in this study to the estimation of the left ventricular (LV) cavity points. However, due to similarity between the statistical information of normal and abnormal heart motions, detecting and classifying abnormality is a challenging problem, which we investigate with a global measure based on the SDE. We further derive two other possible information theoretic abnormality detection criteria, one is based on Rényi entropy and the other on Fisher information. The proposed methods analyze wall motion quantitatively by constructing distributions of the normalized radial distance estimates of the LV cavity. Using 269 × 20 segmented LV cavities of short-axis MRI obtained from 30 subjects, the experimental analysis demonstrates that the proposed SDE criterion can lead to a significant improvement over other features that are prevalent in the literature related to the LV cavity, namely, mean radial displacement and mean radial velocity.

Research paper thumbnail of Left ventricle segmentation via graph cut distribution matching

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2009

We present a discrete kernel density matching energy for segmenting the left ventricle cavity in ... more We present a discrete kernel density matching energy for segmenting the left ventricle cavity in cardiac magnetic resonance sequences. The energy and its graph cut optimization based on an original first-order approximation of the Bhattacharyya measure have not been proposed previously, and yield competitive results in nearly real-time. The algorithm seeks a region within each frame by optimization of two priors, one geometric (distance-based) and the other photometric, each measuring a distribution similarity between the region and a model learned from the first frame. Based on global rather than pixelwise information, the proposed algorithm does not require complex training and optimization with respect to geometric transformations. Unlike related active contour methods, it does not compute iterative updates of computationally expensive kernel densities. Furthermore, the proposed first-order analysis can be used for other intractable energies and, therefore, can lead to segmentati...

Research paper thumbnail of Left ventricle segmentation in MRI via convex relaxed distribution matching

Medical Image Analysis, 2013

A fundamental step in the diagnosis of cardiovascular diseases, automatic left ventricle (LV) seg... more A fundamental step in the diagnosis of cardiovascular diseases, automatic left ventricle (LV) segmentation in cardiac magnetic resonance images (MRIs) is still acknowledged to be a difficult problem. Most of the existing algorithms require either extensive training or intensive user inputs. This study investigates fast detection of the left ventricle (LV) endo-and epicardium surfaces in cardiac MRI via convex relaxation and distribution matching. The algorithm requires a single subject for training and a very simple user input, which amounts to a single point (mouse click) per target region (cavity or myocardium). It seeks cavity and myocardium regions within each 3D phase by optimizing two functionals, each containing two distribution-matching constraints: (1) a distance-based shape prior and (2) an intensity prior. Based on a global measure of similarity between distributions, the shape prior is intrinsically invariant with respect to translation and rotation. We further introduce a scale variable from which we derive a fixed-point equation (FPE), thereby achieving scale-invariance with only few fast computations. The proposed algorithm relaxes the need for costly pose estimation (or registration) procedures and large training sets, and can tolerate shape deformations, unlike template (or atlas) based priors. Our formulation leads to a challenging problem, which is not directly amenable to convex-optimization techniques. For each functional, we split the problem into a sequence of sub-problems, each of which can be solved exactly and globally via a convex relaxation and the augmented Lagrangian method. Unlike related graph-cut approaches, the proposed convex-relaxation solution can be parallelized to reduce substantially the computational time for 3D domains (or higher), extends directly to high dimensions, and does not have the grid-bias problem. Our parallelized implementation on a graphics processing unit (GPU) demonstrates that the proposed algorithm requires about 3.87 s for a typical cardiac MRI volume, a speed-up of about five times compared to a standard implementation. We report a performance evaluation over 400 volumes acquired from 20 subjects, which shows that the obtained 3D surfaces correlate with independent manual delineations. We further demonstrate experimentally that (1) the performance of the algorithm is not significantly affected by the choice of the training subject and (2) the shape description we use does not change significantly from one subject to another. These results support the fact that a single subject is sufficient for training the proposed algorithm.

Research paper thumbnail of Regional Assessment of Cardiac Left Ventricular Myocardial Function via MRI Statistical Features

IEEE Transactions on Medical Imaging, 2000

Automating the detection and localization of segmental (regional) left ventricle (LV) abnormaliti... more Automating the detection and localization of segmental (regional) left ventricle (LV) abnormalities in magnetic resonance imaging (MRI) has recently sparked an impressive research effort, with promising performances and a breadth of techniques. However, despite such an effort, the problem is still acknowledged to be challenging, with much room for improvements in regard to accuracy. Furthermore, most of the existing techniques are labor intensive, requiring delineations of the endo-and/or epi-cardial boundaries in all frames of a cardiac sequence. The purpose of this study is to investigate a real-time machine-learning approach which uses some image features that can be easily computed, but that nevertheless correlate well with the segmental cardiac function. Starting from a minimum user input in only one frame in a subject dataset, we build for all the regional segments and all subsequent frames a set of statistical MRI features based on a measure of similarity between distributions. We demonstrate that, over a cardiac cycle, the statistical features are related to the proportion of blood within each segment. Therefore, they can characterize segmental contraction without the need for delineating the LV boundaries in all the frames. We first seek the optimal direction along which the proposed image features are most descriptive via a linear discriminant analysis. Then, using the results as inputs to a linear support vector machine classifier, we obtain an abnormality assessment of each of the standard cardiac segments in real-time. We report a comprehensive experimental evaluation of the proposed algorithm over 928 cardiac segments obtained from 58 subjects. Compared against ground-truth evaluations by experienced radiologists, the proposed algorithm performed competitively, with an overall classification accuracy of 86.09% and a kappa measure of 0.73.

Research paper thumbnail of Detecting left ventricular impaired relaxation using MR imaging

IEEE-EMBS International Conference on Biomedical and Health Informatics (BHI), 2014

This study investigates automatic assessment of the left ventricular (LV) diastolic function usin... more This study investigates automatic assessment of the left ventricular (LV) diastolic function using cine cardiac magnetic resonance (MR) imaging. Most of the existing LV assessment algorithms addressed the systolic function, which essentially pertains to the analysis of regional wall motion abnormalities or the estimation of the ejection fraction. However, several recent clinical studies suggested that evaluating the diastolic function is essential. The diastolic function plays an important role in assessing cardiovascular abnormalities, particularly in the case of heart failure with preserved ejection fraction. The assessments of LV relaxation and stiffness abnormalities can be achieved with cardiac MR imaging. Unlike with transthoracic echocardiography, MR is not limited by an acoustic window, and allows exhaustive myocardial imaging with excellent spatial resolution. We propose an algorithm that evaluates the LV relaxation from short-axis cine MR images. The method is based on three main steps: (1) non-rigid registration, which yields a sequence of points over time, given a user-provided contour on the first frame; (2) computations of the LV filling rate and volume over the cardiac cycle; and (3) automatic detection of the maxima of the E and A waves. We report comprehensive experimental evaluations over MR data sets acquired from 53 subjects, including comparisons with independent reports for the same subjects from echocardiography. The proposed algorithm yielded a Kappa measure of 0.66, a substantial agreement with the echocardiography results.

Research paper thumbnail of Assessment of regional myocardial function via statistical features in MR images

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2011

Early and accurate detection of Left Ventricle (LV) regional wall motion abnormalities significan... more Early and accurate detection of Left Ventricle (LV) regional wall motion abnormalities significantly helps in the diagnosis and followup of cardiovascular diseases. We present a regional myocardial abnormality detection framework based on image statistics. The proposed framework requires a minimal user interaction, only to specify initial delineation and anatomical landmarks on the first frame. Then, approximations of regional myocardial segments in subsequent frames were systematically obtained by superimposing the initial delineation on the rest of the frames. The proposed method exploits the Bhattacharyya coefficient to measure the similarity between the image distribution within each segment approximation and the distribution of the corresponding user-provided segment. Linear Discriminate Analysis (LDA) is applied to find the optimal direction along which the projected features are the most descriptive. Then a Linear Support Vector Machine (SVM) classifier is employed for each o...

Research paper thumbnail of A convex max-flow segmentation of LV using subject-specific distributions on cardiac MRI

Information processing in medical imaging : proceedings of the ... conference, 2011

This work studies the convex relaxation approach to the left ventricle (LV) segmentation which gi... more This work studies the convex relaxation approach to the left ventricle (LV) segmentation which gives rise to a challenging multi-region seperation with the geometrical constraint. For each region, we consider the global Bhattacharyya metric prior to evaluate a gray-scale and a radial distance distribution matching. In this regard, the studied problem amounts to finding three regions that most closely match their respective input distribution model. It was previously addressed by curve evolution, which leads to sub-optimal and computationally intensive algorithms, or by graph cuts, which result in heavy metrication errors (grid bias). The proposed convex relaxation approach solves the LV segmentation through a sequence of convex sub-problems. Each sub-problem leads to a novel bound of the Bhattacharyya measure and yields the convex formulation which paves the way to build up the efficient and reliable solver. In this respect, we propose a novel flow configuration that accounts for la...

Research paper thumbnail of Regional heart motion abnormality detection via multiview fusion

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2012

This study investigates regional heart motion abnormality detection via multiview fusion in cine ... more This study investigates regional heart motion abnormality detection via multiview fusion in cine cardiac MR images. In contrast to previous methods which rely only on short-axis image sequences, the proposed approach exploits the information from several other long-axis image sequences, namely, 2-chamber, 3-chamber and 4-chamber MR images. Our analysis follows the standard issued by American Heart Association to identify 17 standardized left ventricular segments. The proposed method first computes an initial sequence of corresponding myocardial points using a nonrigid image registration algorithm within each sequence. Then, these points were mapped to 3D space and tracked using Unscented Kalman Filter (UKS). We propose a maximum likelihood based track-to-track fusion approach to combine UKS tracks from multiple image views. Finally, we use a Shannon's differential entropy of distributions of potential classifiers obtained from multiview fusion estimates, and a naive Bayes classi...

Research paper thumbnail of Heart motion abnormality detection via an information measure and Bayesian filtering

Medical image computing and computer-assisted intervention : MICCAI ... International Conference on Medical Image Computing and Computer-Assisted Intervention, 2009

This study investigates heart wall motion abnormality detection with an information theoretic mea... more This study investigates heart wall motion abnormality detection with an information theoretic measure of heart motion based on the Shannon's differential entropy (SDE) and recursive Bayesian filtering. Heart wall motion is generally analyzed using functional images which are subject to noise and segmentation inaccuracies, and incorporation of prior knowledge is crucial in improving the accuracy. The Kalman filter, a well known recursive Bayesian filter, is used in this study to estimate the left ventricular (LV) cavity points given incomplete and noisy data, and given a dynamic model. However, due to similarities between the statistical information of normal and abnormal heart motions, detecting and classifying abnormality is a challenging problem which we proposed to investigate with a global measure based on the SDE. We further derive two other possible information theoretic abnormality detection criteria, one is based on Rényi entropy and the other on Fisher information. The ...

Research paper thumbnail of Accuracy of magnetic resonance imaging–cone beam computed tomography rigid registration of the head: an in-vitro study

Oral Surgery, Oral Medicine, Oral Pathology and Oral Radiology, 2015

To evaluate the performance of cross-modality image registration procedure between magnetic reson... more To evaluate the performance of cross-modality image registration procedure between magnetic resonance imaging (MRI) and cone beam computed tomography (CBCT). In vitro diagnostic MRI and CBCT images of 5 cadaver swine heads were obtained prospectively. Five radiopaque fiducial markers were attached to each cadaver skull by using resin screws. Automatic MRI-CBCT rigid registrations were performed. The specimens were then scanned using a 3-dimensional (3-D) laser scanner. The 3-D coordinate points for the centroid of the attached fiducial markers from laser scan were identified and considered ground truth. The distances between marker centroids were measured with MRI, CBCT, and MRI-CBCT. Accuracy was calculated by using repeated measures analysis of variance and mean difference values. The registration method was repeated 10 times for each specimen in MRI to measure the average error. There was no significant difference (P &gt; .05) in mean distances of the markers between all images and the ground truth. The distances&#39; mean difference between MRI, CBCT, and MRI-CBCT and the ground truth were 0.2 ± 1.1 mm, 0.3 ± 1.0 mm, 0.2 ± 1.2 mm, respectively. The detected method error ranged between 0.06 mm and 0.1 mm. The cross-modality image registration algorithm is accurate for head MRI-CBCT registration.

Research paper thumbnail of Distribution Matching with the Bhattacharyya Similarity: A Bound Optimization Framework

We present efficient graph cut algorithms for three problems: (1) finding a region in an image, s... more We present efficient graph cut algorithms for three problems: (1) finding a region in an image, so that the histogram (or distribution) of an image feature within the region most closely matches a given model; (2) cosegmentation of image pairs and (3) interactive image segmentation with a user-provided bounding box. Each algorithm seeks the optimum of a global cost function based on the Bhattacharyya measure, a convenient alternative to other matching measures such as the Kullback-Leibler divergence. Our functionals are not directly amenable to graph cut optimization as they contain non-linear functions of fractional terms, which make the ensuing optimization problems challenging. We first derive a family of parametric bounds of the Bhattacharyya measure by introducing an auxiliary labeling. Then, we show that these bounds are auxiliary functions of the Bhattacharyya measure, a result which allows us to solve each problem efficiently via graph cuts. We show that the proposed optimization procedures converge within very few graph cut iterations. Comprehensive and various experiments, including quantitative and comparative evaluations over two databases, demonstrate the advantages of the proposed algorithms over related works in regard to optimality, computational load, accuracy and flexibility.

Research paper thumbnail of Nonlinear filtering algorithms for multitarget tracking

Tracking multiple targets with uncertain target dynamics is a difficult problem, especially with ... more Tracking multiple targets with uncertain target dynamics is a difficult problem, especially with nonlinear state and/or measurement equations. Random finite set theory provides a rigorous foundation to multitarget tracking problems. It provides a framework to represent the full multitarget posterior in contrast to other conventional approaches. However, the computational complexity of performing multitarget recursion grows exponentially with the number of

Research paper thumbnail of <title>A distributed implementation of a sequential Monte Carlo probability hypothesis density filter for sensor networks</title>

Signal Processing, Sensor Fusion, and Target Recognition XV, 2006

This paper presents a Sequential Monte Carlo (SMC) Probability Hypothesis Density (PHD) algorithm... more This paper presents a Sequential Monte Carlo (SMC) Probability Hypothesis Density (PHD) algorithm for decentralized state estimation from multiple platforms. The proposed algorithm addresses the problem of communicating and fusing track information from a team of multiple sensing platforms detecting and tracking multiple targets in the surveillance region. Each sensing platform makes multiple, noisy measurements of an underlying, time-varying state

Research paper thumbnail of Adaptive Sensor Management for Target Tracking with Sensor Deployment and Motion Uncertainties

Research paper thumbnail of <title>Improved multi-target tracking using probability hypothesis density smoothing</title>

Signal and Data Processing of Small Targets 2007, 2007

The optimal Bayesian multi-target tracking is computationally demanding. The probability hypothes... more The optimal Bayesian multi-target tracking is computationally demanding. The probability hypothesis density (PHD) filter, which is a first moment approximation of the optimal one, is a computationally tractable alternative. By evaluating the PHD, one can extract the number of targets as well as their individual states. Recent sequential Monte Carlo (SMC) implementation of the PHD filter paves the way to apply the PHD filter to nonlinear non-Gaussian problems. It seems that the particle implementation of PHD filter is more dependent on current measurements, especially in the case of low observable target problems (i.e., estimates are sensitive to missed detections and false alarms). In this paper, a PHD smoothing algorithm is proposed to improve the capability of the PHD based tracking system. By performing smoothing, which gives delayed estimates, we will get not only better estimates for target states but also better estimate for number of targets. Simulations are performed on proposed method with a multi-target scenario. Simulation results confirm the improved performance of the proposed algorithm.

Research paper thumbnail of <title>Spline filter for nonlinear/non-Gaussian Bayesian tracking</title>

Signal and Data Processing of Small Targets 2007, 2007

This paper presents a method for the realization of nonlinear/non-Gaussian Bayesian filtering bas... more This paper presents a method for the realization of nonlinear/non-Gaussian Bayesian filtering based on spline interpolation. Sequential Monte Carlo (SMC) approaches are widely used in nonlinear/non-Gaussian Bayesian filtering in which the densities are approximated by taking discrete set of points in the state space. In contrast to the SMC methods, the proposed approach uses spline polynomial interpolation to approximate the

Research paper thumbnail of Regional Heart Motion Abnormality Detection via Information Measures and Unscented Kalman Filtering

Lecture Notes in Computer Science, 2010

This study investigates regional heart motion abnormality detection using various classifier feat... more This study investigates regional heart motion abnormality detection using various classifier features with Shannon's Differential Entropy (SDE). Rather than relying on elementary measurements or a fixed set of moments, the SDE measures global distribution information and, as such, has more discriminative power in classifying distributions. Based on functional images, which are subject to noise and segmentation inaccuracies, heart wall motion analysis is acknowledged as a difficult problem and, therefore, incorporation of prior knowledge is desirable to enhance the accuracy. Given noisy data and nonlinear dynamic model to describe the myocardial motion, unscented Kalman filter, a recursive nonlinear Bayesian filter, is devised in this study so as to estimate LV cavity points. Subsequently, a naive Bayes classifier algorithm is constructed from the SDEs of different features in order to automatically detect abnormal functional regions of the myocardium. Using 90×20 segmented LV cavities of short-axis magnetic resonance images obtained from 30 subjects, the experimental analysis carried over 480 myocardial segments demonstrates that the proposed method perform significantly better than other recent methods, and can lead to a promising diagnostic support tool to assist clinicians.

Research paper thumbnail of <title>Integrated bias removal in passive radar systems</title>

Signal and Data Processing of Small Targets 2008, 2008

A passive coherent location (PCL) system exploits the ambient FM radio or television signals from... more A passive coherent location (PCL) system exploits the ambient FM radio or television signals from powerful local transmitters, which makes it ideal for covert tracking. In a passive radar system, also known as PCL system, a variety of measurements can be used to estimate target states such as direction of arrival (DOA), time difference of arrival (TDOA) or Doppler shift.

Research paper thumbnail of <title>A sequential Monte Carlo probability hypothesis density algorithm for multitarget track-before-detect</title>

Signal and Data Processing of Small Targets 2005, 2005

In this paper, we present a recursive track-before-detect (TBD) algorithm based on the Probabilit... more In this paper, we present a recursive track-before-detect (TBD) algorithm based on the Probability Hypothesis Density (PHD) filter for multitarget tracking. TBD algorithms are better suited over standard target tracking methods for tracking dim targets in heavy clutter and noise. Classical target tracking, where the measurements are pre-processed at each time step before passing them to the tracking filter results in information loss, which is very damaging if the target signal-to-noise ratio is low. However, in TBD the tracking filter operates directly on the raw measurements at the expense of added computational burden. The development of a recursive TBD algorithm reduces the computational burden over conventional TBD methods, namely, Hough transform, dynamic programming, etc. The TBD is a hard nonlinear non-Gaussian problem even for single target scenarios. Recent advances in Sequential Monte Carlo (SMC) based nonlinear filtering make multitarget TBD feasible. However, the current implementations use a modeling setup to accommodate the varying number of targets where a multiple model SMC based TBD approach is used to solve the problem conditioned on the model, i.e., number of targets. The PHD filter, which propagates only the first-order statistical moment (or the PHD) of the full target posterior, has been shown to be a computationally efficient solution to multitarget tracking problems with varying number of targets. We propose a PHD filter based TBD so that there is no assumption to be made on the number of targets. Simulation results are presented to show the effectiveness of the proposed filter in tracking multiple weak targets.

Research paper thumbnail of <title>Spline filter for multidimensional nonlinear/non-Gaussian Bayesian tracking</title>

Signal and Data Processing of Small Targets 2008, 2008

This paper presents a novel continuous approximation approach to nonlinear/non-Gaussian Bayesian ... more This paper presents a novel continuous approximation approach to nonlinear/non-Gaussian Bayesian tracking. A good representation of the probability density and likelihood functions is essential for the effectiveness of nonlinear filtering algorithms since these functions ...