Elena Casiraghi | Università degli Studi di Milano - State University of Milan (Italy) (original) (raw)
Uploads
Papers by Elena Casiraghi
Purpose Patients with familial adenomatous polyposis (FAP) undergoing Total Colectomy with ileum-... more Purpose Patients with familial adenomatous polyposis (FAP) undergoing Total Colectomy with ileum-rectum anastomosis (IRA) could develop cancer in the rectal stump. The purpose of this study was to evaluate the survival rate after developing cancer in rectal stump in patients with FAP. Methodology The database of Hereditary Digestive Tumor Registry at Fondazione IRCCS Istituto Tumori of Milan was reviewed. Patients underwent Total Colectomy/IRA between 1935 and 2014 were included in the study, and patients who developed cancer in rectal stump were identified. The survival rate of the patients who developed a cancer in rectal stump was assessed using Kaplan-Meier method. Results From a total of 697 patients undergone total colectomy with IRA, 49 patients (7%) developed a cancer in the rectal stump. The median (range) age at diagnosis of cancer in the rectal stump, for the 49 patients, was 42 years (21-67), the APC mutation was pathogenetic in 43 (88%) patients and in 12 patients (24%) the mutation location was identified between codon 1061 and 1309. Median (range) interval from Total Colectomy/IRA and developing cancer in rectal stump was 157 months (12-486). The stage of cancer in rectal stump was A/B in 38 pts (77.5%) while stage C/D in 11 pts (22.5%). With a median (range) follow-up of 88.3 months (12-368) after developing cancer in rectal stump the survival rate at 10 years was 72%. Conclusion Within the present series the cancer in rectal stump is a quite long term risk, and may support the conservative approach at first surgery in FAP pts
Lecture Notes in Computer Science, 2015
In the past two decades the estimation of the intrinsic dimensionality of a dataset has gained co... more In the past two decades the estimation of the intrinsic dimensionality of a dataset has gained considerable importance, since it is a relevant information for several real life applications. Unfortunately, although a great deal of research effort has been devoted to the development of effective intrinsic dimensionality estimators, the problem is still open. For this reason, in this paper we propose a novel robust intrinsic dimensionality estimator that exploits the information conveyed by the normalized nearest neighbor distances, through a technique based on rank-order statistics that limits common underestimation issues related to the edge effect. Experiments performed on both synthetic and real datasets highlight the robustness and the effectiveness of the proposed algorithm when compared to state-of-the-art methodologies.
European Journal of Radiology, 2015
Lecture Notes in Computer Science, 2013
Mathematical Problems in Engineering, 2015
When dealing with datasets comprising high-dimensional points, it is usually advantageous to disc... more When dealing with datasets comprising high-dimensional points, it is usually advantageous to discover some data structure. A fundamental information needed to this aim is the minimum number of parameters required to describe the data while minimizing the information loss. This number, usually called intrinsic dimension, can be interpreted as the dimension of the manifold from which the input data are supposed to be drawn. Due to its usefulness in many theoretical and practical problems, in the last decades the concept of intrinsic dimension has gained considerable attention in the scientific community, motivating the large number of intrinsic dimensionality estimators proposed in the literature. However, the problem is still open since most techniques cannot efficiently deal with datasets drawn from manifolds of high intrinsic dimension and nonlinearly embedded in higher dimensional spaces. This paper surveys some of the most interesting, widespread used, and advanced state-of-the-a...
Image Analysis and Processing - ICIAP 2017
Computers
Missing data imputation has been a hot topic in the past decade, and many state-of-the-art works ... more Missing data imputation has been a hot topic in the past decade, and many state-of-the-art works have been presented to propose novel, interesting solutions that have been applied in a variety of fields. In the past decade, the successful results achieved by deep learning techniques have opened the way to their application for solving difficult problems where human skill is not able to provide a reliable solution. Not surprisingly, some deep learners, mainly exploiting encoder-decoder architectures, have also been designed and applied to the task of missing data imputation. However, most of the proposed imputation techniques have not been designed to tackle “complex data”, that is high dimensional data belonging to datasets with huge cardinality and describing complex problems. Precisely, they often need critical parameters to be manually set or exploit complex architecture and/or training phases that make their computational load impracticable. In this paper, after clustering the s...
Scientific Reports
Methods for phenotype and outcome prediction are largely based on inductive supervised models tha... more Methods for phenotype and outcome prediction are largely based on inductive supervised models that use selected biomarkers to make predictions, without explicitly considering the functional relationships between individuals. We introduce a novel network-based approach named Patient-Net (P-Net) in which biomolecular profiles of patients are modeled in a graph-structured space that represents gene expression relationships between patients. Then a kernel-based semi-supervised transductive algorithm is applied to the graph to explore the overall topology of the graph and to predict the phenotype/clinical outcome of patients. Experimental tests involving several publicly available datasets of patients afflicted with pancreatic, breast, colon and colorectal cancer show that our proposed method is competitive with state-of-the-art supervised and semi-supervised predictive systems. Importantly, P-Net also provides interpretable models that can be easily visualized to gain clues about the re...
Universal Access in the Information Society
Pattern Recognition Letters
BMC Bioinformatics
Background The protein ki67 (pki67) is a marker of tumor aggressiveness, and its expression has b... more Background The protein ki67 (pki67) is a marker of tumor aggressiveness, and its expression has been proven to be useful in the prognostic and predictive evaluation of several types of tumors. To numerically quantify the pki67 presence in cancerous tissue areas, pathologists generally analyze histochemical images to count the number of tumor nuclei marked for pki67. This allows estimating the ki67-index, that is the percentage of tumor nuclei positive for pki67 over all the tumor nuclei. Given the high image resolution and dimensions, its estimation by expert clinicians is particularly laborious and time consuming. Though automatic cell counting techniques have been presented so far, the problem is still open. Results In this paper we present a novel automatic approach for the estimations of the ki67-index. The method starts by exploiting the STRESS algorithm to produce a color enhanced image where all pixels belonging to nuclei are easily identified by thresholding, and then separa...
ELCVIA Electronic Letters on Computer Vision and Image Analysis
Journal of Clinical Investigation
Ieee Transactions on Medical Imaging, Dec 1, 2006
Camera calibration is a central topic in computer vision, since it is the first and fundamental s... more Camera calibration is a central topic in computer vision, since it is the first and fundamental step for image rectification, D modelling and reconstruction. Good results can be obtained using very well known camera calibration algorithms like the ones presented by Zhang or Tsai; both of them need an accurate initialization procedure that requires to determine the corner positions of a calibration pattern (e.g. a chessboard) with very high precision. In this paper we propose an efficient algorithm which determines the chessboard corners with subpixel precision; moreover it does not make any assumption on the scale and orientation of the chessboard, and works under very different illumination conditions. The method first localizes the chessboard in the image, then it determines the size of its squared elements, and finally it looks for the corners by means of a simple statistical model. The results presented show the accuracy and the robustness of the method.
Purpose Patients with familial adenomatous polyposis (FAP) undergoing Total Colectomy with ileum-... more Purpose Patients with familial adenomatous polyposis (FAP) undergoing Total Colectomy with ileum-rectum anastomosis (IRA) could develop cancer in the rectal stump. The purpose of this study was to evaluate the survival rate after developing cancer in rectal stump in patients with FAP. Methodology The database of Hereditary Digestive Tumor Registry at Fondazione IRCCS Istituto Tumori of Milan was reviewed. Patients underwent Total Colectomy/IRA between 1935 and 2014 were included in the study, and patients who developed cancer in rectal stump were identified. The survival rate of the patients who developed a cancer in rectal stump was assessed using Kaplan-Meier method. Results From a total of 697 patients undergone total colectomy with IRA, 49 patients (7%) developed a cancer in the rectal stump. The median (range) age at diagnosis of cancer in the rectal stump, for the 49 patients, was 42 years (21-67), the APC mutation was pathogenetic in 43 (88%) patients and in 12 patients (24%) the mutation location was identified between codon 1061 and 1309. Median (range) interval from Total Colectomy/IRA and developing cancer in rectal stump was 157 months (12-486). The stage of cancer in rectal stump was A/B in 38 pts (77.5%) while stage C/D in 11 pts (22.5%). With a median (range) follow-up of 88.3 months (12-368) after developing cancer in rectal stump the survival rate at 10 years was 72%. Conclusion Within the present series the cancer in rectal stump is a quite long term risk, and may support the conservative approach at first surgery in FAP pts
Lecture Notes in Computer Science, 2015
In the past two decades the estimation of the intrinsic dimensionality of a dataset has gained co... more In the past two decades the estimation of the intrinsic dimensionality of a dataset has gained considerable importance, since it is a relevant information for several real life applications. Unfortunately, although a great deal of research effort has been devoted to the development of effective intrinsic dimensionality estimators, the problem is still open. For this reason, in this paper we propose a novel robust intrinsic dimensionality estimator that exploits the information conveyed by the normalized nearest neighbor distances, through a technique based on rank-order statistics that limits common underestimation issues related to the edge effect. Experiments performed on both synthetic and real datasets highlight the robustness and the effectiveness of the proposed algorithm when compared to state-of-the-art methodologies.
European Journal of Radiology, 2015
Lecture Notes in Computer Science, 2013
Mathematical Problems in Engineering, 2015
When dealing with datasets comprising high-dimensional points, it is usually advantageous to disc... more When dealing with datasets comprising high-dimensional points, it is usually advantageous to discover some data structure. A fundamental information needed to this aim is the minimum number of parameters required to describe the data while minimizing the information loss. This number, usually called intrinsic dimension, can be interpreted as the dimension of the manifold from which the input data are supposed to be drawn. Due to its usefulness in many theoretical and practical problems, in the last decades the concept of intrinsic dimension has gained considerable attention in the scientific community, motivating the large number of intrinsic dimensionality estimators proposed in the literature. However, the problem is still open since most techniques cannot efficiently deal with datasets drawn from manifolds of high intrinsic dimension and nonlinearly embedded in higher dimensional spaces. This paper surveys some of the most interesting, widespread used, and advanced state-of-the-a...
Image Analysis and Processing - ICIAP 2017
Computers
Missing data imputation has been a hot topic in the past decade, and many state-of-the-art works ... more Missing data imputation has been a hot topic in the past decade, and many state-of-the-art works have been presented to propose novel, interesting solutions that have been applied in a variety of fields. In the past decade, the successful results achieved by deep learning techniques have opened the way to their application for solving difficult problems where human skill is not able to provide a reliable solution. Not surprisingly, some deep learners, mainly exploiting encoder-decoder architectures, have also been designed and applied to the task of missing data imputation. However, most of the proposed imputation techniques have not been designed to tackle “complex data”, that is high dimensional data belonging to datasets with huge cardinality and describing complex problems. Precisely, they often need critical parameters to be manually set or exploit complex architecture and/or training phases that make their computational load impracticable. In this paper, after clustering the s...
Scientific Reports
Methods for phenotype and outcome prediction are largely based on inductive supervised models tha... more Methods for phenotype and outcome prediction are largely based on inductive supervised models that use selected biomarkers to make predictions, without explicitly considering the functional relationships between individuals. We introduce a novel network-based approach named Patient-Net (P-Net) in which biomolecular profiles of patients are modeled in a graph-structured space that represents gene expression relationships between patients. Then a kernel-based semi-supervised transductive algorithm is applied to the graph to explore the overall topology of the graph and to predict the phenotype/clinical outcome of patients. Experimental tests involving several publicly available datasets of patients afflicted with pancreatic, breast, colon and colorectal cancer show that our proposed method is competitive with state-of-the-art supervised and semi-supervised predictive systems. Importantly, P-Net also provides interpretable models that can be easily visualized to gain clues about the re...
Universal Access in the Information Society
Pattern Recognition Letters
BMC Bioinformatics
Background The protein ki67 (pki67) is a marker of tumor aggressiveness, and its expression has b... more Background The protein ki67 (pki67) is a marker of tumor aggressiveness, and its expression has been proven to be useful in the prognostic and predictive evaluation of several types of tumors. To numerically quantify the pki67 presence in cancerous tissue areas, pathologists generally analyze histochemical images to count the number of tumor nuclei marked for pki67. This allows estimating the ki67-index, that is the percentage of tumor nuclei positive for pki67 over all the tumor nuclei. Given the high image resolution and dimensions, its estimation by expert clinicians is particularly laborious and time consuming. Though automatic cell counting techniques have been presented so far, the problem is still open. Results In this paper we present a novel automatic approach for the estimations of the ki67-index. The method starts by exploiting the STRESS algorithm to produce a color enhanced image where all pixels belonging to nuclei are easily identified by thresholding, and then separa...
ELCVIA Electronic Letters on Computer Vision and Image Analysis
Journal of Clinical Investigation
Ieee Transactions on Medical Imaging, Dec 1, 2006
Camera calibration is a central topic in computer vision, since it is the first and fundamental s... more Camera calibration is a central topic in computer vision, since it is the first and fundamental step for image rectification, D modelling and reconstruction. Good results can be obtained using very well known camera calibration algorithms like the ones presented by Zhang or Tsai; both of them need an accurate initialization procedure that requires to determine the corner positions of a calibration pattern (e.g. a chessboard) with very high precision. In this paper we propose an efficient algorithm which determines the chessboard corners with subpixel precision; moreover it does not make any assumption on the scale and orientation of the chessboard, and works under very different illumination conditions. The method first localizes the chessboard in the image, then it determines the size of its squared elements, and finally it looks for the corners by means of a simple statistical model. The results presented show the accuracy and the robustness of the method.