Ester Zumpano - Academia.edu (original) (raw)
Papers by Ester Zumpano
IEEE/WIC/ACM International Conference on Web Intelligence - Companion Volume
Computer vision systems are increasingly used for the early detection of skin cancers. Recognizin... more Computer vision systems are increasingly used for the early detection of skin cancers. Recognizing the first sign of melanoma is very important because if melanoma is found and treated in its primary stage the chances for long-term survival are excellent. On the contrary, as it progresses its treatment becomes increasingly harder and it has worse outcome. The various proposals of computer vision systems are characterized by some fundamental common phases: image acquisition, pre-processing, segmentation, features extraction and finally classification. Feature extraction aims at extracting the features from the lesion image in order to characterize the melanoma and feed the classifier. The recent research provided many different feature extraction algorithms for melanoma diagnosis from dermoscopy images from the simplest to the most sophisticated. Features are typically extracted using digital image processing methods (i.e., segmentation, edge detection and color and structure processing), and an open discussion about the meaning of these features and the objective ways of measuring them is ongoing. This paper is a contribution to the feature extraction phase as it describes the most frequently used features in the elaboration of computer vision systems and reports a description of recent works for feature extraction and classification.
Expert Systems with Applications
Active integrity constraints (AICs) are a useful formalism to express integrity constraints and p... more Active integrity constraints (AICs) are a useful formalism to express integrity constraints and policies to restore consistency in databases violating them. However, AICs do not allow users to express different kinds of constraints commonly arising in practice, such as foreign keys. In this paper, we propose existential active integrity constraints (EAICs), a powerful extension of AICs that allows us to express a wide range of constraints used in databases and ontological systems. We investigate different properties of EAICs. Specifically, we show that there exists a "representative" set of founded updates, called universal, which suffices for query answering. As such a set might contain an infinite number of founded updates, each of infinite size, we study syntactic restrictions ensuring finiteness, as well as the existence of a single universal founded update.
Cultural tourism is one of the main activities for the enhancement of archaeological heritage. In... more Cultural tourism is one of the main activities for the enhancement of archaeological heritage. Information and Communications Technology (ICT) is largely used in the domain of cultural heritage, to provide the visitor with specific and complete information about tangible culture (such as buildings, monuments and artifacts). This work is a contribution in this direction: it describes a mobile application for the enhancement of points of interest (POIs) in the archaeological park of Castiglione di Paludi. For each of these POIs, a 3D model shows its original structure.
Educational games appear as a new tool for learning cultural content in an engaging way. The appl... more Educational games appear as a new tool for learning cultural content in an engaging way. The applications with augmented reality in virtual environments are able to support the experience of cultural heritage, overcoming the “tangibility” barriers of museums, exhibitions, books and audio-visual content. We intend to provide a unified view regarding the game genres used to profitably manage the contents of cultural heritage, highlighting the training objectives of the games in this sector and analyzing the complex relationships between gender, context of use, technological solutions and learning effectiveness as well as the identification of the most desirable features for the design phase.
Cultural heritage is a collection of works, monuments, buildings, and traditions belonging to the... more Cultural heritage is a collection of works, monuments, buildings, and traditions belonging to the territory that generated it: this represents the wealth of a country both from a cultural and an economic point of view. Preserving and understanding the cultural heritage is of prior importance to discover and analyse findings of historical interest worldwide. Information and Communications Technology (ICT) is largely used for the enhancement of archaeological heritage, to provide the visitor with specific and complete information about tangible culture. This work is a contribution in this direction: it aims at disseminating information about a territory, Calabria, famous for being one of the regions belonging to Magna Graecia and rich in archaeological heritage. Specifically, it presents a project proposal that uses ICT technologies and augmented reality in the domain of cultural heritage, to promote new keys of reading and knowledge of the territory so that providing the visitor with...
2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 2020
Malignant melanoma is responsible for the highest number of deaths related to skin lesions. The s... more Malignant melanoma is responsible for the highest number of deaths related to skin lesions. The similarities of melanoma with other skin lesions such as dysplastic nevi, however, constitute a pitfall for early diagnosis. The research community is committed to proposing software solutions that favor the computerized analysis of lesions for melanoma detection. Existing methods have typically focused on the dichotomous distinction of melanoma from benign lesions. Currently, there is debate about Dysplastic Nevi Syndrome (DNS), or rather about the number of moles present on the human body as potential melanoma risk factors. Distinguishing dysplastic nevi from common ones is a challenging yet mostly unexplored classification problem. The classification phase is particularly delicate: over time, a series of automatic learning algorithms have been proposed to better face this issue. In this paper, we refer to the emergencing role of Multiple Instance Learning approaches for discriminating melanoma from dysplastic nevi and to outline the even more complex challenge related to the classification of dysplastic nevi from common ones.
2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), 2018
Following the general trend, the amount of digital information in the stored electronic health re... more Following the general trend, the amount of digital information in the stored electronic health records (EHRs) had an explosion in the last decade. EHRs are not anymore used, as in the past, to store basic information of the patient and administrative tasks, but they may include a range of data, including the medical history of the patient, laboratory test results, demographics, medication and allergies, immunization status, radiology images, vital signs. At the present, the problem has shifted from collecting massive amounts of data to understanding it, i.e. use EHRs for turning data into knowledge, conclusions and actions. EHRs were not designed to forecast disease risk or disease progression or to determine the right treatment, but if they are combined with artificial intelligence (AI) algorithm this issue became possible. The need for tools allowing to construct predictive models capturing disease progression is a priority. In the recent past EHRs were analyzed using traditional machine learning techniques, whereas recently the progress in the field of deep learning let to the application of deep learning techniques to EHRs. This paper reports a brief overview of some recently developed deep learning tools for EHRs.
Mathematical Problems in Engineering, 2019
Lecture Notes in Computer Science, 2015
This paper stems from the work in [10] in which the declarative semantics of a P2P system is defi... more This paper stems from the work in [10] in which the declarative semantics of a P2P system is defined in terms of minimal weak models. Under this semantics each peer uses its mapping rules to import minimal sets of mapping atoms allowing to satisfy its local integrity constraints. This behavior results to be useful in real world P2P systems in which peers often use the available import mechanisms to extract knowledge from the rest of the system only if this knowledge is strictly needed to repair an inconsistent local database. Then, an inconsistent peer, in the interaction with different peers, just imports the information allowing to restore consistency, that is minimal sets of atoms allowing the peer to enrich its knowledge so that restoring inconsistency anomalies. The paper extends previous work by proposing a rewriting technique that allows modeling a P2P system, \({\mathcal{{PS}}}\), as a unique logic program whose minimal models correspond to the minimal weak models of \({\mathcal{{PS}}}\).
Sistemi Evoluti per Basi di Dati, 2004
Proceedings. International Database Engineering and Applications Symposium, 2004. IDEAS '04.
... Index pages are gener-ated by learning from usage patterns, while frequent path ... above, cl... more ... Index pages are gener-ated by learning from usage patterns, while frequent path ... above, clustering is an important direction adopted by most web personalization approaches. ... Several clustering algorithms have been proposed [10], and in particular efficient partitional methods ...
Encyclopedia of Database Technologies and Applications
Proceedings of the 18th International Database Engineering & Applications Symposium on - IDEAS '14, 2014
ABSTRACT This paper proposes a logic framework for modeling the interaction among incomplete and ... more ABSTRACT This paper proposes a logic framework for modeling the interaction among incomplete and inconsistent deductive databases in a P2P environment. Each peer joining a P2P system provides or imports data from its neighbors by using a set of mapping rules, i.e. a set of semantic correspondences to a set of peers belonging to the same environment. By using mapping rules, as soon as it enters the system, a peer can participate and access all data available in its neighborhood, and through its neighborhood it becomes accessible to all the other peers in the system. Two different types of mapping rules are defined: a first type allowing to import maximal sets of atoms and a second type allowing to import minimal sets of atoms from source peers to target peers. In the proposed setting, each peer can be thought of as a resource used either to enrich (integrate) the knowledge or to fix (repair) the knowledge. The declarative semantics of a P2P system is defined in terms of preferred weak models. An equivalent and alternative characterization of preferred weak model semantics, in terms of prioritized logic programs, is also introduced. The paper also presents preliminary results about complexity of P2P logic queries.
Lecture Notes in Computer Science
Current database systems are often large and complex and the case that a user or an application h... more Current database systems are often large and complex and the case that a user or an application has full access to the entire database is rare. It is more likely to occur that access is granted via windows of the entire systems, called views. A view, usually virtual, is defined by giving a query on the whole database and at
Proceedings of the 2007 ACM symposium on Applied computing - SAC '07, 2007
This work addresses the issue of prioritized reasoning in the context of logic programming. The c... more This work addresses the issue of prioritized reasoning in the context of logic programming. The case of preference conditions involving atoms is considered and a refinement of the comparison method of the Answer Set Optimization semantics [4] is presented. The paper introduces the concept of choice, as a set of preference rules describing common choice options in different contexts. Thus,
Proceedings of the Fourteenth International Database Engineering & Applications Symposium on - IDEAS '10, 2010
Abstract This paper presents a technique that builds a layer of virtual sensors over a sensor net... more Abstract This paper presents a technique that builds a layer of virtual sensors over a sensor network. The virtual sensors are able to infer and provide data for the physical sensors that do not work. The key assumption of our approach is that the physical quantities sensed by the sensors are related. The relations among sensors are unknown, but during a learning phase the layer of virtual sensors infers an approximation of them by means of fuzzy rules. The inferred fuzzy rules capture these relations in a simple way even when the ...
Lecture Notes in Computer Science, 2012
ABSTRACT A peer to peer system easily provides a way to aggregate information distributed in the ... more ABSTRACT A peer to peer system easily provides a way to aggregate information distributed in the network. Anyhow, while collecting data it is quite natural for a source peer to associate different degrees of reliability to the portion of data provided by its neighbor peers. This paper investigates the data exchange problem among distributed independent sources and concentrates on the task of using dynamic preferences to drive the integration process in the case of conflicting information. Previous works in the literature are rigid in the sense that preferences between conflicting sets of atoms, that a peer can import, only depends on the priorities associated to the source peers at design time. These approaches do not allow to model concepts such as "import tuples from the peer having the highest upload speed if they conflict" or "among conflicting values import the most recent ones". In this paper it is supposed the existence of a special peer, called authority peer. It contains information about the peers in the network, is accessible from each peer of the system and is used to enhance preference mechanism. The framework, here proposed, ensures dynamism by allowing to select among different scenarios looking at the properties of data provided by the peers: this is done by "dynamically" establishing priorities among mapping rules.
2006 10th International Database Engineering and Applications Symposium (IDEAS'06), 2006
The paper proposes a logic framework for modeling the interaction among deductive databases and c... more The paper proposes a logic framework for modeling the interaction among deductive databases and computing consistent answers to logic queries in a P2P environment. As usual, data are exchanged among peers by using logical rules, called mapping rules. The novelty of our approach is that only data not violating integrity constraints are exchanged. The (declarative) semantics of a P2P system
Lecture Notes in Computer Science, 2003
This paper presents a logic language, called NP Datalog \mathcal{N}\mathcal{P}{\mathbf{ }}Datalog... more This paper presents a logic language, called NP Datalog \mathcal{N}\mathcal{P}{\mathbf{ }}Datalog suitable for expressing NP search and optimization problems. The ‘search’ language extends stratified Datalog with constraints and allows disjunction to define nondeterministically partitions of relations. It’s well known that NP \mathcal{N}\mathcal{P} search problems can be formulated as unstratified DATALOG queries under nondeterministic stable model semantics so that each stable
IEEE/WIC/ACM International Conference on Web Intelligence - Companion Volume
Computer vision systems are increasingly used for the early detection of skin cancers. Recognizin... more Computer vision systems are increasingly used for the early detection of skin cancers. Recognizing the first sign of melanoma is very important because if melanoma is found and treated in its primary stage the chances for long-term survival are excellent. On the contrary, as it progresses its treatment becomes increasingly harder and it has worse outcome. The various proposals of computer vision systems are characterized by some fundamental common phases: image acquisition, pre-processing, segmentation, features extraction and finally classification. Feature extraction aims at extracting the features from the lesion image in order to characterize the melanoma and feed the classifier. The recent research provided many different feature extraction algorithms for melanoma diagnosis from dermoscopy images from the simplest to the most sophisticated. Features are typically extracted using digital image processing methods (i.e., segmentation, edge detection and color and structure processing), and an open discussion about the meaning of these features and the objective ways of measuring them is ongoing. This paper is a contribution to the feature extraction phase as it describes the most frequently used features in the elaboration of computer vision systems and reports a description of recent works for feature extraction and classification.
Expert Systems with Applications
Active integrity constraints (AICs) are a useful formalism to express integrity constraints and p... more Active integrity constraints (AICs) are a useful formalism to express integrity constraints and policies to restore consistency in databases violating them. However, AICs do not allow users to express different kinds of constraints commonly arising in practice, such as foreign keys. In this paper, we propose existential active integrity constraints (EAICs), a powerful extension of AICs that allows us to express a wide range of constraints used in databases and ontological systems. We investigate different properties of EAICs. Specifically, we show that there exists a "representative" set of founded updates, called universal, which suffices for query answering. As such a set might contain an infinite number of founded updates, each of infinite size, we study syntactic restrictions ensuring finiteness, as well as the existence of a single universal founded update.
Cultural tourism is one of the main activities for the enhancement of archaeological heritage. In... more Cultural tourism is one of the main activities for the enhancement of archaeological heritage. Information and Communications Technology (ICT) is largely used in the domain of cultural heritage, to provide the visitor with specific and complete information about tangible culture (such as buildings, monuments and artifacts). This work is a contribution in this direction: it describes a mobile application for the enhancement of points of interest (POIs) in the archaeological park of Castiglione di Paludi. For each of these POIs, a 3D model shows its original structure.
Educational games appear as a new tool for learning cultural content in an engaging way. The appl... more Educational games appear as a new tool for learning cultural content in an engaging way. The applications with augmented reality in virtual environments are able to support the experience of cultural heritage, overcoming the “tangibility” barriers of museums, exhibitions, books and audio-visual content. We intend to provide a unified view regarding the game genres used to profitably manage the contents of cultural heritage, highlighting the training objectives of the games in this sector and analyzing the complex relationships between gender, context of use, technological solutions and learning effectiveness as well as the identification of the most desirable features for the design phase.
Cultural heritage is a collection of works, monuments, buildings, and traditions belonging to the... more Cultural heritage is a collection of works, monuments, buildings, and traditions belonging to the territory that generated it: this represents the wealth of a country both from a cultural and an economic point of view. Preserving and understanding the cultural heritage is of prior importance to discover and analyse findings of historical interest worldwide. Information and Communications Technology (ICT) is largely used for the enhancement of archaeological heritage, to provide the visitor with specific and complete information about tangible culture. This work is a contribution in this direction: it aims at disseminating information about a territory, Calabria, famous for being one of the regions belonging to Magna Graecia and rich in archaeological heritage. Specifically, it presents a project proposal that uses ICT technologies and augmented reality in the domain of cultural heritage, to promote new keys of reading and knowledge of the territory so that providing the visitor with...
2020 19th IEEE International Conference on Machine Learning and Applications (ICMLA), 2020
Malignant melanoma is responsible for the highest number of deaths related to skin lesions. The s... more Malignant melanoma is responsible for the highest number of deaths related to skin lesions. The similarities of melanoma with other skin lesions such as dysplastic nevi, however, constitute a pitfall for early diagnosis. The research community is committed to proposing software solutions that favor the computerized analysis of lesions for melanoma detection. Existing methods have typically focused on the dichotomous distinction of melanoma from benign lesions. Currently, there is debate about Dysplastic Nevi Syndrome (DNS), or rather about the number of moles present on the human body as potential melanoma risk factors. Distinguishing dysplastic nevi from common ones is a challenging yet mostly unexplored classification problem. The classification phase is particularly delicate: over time, a series of automatic learning algorithms have been proposed to better face this issue. In this paper, we refer to the emergencing role of Multiple Instance Learning approaches for discriminating melanoma from dysplastic nevi and to outline the even more complex challenge related to the classification of dysplastic nevi from common ones.
2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), 2018
Following the general trend, the amount of digital information in the stored electronic health re... more Following the general trend, the amount of digital information in the stored electronic health records (EHRs) had an explosion in the last decade. EHRs are not anymore used, as in the past, to store basic information of the patient and administrative tasks, but they may include a range of data, including the medical history of the patient, laboratory test results, demographics, medication and allergies, immunization status, radiology images, vital signs. At the present, the problem has shifted from collecting massive amounts of data to understanding it, i.e. use EHRs for turning data into knowledge, conclusions and actions. EHRs were not designed to forecast disease risk or disease progression or to determine the right treatment, but if they are combined with artificial intelligence (AI) algorithm this issue became possible. The need for tools allowing to construct predictive models capturing disease progression is a priority. In the recent past EHRs were analyzed using traditional machine learning techniques, whereas recently the progress in the field of deep learning let to the application of deep learning techniques to EHRs. This paper reports a brief overview of some recently developed deep learning tools for EHRs.
Mathematical Problems in Engineering, 2019
Lecture Notes in Computer Science, 2015
This paper stems from the work in [10] in which the declarative semantics of a P2P system is defi... more This paper stems from the work in [10] in which the declarative semantics of a P2P system is defined in terms of minimal weak models. Under this semantics each peer uses its mapping rules to import minimal sets of mapping atoms allowing to satisfy its local integrity constraints. This behavior results to be useful in real world P2P systems in which peers often use the available import mechanisms to extract knowledge from the rest of the system only if this knowledge is strictly needed to repair an inconsistent local database. Then, an inconsistent peer, in the interaction with different peers, just imports the information allowing to restore consistency, that is minimal sets of atoms allowing the peer to enrich its knowledge so that restoring inconsistency anomalies. The paper extends previous work by proposing a rewriting technique that allows modeling a P2P system, \({\mathcal{{PS}}}\), as a unique logic program whose minimal models correspond to the minimal weak models of \({\mathcal{{PS}}}\).
Sistemi Evoluti per Basi di Dati, 2004
Proceedings. International Database Engineering and Applications Symposium, 2004. IDEAS '04.
... Index pages are gener-ated by learning from usage patterns, while frequent path ... above, cl... more ... Index pages are gener-ated by learning from usage patterns, while frequent path ... above, clustering is an important direction adopted by most web personalization approaches. ... Several clustering algorithms have been proposed [10], and in particular efficient partitional methods ...
Encyclopedia of Database Technologies and Applications
Proceedings of the 18th International Database Engineering & Applications Symposium on - IDEAS '14, 2014
ABSTRACT This paper proposes a logic framework for modeling the interaction among incomplete and ... more ABSTRACT This paper proposes a logic framework for modeling the interaction among incomplete and inconsistent deductive databases in a P2P environment. Each peer joining a P2P system provides or imports data from its neighbors by using a set of mapping rules, i.e. a set of semantic correspondences to a set of peers belonging to the same environment. By using mapping rules, as soon as it enters the system, a peer can participate and access all data available in its neighborhood, and through its neighborhood it becomes accessible to all the other peers in the system. Two different types of mapping rules are defined: a first type allowing to import maximal sets of atoms and a second type allowing to import minimal sets of atoms from source peers to target peers. In the proposed setting, each peer can be thought of as a resource used either to enrich (integrate) the knowledge or to fix (repair) the knowledge. The declarative semantics of a P2P system is defined in terms of preferred weak models. An equivalent and alternative characterization of preferred weak model semantics, in terms of prioritized logic programs, is also introduced. The paper also presents preliminary results about complexity of P2P logic queries.
Lecture Notes in Computer Science
Current database systems are often large and complex and the case that a user or an application h... more Current database systems are often large and complex and the case that a user or an application has full access to the entire database is rare. It is more likely to occur that access is granted via windows of the entire systems, called views. A view, usually virtual, is defined by giving a query on the whole database and at
Proceedings of the 2007 ACM symposium on Applied computing - SAC '07, 2007
This work addresses the issue of prioritized reasoning in the context of logic programming. The c... more This work addresses the issue of prioritized reasoning in the context of logic programming. The case of preference conditions involving atoms is considered and a refinement of the comparison method of the Answer Set Optimization semantics [4] is presented. The paper introduces the concept of choice, as a set of preference rules describing common choice options in different contexts. Thus,
Proceedings of the Fourteenth International Database Engineering & Applications Symposium on - IDEAS '10, 2010
Abstract This paper presents a technique that builds a layer of virtual sensors over a sensor net... more Abstract This paper presents a technique that builds a layer of virtual sensors over a sensor network. The virtual sensors are able to infer and provide data for the physical sensors that do not work. The key assumption of our approach is that the physical quantities sensed by the sensors are related. The relations among sensors are unknown, but during a learning phase the layer of virtual sensors infers an approximation of them by means of fuzzy rules. The inferred fuzzy rules capture these relations in a simple way even when the ...
Lecture Notes in Computer Science, 2012
ABSTRACT A peer to peer system easily provides a way to aggregate information distributed in the ... more ABSTRACT A peer to peer system easily provides a way to aggregate information distributed in the network. Anyhow, while collecting data it is quite natural for a source peer to associate different degrees of reliability to the portion of data provided by its neighbor peers. This paper investigates the data exchange problem among distributed independent sources and concentrates on the task of using dynamic preferences to drive the integration process in the case of conflicting information. Previous works in the literature are rigid in the sense that preferences between conflicting sets of atoms, that a peer can import, only depends on the priorities associated to the source peers at design time. These approaches do not allow to model concepts such as "import tuples from the peer having the highest upload speed if they conflict" or "among conflicting values import the most recent ones". In this paper it is supposed the existence of a special peer, called authority peer. It contains information about the peers in the network, is accessible from each peer of the system and is used to enhance preference mechanism. The framework, here proposed, ensures dynamism by allowing to select among different scenarios looking at the properties of data provided by the peers: this is done by "dynamically" establishing priorities among mapping rules.
2006 10th International Database Engineering and Applications Symposium (IDEAS'06), 2006
The paper proposes a logic framework for modeling the interaction among deductive databases and c... more The paper proposes a logic framework for modeling the interaction among deductive databases and computing consistent answers to logic queries in a P2P environment. As usual, data are exchanged among peers by using logical rules, called mapping rules. The novelty of our approach is that only data not violating integrity constraints are exchanged. The (declarative) semantics of a P2P system
Lecture Notes in Computer Science, 2003
This paper presents a logic language, called NP Datalog \mathcal{N}\mathcal{P}{\mathbf{ }}Datalog... more This paper presents a logic language, called NP Datalog \mathcal{N}\mathcal{P}{\mathbf{ }}Datalog suitable for expressing NP search and optimization problems. The ‘search’ language extends stratified Datalog with constraints and allows disjunction to define nondeterministically partitions of relations. It’s well known that NP \mathcal{N}\mathcal{P} search problems can be formulated as unstratified DATALOG queries under nondeterministic stable model semantics so that each stable