Dan Tamir - Profile on Academia.edu (original) (raw)

Papers by Dan Tamir

Research paper thumbnail of Optimizing B-Spline Surface Reconstruction for Sharp Feature Preservation

2020 10th Annual Computing and Communication Workshop and Conference (CCWC)

Methods of surface reconstruction from 3D point clouds have received much attention in recent yea... more Methods of surface reconstruction from 3D point clouds have received much attention in recent years due to their vast array of applications and the increasing supply of accurate 3D data. Providing smoothness, local modification, and robustness to noise, the B-spline surface fitting is one of the most popular of such methods. However, a problem encountered when using B-spline surface reconstruction is the representation of sharp features: corners and edges tend to be smoothed out. We propose an approach to sharp feature preservation which relies on curvature analysis of the B-spline surface. B-spline patches that have high curvature and are surrounded by patches with low curvature are identified as those representing sharp features. The location of sharp features is then determined through interpolation from low-curvature patches surrounding the identified patches. Finally, these features are preserved through repeated addition of points to the point cloud. We evaluate our sharp feature preservation algorithm at varying levels of noise, demonstrating its high accuracy at low noise and moderate robustness as the noise increases.

Research paper thumbnail of Computing with Words in Maritime Piracy and Attack Detection Systems

In this paper, we propose to apply recent advances in deep learning to design and train algorithm... more In this paper, we propose to apply recent advances in deep learning to design and train algorithms to localize, identify, and track small maritime objects under varying conditions (e.g., a snowstorm, high glare, night), and in computing-with-words to identify threatening activities where lack of training data precludes the use of deep learning. The recent rise of maritime piracy and attacks on transportation ships has cost the global economy several billion dollars. To counter the threat, researchers have proposed agent-driven modeling to capture the dynamics of the maritime transportation system, and to score the potential of a range of piracy countermeasures. Combining information from onboard sensors and cameras with intelligence from external sources for early piracy threat detection has shown promising results but lacks real-time updates for situational context. Such systems can benefit from early warnings, such as “a boat is approaching the ship and accelerating,” “a boat is c...

Research paper thumbnail of Computing with Words - A Framework for Human-Computer Interaction

In this paper we explore the possibility of using computation with words (CWW) systems and CWW-ba... more In this paper we explore the possibility of using computation with words (CWW) systems and CWW-based human-computer interface (HCI) and interaction to enable efficient computation and HCI. The application selected to demonstrate the problems and potential solutions is in the context of autonomous driving. The specific problem addressed is of a machine instructed by human word commands to execute the task of parking two manned or unmanned cars in a two-car garage using CWW. We divide the interaction process into two steps: (1) feasibility verification and (2) execution. In order to fulfill the task, we begin with verifications of feasibility in terms of assessing whether the garage is unoccupied, checking general ballpark dimensions, inspecting irregular shapes, and classifying the cars that need to be parked, in terms of size, types of vehicles, ranges of acceptable tolerances needed if the cars are manned or not, and means of collision avoidance. The execution of the autonomous dri...

Research paper thumbnail of Power Aware Work Stealing in Homogeneous Multi-Core Systems

Excessive power consumption affects the reliability of cores, requires expensive cooling mechanis... more Excessive power consumption affects the reliability of cores, requires expensive cooling mechanisms, reduces battery lifetime, and causes extensive damage to the device. Hence, managing the power consumption and performance of cores is an important aspect of chip design. This research aims to achieve efficient mul ticore power monitoring and control via operating system based power-aware task scheduling. The main objectives of power-aware scheduling are: 1) lowering core's power consumption level, 2) maintaining the system within an allowable power envelope, and 3) balancing the power consumption across cores; without significant impact on time performance. In previous research we have explored power-aware task scheduling at the single core level referred to as intra-core scheduling. This paper reports on a research on a power-aware form of inter-core scheduling policy referred to as work stealing. Work stealing is a special case of task migration, where a "starving" c...

Research paper thumbnail of Logic connectives of complex fuzzy sets

Romanian Journal of Information Science and Technology, 2018

The Fuzzy Set Theory has been applied in various problems in numerous fields. In particular, the ... more The Fuzzy Set Theory has been applied in various problems in numerous fields. In particular, the concepts of t-norms and t-conorms serve a significant role in shaping the theory and its applications. The notion of Complex Fuzzy Sets extends the Fuzzy Set Theory and provides several advantages over the classical theory, especially in terms of the capability to concisely, efficiently, and accurately represent complex relations between fuzzy set components. Some of the areas where complex fuzzy sets have been successfully applied are the areas of time series analysis and multi-criteria decision making problems. The notions of complex t-norms and t-conorms have not been fully developed so far. In this paper, we present the complex fuzzy set forms of t-norms and t-conorms and detail their properties. Additionally, we provide two numerical examples of applying the complex t-norm and tconorm to multi-criteria decision making in the context of medicine- related problems using medical datasets.

Research paper thumbnail of Complex Number Representation of Intuitionistic Fuzzy Sets

Complex numbers can capture compound features and convey multifaceted information; thereby provid... more Complex numbers can capture compound features and convey multifaceted information; thereby providing means for solving complicated problems. In this paper, we combine the degree of membership and a degree of non-membership of members of Intuitionistic Fuzzy Sets via complex numbers to characterize these fuzzy sets. This approach enables extending several concepts such as classical fuzzy sets, Pythagorean fuzzy sets, and complex fuzzy sets. We discuss complex numbers-based set theoretic operations such as union, intersection, and complement. We define the No-Man-Zone (NMZ) set and establish the relation of NMZ characterization with complex numbers. Further, we introduce athematic complex numbers-based operations of intuitionistic fuzzy sets. We show that the square of the absolute of an intuitionistic fuzzy set becomes a Pythagorean fuzzy set. The polar form of an intuitionistic fuzzy set is reduced to a complex fuzzy set.

Research paper thumbnail of Representing complex intuitionistic fuzzy set by quaternion numbers and applications to decision making

Applied Soft Computing, 2020

Intuitionistic fuzzy sets are useful for modeling uncertain data of realistic problems. In this p... more Intuitionistic fuzzy sets are useful for modeling uncertain data of realistic problems. In this paper, we generalize and expand the utility of complex intuitionistic fuzzy sets using the space of quaternion numbers. The proposed representation can capture composite features and convey multi-dimensional fuzzy information via the functions of real membership, imaginary membership, real non-membership, and imaginary non-membership. We analyze the order relations and logic operations of the complex intuitionistic fuzzy set theory and introduce new operations based on quaternion numbers. We also present two quaternion distance measures in algebraic and polar forms and analyze their properties. We apply the quaternion representations and measures to decision-making models. The proposed model is experimentally validated in medical diagnosis, which is an emerging application for tackling patient's symptoms and attributes of diseases.

Research paper thumbnail of High Accuracy Method for Discovering Quantitave Association Rules in Datatables and Databases

High Accuracy Method for Discovering Quantitave Association Rules in Datatables and Databases

FUZZY ECONOMIC REVIEW, 2009

Research paper thumbnail of Fuzzy Logic and Data Mining in Disaster Mitigation

Fuzzy Logic and Data Mining in Disaster Mitigation

NATO Science for Peace and Security Series C: Environmental Security, 2014

Disaster mitigation and management is one of the most challenging examples of decision making und... more Disaster mitigation and management is one of the most challenging examples of decision making under uncertain, missing, and sketchy, information. Even in the extreme cases where the nature of the disaster is known, preparedness plans are in place, and analysis, evaluation, and simulations of the disaster management procedures have been performed, the amount and magnitude of “surprises” that accompany the real disaster pose an enormous demand. In the more severe cases, where the entire disaster is an unpredicted event, the disaster management and response system might fast run into a chaotic state. Hence, the key for improving disaster preparedness and mitigation capabilities is employing sound techniques for data collection, information processing, and decision making under uncertainty. Fuzzy logic based techniques are some of the most promising approaches for disaster mitigation. The advantage of the fuzzy-based approach is that it enables keeping account on events with perceived low possibility of occurrence via low fuzzy membership/truth-values and updating these values as information is accumulated or changed. Several fuzzy logic based algorithms can be deployed in the data collection, accumulation, and retention stage, in the information processing phase, and in the decision making process. In this chapter a comprehensive assessment of fuzzy techniques for disaster mitigation is presented. The use of fuzzy logic as a possible tool for disaster management is investigated and the strengths and weaknesses of several fuzzy techniques are evaluated. In addition to classical fuzzy techniques, the use of incremental fuzzy clustering in the context of complex and high order fuzzy logic system is evaluated.

Research paper thumbnail of Musing: Interactive Didactics for Art Museums and Galleries via Image Processing and Augmented Reality. Providing Contextual Content for Artworks via Consumer-Level Mobile Devices

Textual didactics, used in museums and galleries provide access to historical, socio-political, t... more Textual didactics, used in museums and galleries provide access to historical, socio-political, technical, and biographic information about the artworks and artists. These types of didactics are considered to be cost-effective. However, they do not enable the use of audio, video, and Web interface that allows for multiple forms of usage for the museum visitors. We have developed a smartphone application, called Musing, for interaction of museum visitors with informational content and enhancement of their museum experience. Musing is an augmented reality (AR) application that enables the visitor to capture an artwork with a smartphone camera. Using image processing, the application recognizes the artwork and places graphical user interface objects in the form of Points of Interest (POIs) onto the image of the artwork displayed on-screen. These POIs provide the visitor with additional didactic information in the form of text overlays, audio, video, and/or Web sites. The Musing applica...

Research paper thumbnail of A Module-based Approach to Adopting the 2013 ACM Curricular Recommendations on Parallel Computing

Proceedings of the 46th ACM Technical Symposium on Computer Science Education - SIGCSE '15, 2015

The widespread deployment of multicore systems over the last decade has brought about major chang... more The widespread deployment of multicore systems over the last decade has brought about major changes in the software and hardware landscape. The resulting importance of parallel computing is reflected in the 2013 Curriculum Guidelines developed by the joint ACM/IEEE taskforce. The document recommends increased coverage of parallel computing and describes a new Knowledge Area on this topic. These recommendations have already been adopted by several universities in the form of new parallel-programming courses. Implementing the recommendations in a complete curriculum, however, poses many challenges, including deciding on existing material to be removed, complying with administrative and ABET requirements, and maintaining caps on graduation credit hours. This paper describes an alternative approach for adopting the 2013 curricular recommendations on parallel computing. Specifically, we use a modulebased approach that introduces parallel computing concepts and reiterates them through a series of short, self-contained modules taught across several lower-division courses. Most of these concepts are then combined into a new senior-level capstone course on parallel programming. Each module covers parallelism aspects in the context of a conventional computer science topic, thus enabling us to include parallel computing without a major overhaul of the curriculum. Evaluations conducted during the first year show encouraging results for this early-and-often approach in terms of learning outcomes, student interest, and confidence gains.

Research paper thumbnail of On Fuzziness in Complex Fuzzy Systems

On Fuzziness in Complex Fuzzy Systems

Studies in Fuzziness and Soft Computing, 2013

Abraham Kandel, one of the pioneers of fuzzy logic research, has numerous theoretical and practic... more Abraham Kandel, one of the pioneers of fuzzy logic research, has numerous theoretical and practical contributions in the field of fuzzy systems. Mark Last’s main contribution to the field of fuzzy logic includes the introduction of fuzzy based automatic perception and info fuzzy networks. Dan Tamir has been active in the area of formalization of axiomatic fuzzy logic and in applications of fuzzy logic in pattern recognition. One of the recent research threads pursued by Kandel, Last, and Tamir is complex fuzzy logic. This chapter provides a brief review of the contribution of the authors to the field of fuzzy logic as well as a survey on current state of the theory of complex fuzzy sets, complex fuzzy classes, and complex fuzzy logic.

Research paper thumbnail of Detecting Software Usability Deficiencies Through Pinpoint Analysis

The effort-based model of usability is used for evaluating user interface (UI), de velopment of u... more The effort-based model of usability is used for evaluating user interface (UI), de velopment of usabl e software, and pinpointing software usability defects. In this context, the term pinpoint analysis refers to identifying and locating software usability deficiencies and correlating these deficiencies with the UI software code. For example, often, when users are in a state of confusion and not sure how to proceed using the software, they tend to gaze around the screen trying to find the best way to complete a task. This behavior is referred to as excessive effort. In this paper, the underlying theory of effort-based usability evaluation along with pattern recognition techniques are used to produce an innovative framework for the objective of identifying usability deficiencies in software. Pattern recognition techniques and methods are applied to data gathered throughout user interaction with software in an attempt to identify excessive effort segments via automatic classification of segments of video files containing eye-tracking results. The video files are automatically divided into segments using event-based segmentation, where a segment is the time between two consecutive keyboard/mouse clicks. Subsequently, data reduction programs are run on the segments for generating feature vectors. S everal different classification procedures are applied to the features in order to automatically classify each segment into excessive and non-excessive effort segments. This allows developers to focus on the excessive effort segments and further analyze usability deficiencies in these segments. To verify the results of the pattern recognition procedures, the video is manually classified into excessive and non-excessive segments and the results of automatic and manual classification are compared. The paper details the theory of effort-based pinpoint analysis and reports on experiments performed to evaluate the utility of this theory. Experiment results show more than 40% reduction in time for usability testing.

Research paper thumbnail of Dynamic Incremental Fuzzy C-Means Clustering

Researchers have observed that multistage clustering can accelerate convergence and improve clust... more Researchers have observed that multistage clustering can accelerate convergence and improve clustering quality. Two-stage and two-phase fuzzy C-means (FCM) algorithms have been reported. In this paper, we demonstrate that the FCM clustering algorithm can be improved by the use of static and dynamic single-pass incremental FCM procedures.

Research paper thumbnail of Time Space Tradeoffs in GA Based Feature Selection for Workload Characterization

Time Space Tradeoffs in GA Based Feature Selection for Workload Characterization

Lecture Notes in Computer Science, 2010

N. García-Pedrajas et al. (Eds.): IEA/AIE 2010, Part II, LNAI 6097, pp. 643–652, 2010. © Springer... more N. García-Pedrajas et al. (Eds.): IEA/AIE 2010, Part II, LNAI 6097, pp. 643–652, 2010. © Springer-Verlag Berlin Heidelberg 2010 ... Time Space Tradeoffs in GA Based Feature Selection ... Dan E. Tamir1, Clara Novoa2, and Daniel Lowell1 ... 1 Department of Computer ...

Research paper thumbnail of Compressive scanning of an object signature

Natural Computing, 2014

In this paper we explore the utility of compressive sensing for object signature generation in th... more In this paper we explore the utility of compressive sensing for object signature generation in the optical domain. In the data acquisition stage we use laser scanning to obtain a small (sub-Nyquist) number of points of an object's boundary. This is used to construct the signature, thereby enabling object identification, reconstruction, and, image data compression. We refer to this framework as compressive scanning of objects' signatures. The main contributions of the paper are the following: (1) we use this framework to replace parts of the digital processing with optical processing and present one possible implementation, (2) the use of compressive scanning reduces laser data obtained and maintains high reconstruction accuracy, and (3) we show that using compressive sensing can lead to a reduction in the amount of stored data without significantly affecting the utility of this data for image recognition and image compression.

Research paper thumbnail of Object Signature Acquisition through Compressive Scanning

Object Signature Acquisition through Compressive Scanning

Lecture Notes in Computer Science, 2013

In this paper we explore the utility of compressive sensing for object signature generation in th... more In this paper we explore the utility of compressive sensing for object signature generation in the optical domain. We use laser scanning in the data acquisition stage to obtain a small (sub-Nyquist) number of points of an object’s boundary. This can be used to construct the signature, thereby enabling object identification, reconstruction, and, image data compression. We refer to this framework as compressive scanning of objects’ signatures. The main contributions of the paper are the following: 1) we use this framework to replace parts of the digital processing with optical processing, 2) the use of compressive scanning reduces laser data obtained and maintains high reconstruction accuracy, and 3) we show that using compressive sensing can lead to a reduction in the amount of stored data without significantly affecting the utility of this data for image recognition and image compression.

Research paper thumbnail of Improved Energy Efficiency for Multithreaded Kernels through Model-Based Autotuning

Improved Energy Efficiency for Multithreaded Kernels through Model-Based Autotuning

2012 IEEE Green Technologies Conference, 2012

In the last few years, the emergence of multicore architectures has revolutionized the landscape ... more In the last few years, the emergence of multicore architectures has revolutionized the landscape of high-performance computing. The multicore shift has not only increased the per-node performance potential of computer systems but also has made great strides in curbing power and heat dissipation. As we look to the future, however, the gains in performance and energy consumption is not going

Research paper thumbnail of Preserving Hamming Distance in Arithmetic and Logical Operations

Preserving Hamming Distance in Arithmetic and Logical Operations

Journal of Electronic Testing, 2013

ABSTRACT This paper presents a new method for fault-tolerant computing where for a given error ra... more ABSTRACT This paper presents a new method for fault-tolerant computing where for a given error rate, r, the hamming distance between correct inputs and faulty inputs, as well as the hamming distance between correct results and faulty results, is preserved throughout processing; thereby enabling correction of up to r transient faults per computation cycle. The new method is compared and contrasted with current protection methods and its cost/performance is analyzed.

Research paper thumbnail of Logic programming and the execution model of Prolog

Information Sciences - Applications, 1995

This paper introduces the subject of logic programming, describes the execution model of Prolog, ... more This paper introduces the subject of logic programming, describes the execution model of Prolog, and surveys Prolog development tools. In addition, the paper explains how Prolong integrates with artificial intelligence applications and software engineering principles. Finally, it shows how the execution model of Prolog can be optimized and parallelized efficiently.

Research paper thumbnail of Optimizing B-Spline Surface Reconstruction for Sharp Feature Preservation

2020 10th Annual Computing and Communication Workshop and Conference (CCWC)

Methods of surface reconstruction from 3D point clouds have received much attention in recent yea... more Methods of surface reconstruction from 3D point clouds have received much attention in recent years due to their vast array of applications and the increasing supply of accurate 3D data. Providing smoothness, local modification, and robustness to noise, the B-spline surface fitting is one of the most popular of such methods. However, a problem encountered when using B-spline surface reconstruction is the representation of sharp features: corners and edges tend to be smoothed out. We propose an approach to sharp feature preservation which relies on curvature analysis of the B-spline surface. B-spline patches that have high curvature and are surrounded by patches with low curvature are identified as those representing sharp features. The location of sharp features is then determined through interpolation from low-curvature patches surrounding the identified patches. Finally, these features are preserved through repeated addition of points to the point cloud. We evaluate our sharp feature preservation algorithm at varying levels of noise, demonstrating its high accuracy at low noise and moderate robustness as the noise increases.

Research paper thumbnail of Computing with Words in Maritime Piracy and Attack Detection Systems

In this paper, we propose to apply recent advances in deep learning to design and train algorithm... more In this paper, we propose to apply recent advances in deep learning to design and train algorithms to localize, identify, and track small maritime objects under varying conditions (e.g., a snowstorm, high glare, night), and in computing-with-words to identify threatening activities where lack of training data precludes the use of deep learning. The recent rise of maritime piracy and attacks on transportation ships has cost the global economy several billion dollars. To counter the threat, researchers have proposed agent-driven modeling to capture the dynamics of the maritime transportation system, and to score the potential of a range of piracy countermeasures. Combining information from onboard sensors and cameras with intelligence from external sources for early piracy threat detection has shown promising results but lacks real-time updates for situational context. Such systems can benefit from early warnings, such as “a boat is approaching the ship and accelerating,” “a boat is c...

Research paper thumbnail of Computing with Words - A Framework for Human-Computer Interaction

In this paper we explore the possibility of using computation with words (CWW) systems and CWW-ba... more In this paper we explore the possibility of using computation with words (CWW) systems and CWW-based human-computer interface (HCI) and interaction to enable efficient computation and HCI. The application selected to demonstrate the problems and potential solutions is in the context of autonomous driving. The specific problem addressed is of a machine instructed by human word commands to execute the task of parking two manned or unmanned cars in a two-car garage using CWW. We divide the interaction process into two steps: (1) feasibility verification and (2) execution. In order to fulfill the task, we begin with verifications of feasibility in terms of assessing whether the garage is unoccupied, checking general ballpark dimensions, inspecting irregular shapes, and classifying the cars that need to be parked, in terms of size, types of vehicles, ranges of acceptable tolerances needed if the cars are manned or not, and means of collision avoidance. The execution of the autonomous dri...

Research paper thumbnail of Power Aware Work Stealing in Homogeneous Multi-Core Systems

Excessive power consumption affects the reliability of cores, requires expensive cooling mechanis... more Excessive power consumption affects the reliability of cores, requires expensive cooling mechanisms, reduces battery lifetime, and causes extensive damage to the device. Hence, managing the power consumption and performance of cores is an important aspect of chip design. This research aims to achieve efficient mul ticore power monitoring and control via operating system based power-aware task scheduling. The main objectives of power-aware scheduling are: 1) lowering core's power consumption level, 2) maintaining the system within an allowable power envelope, and 3) balancing the power consumption across cores; without significant impact on time performance. In previous research we have explored power-aware task scheduling at the single core level referred to as intra-core scheduling. This paper reports on a research on a power-aware form of inter-core scheduling policy referred to as work stealing. Work stealing is a special case of task migration, where a "starving" c...

Research paper thumbnail of Logic connectives of complex fuzzy sets

Romanian Journal of Information Science and Technology, 2018

The Fuzzy Set Theory has been applied in various problems in numerous fields. In particular, the ... more The Fuzzy Set Theory has been applied in various problems in numerous fields. In particular, the concepts of t-norms and t-conorms serve a significant role in shaping the theory and its applications. The notion of Complex Fuzzy Sets extends the Fuzzy Set Theory and provides several advantages over the classical theory, especially in terms of the capability to concisely, efficiently, and accurately represent complex relations between fuzzy set components. Some of the areas where complex fuzzy sets have been successfully applied are the areas of time series analysis and multi-criteria decision making problems. The notions of complex t-norms and t-conorms have not been fully developed so far. In this paper, we present the complex fuzzy set forms of t-norms and t-conorms and detail their properties. Additionally, we provide two numerical examples of applying the complex t-norm and tconorm to multi-criteria decision making in the context of medicine- related problems using medical datasets.

Research paper thumbnail of Complex Number Representation of Intuitionistic Fuzzy Sets

Complex numbers can capture compound features and convey multifaceted information; thereby provid... more Complex numbers can capture compound features and convey multifaceted information; thereby providing means for solving complicated problems. In this paper, we combine the degree of membership and a degree of non-membership of members of Intuitionistic Fuzzy Sets via complex numbers to characterize these fuzzy sets. This approach enables extending several concepts such as classical fuzzy sets, Pythagorean fuzzy sets, and complex fuzzy sets. We discuss complex numbers-based set theoretic operations such as union, intersection, and complement. We define the No-Man-Zone (NMZ) set and establish the relation of NMZ characterization with complex numbers. Further, we introduce athematic complex numbers-based operations of intuitionistic fuzzy sets. We show that the square of the absolute of an intuitionistic fuzzy set becomes a Pythagorean fuzzy set. The polar form of an intuitionistic fuzzy set is reduced to a complex fuzzy set.

Research paper thumbnail of Representing complex intuitionistic fuzzy set by quaternion numbers and applications to decision making

Applied Soft Computing, 2020

Intuitionistic fuzzy sets are useful for modeling uncertain data of realistic problems. In this p... more Intuitionistic fuzzy sets are useful for modeling uncertain data of realistic problems. In this paper, we generalize and expand the utility of complex intuitionistic fuzzy sets using the space of quaternion numbers. The proposed representation can capture composite features and convey multi-dimensional fuzzy information via the functions of real membership, imaginary membership, real non-membership, and imaginary non-membership. We analyze the order relations and logic operations of the complex intuitionistic fuzzy set theory and introduce new operations based on quaternion numbers. We also present two quaternion distance measures in algebraic and polar forms and analyze their properties. We apply the quaternion representations and measures to decision-making models. The proposed model is experimentally validated in medical diagnosis, which is an emerging application for tackling patient's symptoms and attributes of diseases.

Research paper thumbnail of High Accuracy Method for Discovering Quantitave Association Rules in Datatables and Databases

High Accuracy Method for Discovering Quantitave Association Rules in Datatables and Databases

FUZZY ECONOMIC REVIEW, 2009

Research paper thumbnail of Fuzzy Logic and Data Mining in Disaster Mitigation

Fuzzy Logic and Data Mining in Disaster Mitigation

NATO Science for Peace and Security Series C: Environmental Security, 2014

Disaster mitigation and management is one of the most challenging examples of decision making und... more Disaster mitigation and management is one of the most challenging examples of decision making under uncertain, missing, and sketchy, information. Even in the extreme cases where the nature of the disaster is known, preparedness plans are in place, and analysis, evaluation, and simulations of the disaster management procedures have been performed, the amount and magnitude of “surprises” that accompany the real disaster pose an enormous demand. In the more severe cases, where the entire disaster is an unpredicted event, the disaster management and response system might fast run into a chaotic state. Hence, the key for improving disaster preparedness and mitigation capabilities is employing sound techniques for data collection, information processing, and decision making under uncertainty. Fuzzy logic based techniques are some of the most promising approaches for disaster mitigation. The advantage of the fuzzy-based approach is that it enables keeping account on events with perceived low possibility of occurrence via low fuzzy membership/truth-values and updating these values as information is accumulated or changed. Several fuzzy logic based algorithms can be deployed in the data collection, accumulation, and retention stage, in the information processing phase, and in the decision making process. In this chapter a comprehensive assessment of fuzzy techniques for disaster mitigation is presented. The use of fuzzy logic as a possible tool for disaster management is investigated and the strengths and weaknesses of several fuzzy techniques are evaluated. In addition to classical fuzzy techniques, the use of incremental fuzzy clustering in the context of complex and high order fuzzy logic system is evaluated.

Research paper thumbnail of Musing: Interactive Didactics for Art Museums and Galleries via Image Processing and Augmented Reality. Providing Contextual Content for Artworks via Consumer-Level Mobile Devices

Textual didactics, used in museums and galleries provide access to historical, socio-political, t... more Textual didactics, used in museums and galleries provide access to historical, socio-political, technical, and biographic information about the artworks and artists. These types of didactics are considered to be cost-effective. However, they do not enable the use of audio, video, and Web interface that allows for multiple forms of usage for the museum visitors. We have developed a smartphone application, called Musing, for interaction of museum visitors with informational content and enhancement of their museum experience. Musing is an augmented reality (AR) application that enables the visitor to capture an artwork with a smartphone camera. Using image processing, the application recognizes the artwork and places graphical user interface objects in the form of Points of Interest (POIs) onto the image of the artwork displayed on-screen. These POIs provide the visitor with additional didactic information in the form of text overlays, audio, video, and/or Web sites. The Musing applica...

Research paper thumbnail of A Module-based Approach to Adopting the 2013 ACM Curricular Recommendations on Parallel Computing

Proceedings of the 46th ACM Technical Symposium on Computer Science Education - SIGCSE '15, 2015

The widespread deployment of multicore systems over the last decade has brought about major chang... more The widespread deployment of multicore systems over the last decade has brought about major changes in the software and hardware landscape. The resulting importance of parallel computing is reflected in the 2013 Curriculum Guidelines developed by the joint ACM/IEEE taskforce. The document recommends increased coverage of parallel computing and describes a new Knowledge Area on this topic. These recommendations have already been adopted by several universities in the form of new parallel-programming courses. Implementing the recommendations in a complete curriculum, however, poses many challenges, including deciding on existing material to be removed, complying with administrative and ABET requirements, and maintaining caps on graduation credit hours. This paper describes an alternative approach for adopting the 2013 curricular recommendations on parallel computing. Specifically, we use a modulebased approach that introduces parallel computing concepts and reiterates them through a series of short, self-contained modules taught across several lower-division courses. Most of these concepts are then combined into a new senior-level capstone course on parallel programming. Each module covers parallelism aspects in the context of a conventional computer science topic, thus enabling us to include parallel computing without a major overhaul of the curriculum. Evaluations conducted during the first year show encouraging results for this early-and-often approach in terms of learning outcomes, student interest, and confidence gains.

Research paper thumbnail of On Fuzziness in Complex Fuzzy Systems

On Fuzziness in Complex Fuzzy Systems

Studies in Fuzziness and Soft Computing, 2013

Abraham Kandel, one of the pioneers of fuzzy logic research, has numerous theoretical and practic... more Abraham Kandel, one of the pioneers of fuzzy logic research, has numerous theoretical and practical contributions in the field of fuzzy systems. Mark Last’s main contribution to the field of fuzzy logic includes the introduction of fuzzy based automatic perception and info fuzzy networks. Dan Tamir has been active in the area of formalization of axiomatic fuzzy logic and in applications of fuzzy logic in pattern recognition. One of the recent research threads pursued by Kandel, Last, and Tamir is complex fuzzy logic. This chapter provides a brief review of the contribution of the authors to the field of fuzzy logic as well as a survey on current state of the theory of complex fuzzy sets, complex fuzzy classes, and complex fuzzy logic.

Research paper thumbnail of Detecting Software Usability Deficiencies Through Pinpoint Analysis

The effort-based model of usability is used for evaluating user interface (UI), de velopment of u... more The effort-based model of usability is used for evaluating user interface (UI), de velopment of usabl e software, and pinpointing software usability defects. In this context, the term pinpoint analysis refers to identifying and locating software usability deficiencies and correlating these deficiencies with the UI software code. For example, often, when users are in a state of confusion and not sure how to proceed using the software, they tend to gaze around the screen trying to find the best way to complete a task. This behavior is referred to as excessive effort. In this paper, the underlying theory of effort-based usability evaluation along with pattern recognition techniques are used to produce an innovative framework for the objective of identifying usability deficiencies in software. Pattern recognition techniques and methods are applied to data gathered throughout user interaction with software in an attempt to identify excessive effort segments via automatic classification of segments of video files containing eye-tracking results. The video files are automatically divided into segments using event-based segmentation, where a segment is the time between two consecutive keyboard/mouse clicks. Subsequently, data reduction programs are run on the segments for generating feature vectors. S everal different classification procedures are applied to the features in order to automatically classify each segment into excessive and non-excessive effort segments. This allows developers to focus on the excessive effort segments and further analyze usability deficiencies in these segments. To verify the results of the pattern recognition procedures, the video is manually classified into excessive and non-excessive segments and the results of automatic and manual classification are compared. The paper details the theory of effort-based pinpoint analysis and reports on experiments performed to evaluate the utility of this theory. Experiment results show more than 40% reduction in time for usability testing.

Research paper thumbnail of Dynamic Incremental Fuzzy C-Means Clustering

Researchers have observed that multistage clustering can accelerate convergence and improve clust... more Researchers have observed that multistage clustering can accelerate convergence and improve clustering quality. Two-stage and two-phase fuzzy C-means (FCM) algorithms have been reported. In this paper, we demonstrate that the FCM clustering algorithm can be improved by the use of static and dynamic single-pass incremental FCM procedures.

Research paper thumbnail of Time Space Tradeoffs in GA Based Feature Selection for Workload Characterization

Time Space Tradeoffs in GA Based Feature Selection for Workload Characterization

Lecture Notes in Computer Science, 2010

N. García-Pedrajas et al. (Eds.): IEA/AIE 2010, Part II, LNAI 6097, pp. 643–652, 2010. © Springer... more N. García-Pedrajas et al. (Eds.): IEA/AIE 2010, Part II, LNAI 6097, pp. 643–652, 2010. © Springer-Verlag Berlin Heidelberg 2010 ... Time Space Tradeoffs in GA Based Feature Selection ... Dan E. Tamir1, Clara Novoa2, and Daniel Lowell1 ... 1 Department of Computer ...

Research paper thumbnail of Compressive scanning of an object signature

Natural Computing, 2014

In this paper we explore the utility of compressive sensing for object signature generation in th... more In this paper we explore the utility of compressive sensing for object signature generation in the optical domain. In the data acquisition stage we use laser scanning to obtain a small (sub-Nyquist) number of points of an object's boundary. This is used to construct the signature, thereby enabling object identification, reconstruction, and, image data compression. We refer to this framework as compressive scanning of objects' signatures. The main contributions of the paper are the following: (1) we use this framework to replace parts of the digital processing with optical processing and present one possible implementation, (2) the use of compressive scanning reduces laser data obtained and maintains high reconstruction accuracy, and (3) we show that using compressive sensing can lead to a reduction in the amount of stored data without significantly affecting the utility of this data for image recognition and image compression.

Research paper thumbnail of Object Signature Acquisition through Compressive Scanning

Object Signature Acquisition through Compressive Scanning

Lecture Notes in Computer Science, 2013

In this paper we explore the utility of compressive sensing for object signature generation in th... more In this paper we explore the utility of compressive sensing for object signature generation in the optical domain. We use laser scanning in the data acquisition stage to obtain a small (sub-Nyquist) number of points of an object’s boundary. This can be used to construct the signature, thereby enabling object identification, reconstruction, and, image data compression. We refer to this framework as compressive scanning of objects’ signatures. The main contributions of the paper are the following: 1) we use this framework to replace parts of the digital processing with optical processing, 2) the use of compressive scanning reduces laser data obtained and maintains high reconstruction accuracy, and 3) we show that using compressive sensing can lead to a reduction in the amount of stored data without significantly affecting the utility of this data for image recognition and image compression.

Research paper thumbnail of Improved Energy Efficiency for Multithreaded Kernels through Model-Based Autotuning

Improved Energy Efficiency for Multithreaded Kernels through Model-Based Autotuning

2012 IEEE Green Technologies Conference, 2012

In the last few years, the emergence of multicore architectures has revolutionized the landscape ... more In the last few years, the emergence of multicore architectures has revolutionized the landscape of high-performance computing. The multicore shift has not only increased the per-node performance potential of computer systems but also has made great strides in curbing power and heat dissipation. As we look to the future, however, the gains in performance and energy consumption is not going

Research paper thumbnail of Preserving Hamming Distance in Arithmetic and Logical Operations

Preserving Hamming Distance in Arithmetic and Logical Operations

Journal of Electronic Testing, 2013

ABSTRACT This paper presents a new method for fault-tolerant computing where for a given error ra... more ABSTRACT This paper presents a new method for fault-tolerant computing where for a given error rate, r, the hamming distance between correct inputs and faulty inputs, as well as the hamming distance between correct results and faulty results, is preserved throughout processing; thereby enabling correction of up to r transient faults per computation cycle. The new method is compared and contrasted with current protection methods and its cost/performance is analyzed.

Research paper thumbnail of Logic programming and the execution model of Prolog

Information Sciences - Applications, 1995

This paper introduces the subject of logic programming, describes the execution model of Prolog, ... more This paper introduces the subject of logic programming, describes the execution model of Prolog, and surveys Prolog development tools. In addition, the paper explains how Prolong integrates with artificial intelligence applications and software engineering principles. Finally, it shows how the execution model of Prolog can be optimized and parallelized efficiently.