Mohsen Parisay | Technische Universität Darmstadt (original) (raw)
Papers by Mohsen Parisay
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose, EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a hands-free interaction method for point-and-select tasks. We evaluated the prototype in two separate user studies, each containing two experiments with 33 participants and found that EyeTAP is robust even in presence of ambient noise in the audio input signal with tolerance of up to 70 dB, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, EyeTAP has a lower error rate than the dwell-time method in a ribbon-shaped experiment. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requiremen...
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose, EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a hands-free interaction method for point-and-select tasks. We evaluated the prototype in two separate user studies, each containing two experiments with 33 participants and found that EyeTAP is robust even in presence of ambient noise in the audio input signal with tolerance of up to 70 dB, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, EyeTAP has a lower error rate than the dwell-time method in a ribbon-shaped experiment. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requiremen...
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as ‘Midas touch’. In this paper we propose EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a contact-free multimodal interaction method for point-and-select tasks. We evaluated the prototype in four user studies with 33 participants and found that EyeTAP is applicable in the presence of ambient noise, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, although EyeTAP did not generally outperform the dwell-time method, it did have a lower error rate than the dwell-time in one of ∗Corresponding author Email address: m_parisa@encs.concordia.ca (Mohsen Parisay) Preprint submitted to International Journal of Human-Computer Studies June 3, 2021 our experiments. Our study shows that EyeTAP would be useful f...
Fitts' law is a prediction model to measure the difficulty level of target selection for poin... more Fitts' law is a prediction model to measure the difficulty level of target selection for pointing devices. However, emerging devices and interaction techniques require more flexible parameters to adopt the original Fitts' law to new circumstances and case scenarios. We propose Index of Difficulty for Eye tracking Applications (IDEA) which integrates Fitts' law with users' feedback from the NASA TLX to measure the difficulty of target selection. The COVID-19 pandemic has shown the necessity of contact-free interactions on public and shared devices, thus in this work, we aim to propose a model for evaluating contact-free interaction techniques, which can accurately measure the difficulty of eye tracking applications and can be adapted to children, users with disabilities, and elderly without requiring the acquisition of physiological sensory data. We tested the IDEA model using data from a three-part user study with 33 participants that compared two eye tracking select...
2020 13th International Conference on Human System Interaction (HSI)
Public authorities normally consider statistical data about indicators in their decision makings.... more Public authorities normally consider statistical data about indicators in their decision makings. Such valid kind of data allows an objective observation about indicator developments over time. In case of a significant deviation from the normal indicator level, it is difficult to understand the reasons for upcoming problems. In this article we present an approach that allows an enhanced information gathering through an improved information overview about the depending aspects to such an indicator by considering governmental data-sources that provide also other types of data than just statistics. Even more, our approach integrates a system that allows generating explanations for Open Government Data, especially to specific indicators, based on Linked-Open Data and shows it in graphical form to enable a fast overview gathering. This allows decision-makers to get hints for unexpected reasons of concrete problems that may influence an indicator.
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose, EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a hands-free interaction method for point-and-select tasks. We evaluated the prototype in two separate user studies, each containing two experiments with 33 participants and found that EyeTAP is robust even in presence of ambient noise in the audio input signal with tolerance of up to 70 dB, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, EyeTAP has a lower error rate than the dwell-time method in a ribbon-shaped experiment. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requiremen...
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as 'Midas touch'. In this paper we propose, EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a hands-free interaction method for point-and-select tasks. We evaluated the prototype in two separate user studies, each containing two experiments with 33 participants and found that EyeTAP is robust even in presence of ambient noise in the audio input signal with tolerance of up to 70 dB, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, EyeTAP has a lower error rate than the dwell-time method in a ribbon-shaped experiment. These characteristics make it applicable for users for whom physical movements are restricted or not possible due to a disability. Furthermore, EyeTAP has no specific requiremen...
One of the main challenges of gaze-based interactions is the ability to distinguish normal eye fu... more One of the main challenges of gaze-based interactions is the ability to distinguish normal eye function from a deliberate interaction with the computer system, commonly referred to as ‘Midas touch’. In this paper we propose EyeTAP (Eye tracking point-and-select by Targeted Acoustic Pulse) a contact-free multimodal interaction method for point-and-select tasks. We evaluated the prototype in four user studies with 33 participants and found that EyeTAP is applicable in the presence of ambient noise, results in a faster movement time, and faster task completion time, and has a lower cognitive workload than voice recognition. In addition, although EyeTAP did not generally outperform the dwell-time method, it did have a lower error rate than the dwell-time in one of ∗Corresponding author Email address: m_parisa@encs.concordia.ca (Mohsen Parisay) Preprint submitted to International Journal of Human-Computer Studies June 3, 2021 our experiments. Our study shows that EyeTAP would be useful f...
Fitts' law is a prediction model to measure the difficulty level of target selection for poin... more Fitts' law is a prediction model to measure the difficulty level of target selection for pointing devices. However, emerging devices and interaction techniques require more flexible parameters to adopt the original Fitts' law to new circumstances and case scenarios. We propose Index of Difficulty for Eye tracking Applications (IDEA) which integrates Fitts' law with users' feedback from the NASA TLX to measure the difficulty of target selection. The COVID-19 pandemic has shown the necessity of contact-free interactions on public and shared devices, thus in this work, we aim to propose a model for evaluating contact-free interaction techniques, which can accurately measure the difficulty of eye tracking applications and can be adapted to children, users with disabilities, and elderly without requiring the acquisition of physiological sensory data. We tested the IDEA model using data from a three-part user study with 33 participants that compared two eye tracking select...
2020 13th International Conference on Human System Interaction (HSI)
Public authorities normally consider statistical data about indicators in their decision makings.... more Public authorities normally consider statistical data about indicators in their decision makings. Such valid kind of data allows an objective observation about indicator developments over time. In case of a significant deviation from the normal indicator level, it is difficult to understand the reasons for upcoming problems. In this article we present an approach that allows an enhanced information gathering through an improved information overview about the depending aspects to such an indicator by considering governmental data-sources that provide also other types of data than just statistics. Even more, our approach integrates a system that allows generating explanations for Open Government Data, especially to specific indicators, based on Linked-Open Data and shows it in graphical form to enable a fast overview gathering. This allows decision-makers to get hints for unexpected reasons of concrete problems that may influence an indicator.