Yulia Sandamirskaya | University of Zurich, Switzerland (original) (raw)
Uploads
Papers by Yulia Sandamirskaya
We present a neuromorphic adaptation of a spiking neural network model of the locust Lobula Giant... more We present a neuromorphic adaptation of a spiking neural network model of the locust Lobula Giant Movement Detector (LGMD), which detects objects increasing in size in the field of vision (looming) and can be used to facilitate obstacle avoidance in robotic applications. Our model is constrained by the parameters of a mixed signal analog-digital neuromorphic device, developed by our group, and is driven by the output of a neuromorphic vision sensor. We demonstrate the performance of the model and how it may be used for obstacle avoidance on an unmanned areal vehicle (UAV).
This paper presents a hardware implementation of a Dynamic Neural Field on the Vision chip to ena... more This paper presents a hardware implementation of a Dynamic Neural Field on the Vision chip to enable robust, fast, and efficient salience-based object tracking.
Neuromorphic electronic systems exhibit advantageous characteristics, in terms of low energy cons... more Neuromorphic electronic systems exhibit advantageous characteristics, in terms of low energy consumption and low response latency, which can be useful in robotic applications that require compact and low power embedded computing resources. However, these neuromorphic circuits still face significant limitations that make their usage challenging: these include low precision, variability of components, sensitivity to noise and temperature drifts, as well as the currently limited number of neurons and synapses that are typically emulated on a single chip. In this paper, we show how it is possible to achieve functional robot control strategies using a mixed signal analog/digital neuromorphic processor interfaced to a mobile robotic platform equipped with an event-based dynamic vision sensor. We provide a proof of concept implementation of obstacle avoidance and target acquisition using biologically plausible spiking neural networks directly emulated by the neuromorphic hardware. To our knowledge, this is the first demonstration of a working spike-based neuromorphic robotic controller in this type of hardware which illustrates the feasibility, as well as limitations, of this approach.
Neuromorphic hardware emulates dynamics of biological neural networks in electronic circuits offe... more Neuromorphic hardware emulates dynamics of biological neural networks in electronic circuits offering an alternative to the von Neumann computing architecture that is low-power, inherently parallel, and event-driven. This hardware allows to implement neural-network based robotic controllers in an energy-efficient way with low latency, but requires solving the problem of device variability, characteristic for analog electronic circuits. In this work, we interfaced a mixed-signal analog-digital neuromorphic processor ROLLS to a neuromorphic dynamic vision sensor (DVS) mounted on a robotic vehicle and developed an autonomous neuromorphic agent that is able to perform neurally inspired obstacle-avoidance and target acquisition. We developed a neural network architecture that can cope with device variability and verified its robustness in different environmental situations, e.g., moving obstacles, moving target, clutter, and poor light conditions. We demonstrate how this network, combined with the properties of the DVS, allows the robot to avoid obstacles using a simple biologically-inspired dynamics. We also show how a Dynamic Neural Field for target acquisition can be implemented in spiking neuromorphic hardware. This work demonstrates an implementation of working obstacle avoidance and target acquisition using mixed signal analog/digital neuromorphic hardware.
Event based sensors and neural processing architectures represent a promising technology for impl... more Event based sensors and neural processing architectures represent a promising technology for implementing low power and low latency robotic control systems. However, the implementation of robust and reliable control architectures using neuromorphic devices is challenging, due to their limited precision and variable nature of their underlying computing elements. In this paper we demonstrate robust obstacle avoidance and target acquisition behaviors in a compact mobile platform controlled by a neuromorphic sensory-processing system and validate its performance in a number of robotic experiments.
Biologically Inspired Cognitive Architectures, 2015
Paladyn, Journal of Behavioral Robotics, 2015
Paladyn, Journal of Behavioral Robotics, 2015
Springer Series in Bio-/Neuroinformatics, 2015
Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore,... more Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore, they cannot rely on access to models of all objects in their environment, in order to parameterize object-directed actions. The robot must estimate the shape of objects in such environments, based on their perception. How to estimate an object’s shape based on distal sensors, such as color- or depth cameras, has been extensively studied. Using haptic sensors for this purpose, however, has not been considered in a comparable depth. Humans, to the contrary, are able to improve object manipulation capabilities by using tactile stimuli, acquired from an active haptic exploration of an object. In this paper we introduce a neural-dynamic model which allows to build an object shape representation based on haptic exploration. Acquiring this representation during object manipulation requires the robot to autonomously detect and correct errors in the localization of tactile features with respect to the object. We have implemented an architecture for haptic exploration of an object’s shape on a physical robotic hand in a simple exemplary scenario, in which the geometrical models of two different n-gons are learned from tactile data while rotating them with the robotic hand.
4th International Conference on Development and Learning and on Epigenetic Robotics, 2014
Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore,... more Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore, they cannot rely on access to models of all objects in their environment, in order to parameterize object-directed actions. The robot must estimate the shape of objects in such environments, based on their perception. How to estimate an object’s shape based on distal sensors, such as color- or depth cameras, has been extensively studied. Using haptic sensors for this purpose, however, has not been considered in a comparable depth. Humans, to the contrary, are able to improve object manipulation capabilities by using tactile stimuli, acquired from an active haptic exploration of an object. In this paper we introduce a neural-dynamic model which allows to build an object shape representation based on haptic exploration. Acquiring this representation during object manipulation requires the robot to autonomously detect and correct errors in the localization of tactile features with respect to the object. We have implemented an architecture for haptic exploration of an object’s shape on a physical robotic hand in a simple exemplary scenario, in which the geometrical models of two different n-gons are learned from tactile data while rotating them with the robotic hand.
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014
We present a neuromorphic adaptation of a spiking neural network model of the locust Lobula Giant... more We present a neuromorphic adaptation of a spiking neural network model of the locust Lobula Giant Movement Detector (LGMD), which detects objects increasing in size in the field of vision (looming) and can be used to facilitate obstacle avoidance in robotic applications. Our model is constrained by the parameters of a mixed signal analog-digital neuromorphic device, developed by our group, and is driven by the output of a neuromorphic vision sensor. We demonstrate the performance of the model and how it may be used for obstacle avoidance on an unmanned areal vehicle (UAV).
This paper presents a hardware implementation of a Dynamic Neural Field on the Vision chip to ena... more This paper presents a hardware implementation of a Dynamic Neural Field on the Vision chip to enable robust, fast, and efficient salience-based object tracking.
Neuromorphic electronic systems exhibit advantageous characteristics, in terms of low energy cons... more Neuromorphic electronic systems exhibit advantageous characteristics, in terms of low energy consumption and low response latency, which can be useful in robotic applications that require compact and low power embedded computing resources. However, these neuromorphic circuits still face significant limitations that make their usage challenging: these include low precision, variability of components, sensitivity to noise and temperature drifts, as well as the currently limited number of neurons and synapses that are typically emulated on a single chip. In this paper, we show how it is possible to achieve functional robot control strategies using a mixed signal analog/digital neuromorphic processor interfaced to a mobile robotic platform equipped with an event-based dynamic vision sensor. We provide a proof of concept implementation of obstacle avoidance and target acquisition using biologically plausible spiking neural networks directly emulated by the neuromorphic hardware. To our knowledge, this is the first demonstration of a working spike-based neuromorphic robotic controller in this type of hardware which illustrates the feasibility, as well as limitations, of this approach.
Neuromorphic hardware emulates dynamics of biological neural networks in electronic circuits offe... more Neuromorphic hardware emulates dynamics of biological neural networks in electronic circuits offering an alternative to the von Neumann computing architecture that is low-power, inherently parallel, and event-driven. This hardware allows to implement neural-network based robotic controllers in an energy-efficient way with low latency, but requires solving the problem of device variability, characteristic for analog electronic circuits. In this work, we interfaced a mixed-signal analog-digital neuromorphic processor ROLLS to a neuromorphic dynamic vision sensor (DVS) mounted on a robotic vehicle and developed an autonomous neuromorphic agent that is able to perform neurally inspired obstacle-avoidance and target acquisition. We developed a neural network architecture that can cope with device variability and verified its robustness in different environmental situations, e.g., moving obstacles, moving target, clutter, and poor light conditions. We demonstrate how this network, combined with the properties of the DVS, allows the robot to avoid obstacles using a simple biologically-inspired dynamics. We also show how a Dynamic Neural Field for target acquisition can be implemented in spiking neuromorphic hardware. This work demonstrates an implementation of working obstacle avoidance and target acquisition using mixed signal analog/digital neuromorphic hardware.
Event based sensors and neural processing architectures represent a promising technology for impl... more Event based sensors and neural processing architectures represent a promising technology for implementing low power and low latency robotic control systems. However, the implementation of robust and reliable control architectures using neuromorphic devices is challenging, due to their limited precision and variable nature of their underlying computing elements. In this paper we demonstrate robust obstacle avoidance and target acquisition behaviors in a compact mobile platform controlled by a neuromorphic sensory-processing system and validate its performance in a number of robotic experiments.
Biologically Inspired Cognitive Architectures, 2015
Paladyn, Journal of Behavioral Robotics, 2015
Paladyn, Journal of Behavioral Robotics, 2015
Springer Series in Bio-/Neuroinformatics, 2015
Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore,... more Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore, they cannot rely on access to models of all objects in their environment, in order to parameterize object-directed actions. The robot must estimate the shape of objects in such environments, based on their perception. How to estimate an object’s shape based on distal sensors, such as color- or depth cameras, has been extensively studied. Using haptic sensors for this purpose, however, has not been considered in a comparable depth. Humans, to the contrary, are able to improve object manipulation capabilities by using tactile stimuli, acquired from an active haptic exploration of an object. In this paper we introduce a neural-dynamic model which allows to build an object shape representation based on haptic exploration. Acquiring this representation during object manipulation requires the robot to autonomously detect and correct errors in the localization of tactile features with respect to the object. We have implemented an architecture for haptic exploration of an object’s shape on a physical robotic hand in a simple exemplary scenario, in which the geometrical models of two different n-gons are learned from tactile data while rotating them with the robotic hand.
4th International Conference on Development and Learning and on Epigenetic Robotics, 2014
Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore,... more Robots are expected to operate autonomously in unconstrained, real-world environments. Therefore, they cannot rely on access to models of all objects in their environment, in order to parameterize object-directed actions. The robot must estimate the shape of objects in such environments, based on their perception. How to estimate an object’s shape based on distal sensors, such as color- or depth cameras, has been extensively studied. Using haptic sensors for this purpose, however, has not been considered in a comparable depth. Humans, to the contrary, are able to improve object manipulation capabilities by using tactile stimuli, acquired from an active haptic exploration of an object. In this paper we introduce a neural-dynamic model which allows to build an object shape representation based on haptic exploration. Acquiring this representation during object manipulation requires the robot to autonomously detect and correct errors in the localization of tactile features with respect to the object. We have implemented an architecture for haptic exploration of an object’s shape on a physical robotic hand in a simple exemplary scenario, in which the geometrical models of two different n-gons are learned from tactile data while rotating them with the robotic hand.
2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2014