Khadija Ashraf - Academia.edu (original) (raw)
Papers by Khadija Ashraf
2022 IEEE International Conference on Advanced Networks and Telecommunications Systems (ANTS)
ACM Transactions on Sensor Networks
Sensing and communication technology has been used successfully in various event monitoring appli... more Sensing and communication technology has been used successfully in various event monitoring applications over the last two decades, especially in places where long-term manual monitoring is infeasible. However, the major applicability of this technology was mostly limited to terrestrial environments. On the other hand, underwater wireless sensor networks (UWSNs) opens a new space for the remote monitoring of underwater species, faunas along with communicating with underwater vehicles, submarines etc. However, as opposed to terrestrial radio communication, underwater environment brings new challenges for reliable communication due to the high conductivity of the aqueous medium which leads to major signal absorption. In this paper, we provide a detailed technical overview of different underwater communication technologies, namely acoustic, magnetic and visual light, along with their potentials and challenges in submarine environments. Detailed comparison among these technologies have ...
We present a mobility characterization study through extensive experiments in real world driving ... more We present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a stereo–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis in terms of camera image parameters. Unlike prior works, we validated all our measurements through real time collected ground truth values. The validated results presents this work as a stepping stone to addressing mobility in vehicular<br>screen-camera communication. <br>
This dataset was used in our work "See-through a Vehicle: Augmenting Road Safety Information... more This dataset was used in our work "See-through a Vehicle: Augmenting Road Safety Information using Visual Perception and Camera Communication in Vehicles" published in the IEEE Transactions on Vehicular Technology (TVT). In this work, we present the design, implementation and evaluation of non-line-of-sight (NLOS) perception to achieve a virtual {\em see-through} functionality for road vehicles. In this system, a safety event, such as pedestrian crossing or traffic list status or vehicle merge, that are occluded to a driver due to another vehicle on the driving lane, are perceived and communicated by the occluding vehicle. Each vehicle is equipped with a camera, placed on the dashboard, that perceives the scene in a drivers view. This scene is analyzed and mapped into specific warning codes that are specific to a safety event, and communicated as short packets using visible light communication. The camera in the following vehicle captures this information and generates a r...
We present a mobility characterization study through extensive experiments in real world driving ... more We present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a stereo–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis in terms of camera image parameters. Unlike prior works, we validated all our measurements through real time collected ground truth values. The validated results presents this work as a stepping stone to addressing mobility in vehicular<br>screen-camera communication. <br>
Proceedings of the Workshop on Internet of Lights, 2021
This paper positions the idea of enhancing the sensing capabilitiesof unmanned aerial vehicles (U... more This paper positions the idea of enhancing the sensing capabilitiesof unmanned aerial vehicles (UAV or Drones) using LED-Cameracommunication, camera perception, and adaptive sampling algo-rithms. In this paper, considering soil moisture measurement overa specific geographic area as the application, we propose a systemthat consists of drone assisted mobile ground robots(MGRs) thatperform collaborative sensing and communicate with the droneusing visible light. The key idea is to combine aerial images andMGR sensed moisture values to generate a dynamically adaptablemoisture map. Camera communication eliminates the need to estab-lish continuous radio communication in remote locations and offersvisual association that enables the drone to localize the MGRs. Thissystem pushes the state-of-the-art in sensing by accurately sensinga region with high spatio-temporal resolution while optimizing thetime and energy required to create a moisture map of a geographi-cal area. As the sensing resolut...
IEEE Vehicular Networking Conference, 2019
In this work, we explore the use of multi-camera setups through stereo-vision enabled cameras to ... more In this work, we explore the use of multi-camera setups through stereo-vision enabled cameras to perceive immediate and non-line of sight environments of a vehicle. Our proposed system uses visible light communication between a transmitter and receiver vehicle to convey information about the scene in front of the transmitter vehicle. Such a system virtually enables a “see-through-the-vehicle-in-front”, functionality by combining scene perception with vehicle-vehicle communication. We posit to demonstrate this functionality using a device-device communication setup using a stereo camera that does perception mapping and visible light communication reception from a LED transmitter.
The increasing use of light emitting diodes (LED) and light receptors such as photodiodes and cam... more The increasing use of light emitting diodes (LED) and light receptors such as photodiodes and cameras in vehicles motivates the use of visible light communication (VLC) for inter–vehicular networking. However, the mobility of the vehicles presents a fundamental impediment for high throughput and link sustenance in vehicular VLC. While prior work has explored vehicular VLC system design, yet, there is no clear understanding on the amount of motion of vehicles in real world vehicular VLC use–case scenarios. To address this knowledge gap, in this paper, we present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a multi–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis and different vehicular motion behaviors. The motion characterization fr...
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY
In this paper, we present the design, implementation and evaluation of non-line-of-sight (NLOS) p... more In this paper, we present the design, implementation and evaluation of non-line-of-sight (NLOS) perception to achieve a virtual see-through functionality for road vehicles. In this system, a safety event, such as pedestrian crossing or traffic list status or vehicle merge, that are occluded to a driver due to another vehicle on the driving lane, are perceived and communicated by the occluding vehicle. Each vehicle is equipped with a camera, placed on the dashboard, that perceives the scene in a drivers view. This scene is analyzed and mapped into specific warning codes that are specific to a safety event, and communicated as short packets using visible light communication. The camera in the following vehicle captures this information and generates a recommendation for safety action to the driver by comparing the warning from the packet and from its own scene perception. Through experimental evaluations of a proof-of-concept implementation, we show that our system is able to achieve up to 90% accuracy in identifying nine occluded safety events, which correspond to traffic light statuses (red, green, yellow), other vehicles' lane change behaviors (merge/leave lane left/right), and pedestrian detection.
IEEE Transactions on Vehicular Technology
2021 8th International Conference on Smart Computing and Communications (ICSCC)
2022 IEEE International Conference on Advanced Networks and Telecommunications Systems (ANTS)
ACM Transactions on Sensor Networks
Sensing and communication technology has been used successfully in various event monitoring appli... more Sensing and communication technology has been used successfully in various event monitoring applications over the last two decades, especially in places where long-term manual monitoring is infeasible. However, the major applicability of this technology was mostly limited to terrestrial environments. On the other hand, underwater wireless sensor networks (UWSNs) opens a new space for the remote monitoring of underwater species, faunas along with communicating with underwater vehicles, submarines etc. However, as opposed to terrestrial radio communication, underwater environment brings new challenges for reliable communication due to the high conductivity of the aqueous medium which leads to major signal absorption. In this paper, we provide a detailed technical overview of different underwater communication technologies, namely acoustic, magnetic and visual light, along with their potentials and challenges in submarine environments. Detailed comparison among these technologies have ...
We present a mobility characterization study through extensive experiments in real world driving ... more We present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a stereo–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis in terms of camera image parameters. Unlike prior works, we validated all our measurements through real time collected ground truth values. The validated results presents this work as a stepping stone to addressing mobility in vehicular<br>screen-camera communication. <br>
This dataset was used in our work "See-through a Vehicle: Augmenting Road Safety Information... more This dataset was used in our work "See-through a Vehicle: Augmenting Road Safety Information using Visual Perception and Camera Communication in Vehicles" published in the IEEE Transactions on Vehicular Technology (TVT). In this work, we present the design, implementation and evaluation of non-line-of-sight (NLOS) perception to achieve a virtual {\em see-through} functionality for road vehicles. In this system, a safety event, such as pedestrian crossing or traffic list status or vehicle merge, that are occluded to a driver due to another vehicle on the driving lane, are perceived and communicated by the occluding vehicle. Each vehicle is equipped with a camera, placed on the dashboard, that perceives the scene in a drivers view. This scene is analyzed and mapped into specific warning codes that are specific to a safety event, and communicated as short packets using visible light communication. The camera in the following vehicle captures this information and generates a r...
We present a mobility characterization study through extensive experiments in real world driving ... more We present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a stereo–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis in terms of camera image parameters. Unlike prior works, we validated all our measurements through real time collected ground truth values. The validated results presents this work as a stepping stone to addressing mobility in vehicular<br>screen-camera communication. <br>
Proceedings of the Workshop on Internet of Lights, 2021
This paper positions the idea of enhancing the sensing capabilitiesof unmanned aerial vehicles (U... more This paper positions the idea of enhancing the sensing capabilitiesof unmanned aerial vehicles (UAV or Drones) using LED-Cameracommunication, camera perception, and adaptive sampling algo-rithms. In this paper, considering soil moisture measurement overa specific geographic area as the application, we propose a systemthat consists of drone assisted mobile ground robots(MGRs) thatperform collaborative sensing and communicate with the droneusing visible light. The key idea is to combine aerial images andMGR sensed moisture values to generate a dynamically adaptablemoisture map. Camera communication eliminates the need to estab-lish continuous radio communication in remote locations and offersvisual association that enables the drone to localize the MGRs. Thissystem pushes the state-of-the-art in sensing by accurately sensinga region with high spatio-temporal resolution while optimizing thetime and energy required to create a moisture map of a geographi-cal area. As the sensing resolut...
IEEE Vehicular Networking Conference, 2019
In this work, we explore the use of multi-camera setups through stereo-vision enabled cameras to ... more In this work, we explore the use of multi-camera setups through stereo-vision enabled cameras to perceive immediate and non-line of sight environments of a vehicle. Our proposed system uses visible light communication between a transmitter and receiver vehicle to convey information about the scene in front of the transmitter vehicle. Such a system virtually enables a “see-through-the-vehicle-in-front”, functionality by combining scene perception with vehicle-vehicle communication. We posit to demonstrate this functionality using a device-device communication setup using a stereo camera that does perception mapping and visible light communication reception from a LED transmitter.
The increasing use of light emitting diodes (LED) and light receptors such as photodiodes and cam... more The increasing use of light emitting diodes (LED) and light receptors such as photodiodes and cameras in vehicles motivates the use of visible light communication (VLC) for inter–vehicular networking. However, the mobility of the vehicles presents a fundamental impediment for high throughput and link sustenance in vehicular VLC. While prior work has explored vehicular VLC system design, yet, there is no clear understanding on the amount of motion of vehicles in real world vehicular VLC use–case scenarios. To address this knowledge gap, in this paper, we present a mobility characterization study through extensive experiments in real world driving scenarios. We characterize motion using a constantly illuminated transmitter on a lead vehicle and a multi–camera setup on a following vehicle. The observations from our experiments reveal key insights on the degree of relative motion of a vehicle along its spatial axis and different vehicular motion behaviors. The motion characterization fr...
IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY
In this paper, we present the design, implementation and evaluation of non-line-of-sight (NLOS) p... more In this paper, we present the design, implementation and evaluation of non-line-of-sight (NLOS) perception to achieve a virtual see-through functionality for road vehicles. In this system, a safety event, such as pedestrian crossing or traffic list status or vehicle merge, that are occluded to a driver due to another vehicle on the driving lane, are perceived and communicated by the occluding vehicle. Each vehicle is equipped with a camera, placed on the dashboard, that perceives the scene in a drivers view. This scene is analyzed and mapped into specific warning codes that are specific to a safety event, and communicated as short packets using visible light communication. The camera in the following vehicle captures this information and generates a recommendation for safety action to the driver by comparing the warning from the packet and from its own scene perception. Through experimental evaluations of a proof-of-concept implementation, we show that our system is able to achieve up to 90% accuracy in identifying nine occluded safety events, which correspond to traffic light statuses (red, green, yellow), other vehicles' lane change behaviors (merge/leave lane left/right), and pedestrian detection.
IEEE Transactions on Vehicular Technology
2021 8th International Conference on Smart Computing and Communications (ICSCC)