Raskar MIT Media Lab 500 Hz Tracking with

Inexpensive IR Photosensors, Prakash and Second Skin (original) (raw)

Prakash: Lighting-Aware Motion Capture Using

Photosensing Markers and Multiplexed Illumination

R Raskar, H Nii, B de Decker, Y Hashimoto, J Summet, D Moore, Y Zhao, J Westhues, P Dietz, M Inami, S Nayar, J Barnwell, M Noland, P Bekaert, V Branzoi, E Bruns

ACM SIGGRAPH 2007 Paper

Imperceptible Photosensing Marker Tags for

Location + Orientation + Incident Illumination + Reflectance + Unique Id Capture in Ambient Light
at 500Hz, for inexpensive(<$1K) on-set motion capture.

For Business or Licensing Contact Information, please see below.

| Affiliations: | | | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | | MERL: Ramesh Raskar, Jonathan Westhues, Paul Dietz, John Barnwell, U of Tokyo: Hideaki Nii, Universiteit Hasselt, Belgium: Bert de Decker, Philippe Bekaert, University of Electro-Communications, Tokyo: Yuki Hashimoto, Masahiko Inami Georgia Institute of Technology: Jay Summet, Syracuse U.: Dylan Moore, Brown U.: Yong Zhao Columnia University: Shree K. Nayar, Vlad Branzoi University of North Carolina at Chapel Hill: Michael Noland Bauhaus University, Weimar, Germany: Erich Bruns Media: EETimes[Local], CGWorld, VFXWorld, DigitalArts IEEE Signal Processing Magazine on latest Mo-cap Methods Wikipedia on Motion CaptureSecond Skin: Augmenting Human Performance and Improving Motor LearningDennis Miaw and Ramesh Raskar, MIT Media Lab Improving and training for performance, sports, dance, tai chi, etcPhysical therapy for mentally and physically challenged individualsAssisting elderly people who might have difficulty with daily activitiesProject website: Second Skin, Camera Culture GroupMedia: Technology Review, BitStream, DVICE,Slashdot, Fast Company | |


Abstract

In this paper, we present a high speed optical motion capture method which can measure three dimensional motion, orientation, and incident illumination at tagged points in a scene. We use tracking tags that can be imperceptibly embedded in attire or other objects and can work in natural lighting conditions. Our system can support an unlimited number of tags in a scene, and each tag has a unique id thus eliminating marker reacquisition issues. Our tags also provide incident illumination data which can be used when inserting synthetic elements in order to match the lighting of the scene at the time of capturing. This makes the technique ideal for on-set motion capture or the real-time broadcasting of virtual sets.

Unlike previous methods that employ high speed cameras or scanning lasers, we capture the scene appearance using the simplest possible optical devices – a light-emitting diode (LED) with a passive binary mask used as the transmitter and a photosensor used as the receiver. We strategically place a set of optical transmitters to spatio-temporally encode the volume of interest. Photosensors attached to scene points demultiplex the coded optical signals from multiple transmitters, allowing us to compute not only their location and orientation but also their incident illumination and the reflectance of the surfaces to which they are attached. We use our untethered tag system to demonstrate methods of adding special effects to captured videos that cannot be accomplished using pure vision techniques that rely on camera images.



Siggraph ETech 2006: Instant Replay:

LumiNetra: 500 Hz Tracking with Inexpensive IR Photosensors


This page shows results using our high-speed space labeling projectors and links to additional information about the project.


Back to other projects

Camera Culture Group MIT Media Lab, Cambridge, MA, USA