Project Description

Expedition to develop the science and technology of Behavioral Imaging, the measurement and analysis of social and communicative behavior using multi-modal sensing, with applications to developmental disorders such as autism. He also serves as the Deputy Director of the NIH Center of Excellence on Mobile Sensor Data-to-Knowledge (MD2K). See and  for details.

Learning objectives

Following the presentation, attendees will be able to:

  • Describe cues for video analysis which are available in the first person setting and outline their use in predicting visual attention and recognizing actions
  • Compare and contrast methods for skimming collections of first person video based on selectively dropping frames and detecting specific events of interest

Describe the use of first person vision to measure social behavior in the context of a behavioral therapy program for children with autism

Suggested Readings

  • Yin Li, Alireza Fathi, and James M. Rehg. Learning to Predict Gaze in Egocentric Video. In Proc. International Conference on Computer Vision (ICCV), 2013.
  • Zhefan Ye, Yin Li, Yun Liu, Chanel Bridges, Agata Rozga, and James M. Rehg. Detecting Bids for Eye Contact using a Wearable Camera, In Proc. IEEE International Conference on Automatic Face and Gesture Recognition (FG), 2015.
  • Neel Joshi, Wolf Kienzle, Mike Toelle, Matt Uyttendaele, and Michael F. Cohen. Real-Time Hyperlapse Creation via Optimal Frame Selection. In Proc. ACM SIGGRAPH 2015.

Project Details