DartNets Lab's LiSense Won Best Video Award in MobiCom'15

The LiSense system developed in the DartNets lab turns everyday lighting into sensors that sense and respond to what we do. Check out the video on Youtube:

Using solely the light around us, a new light-sensing system developed by the DartNets Lab tracks whole-body human postures unobtrusively and in real time. The LiSense system reconstructs a 3D skeletal model on a computer screen based on the shadows cast by a person moving within a lighted room — no cameras or on-body devices are required.

The light-sensing testbed built by the team consist of five off-the-shelf LED lights on the ceiling, hundreds of light sensors on the floor, 29 microcontrollers and a server. It can generate 60 posture inferences per second in real time, with 10-11 degrees of angular error on average.

The LiSense system could be especially applicable in health and behavioral monitoring. If the light around us continuously monitors how we move and gesture over time, it might help detect early symptoms of diseases or foster a healthy lifestyle.

The leading student Tianxing Li presented the work in the ACM Conference on Mobile Computing and Networking (MobiCom), a top conference in mobile computing, last week in Paris. The work won the Best Video Award in the conference.

Please check out more details on LiSense at: http://dartnets.cs.dartmouth.edu/lisense

The DartNets lab is co-directed by Professor Andrew Campbell and Professor Xia Zhou. To find out other research projects in the lab, please visit: http://dartnets.cs.dartmouth.edu/