Smartphone Sensing

Prof. Andrew Campbell

Smartphone Sensing Logo

How do we make smartphones even smarter?

That is the central question driving the Smartphone Sensing Group at Dartmouth.

Smartphones are open and programmable and come with a growing number of powerful embedded sensors, such as an accelerometer, digital compass, gyroscope, GPS, microphone, and camera, which are enabling new sensing applications across a wide variety of domains such as social networks, mobile health, gaming, entertainment, education and transportation.

Application delivery channels such as the AppStore and Market have transformed plain old cell phones into app phones, capable of downloading a myriad of applications in an instant.

The Smartphone Sensing Group is turning the everyday smart phone into a cognitive phone by pushing intelligence to the phone and the computing cloud to make inferences about people's behavior, surroundings and their life patterns.

We are developing new software technology for smartphones to sense, learn, visualize, and share information about ourselves, friends, communities, the way we live, and the world we live in.

Some of the sensing algorithms, systems and applications we have developed in collaboration with Tanzeem Choudhury (Cornell University) and others include CenceMe, SoundSense, NeuroPhone, Jigsaw, Darwin Phones, NextPlace, EyePhone, BeWell, Community-Guided Learning , and Community Similarity Networks.

Read more...