- Undergraduate
- Graduate
- Research
- News & Events
- People
- Inclusivity
- Jobs
Back to Top Nav
Back to Top Nav
Back to Top Nav
A joint research team from the University of Cambridge and Dartmouth College has developed a system for using infrared light tags to monitor face-to-face interactions. The technique could lead to a more precise understanding of how individuals interact in social settings and can increase the effectiveness of communications coaching.
The system, named Protractor by the Cambridge-Dartmouth team, uses invisible light to record how people employ body language by measuring body angles and distances between individuals.
Prior studies have revealed that body language can influence many aspects of everyday life including job interviews, doctor-patient conversations and team projects. Each setting includes a specific set of interaction details such as eye contact and hand gestures for which an accurate monitoring of distance and relative orientation is crucial.
Body language is already commonly studied through video sessions, audio recordings and paper questionnaires. Compared to the new, light-based system, these approaches can require invasive cameras, necessitate complex infrastructure support, and impose high burdens on users.
Protractor is a lightweight, wearable tag resembling an access badge worn with a lanyard or clip. The device measures non-verbal behavior with fine granularity by using near-infrared light from photodiodes. The light technology operates at a wavelength commonly used in television remote controls.
To study the technique's effectiveness, the team used the Protractor tags to track non-verbal behaviors during a problem-solving group task known as "The Marshmallow Challenge." In this task, teams of four members were given 18 minutes to build a structure that could support a marshmallow using tape, string and a handful of spaghetti.
In the study of 64 participants, Protractor achieved 1- to 2-inch mean error in estimating interaction distance and less than 6 degrees error 95 percent of the time for measuring relative body orientation. The system also allowed researchers to assess an individual's task role within the challenge with close to 85 percent accuracy while identifying stages in the building process with over 93 percent accuracy.
The research was published in Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies and will be presented in UbiComp'18.
Check out the initial press release at AAAS.
The full study can be found here.
More research at DartNets lab can be found here.