graduate research

Robots and AI to support research on harmful blue-green algae

A team of scientists from research centers stretching from Maine to South Carolina will combine big data, artificial intelligence and robotics with new and time-tested techniques for lake sampling to understand where, when, and how cyanobacterial blooms form in lakes across the East Coast in a 4-year project supported by NSF ($3 million, renewable to $5.9 M).

The research team brings together experts in freshwater ecology, computer science, engineering and geospatial science from Bates College, Colby College, Dartmouth, the University of New Hampshire, the University of Rhode Island and the University of South Carolina.

Andrew Campbell Presented with SIGMOBILE Test of Time Award!

Andrew Campbell and his PhD students received the prestigious ACM SIGMOBILE Test-of-Time Award for their 2008 paper:

Miluzzo E, Lane ND, Fodor K, Peterson R, Lu H, Musolesi M, Eisenman SB, Zheng X, Campbell AT. "Sensing meets mobile social networks: the design, implementation and evaluation of the cenceme application." ACM SenSys, 2008.

The award citation states:

CenceMe was the first paper to demonstrate how smartphones can be used to derive rich behavioral insights continuously from onboard sensors. Since its publication, the work has inspired a huge body of research and commercial endeavors that has continued to increase the breadth and depth of personal sensing. Some of the activity inference methods that are now common in smartphone operating systems can be traced back to the original CenceMe system.

This is a seminal paper that spearheaded the field of smartphone sensing. Today, activity recognition is integrated into the operating system of every Android and iPhone.

New research lets artists create more realistic and controllable CGI

A new theory based on the physics of cloud formation and neutron scattering could help animators create more lifelike movies, according to a Dartmouth-led study. Software developed using the technique focuses on how light interacts with microscopic particles to develop computer-generated images.

The work was conduced by Dartmouth's Visual Computing Laboratory in collaboration with researchers from Pixar, Disney Research, ETH Zurich and Cornell University. A research paper detailing the advancement was just published in the journal Transactions on Graphics and will be presented at SIGGRAPH Asia by Dartmouth CS PhD student Benedikt Bitterli tomorrow, December 6 in Tokyo, Japan.

Watch the quick "fast forward" video below that summarizes the technique:

Prof. Campbell Wins The ACM Sensys 2018 Test-of-Time Award

Andrew Campbell and his PhD students received the prestigious ACM SenSys 2018 Test-of-Time Award (10 year award) for their paper: “Sensing meets mobile social networks: the design, implementation and evaluation of the cenceme application”.

In 2008, when the App Store first opened Professor Campbell and his team released the CenceMe app, which implemented a machine learning algorithm directly on the iPhone for the first time to automatically detected the user’s behavior (e.g., sitting, walking, running, socializing). The app pushed this user context to Facebook and for the first time human behavior passively inferred from sensors embedded in smartphones was visible to friends in real-time.

The award citation states: “At the dawn of the smartphone era, this paper had the foresight to realize that smartphones are human companions and their sensors, collectively, can be used to derive novel social behavior insights. It also pioneered applying machine learning across local devices and servers”.

Today, activity recognition is integrated into the operating system of every Android and iPhone.

Eye-Tracking Glasses Giving New Vision for Augmented Reality

High power consumption and cost have kept eye trackers out of current augmented reality systems. By using near-infrared lights and photodiodes DartNets Lab has created an energy-efficient, wearable system that tracks rapid eye movements and allows hands-free input of system commands.

The glasses, which can also help monitor human health, was introduced at MobiCom 2018 taking place from October 29-November 2 in New Delhi, India.

Graduate Students Create Computer-Chip Security Fix

Dartmouth computer science graduate students are applying their research techniques to fundamental security flaws recently found in nearly every computer chip manufactured in the last 20 years—flaws that they say could prove catastrophic if exploited by malicious hackers.

The researchers are coming to grips with a design flaw that ultimately falls into the province of the chip manufacturers—such industry giants as Intel and AMD. Until new designs are implemented, an interim solution devised at Dartmouth can fill the breach.

The research team includes two PhD students, Prashant Anantharaman and Ira Ray Jenkins, and Rebecca Shapiro, Guarini ’18, who received her PhD this past spring. Professor of Computer Science Sean Smith and Research Associate Professor Sergey Bratus advised the team.

Read the whole article over at Dartmouth News.

A Smartwatch That Works With One Hand

Smartwatches can be handy—perhaps too “handy,” given that they require both hands to operate. That can be a problem if your other hand is full, nonfunctioning or missing.

Researchers have tried a variety of approaches to help smartwatch users who lack a free hand, such as putting acoustic sensors on the watchband to capture inputs from finger-tapping. These efforts have concentrated mostly on enabling discrete commands, such as moving down a list of songs one at a time. Voice commands also can work for such functions, although the noise of speech isn’t always welcome.

Scientists at Dartmouth College and the University of Manitoba have been working on another approach: enabling continuous input—such as drawing letters and shapes or panning across a map—of the kind more typical of using a mouse or stylus. They hope to avoid relying on a lot of arm-tilting for these motions, since that tends to take the screen out of view.

DartNets Lab's DarkLight Won Best Video Award in MobiCom'16

With the rise in wearables such as smartwatches and fitness trackers that rely on smart sensors, and the continued popularity of smartphones, smartdevices are taking our country by storm. Wireless data for such devices is typically beamed through Wi-Fi or Bluetooth, yet, the new wireless communication technology of “visible light communication (VLC),” has emerged as a new option albeit with limitations due to the challenges it faces in practice, such as being easily blocked or not being able to sustain transmission when light is off. Through a new Dartmouth project called “DarkLight,” researchers have developed and demonstrated for the first-time, how visible light can be used to transmit data even when the light appears dark or off. DarkLight provides a new communication primitive similar to infrared communication, however, it exploits the LED lights already around us rather than needing additional infrared emitters.

The study, “The DarkLight Rises: Visible Light Communication in the Dark,” was presented and demonstrated at “MobiCom 2016: The 22nd Annual International Conference on Mobile Computing and Networking” by Zhao Tian, the lead Ph.D. student for the project.

Pages