skip to main content

Virginia Tech researcher receives grant to keep cloud connectivity robust

November 23, 2016

Man holds a group of mini computers by their connected wires.
Ali R. Butt, professor of computer science, has received funding from the National Science Foundation to study how to keep the internet humming as gadgets take up more bandwidth.

The world is increasingly connected through not only smart phones and email, but new and hungry gadgets, such as webcams, sensors, and monitors, which demand an ever-larger slice of the bandwidth pie.

Ali R. Butt, professor in the Department of Computer Science in Virginia Tech’s College of Engineering, was recently awarded a $516,000 grant to examine how to keep those gadgets from depleting the cloud computing bandwidth that the internet currently depends on.

“We are now just beginning to experience living super-interconnected lives,” said Butt. “Imagine five or 10 years from now when we will live in smart houses that use all kinds of sensors to monitor your safety, adjust the cooling or heating, and many other little devices and things that are only beginning to be used. These things require valuable computing abilities and information on the cloud to work properly and be useful."

Butt, who also holds a courtesy appointment in electrical and computer engineering, is the principal investigator on the collaborative research project. He is partnering with Muhammad Shahzad, assistant professor of computer science from North Carolina State University, to design new techniques for massive data management and processing in the cloud, as well as study the actual nodes computers use to transfer information. The project is funded by the National Science Foundation.

Butt will also enlist graduate students from the department to augment the research team: Jamal Khan from Islamabad, Pakistan; Arnab Pau from Habra, India; and Luna Xu from Shanghai, China.

"What we want to do with this grant is to figure out a way to, instead of deluging the cloud with every bit of information from millions and millions of devices, divert only the necessary information to the cloud through an intermediary device that will put less strain on the main network,” said Butt.

This type of tiered processing in the network has been dubbed “fog computing” by some in the industry, such as router giant CISCO Systems, due to the placement of new computing ability in between the devices and the actual cloud computing data centers.

Currently, gadgets that utilize sensors to interact with other gadgets process over 10 billion cloud transactions per day on a network dubbed the Internet of Things (IoT). The concept is relatively new and is just beginning to be defined in the field of computer science.

Butt will mimic bandwidth loads from gadgets by using small networks of inexpensive Raspberry Pis, components that are tiny computers costing about $35 each, and were used in the award-winning interactive art project SeeMore, collaboratively produced with the Institute for Creativity, Arts, and Technology.

The project aims to undertake the fundamental task of collecting and detailing fine-grained monitoring of the robustness and usage statistics of the rapidly growing IoT. Fine-grained monitoring requires that the IoT service providers have to acquire, transport, store, and process exponentially increasing amount of data that must be managed or monitored.

The bandwidth necessary to run a George Jetson-style house on an IoT network will be increasingly taxed according to Butt.So, it may not be the canine companionship of Astro that ends up being man’s best friend in an increasingly connected world.

Reliability may come to be defined as the fast processing and regular amount of bandwidth available to a population that depends on gadgets to do everything from brewing the morning coffee to monitoring critical health issues.

Written by Amy Loeffler

Contact: