Data Sensing Lab at Google I/O 2013: Google Cloud Platform meets the Internet
of Things
By Michael
Manoochehri, Developer Programs Engineer, Google Cloud Platform
Cross-posted with the Google
Cloud Platform Blog
After last year's Google I/O conference, the Google Cloud Platform Developer Relations team
started to think about how attendees experienced the event. We wanted to help attendees gain
more insight about the conference space and the environment itself. Which developer Sandboxes
were the busiest? Which were the loudest locations, and which were the best places to take a
quick nap? We think about data problems all the time, and this looked like an interesting big
data challenge that we could try to solve. So this year, we decided to try to answer our
questions with a project that's a bit different, kind of futuristic, and maybe a little
crazy.
Since we love open source hardware hacking as much as we love to share open source code, we
decided to team up with the
O'Reilly
Data Sensing Lab to deploy hundreds of
Arduino-based environmental sensors at Google I/O 2013.
Using software built with the Google Cloud Platform, we'll be collecting and visualizing
ambient data about the conference, such as temperature, humidity, air quality, in real time!
Altogether, the sensors network will provide over 4,000 continuous data streams over a ZigBee
mesh network managed by
Device
Cloud by Etherios.
In addition, our motes will be able to detect fluctuations in noise level, and some will be
attached to footstep counters, to understand collective movement around the conference floor.
Of course, since a key goal of Google I/O is to promote innovation in the open, the project's
Cloud Platform code, the Arduino hardware designs, and even the data collected, will be open
source and available online after the conference.
Google Cloud Platform, which provides the software backend for this project, has a variety of
features for building applications that collect and process data from a large number of client
devices - without having to spend time managing hardware or infrastructure.
Google App
Engine Datastore, along with
Cloud
Endpoints, provides a scalable front end API for collecting data from devices.
Google Compute Engine is
used to process and analyse data with software tools you may already be familiar with, such as
R and Hadoop.
Google
BigQuery provides fast aggregate analysis of terabyte datasets. Finally, App
Engine's web application framework is able to surface interactive visualizations to
users.
Networked sensor technology is in the early stages of revolutionizing business logistics, city
planning, and consumer products. We are looking forward to sharing the Data Sensing Lab with
Google I/O attendees, because we want to show how using open hardware together with the Google
Cloud Platform can make this technology accessible to anyone.
With the help of the Google Maps DevRel team, we'll be displaying visualizations of
interesting trends on several screens around the conference. Members of the Data Sensing Lab
will be on hand in the Google I/O Cloud Sandbox to show off prototypes and talk to attendees
about open hardware development. Lead software developer Amy Unruh and Kim Cameron from the
Cloud Platform Developer Relations team will talk about how we built the software involved in
this project in a talk called "
Behind the Data Sensing
Lab". In case you aren't able to attend Google I/O 2013, this session will be
available online after the conference. Learn more about the
Google Cloud Platform on our site, and to dive
in to building applications, check out our
developer documentation.
Michael
Manoochehri is a Developer Programs Engineer supporting the Google Cloud Platform.
He is passionate about making cloud computing and data analysis universally accessible and
useful.
Posted by Scott Knaster,
Editor