There’s an experimental store in the basement of Hewlett Packard Labs rigged together with spare tablet computers, empty racks, and foam core board. A small team of researchers built it to explore ways to give brick-and-mortar retailers some of the same capabilities as their Internet counterparts.
Online retailers know what products people are looking at, and when they don’t buy items in their virtual shopping carts. They use behavioral data like this to craft display ads and promotions aimed at getting people to complete their purchases.
Traditional stores don’t have this level of insight. They’re dependent on a sharp-eyed clerk to notice someone pacing an aisle or studying a product.
On the shelves of the Labs store, devices the size of an overstuffed wallet track people as they pause in front of a shelf and pick up items. The devices work together to interpret the meaning of customer behaviors—providing the equivalent of an online retailer’s digital clickstream.
Eventually, these devices will be able to connect to one another on an ad hoc basis to share information on a much larger scale. Parts on an oil rig, for example, will collaborate to interpret signs that might indicate an equipment malfunction; home security systems will warn neighbors about a prowler moving up the block; entire transportation grids will reprogram themselves to resolve traffic jams, redirecting travelers on the fly.
“At some point, we're going to have so much data in the world that just the process of moving that data to a cloud is going to be cost-prohibitive.”Amip Shah, Hewlett Packard Labs
By processing information themselves, communicating directly with other devices, and sharing only the required data, these “distributed mesh computers” can act fast and protect privacy. Each node on a network will be able to determine “have I seen this before, has anybody else seen this, and what should I do about it?” says Amip Shah, head of Distributed Mesh Computing, or DMC, research at Hewlett Packard Labs. "I don’t need my things to be smarter. I need them to have common sense."
It’s a vision that extends beyond today's Internet of Things, where common equipment such as electric meters, pollution monitors, and security cameras are outfitted with sensors, then connected to the Internet. HPE and others are developing platforms that manage and optimize the data these devices generate.
For now, IoT devices collect and send data to the cloud, where much of it languishes in storage. Only a small percentage of it is processed, analyzed, and turned into useful insights.
There are several limitations with this “device to data center” model, Shah says. First, data must be transmitted across networks, a bandwidth-intensive process that can be slow and expensive. Second, in order for these devices to effectively take action when needed, the data they collect must be analyzed in a timely manner, if not in real time.
Both of these limitations are being stressed by the explosion of data generated by IoT devices, which research firm IDC expects to grow from 88 exabytes in 2013 to 4,400 exabytes—4.4 trillion gigabytes—in 2020. That later figure is equivalent to all the data collected and copied in the world in 2013.
“At some point, we're going to have so much data in the world that just the process of moving that data to a cloud is going to be cost-prohibitive,” Shah says.
Yanyan Zhuang, a research assistant professor of computer science and engineering at New York University, says that to reduce a bandwidth crunch, network operators can filter and compress data at the point of collection, or just send on analytic conclusions. This approach "can dramatically reduce the amount of data to be sent” while alleviating some privacy concerns, she says.
The case for common sense analytics
A Distributed Mesh Computing device
The first test
To detect people walking down the aisles and measure their distance from products, the researchers installed on the shelves a series of low-cost, single-board Raspberry Pi computers. They outfitted them with sensors, including Bluetooth beacons and accelerometers, and a variety of cameras: some that could tilt and rotate, and others that remained stationary at eye level or in corners of the store."
By coupling image capture with data processing and analysis, the team was able to track customer interactions without transferring images away from the computer. The camera’s full-resolution images stayed local, and devices shared only a stripped-down version—the computer equivalent of the “tall guy in the blue shirt.”
Working together, the devices tracked customers as they walked through the store, like an online retailer tracking an anonymous website visitor as he browses the product catalog. Collectively, the computers could then also tell when customers were confused or needed help.
“It became a living lab,” says April Mitchell, a Hewlett Packard Labs research director, whose team built the store.
The common-sense network
There are two main challenges to moving DMC out of the concept store and into the real world. First, devices small enough to be widely deployed don’t have the storage capacity to collect the amounts of data necessary for accurate analytics. Second, they’re not fast enough.
On a small whiteboard beside a desk cluttered with tiny test computers at the back of Hewlett Packard Labs, electrical engineer Geoff Lyon has scribbled a potential solution.
His drawing—of a long rectangle of computer memory surrounded by tiny microprocessors—is simple. Yet it reverses today’s decades-old computer architecture model that revolves around the CPU, or central processing unit, and sidelines memory to a supporting role.
“I don’t need my things to be smarter. I need them to have common sense.”Amip Shah, Hewlett Packard Labs
In Lyon’s model, memory takes center stage. A small portion of the long block of memory is DRAM, or dynamic random access memory, the fastest type of memory available today. The remainder is flash storage. The envisioned computer transmits in real time the data its sensors collect to the DRAM, which sends it to several small processors for immediate analysis.
An adjacent, much longer block of flash storage stores older data and is surrounded by its own army of specialized processors. As the chips surrounding the DRAM process the real-time data, these chips analyze the historical data.
“We have raw data which we collect, which always stays on the node,” Lyon says. “We have processed information, which is anything that we're willing to share with the other nodes, and we have events that are essentially just broadcast.”
If one of the devices senses a lot of motion, for instance, it could broadcast that data to determine whether other devices are experiencing the same motion, as might happen in an earthquake, or if has detected an isolated event. If just a few devices report similar data, they could share processed information with one another to better understand what happened.
Future iterations of this memory-driven architecture, Lyon says, will also incorporate emerging memory technologies, to improve storage capacity, energy efficiency, and compute capabilities.
Lyon and his team have spent the past year building two test DMC arrays of 44 nodes each in the Labs basement, adjacent to the experimental retail store. He calls these arrays The Bakery because Raspberry Pis power all the nodes in these test beds.
For now, these computers are modest: they have 1 gigabyte of DRAM and 32 gigabytes of flash storage, and they don’t use Lyon’s proposed memory-driven architecture. But they’re outfitted with customized circuitry and an impressive range of sensors measuring temperature, humidity, air pressure, light absorbance, motion, and acceleration.
Next, the team plans to build another array of memory-driven computers, implementing the architecture Lyon believes will make DMC a viable commercial opportunity.
The experimental Labs store.
An attainable future
Shah, the head of DMC research, glows when he compares Mitchell’s scrappy retail store with Lyon’s more recent design of memory-driven arrays. To him, the two projects make clear the potential of DMC to transform entire systems, such as transportation and patient monitoring.
Shah says cars could come equipped with the devices Lyon is developing. They’ll detect problems or dangerous conditions. When a car goes over a patch of black ice, for example, it will perform analytics to conclude what happened, then act on the information in a way that doesn’t violate the driver’s privacy.
The car might send a signal asking if other cars plan to travel down the road where the ice is, Shah says, and then warn those that are. Using the same technique, they could find out whether some odd behavior in the car is common to a certain make and model and, if so, alert the manufacturer—all while protecting the driver’s privacy.
“That validation of the concept showed this could be done,” he says.
Meanwhile, the computers in The Bakery are blinking away