Part Two: The smart city is the data economy made manifest

02

By Curt Hopkins

In part one of this series, "Data is the new currency," we talked about the shift in technology and business strategy from making things to knowing things. In this article we will investigate how a company will have to alter what they do to thrive in this new economy.

As the economy continues to change and as data assumes an even more important place in it, businesses will need to structure themselves in such a way as to locate, collect, and refine data. Companies need to move from a belief that data is a cost – in electricity, space, employee hours, capital outlay, and latency – to an understanding that data is a strategic asset.

Well and good, you might say, but how am I to battle those data-attached negatives? By employing edge to cloud computing.

How an edge to cloud system works

“The goal of all organizations is to become more ‘sentient,’ to be able to sense and respond to external influences faster than their competition,” says Ian Brooks, HPE’s European head of innovation.

This is where artificial intelligence enters the picture, says Brooks. One of the ways companies can get smarter is by using AI modeling to determine how effective changes will be without spending money on retooling a product line.

AI modeling consists of two constituent parts, the training of the model to evaluate the weights of the inputs (their relative importance), and the inference or deployment and response of the trained model to stimuli in the real world.

We acquire the information to develop an accurate model from the edge, a myriad of sensors depending on the use case. In the case of weather, they could be buoys, sondes, satellites, land measurement stations etc. In the case of manufacturing we measure pressures, stresses, vibrations, and torque.

Traditionally these sensor readings were collected and back-hauled over an appropriate carrier network such as 5G, satellite, WLAN, or LORA to the datacenter. Increasingly, however, due to compliance issues, cost, or the latency introduced, more and more computation is being done in situ, using edge compute and inference models.

Datasets for training are huge and need to be analyzed either using the top-down approach of developing a hypothesis and testing it, a bottom-up approach using neural networks to surface patterns from the data, or a combination of the two. The faster that the data can be moved into memory and processed, says Brooks, the faster we can get the results.

Memory-Driven Computing puts the focus on analyzing the data in-memory, rather than repeatedly loading chunks from disk, but we still need to move the data into memory, and that’s where the Gen-Z protocol comes in. Developed by an industry-wide consortium, Gen-Z is an open, high performance, low latency systems interconnect that links various core components of a computer directly to the data. The eventual deployment of Gen-Z based solutions, alongside workload-specific compute engines and non-volatile memory, will allow us to make ever more realistic simulations of the real world and accelerate progress on some of society’s greatest computational challenges.

The smart city

Edge to cloud is a strategy throughout which data is gathered, processed, and shared at every point. This is beginning to replace the old-fashioned approach to data in which it is sent to a central processing location, from which instructions are sent back out. This new system allows an almost instant reaction to changes in the data environment, enabling quicker decision-making, much of which can be done without bottlenecking the process where human action intersects it.

In Japan, Panasonic has joined with public sector partners to create the $560 million, 1,000-household village called Fujisawa Sustainable Smart Town, located in the Kanagawa prefecture in central Japan. Finished in 2018, it is arguably the most complete smart city yet to be built, with functional waste, traffic, and utility systems. It is a dense, urban environment with three days of stored emergency power and other resources.

The smart city is an extreme use case for edge to cloud. But the technology is also seen in industries and practices ranging from logistics to mining to traffic control.

It is arguable whether the smart city will take over from the traditional city. For one thing, legacy technology is difficult to mesh with new tech even in something as finite as a single company’s billing system. In a situation where a bad connection between old tech and a traffic control system could mean fatal accidents, smart cities may thrive primarily within new, purpose-built cities.

But regardless, smart city technology has already begun to thread through extant cities, sometimes to extraordinary effect. Given how well it expresses the edge to cloud paradigm, it may be worth examining how that technology is used in the imperfect world we currently occupy.

The core currency of the smart city is data and data collection is becoming a commodity. Sensors are becoming more affordable so more ubiquitous. The interesting challenge we face is how do we make sense of that data?

Eyal Feder, ZenCity

An ebullient landscape

Carlo Ratti, founding partner of CRA-Carlo Ratti Associati and director of the MIT Senseable City Lab calls smart city tech “an ebullient landscape that has been developing all over the world.” It is at its most simple, “the manifestation of a broad technological trend: The Internet is entering the spaces we live in, and is becoming Internet of Things (IoT), allowing us to create a myriad sensing-and-actuating loops in cities that were not possible before. Applications can be manifold, from waste management, to mobility, to energy, to public health, to civic participation.”

According to the global research consultancy Frost and Sullivan, smart city technology will be a $1.5 trillion market by 2020. According to Eyal Feder, CEO of the Israeli smart city data company ZenCity, you can see the build-up to that figure happening everywhere you look.

“I don’t think I’ve ever met a city that isn’t actively pursuing becoming a smart city,” he says. At its simplest, smart city is just a city that uses edge to cloud to manage itself better, to become more efficient, more environmental, and to better respond to the needs of its citizens.

“The core currency of the smart city is data and data collection is becoming a commodity,” says Feder. “Sensors are becoming more affordable so more ubiquitous. The interesting challenge we face is how do we make sense of that data?”

But how this will express itself is as exciting as it is varied. Take your trash, for example.

Smart waste management is a process of attaching sensors to individual trash bins, gathering data about fullness, timing, type, and location, then integrating it with traffic management and human resources. The goal is to make waste management more environmentally responsive, efficient, and headache-free. In five to 10 years, says Feder, most cities will have such systems in place.

But in the end, the smart city is about people and the technology we use to create it has to serve us.

How will decision-makers change the businesses that will change the future?

Edge to cloud is a change of posture, a philosophical approach that has tangible implications. If data is going to run the world, we have to run the data. We need to retain, or regain, control. Only in so doing will we have a sense of agency over the flood of data that could otherwise wash us, and our businesses, away. It is also a business strategy whose proper use will enable us to gain, or maintain, the edge in profit that will allow our companies to grow.

But key to all of this is change, and as much as business leaders think of themselves as risk takers, most have built their careers and companies by taking only the most judicious chances. This may have to change.

Companies need to develop an innovation culture from the top, supported by a board with a healthy number of dollars invested in forward-looking actions.

Andrew Wheeler, Hewlett Packard Labs

How can business and technology leaders embrace change? Andrew Wheeler, deputy director of Hewlett Packard Labs and a college football fan, has a pet metaphor. The playbook.

When you deal with a successful executive, you have to deal with that person’s playbook as well. That playbook, they might say, has gotten them all their promotions, it has gotten their company into a powerful leadership position. Therefore, that playbook is valuable. It should be consulted at all times. Ignoring the playbook is a kind of suicide, they say. So what does Wheeler advise them to do? Throw it out the window.

This is not a function of some sort of iconoclasm on Wheeler’s part. Rather, it’s common sense. If all around you the game has changed, the surest way to decline is to continue to play last year’s game.

“A large oil company is a perfect example of a business that has been built on an old model,” says Wheeler, and a good analogy for the business of data. Raw crude is like raw data; there is value in transport and processing at this level for a few, but more opportunities exist at the refining level. “For crude oil that translates into over 6,000 products; for data that translates into actionable information and insight.”

There is still room for innovation in the old style, according to Wheeler – the rise of the cloud, better processors, memory and storage will still get better for a while – but it is time to turn our attention to information.

“How do we derive real value?” he asks. “That has to become the focus – analytics, processing in real time, delivering. There can’t be any limit to the creativity and as an enterprise company we should play a big role in that shift. We have to look beyond the run-rate of servicing the old economy. We should be the ones who look to disrupt.”

It is easy to live quarter by quarter, says Wheeler, but it is imperative for any company that wishes to thrive in the future to put a strategy in place that tangibly invests in that future.

“Companies need to develop an innovation culture from the top, supported by a board with a healthy number of dollars invested in forward-looking actions,” he says. “Some can still go forward with acquisition for a while, but the writing is on the wall. That play has run its course, just like the once-pervasive Veer or Single-Wing offenses. If you don’t have a forward-mindedness you’re going to be in trouble and sooner or later and you’ll be replaced. They’ll say of you, you stuck by your old playbook instead of making that playbook new.”

If we wish to train a model somehow comparable in size with our brain – with a quadrillion-parameter neural net – then we hit a computational wall today.

Natalia Vassilieva, Hewlett Packard Labs

The man, the machine, and the city

Data is already having, and will continue to have, a profound effect on how businesses will deal with technology if they want to remain vital. As much as you once developed things or made things, you must now know things. There is no straight line between data and compute and back again.

To succeed, you will have to “acquire much more compute power, in order to train larger models on larger data sets” says Natalia Vassilieva, head of AI at Labs.

The 1000-layer neural net, which has been already trained by Microsoft Research, has about 20 million parameters. The largest nets ever trained have billions of parameters. But the human brain has quadrillions of synapses (analogous to parameters in an artificial neural net).

“If we wish to train a model somehow comparable in size with our brain – with a quadrillion-parameter neural net – then we hit a computational wall today,” says Vassiliva. To do that in a matter of hours with today’s algorithms and hardware, we would need a system capable of delivering 1026 Flops, well beyond the 1018 Flops of an exascale system. “Existing systems will not be able to provide that,” says Vassilieva. “To get there we need new algorithms, new accelerators, and new architectures.”

Everyone, from individual researchers to company leaders to government organizations, will need to seek out and experiment with new solutions, which is what HPE has been focused on, wrapping its edge to cloud strategy around Memory-Driven Computing.

Memory-Driven Computing is an architecture that gives every processor in a system direct access to a nearly limitless shared pool of memory and also allows new and AI-specific computational accelerators to access that memory much more quickly and efficiently, getting us closer to really agile AI.

This architecture is helping us get to a world where we can move beyond all the boundaries, obstacles, and speed limits that currently contain us, from the end of Moore’s Law to real-time quadrillion-parameter neural networks.

So, it is clear that our new relationship with data is changing – and will continue to change – how we program and develop. A universal architecture like Memory-Driven computing will be necessary to take advantage of the opportunities that lie on the other side of the frontier of Moore’s Law. This relationship will also change how we orient ourselves in our industrial ecosystems. But will this new economy of data really change our lived world? Will it exert a tangible change on the way we experience the cities that are our primary environment?

“From an architectural point of view, I do not think that the city of tomorrow will look dramatically different from the city of today much in the same way that the Roman ‘urbs’ is not all that different than the city as we know it today,” says Ratti. “We will always need horizontal floors for living, vertical walls in order to separate spaces, and exterior enclosures to protect us from the outside. The key elements of architecture will still be there, and our models of urban planning will be quite similar to what we know today. What will change dramatically will be our way to live the city, at the convergence of the digital and physical world.”