The Cognitive Future of the Planet
A combination of cognitive computing and the Internet of Things is changing the world for the better. The future is cognitive, believes Jonathan Batty, who shared his futuristic views at a public lecture at the CIPR technological forum in Kazan, and then adapted it especially to the BRICS Business Magazine.
This June, in California at the Code conference, IBM CEO Ginni Rometty boldly stated that cognitive computing technologies are not only transforming the company, but within the next five years, every decision in the world will be made based on insight from cognitive computing.
One may argue that these kind of advanced capabilities are restricted to first world societies; however, with the proliferation of mobile devices and cloud computing, cognitive services are increasingly being distributed to places we never thought possible. Of course, IBM is not the only company operating in this space – there is strong competition between the world’s biggest IT companies to develop cognitive computing capabilities – or artificial intelligence (AI) as many of them refer to it.
However, IBM can confidently state that Watson is the world’s first commercially available cognitive computing platform. Watson came to prominence in 2011 when it defeated the best human contestants in the US trivia game show Jeopardy. Since then, we have moved Watson from gaming to real life challenges such as the health care and financial services that are known for generating vast amounts of data and where the consequences of error are high. And Watson is no longer the domain of research – Watson has now turned into a strategic global business unit for IBM, powered by a number of application programming interfaces (APIs), including machine learning, natural language processing, and weather forecasting insight from the Weather Company, which IBM acquired earlier this year.
But what is cognitive computing and how does it differ from artificial intelligence (AI)? Cognitive computing is a new generation of computing systems that process and understand data in ways similar to the human brain. They are different from programmable systems in their ability to reason, form hypothesis, and learn from the data they process. Artificial intelligence has connotations of fully autonomous systems that make important decisions completely unaided; it is the stuff of fiction where robots steal our jobs and take over the world. At IBM, we see a future based on a much deeper human-machine collaboration– a future of humans and machines working together, where low value, repetitive, and dangerous work is handled by semi-autonomous systems, allowing humans to focus on higher value tasks aided and informed by decision support systems.
The Instrumented World
While cognitive computing is the brain of these new systems, the Internet of Things (IoT) is the eyes and ears. IoT is all about equipping physical systems and infrastructures with billions of devices and sensors that gather real-time data about situation around them. Using the cloud to connect those devices to each other and to humans, we have much greater situational awareness and an increased ability to monitor and control ‘things’ remotely. For certain simple tasks, the ‘things’ can even control themselves.
IDC has predicted that there will be 29 billion connected devices and sensors by 2020, embedded in cars, trains, planes, roads, buildings, and wearable devices. A good way to understand the rapid instrumentation of the world is to look at the rise of smartphones. If you go back to 2005, the Galaxy S smartphone had just six built-in sensors. By 2014, the Galaxy S5 had 16 sensors. Today, the most advanced smartphones have over 20 built-in sensors. Add to this the billions of sensors already embedded into buildings, cities, machines, and industrial equipment, and you can understand why the Internet of Things is set to become the biggest source of data on the planet within the next two years.
Currently, one exabyte of data is created by the Internet each day which is equivalent of about 250 billion DVDs. A yottabyte is used to describe the size of the digital universe today, and a brontobyte (which is 10 to the 27 of data) is used to describe what the digital universe will look like with the Internet of Things.
So at its core, the Internet of Things is a data challenge. Currently about 90% of the data generated by the Internet of Things is unused. In some industries, it can be up to 99% simply because the volume of data is so big and the legacy computing systems are not capable of managing or understanding it.
Watson is helping to realize the potential of the IoT by cracking this data challenge. It uses natural language processing and machine learning to reveal insights from large amounts of unstructured data. And because of Watson’s machine learning, cognitive systems constantly self-configure, adjusting predictive models to ensure that the more we use them, the better they get.
This new digitized and intelligent world is already taking shape as illustrated by a number of real-word examples:
GREENER HORIZONS FOR BEIJING
Today, almost everyone is aware of China’s air pollution problems – no more so than in Beijing, which is at the heart of a highly industrial region and the center of the global steel industry. Beijing suffers from some of the highest levels of harmful, ultrafine particulate matter particles (PM 2.5). With growing public awareness of the problem, the city government is taking huge measures to reduce it. For the past 10 years, it has been relocating industry outside the city, switching to cleaner fuels, and installing filtering systems on industrial equipment.
The gains have already been significant with the major pollutants PM10, SO2, and NO2 decreasing by 38%, 82%, and 23%, respectively, in Beijing since 1998. But over the past five years, with much of the industrial transformation complete, the Chinese government has started to look at other measures – more targeted ones – to meet its future targets for reducing air pollution. An important first step was the instrumentation of the city; today, China’s capital is probably the most instrumented in the world when it comes to environmental monitoring. There are literally thousands of sensors across Beijing gathering pollution data in real-time. There are also thousands of cameras collecting information about traffic in the city. Add to that terabytes of social media data generated by citizens frustrated with the air quality, and you have a huge potential vault of insight on pollution in Beijing.
The challenge is integrating and making sense of all that data and how to actually use it improve the situation for Beijing’s 11 million citizens. IBM’s research team in Beijing has developed a system dubbed Green Horizons, which brings together and assimilates data from all of these sources. Crucially, in order to move from simple monitoring to predicting what is going to happen next, they added another key dimension: weather data. Air pollution is carried by weather systems and has the biggest impact on levels of pollution cities. The Green Horizons system uses machine learning to understand the data and figure out where the pollution is coming from and where it will likely go next. It can accurately predict pollution up to three days ahead with a one-kilometer-by-one-kilometer resolution while also producing air quality trend forecasts for the coming 10 days. It forms the basis of an air quality decision support system, installed at the Beijing Environmental Protection Bureau, which is among the most advanced of its kind in the world.
Using this system, the government is not only able to alert citizens to developing pollution threats, but to help figure out the best course of action for reducing it. They can use the system’s scenario modeling capabilities to ask questions like, ‘What if we reduced industrial production in the northeast part of the city for two days?’ or, ‘What would be the gains from restricting city traffic by 50?’?
Back in 2008, the Beijing government literally had to close all industrial production around the city for one month in order to clean the air pior to the Olympic Games. Even though the reduction in pollution was significant, ensuring blue skies for the games, the economic cost of doing this was absolutely enormous. Today, using cognitive IoT technology, the government can employ a much more targeted action strategy.
NAIROBI: REMOVING THE BUMPS
Nairobi, the capital of Kenya, is one of the fastest-growing cities in the world now. And, like all fast-growing cities, it has problems with the volume of traffic outgrowing the urban infrastructure.
The Nairobi city government turned to IBM with two challenges: to optimize city-service vehicles, which were getting stuck in the traffic, and to understand what were the causes of the jams.
However, Nairobi is not Beijing. It didn’t have the money to invest in instrumenting the entire city with cameras and sensors, so scientists from IBM’s Africa research labs used a novel approach: They took basic smartphones and mounted them on the dashboards of the city’s waste collection vehicles. GPS data not only helped to track the location and movement of the fleet of trucks, but data from the smartphones’ built-in accelerometer, gyroscope, and magnetometer helped IBM’s researchers to pinpoint and map the location of the Nairobi’s potholes, speed bumps, and other obstructions causing Nairobi’s jams.
Over time, as those lorries were going around Nairobi, not only were they collecting rubbish, but they were also gathering data about the city that was analyzed by cognitive computing technologies and turned into valuable insight. With this data, the researchers created a living map showing Nairobi authorities what was going on in the city and where they should focus their road repair and maintenance work. All that was delivered through the use of phones, sensors, and cognitive computing.
KONE: LIFTING INNOVATION IN SMART BUILDINGS
KONE, one of the leading producers of elevators, escalators, and moving walkways, literally moves a billon people daily in buildings and cities around the world. A lot of KONE’s equipment is already instrumented with sensors, and vast amounts of data are already being collected about the equipment and components that go into it.
KONE is using IBM’s Watson IoT Cloud Platform to help analyze vast amounts of data from these sensors, helping to identify and predict issues, minimize downtime, and personalize the experience for users. Using the system, instead of having to call in a service engineer or stick to maintenance schedules, KONE is able to predict and respond to selected technical issues in real-time, with the ability to run tests remotely and make commands over the cloud. With real-time insight about the operation of its equipment, KONE also plans to drive further innovation and develop new solutions and services to improve the flow of people in some of the world’s busiest cities and buildings.
For the Good of All
These examples help to illustrate the potential transformative nature of cognitive IoT technologies. As sensors and devices become more affordable and easy to implement, they are rapidly becoming more pervasive across the world’s physical systems and infrastructures. At the same time, cloud computing is enabling remote access to powerful analytics and machine learning capabilities provided by cognitive computing systems like Watson. Combined, these technologies have the potential to transform the world with gains in productivity not seen since the first industrial revolution. In fact, many predict an impact so huge that they are referring to this period as the fourth industrial revolution – or Industry 4.0.
The key is for companies, governments, and universities to come together to ensure these new systems are integrated, secure, safe, and designed for the benefit of everyone. That way, history will undoubtedly look back on this period as a truly golden age for computing.