The new world of robotics and self-driving cars will create a mountain of data. But that's the haystack, not the needle.
It's easy to get lost amid the robots. Self-driving golf carts roll around the Dallas Convention Center. Winged drones hang overhead, missiles at the ready. Torpedo-shaped bots bob in water tanks. Blimps display ads in Blade Runner fashion.
At first, Intel CEO Brian Krzanich's keynote speech at the AUVSI Xponential autonomous vehicle conference seems to fit this hard-ware obsession. He not only takes the stage riding a robot—a modified Segway built with the company Ninebot—but also invokes a swarm of glowing drones to put on a light show. But Krzanich isn't here just to talk about the machines. There's an invisible data tsunami approaching as these machines get smarter and more ubiquitous.
KRZANICH INVOKES A FAMILIAR BUT STILL PIERCING LINE, TELLING THE CROWD "DATA IS THE NEW OIL."
Self-driving cars are a prime example, Krzanich says. "In autonomous vehicles, the data produced increases exponentially. They are data centers on wheels." A slide titled "The Coming Flood of Data" appears over his head as he does a scripted bit of scratch math: One million autonomous cars will produce as much data as half the population on earth. And the numbers only grow when you consider other uses for emerging robots. Industrial bots on factory floors and hospitals will generate terabytes. Flying drone inspec-tions of bridges, pipelines, and airplane hangars produce information that will need to be evaluated immediately and also archived for any future forensic work, should an accident occur.
To Intel, this approaching wave of data is an opportunity. Krzanich invokes a familiar but still piercing line, telling the crowd "data is the new oil." But, like oil, crude data needs to be refined and delivered, or else it's useless. "Flying a drone and capturing images is not where the true value is found," he said. "You have to process it."
Robots are both consumers and creators of data.Take your workaday military drone. Its primary job is to capture images of the ground below. It also provides data about itself, like the health of its systems and its altitude, and it has onboard radar that cap-tures a radar sweep of the airspace. The unmanned aircraft needs to accept data to operate, too, such as the GPS signals that keep it on course and the commands from human operators. (It's a flying data collection center, someone might say.)
That's just a simple drone with a video camera. Nowadays, sensors are getting smaller and less energy-intensive, so engineers want to add more of them. How about some more cameras to expand the drone's field of vision, or infrared and low-light vision? Now you're talking about a drone that's much more capable—and one that has a data problem. It's collecting plenty of info it wants to share, multiple images in multiple formats, but transmitting all of that to the ground takes up a ton of bandwidth and power. (Think of bandwidth as a straw that can only handle so much data before clogging.)
The drone needs to be smart enough not only to compress all this info, but also, ideally, to filter it so it transmits only the most relevant information. What about the cloud, you say? While relying on servers somewhere else to process this information is a solid idea, the data still has to go from the robot to a server somewhere, which opens you up to problems of speed, security, and bandwidth.
Since this is a military drone is our example, imagine image recognition software that matches the shapes of anti-aircraft below and immediately alerts ground controllers with an image and tracks the target. Now let's go farther and have the drone automatically patch itself into signals intelligence database to check the radio emissions coming from the shape's location captured by an AWACS flight. The drone also asks for and gets earlier images of the area, checking the two images for differences in the two images. In microseconds, the analyst on the ground gets a fully automated rundown on the shape, seeing images of recent track marks and evidence of radio communications on military frequencies coming from the vehicle. The identification is all but confirmed even before the image reaches human eyes.
This is all a long-winded way of saying that there's a difference between data and information. Data is good, but it's the haystack. Finding the needle just when you need it, that's the trick.
Imagine a scenarioin which all kinds of sensors work together to create a visual timeline. John Riehl, executive vice president of Video Bank, walks me through one. His example: the deadly Dallas sniper attacks of 2016.
First, he plots all the publicly available YouTube footage found from a specific area (called geospatial fencing) at a specific time. Those YouTube videos are overlaid on a satellite map. Another button-click filters the data to show a series of blue dots, repre-senting the path of a police officer's body camera images. (This is a mockup; the police would be the clients here and have access to that.) If it turns out that a police officers' body camera captured an image that could be useful for the investigation—like if he crossed paths with the shooter before the attack—that information would be almost immediately available. Audio from police radios and press broadcasts can also be incorporated to find out, quickly, what happened.
Riehl says things get more complex when applied to military sensor fusion, which can include intercepted radio communications, laser radar images and drone footage. But give the VideoBank a stream of unorganized data, and the creators promise organized information.
The booths like Riehl's selling data management tools are not the most popular kiosks here. But they exude a quiet sort of confi-dence, since without them, all the drones in the world are useless. Riehl's New Jersey company sells systems that manage video, audio, still images, documents, and other digital content. Which sounds pretty dull until you look at their client list, which includes the U.S. Navy, World Wildlife Federation, and NASCAR.
As for the future, Riehl seems optimistic that the trends are going in the right way for big data to be used everyday. "It takes three things: bandwidth, processing and storage," he says. "There's no limit to any of these."
People at Intel and other software specialists at this robot conference glow with hope about what might be possible at the meeting point of big data and robots.But there's something else under the surface. The public may not be ready for what's com-ing. Their reaction to big data, harvested from and about them, can be unsettled or downright hostile. Krzanich stumbles into this unwittingly during his otherwise benign, reassuring speech.
"Even your refrigerator will have eyes, looking out," he says, a nod to smart homes that invokes feelings of intrusive data collec-tion. Who wants your appliances spying on you, knowing what you eat and how many people are in your kitchen? The data from the fridge can be fed to the supermarket to automatically deliver milk (or rum) when it runs out. That supermarket could then cut a deal to share the data with producers and sell it to advertisers. The more hands passing it around, the more of a chance some-one loses control. Data thieves could use the information to tailor phishing schemes ("Loyal customer, you've won a month of free Kraft Cheese Slices, click here to receive your prize!") or even create a diabolical database of empty houses for some uncommonly well-organized burglars.
And it all started with the refrigerator inside a connected home.
"IT TAKES THREE THINGS: BANDWIDTH, PROCESSING AND STORAGE. THERE'S NO LIMIT TO ANY OF THESE."
For Intel's big data revolution to become reality, people will have to get comfortable with the fact that every bot is a data collec-tion and dissemination node. This is especially true of those aimed at consumers, the robot butlers and appliances that populate a networked home. "Emotion plays an important role when it comes to consumers' willingness to adopt connected technology," said Sabrina Horn, a managing partner at Finn Partners, who this week released a survey into consumer attitudes over these devel-opments. The survey found robots butlers had a more favorable reaction than machine chauffeurs. You might credit this to those harmless household ambassadors, Roomba and Scooba. "Three out of 10 respondents want robots to handle household activities such as folding laundry or vacuuming," the report reads. "Making chores the second most desirable smart home feature next to home security."
The survey's sample size may not be massive (1,000 U.S. respondents) but the trends are still interesting. When asked about con-cerns for connected tech, 26 percent of those under 45 cited privacy or security. Over 45 years old, only 16 percent had the same concerns. It seems the younger people are seeing a threat that older people haven't had to deal with.
For Intel, getting people to accept how big data could help their lives is a priority. Their drone light shows, for example, double as a way to show a gentler face on flying robots. Their robot butler Segways are built to serve.
In the same way, researchers are hoping that big data will lose its dark, threatening shades and reveal a patina of helpfulness. That will come, says Josh Walden, senior vice president of Intel's New Technology Group, when people see the benefits and don't stress over the big data needed to deliver the goods. Then they will push for better ways that can only happen with advanced artificial intelligence and a society teeming with trustworthy robots. Or so Intel hopes.
"Data in itself, who cares?" he says. "What people want, that is the answer."
This article was published in the May 11, 2017 edition of “Popular Mechanics”.