Robots: Lots of features, not much security
Robots are supposed to do good things for us, not bad things to us.
But there is plenty of evidence that, like the billions of other connected devices that make up the Internet of Things (IoT), the growth of robot technology is coming with loads of features, but not much of a security blanket.
More evidence came in a report on home, business and industrial robots released last month by security research firm IOActive, which found that “most” of them lacked what experts generally call “basic security hygiene.”
Those included the predictable list: Insecure communication channels, critical information sent in cleartext or with weak encryption, no requirement for user names or passwords for some services, weak authentication in others, and a lack of sufficient authorization to protect critical functions such as software installation or updates.
All of which would allow, “anyone to remotely and easily hack the robots, … install software in these robots without permission and gain full control over them.”
Beyond that were privacy problems – mobile applications sending private information to remote servers without user consent, including, “mobile network information, device information and current GPS location. This information could be used for surveillance and tracking purposes,” the report said.
And, as is the case with many IoT “smart” devices, they aren’t smart enough to allow their owners to close some of the security holes.
“We found robots with insecure features that couldn’t be easily disabled or protected, as well as features with default passwords that were either difficult to change or could not be changed at all,” wrote the report’s authors, CTO Cesar Cerrudo and Senior Security Consultant Lucas Apa,
The damage from hacked robots could range from spying to injury to death. Cerrudo and Apa cited statistics from the US Department of Labor, which maintains a list of “robot-related incidents, including several that have resulted in death.”
While all of those incidents were considered accidents, “similar incidents could be caused by a robot controlled remotely by hackers,” they wrote.
None of which has, apparently, slowed the appetite for robots of both consumers and businesses.
It is still a relatively young industry. Reports by the International Federation of Robotics (IFR) put growth in the hundreds of thousands to millions, not billions. But the annual growth percentages are impressive – in the 25 percent range.
IFR’s 2016-19 forecast for sales of personal and domestic service robots is 42 million. They are used for things like vacuum and floor cleaning, lawn mowing, entertainment and leisure and elderly and handicap assistance.
It also reports that by 2019, more than 1.4 million new industrial robots will be installed in factories around the world, bringing the total to 2.6 million.
That increase of robotic automation in the workplace had former Florida governor and Republican presidential candidate Jeb Bush saying just this past week that people should be “marching in the street,” demanding reform in an “antiquated” education system that isn’t helping students compete for jobs against increasingly sophisticated robots.
A few catastrophic incidents brought on by hackers getting through lax security in robotic systems could change that, of course. There are no reports, yet, of hackers causing injury or death. But, as Cerrudo and Apa point out, the “attack surface” is very broad. They reported finding vulnerabilities in:
- Microphones and cameras
- Network connectivity
- External services interaction
- Remote control applications
- Modular extensibility
- Safety features
- Main software (firmware)
- Known operating systems
- Network advertisement
- Connection ports
This, they concluded, was in some measure due to most robots using open-source frameworks and libraries. One of the most popular, the Robot Operating System (ROS), “suffers from many known cybersecurity problems, such as cleartext communication, authentication issues and weak authorization schemes.”
While sharing is fine for development and programming, it only works if the software is secure. “Unfortunately this isn’t the case here.”
Indeed, Cerrudo said in an interview that he and Apa didn’t even have to purchase the robots they tested from about a half-dozen manufacturers. “We got access to the different components – mobile applications, firmware, operating systems, software, etc. They were available on the internet to download,” he said, adding that they provide all the functionality for the physical part of the robots.
What will it take to build better security into robots? Experts generally agree that it won’t be done without some major incentives, since the market incentives are to get a product loaded with attractive features to market as quickly as possible.
As Andrew Ostashen, cofounder and principal security engineer at Vulsec, noted that revising features or hardware, “could push out the product delivery date, which could cost million or even billions in missed revenue.”
Apa agreed that manufacturers, “typically prioritize marketing and logistics rather than security.”
He said this is typical in a relatively new industry where, “the extra effort to make their product mainstream absorbs part of the resources that could have been used for hardening their initial prototypes.”
So that means it will take class-action lawsuits brought by victims of robot hacks, or government intervention, which security guru Bruce Schneier has been promoting for some time.
Schneier, CTO of IBM Resilient, who has testified before Congress urging government regulation of the IoT, wrote in a recent blog post on internet development that, “We’re building a world-size robot, and we don’t even realize it.”
That, he wrote, is bringing cyber threats to a new level. “Give the internet hands and feet, and it will have the ability to punch and kick,” he wrote.
Danny Lieberman, CTO of Software Associates, agreed, saying a regulatory agency like the Food and Drug Administration (FDA) should oversee the IoT, “in a multiple tier system of non-regulated, low-risk, mid-risk and invasive/high risk devices.”
But he said for such a model to work, it would have to be much different from the typical government bureaucracy.
“It would have to use an entirely different organizational construct, built from day one on online submission and distributed approval with a very small bureaucracy. Otherwise, it won’t happen,” he said.
Ostashen called for government, “standards that have consequences.”
If manufacturers fail to meet standards or to use best practices, “then the manufacturer must pay a fine,” he said, adding that there ought to be fines levied also for breaches due to poor security.
Apa said he thought government involvement would, “bring attention from consumers and producers to the problems.”
But he said it would only work if, “security regulations are implemented effectively, from the right people and in a realistic way.”
He said he and Cerruda notified the six robot vendors surveyed in the report and just four – SoftBank Robotics, UBTECH Robotics, Universal Robots and Rethink Robotics – responded and were sent a report with the research details. Just one, SoftBank Robotics, said they were going to fix the problems, but offered, “no details on when and how they are going to do it and what issues they were going to fix,” he said.
“Universal Robots said our findings were interesting and that they should do something about it without giving any more details,” he added.
CSO tried to contact all six vendors and got a response only from Rethink Robotics – a prepared statement that had been released to other media outlets earlier. It said two of the items noted were intentional design features that would apply only to the “research and education version” of its robots. The other items, “were already known to us and addressed in Rethink’s software release in February,” the statement said, although it offered the disclaimer that, “we also expect that the robot is connected to a secure corporate network.”
This story, “Robots: Lots of features, not much security” was originally published by CSO.