Only the future is now. Dozens of self-propelled, wheeled robots are already on patrol in places like the Golden 1 Center arena in Sacramento, a residential development near Tampa, Fla., and at venues in Boston, Atlanta and Dallas.
They are cheaper than humans, require no health insurance, never clamor for a raise and work 24 hours a day.
A Mountain View, Calif., startup, Knightscope, contracts out four types of indoor and outdoor robotic sentinels. So far, it has put 47 in service in 10 states.
“We’re about to see a rising of this type of technology,” said Stacy Dean Stephens, a co-founder of Knightscope. “It’s very reasonable to believe that by the end of next year, we’d have a couple of hundred of these out.”
The Knightscope robots are both friendly, with calming blue lights, and imposing in size.
“They get attention,” Stephens said. “There’s a reason they’re 5 1/2 feet tall. There’s a reason they are three feet wide, weigh over 400 pounds, because you want it to be very conspicuous.”
The advent of self-propelled autonomous robots, though, has stepped on a few feet, literally and figuratively. The robots blend three of the world’s top innovations: self-driving technology, artificial intelligence and robotics, areas of frequent breakthroughs. But this is a space where society is still sorting out rules.
Knightscope’s security robots are largely aimed for use on private property, giving them greater latitude. That is not so for delivery robots that operate on sidewalks of several big U.S. cities, where they can mix uneasily with pedestrians.
Elected officials in San Francisco last month pondered whether delivery robots should be banned from city sidewalks, pushing them on to bicycle lanes and away from the elderly.
“There are a lot of issues of how to stop it from hurting people, accidentally running over their toes, pushing over children and dogs, that kind of thing,” said A. Michael Froomkin, a University of Miami Law School professor who specializes in policy issues surrounding robots.
Among practical issues, Froomkin said, is how to contact robot owners in case of mishap.
“If you have a robot with no distinguishing marks, who are you going to call? It’s a very good question and it’s already happened in real life,” Froomkin said.
He referred to Edward Hasbrouck, a consumer advocate and author who blogged in August about a “disturbing encounter” with a delivery robot on a train platform in Redwood City, Calif., that rolled near the edge of the platform, inches from a speeding train.
“I yelled at the robot, hoping that a human operator might be monitoring it, but the only response from the robot was a repeated recorded message, ‘Let me go! I’m working! I’m going to be late!’—as if the platform was a right-of-way, and humans were expected to yield to robots,” Hasbrouck wrote.
Hasbrouck appeared before the Redwood City Council Aug. 28 to complain that there were “no visible marking on the robot” and no way to contact its human operator.
Knightscope, which was founded in 2013, has carved out a different robot niche than delivery: fighting and deterring crime. Its workforce isn’t dulled by the monotony of routine patrols and can detect anomalies that might elude a human sentinel.
Around large parking lots, its K5 robots can use their license plate-reading capability to “track dwell time of cars, say, if a car’s been there for three days,” Stephens said.
A smaller version, the K3, is a little over 4 feet tall and is intended for indoor use in places like shopping malls, warehouses and sports arenas. It has microphones and speakers to allow conversations between people near the robot and the security operations center. It can also air recorded public service messages.
A third version is a stationary robot intended to be placed at points of high traffic. It has sensors that can detect radiation and certain kinds of weapons. A fourth model is a rugged multi-terrain vehicle that could patrol solar and wind farms and power utility installations.
If a criminal impedes the Knightscope robots, they emit an escalating series of alarms. The robots are unarmed and cannot detain criminal suspects.
“Where we draw a very, very thick red line is the weaponization of the machines, even less than lethal,” said Stephens, a former Dallas police officer.
In some parts of the world, those qualms don’t exist. China has deployed a robot in one of its busiest airports, in Shenzhen across from Hong Kong, that can scoot along at 11 mph and use Tasers on its victims.
Knightscope provides its robots to clients on a long-term subscription model for about $7 an hour, per robot. A human monitor works with the robots. A company that replaces a crew of three security guards with two robots and a human monitor will realize a 50 percent savings, it says.
Experts in security and robotics said it may be only a matter of time before police departments employ robots in new functions.
“To the extent that traffic stops are often volatile situations, I can certainly see the appeal of having a robot that might check out what’s going on,” said Elizabeth E. Joh, an expert on policing and technology at the University of California, Davis, School of Law.
The Pentagon maintains a robust program in robot development. Self-propelled ground robots are widely used for such things as bomb disposal, reconnaissance and surveillance. Military contractors seek to expand sales to law enforcement agencies.
But accidents indicate that the time may not yet be ripe for a broader rollout of civilian security robots.
Knightscope suffered an embarrassment when one of its robots fell into a fountain at a Washington, D.C., office building last summer. An employee posted a photo of the sodden android on Twitter, saying it “drowned itself,” and joked about “suicidal robots.”
Taking the incident in good humor, the company replied with an image of K5 in swimming shorts and a yellow floaty. It said, “I heard humans can take a dip in the water in this heat, but robots cannot. I am sorry.”
As security robots evolve, they will become cheaper and more ubiquitous, and regulators may intervene to define how the data they collect can be used and stored.
“These things move. They are new. You’re not used to them. And people don’t know how they’re being used and what kind of guardrails or limitations are on them and on the data these devices are collecting,” said Nuala O’Connor, president of the Center for Democracy & Technology, a think tank in Washington on privacy and information governance.
O’Connor said she is concerned about security robots that appear harmless.
“It’s the cute ones you always have to be worried about, right?” she said.
Joh, the UC Davis law professor, said security robots will find their way to consumers.
“People feel comfortable having an Alexa, a Google Home or a Roomba in their home. This is all part of a continuum. At some point, security robots will be affordable, and I think individuals will want them, too,” Joh said.
“And that, I think, is really the point at which people will want to know, well, what are the rules here? What kind of capabilities should these robots have?”
©2017 McClatchy Washington Bureau Distributed by Tribune Content Agency, LLC.