Thursday, November 7, 2024
Technology

Macro machines

The phrase “mission creep” entered the popular discourse in the early to mid-1990s. It popped up in a number of big papers in articles describing the Somali Civil War. Here’s the New York Times in October ’93:

The trick is to do this without inviting what a senior official called “mission creep” — the expansion of the role to include, for example, raiding neighborhoods controlled by General Aidid and searching for weapons.

Like countless military and sports terms before it, we now understand it in a broader context. It’s one of those phrases that perfectly encapsulates a commonly understood experience — projects whose size, scope and focus shift so gradually you hardly even notice. I bring this up in the context of an op-ed the Electronic Frontier Foundation published last year.

“Mission creep is very real,” analyst Matthew Guariglia wrote in July 2021. “Time and time again, technologies given to police to use only in the most extreme circumstances make their way onto streets during protests or to respond to petty crime. For example, cell site simulators (often called “Stingrays”) were developed for use in foreign battlefields, brought home in the name of fighting “terrorism,” then used by law enforcement to catch immigrants and a man who stole $57 worth of food. Likewise, police have targeted BLM protesters with face surveillance and Amazon Ring doorbell cameras.”

Last week I wrote a story titled “It’s time to talk about killer robots.” On reflection, I wonder whether “We missed the boat on killer robots” might have been the better title. Certainly I stand by the “They already walk among us” subtitle. In the piece, I talk about a precedent set by military drones decades ago. I also note the minor firestorm that arose after Boston Dynamics showed videos of Spot being used in Massachusetts State Police hostage drills. The ACLU told us at the time:

We urgently need more transparency from government agencies, who should be upfront with the public about their plans to test and deploy new technologies. We also need statewide regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence.

Police advance on demonstrators who are protesting the killing of George Floyd on May 30, 2020, in Minneapolis, Minnesota. Former Minneapolis police officer Derek Chauvin was arrested for Floyd’s death and is accused of kneeling on Floyd’s neck as Floyd pleaded with him that he could not breathe. Floyd was pronounced dead a short while later. Chauvin and three other officers, who were involved in the arrest, were fired from the police department after a video of the arrest was circulated. Image Credits: Scott Olson/Getty Images

San Francisco Board of Supervisors president Shamann Walton made a comment at a meeting Tuesday night that echoes the ACLU’s sentiment. “We continuously are being asked to do things in the name of increasing weaponry and opportunities for negative interaction between the police department and people of color,” he said. “This is just one of those things.”

Walton was referring to the Board’s 8-3 vote to approve the use of robots for lethal force. The initial proposal prompted last week’s writeup. The vote was prompted by Assembly Bill 481, which California governor Gavin Newsom signed in September of last year. The bill was designed to bring more transparency to police use of military equipment — itself a product of the 1997 National Defense Authorization Act. Section 1033 allows for the military’s “transfer of excess personal property to support law enforcement activities” for the sake of drug enforcement. The subject jumped back into public consciousness recently, courtesy of various large-scale protests over the past couple of years.

The relevant part of Assembly Bill 481 requires a written inventory of the military equipment utilized by law enforcement. The SFPD’s list includes the Lenco BearCat armored vehicle, flash-bang grenades and 15 submachine guns, among others. There are also 17 robots listed — 12 of which are fully functional. None, it should be noted, were designed for killing. Quite the opposite, in fact. They’re mostly bomb detection and disposal robots — the kind police departments have deployed for years.

The proposal sent to the SF Board this week notes:

The robots listed in this section shall not be utilized outside of training and simulations, criminal apprehensions, critical incidents, exigent circumstances, executing a warrant or during suspicious device assessment. Robots will only be used as a deadly force option when risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.

That last little bit is the sticking point. It’s effectively an authorization for the use of robots to kill. I will say, I believe it falls under the standard definition of “justified” deadly force, which allows officers to shoot to kill in cases of self-defense or where others are facing death or serious bodily harm.

An earlier attempt to add “Robots shall not be used as a Use of Force against any person” was reportedly removed by the SFPD.

Following the vote, department spokesperson Allison Maxie echoed the language in the initial proposal, stating, “Robots equipped in this manner would only be used in extreme circumstances to save or prevent further loss of innocent lives.” She then outlined a specific use case involving equipping one of the existing robots with an explosive to kill a suspect. Such an application would be the polar opposite of their intended use of detecting and disabling explosives.

Yes, the precedent exists. In 2016, Dallas police used a bomb disposal robot to intentionally kill a suspect — believed to be the first time in the U.S. this has happened. Police chief David Brown told the press, “We saw no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was.”

So, mission creep. “Robots will only be used as a deadly force option when risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.” There are a lot of ethical questions we need to be asking right now, but let’s start with this: Are bomb disposal robots armed with bombs an end point or the beginning of something even more troubling?

In recent years, we’ve seen guns mounted on robot dogs designed for the battlefield. We’ve also seen plenty of products designed for the battlefield be deployed domestically. For now, this leap is purely hypothetical — but it’s important to have this conversation before it becomes something more than that. “Mission creep” is a great phrase in this context, given its own military origins.

Maxie, it should be noted, rebuked the idea of robots armed with guns, saying it isn’t in the SFPD’s plans. I’m certainly in no position to say with certainty that San Francisco will be arming robots with guns at any point. I will only say that as the years pass, these scenarios feel less and less like speculative fiction.

A handful of prominent robotics firms — including Boston Dynamics, Agility, ANYbotics, Clearpath Robotics and Open Robotics — recently signed a letter condemning the weaponization of “general purpose” robots. It reads, in part:

We believe that adding weapons to robots that are remotely or autonomously operated, widely available to the public, and capable of navigating to previously inaccessible locations where people live and work, raises new risks of harm and serious ethical issues. Weaponized applications of these newly-capable robots will also harm public trust in the technology in ways that damage the tremendous benefits they will bring to society.

I’m quite aware that not everyone feels the same way I do about weaponized robots. I anticipate getting some pushback here. I’ve had numerous conversations with people working in robotics who don’t draw the same ethical distinctions. The argument that putting robots in harm’s way eliminates the risk to police officers and soldiers has its merit. But I’m also naturally inclined to believe that remote operations can have the effect of dehumanizing suspects. Another important question here: Does taking the human out of the situation potentially lower the burden of pulling the trigger?

The answer is complicated because human brains are complicated. Some very interesting studies on the emotional impact of remote killing are being published. Quoting here from “Emotional Reactions to Killing in Remotely Piloted Aircraft Crewmembers [sic] During and Following Weapon Strikes” published in the March 2018 issue of “Military Behavioral Health”:

The majority (76%) of [Predator/Reaper ] crewmembers reported experiencing both positive and negative emotions in response to weapon-strike missions, suggesting that engagement in remote combat operations is emotionally complex.

The paper concludes that the picture of drone missions possessing all the emotional heft of a video game is misleading. And what of the potential biases highlighted by Walton and others above? Not to mention the kind of visceral fear response to weaponized robots roaming the streets.

I will say, I was surprised this passed in San Francisco. It seems like a strange fit for a city that has so long been held up as a bastion of progressive values. Across the Bay last month, a similar proposal was scrapped due to public backlash. What is less surprising, however, is that some board members were clearly concerned about the optics of being painted as anti-cop.

“I think there’s larger questions raised when progressives and progressive policies start looking to the public like they are anti-police,” board member Rafael Mandelman noted during the meeting. “I think that is bad for progressives. I think it’s bad for this Board of Supervisors. I think it’s bad for Democrats nationally.”

I suppose there’s an extent to which a politician’s job is determining the best of various bad optics. For San Francisco Board members, that meant accepting the proposal and, perhaps, hoping potential mission creep doesn’t eventually put the city on the business end of an autonomous rifle.

I can’t imagine this is going to be the last time we discuss this subject in these pages. Stay tuned.

Image Credits: Locus Robotics

Raise news has slowed down a bit. I suspect this is a combination of (1) things generally slowing down in this dead space between Thanksgiving and Christmas here in the U.S., and (2) the general slowdown in VC as economic headwinds persist.

I anticipate asking anyone with funding news during this period the same basic questions, including “Why now?” and “How tough is this macroenvironment?” broadly speaking. Here’s what Locus Robotics’ CEO Rick Faulk had to say on the subject:

In today’s environment, investors are focused on high-quality companies that have both strong growth/market leadership and business unit economics. Therefore, it is important to have a track record and forecast that supports both. Late-stage private companies are competing with beaten down public companies for investment dollars.

Companies are focused on improving operational efficiency and Locus assists with exactly that…therefore, there is strong excitement around being a differentiated solution in a very large end market. The raise will enable Locus to continue to extend its leadership in the market.

Certainly external forces weren’t enough to slow down its massive to-date funding. A new $117 million Series F led by Goldman Sachs, G2 Venture Partners and Stack brings its total north of $400 million, while placing its valuation “close to” $2 billion. In the crowded fulfillment category, the Massachusetts company has made a name for itself with flexible, brownfield robotic systems that can easily adapt to existing warehouse environments.

finger pointing at a tiny, remote-controlled medical robots

Image Credits: Bionaut Labs

And there was a $43.2 million Series B for Bionaut Labs. Based in Los Angeles, the firm is working to commercialize research into magnetically controlled microrobots that can be used to deliver drugs to the midbrain — a more direct application than standard systemically delivered (intravenously, orally, etc.) drugs.

The company’s co-founders, Michael Shpigelmacher and Aviad Maizels, were both involved in PrimeSense, which developed the 3D imaging technology behind Microsoft’s Kinect. Apple acquired the firm in 2013 and has since used its tech as the basis for Face ID.

“There has been a dearth of innovation around treatments for conditions that cause tremendous suffering, in large part because past failures have discouraged even the best of researchers,” Shpigelmacher says about this latest round. “Bionaut Labs remains committed to finding new ways to treat these devastating diseases, which are long overdue for a breakthrough.”

This round will go toward development of a treatment for malignant glioma brain tumors and Dandy-Walker syndrome, as well as future R&D. Bionaut has set a 2023 timeline for preclinical studies, with human trials potentially following the next year.

Bay Area agtech firm Verdant Robotics, meanwhile, recently closed a Series A. Following an $11.5 million 2019 round (initially reported as a Series A, which the company is referring to as its seed), the company has raised $46.5 million to date. Verdant’s aim is to be more than simply a robotic weeder, adding fertilization and pest control to its offering, along with the kind of data collection/analysis that’s really the secret sauce for these sorts of systems.

Image Credits: Owlchemy Labs

All right, that’s it for this week — and for the next few. Your boy is finally getting some of his vacation days in before the end of the year and boldly exploring what it might feel like to not be burned out all the time. I’ll let you know how it goes. In the meantime, I’ve spoken to bigwig industry folks and asked for their thoughts on 2022, 2023, and beyond. We’ll be publishing those in my stead the next few weeks.

See you in just under a month when I go into full CES panic mode.

Image Credits: Bryce Durbin/TechCrunch

Give yourself the gift of Actuator this holiday season.

source

Leave a Reply

Your email address will not be published. Required fields are marked *