When robots fight, they don't last long. That's okay, because they're educating and entertaining us as they tear each other apart.
1 of 13

Visiting the 11th annual RoboGames in San Mateo, Calif., on Friday, I was reminded how far we are from true artificial intelligence.
The exhibition hall was filled with robots that had very little in the way of brains. And that's probably for the best, since the humans attending the show were there to watch robots fight and die. Intelligent robots might have refused to serve as mechanical gladiators, negotiated for better pay, or rebelled against their frail yet demanding masters.
Most of the machines at the RoboGames were not robots. A robot is "a machine capable of carrying out a complex series of actions automatically, especially one programmable by a computer," according to Google. Automatic operation is also central to Dictionary.com's definition of the term.
Wikipedia offers a broader definition: "A robot is a mechanical or virtual artificial agent, usually an electro-mechanical machine that is guided by a computer program or electronic circuitry." Within these parameters, software programs and cars qualify as robots.
A robot should be autonomous, and most of the robots at the RoboGames were not capable of acting on their own. Most of the bots at the games required a human operator with a remote control device. These machines would be more accurately described as dangerous remote control vehicles because of their cutting blades and flame throwers. Those few I saw that had some degree of autonomy still had to be brought in, set up, calibrated, charged, and managed.
Though the competition is billed as the RoboGames, it's really the Competitive Engineering Games. This is human combat by proxy, rather than a battle between robots directed by algorithms. It's as much a measure of the skill of the person handling the remote control as anything else.
That's not to dismiss the creativity and strategic thinking behind these machines, or how well they work as entertainment. It's great fun to watch people compete through their remote controlled creations and to see how engineering decisions shape the outcome of the conflict.
For example, wedge-shaped combatants are common because they can get underneath opponents and flip them while making it difficult for opponents to do the same. But spinning machines can generate huge amounts of force, allowing them to damage opponents severely if they can hit an exposed surface at a suitable angle.
A spinning disk-shaped robot named The Blender demonstrated the advantages of its design by striking a wedged-shaped bot on the side and sending it flying, disabling it in the process. Another variant on this design involved a spinning wedge with a hammer affixed to it. You really would not want to get hit in the ankle by this thing.
I would love to see robots fight each other without human intervention, provided their tactical choices and limitations were apparent. However, sophisticated, adaptable autonomous action is extraordinarily difficult to program. Robots capable of assessing their opponents and devising an appropriate strategy are likely to be too complicated and expensive for hobbyists and weekend warriors. Even the military prefers human-directed machines.
Those who had the pleasure of playing Muse Software's RobotWar in the early 1980s, or any of the variants on this concept, may share my affinity for contests of code. But competition between software programs tends to be too inscrutable to be interesting to most observers. Hence the RoboGames are contests between people, with fire and violence to hold our attention. Engineering, it turns out, can create interesting entertainment.
Machines can do some remarkable things, thanks to advances in artificial intelligence and machine learning. But these feats remain limited to specific tasks or domains. Robots aren't up to complicated moral choices and sorting through ambiguities. They don't have the benefit of a lifetime of experience and interaction to guide them. How do you encode whether or not an armed sentry robot should fire on a target that may or may not be a child who may or may not have a gun that may or may not be plastic? How do you program an autonomous car to steer when all the available paths end in a collision with a person? And how do you justify that code under cross-examination? These are hard choices, and no one wants to be held responsible for making them. This may help explain why we've kept people in the decision-making loop.
Tech luminaries such as Bill Gates and Elon Musk may believe there's reason to worry about AI run amok, but AI is difficult. It's nowhere near being able to operate machines without human intervention for anything other than specific tasks. And after seeing the carnage at the RoboGames, I'd be more concerned about machines directed by people.
In the pages that follow, you'll see the reasons for my concern, as well as how robot fights are becoming a spectator sport.
Thomas Claburn has been writing about business and technology since 1996, for publications such as New Architect, PC Computing, InformationWeek, Salon, Wired, and Ziff Davis Smart Business. Before that, he worked in film and television, having earned a not particularly useful ... View Full BioWe welcome your comments on this topic on our social media channels, or
[contact us directly] with questions about the site.

1 of 13

More Insights