Ford's Autonomous Car: Under The Hood

Ford is building an autonomous car and the technology for the test vehicle is more familiar than you might think.

Curtis Franklin Jr., Senior Editor at Dark Reading

January 18, 2016

5 Min Read
<p align="left">Ford's autonomous vehicle test bed drew a crowd at CES 2016.</p>

10 Best Enterprise Products At CES

10 Best Enterprise Products At CES


10 Best Enterprise Products At CES (Click image for larger view and slideshow.)

Ford has developed an autonomous car to the point that it is testing it in the snow. For some of us, the question isn't whether the company is testing its car in extreme weather, but how it built it in the first place. On the show floor at CES 2016, InformationWeek had a chance to talk with a Ford software engineer to get a look inside the autonomous test vehicle on display.

One thing you should know is that this was a tough interview to get. Not because Ford PR or the engineer was difficult to work with -- quite the contrary. The problems came because people kept walking up to ask questions during the interview. We were interrupted at least half a dozen times by people who wanted to know more about the project -- or who had ideas to offer for making the car better. Whether they intend to buy or not, people are curious about autonomous vehicles and are eager to talk with engineers who can fill them in on details.

Wayne Williams, the Ford software engineer we spoke with, said that the complexity of an autonomous vehicle starts with the need for the system to create an image of the world around it in a constantly changing stream of data. This image is built from sensor data that informs the virtual driver's ability to detect, identify, and react to everything around it.

In Ford's autonomous vehicle test platform, the data from an array of sensors flows through a network to a processing unit. While they probably bear little resemblance to the system that would be in place in a production vehicle, the systems are built around technology that will seem familiar to enterprise IT professionals.

[ Read Obama Proposes $4 Billion Budget For Self-Driving Cars. ]

To begin with, Williams said that the network itself is standard TCP/IP Ethernet. We talked about other automotive networks and the vulnerability that was demonstrated at Black Hat 2015. He said that security is now top-of-mind for anyone working in automotive electronics, and that the network supporting the virtual driver in Ford's autonomous vehicle is both separate from all other networks on the vehicle and from any wireless communications. Williams said that even engineers who want to update the system have to do so through a physical connection with the vehicle.

CES_2016_Day_2_044.jpg

The sensors that make up the bulk of the nodes on the network are a collection of video, ultra-sound, and other input devices that combine to create a complete multidimensional image of the world around the vehicle. Williams said that the sensors are both critical to the success of the project and one of the key technology challenges for engineers. He indicated that stitching the pieces of the world-defining image together is a problem that is well-defined and solvable. The issue is getting data for the image that is of sufficient quality and resolution to allow the electronic image to be complete.

The virtual driver itself lives in a Linux cluster that sits in the trunk of the test vehicle. Williams said that the cluster is five nodes running Ubuntu Linux. Multiple nodes are required to handle all the sensor input and process it quickly enough to make driving decisions. Asked why there are five nodes in the cluster, Williams was succinct. "That's all that would fit in the trunk," he said.

CES_2016_Day_2_037.jpg

As for the software running on the cluster, that's created using standard technology, as well. Williams said that C has been used for the programming in the various applications that make up the virtual driver. When I asked why no more-modern or -specialized language was used, Williams smiled. "They weren't part of the specification," he explained.

During the conversation, one CES attendee walked up and began asking why particular bits of functionality and code weren't lifted from driver assist programs to advance the autonomous vehicle program. Williams was quick to explain that assisting a human driver and creating a virtual driver are two distinct problems that share far less, electronically or conceptually, than it might seem at the outset. He said that the two programs are distinct, with separate management and development teams.

CES_2016_Day_2_042.jpg

When asked about any sort of timeline to market, Williams demurred. He is an engineer working on solving a problem. Decisions about turning a solution into a product involve many factors that have nothing to do with engineering.

When asked about the most significant problems that must be solved before the autonomous car can be considered ready to be a product, he was quick to answer. "The remaining issues are societal. Sensors are low-hanging and gainable fruit. Algorithms are going to evolve to be road-ready over the next four to five years," he said. "Government, industry, and consumer groups must be in a discussion over who takes responsibility if things go wrong. For them, the question is 'What's the best, worst answer I can find?'" It's the sort of question that no number of sensors can inform.

Read more about:

20162016

About the Author(s)

Curtis Franklin Jr.

Senior Editor at Dark Reading

Curtis Franklin Jr. is Senior Editor at Dark Reading. In this role he focuses on product and technology coverage for the publication. In addition he works on audio and video programming for Dark Reading and contributes to activities at Interop ITX, Black Hat, INsecurity, and other conferences.

Previously he was editor of Light Reading's Security Now and executive editor, technology, at InformationWeek where he was also executive producer of InformationWeek's online radio and podcast episodes.

Curtis has been writing about technologies and products in computing and networking since the early 1980s. He has contributed to a number of technology-industry publications including Enterprise Efficiency, ChannelWeb, Network Computing, InfoWorld, PCWorld, Dark Reading, and ITWorld.com on subjects ranging from mobile enterprise computing to enterprise security and wireless networking.

Curtis is the author of thousands of articles, the co-author of five books, and has been a frequent speaker at computer and networking industry conferences across North America and Europe. His most popular book, The Absolute Beginner's Guide to Podcasting, with co-author George Colombo, was published by Que Books. His most recent book, Cloud Computing: Technologies and Strategies of the Ubiquitous Data Center, with co-author Brian Chee, was released in April 2010. His next book, Securing the Cloud: Security Strategies for the Ubiquitous Data Center, with co-author Brian Chee, is scheduled for release in the Fall of 2018.

When he's not writing, Curtis is a painter, photographer, cook, and multi-instrumentalist musician. He is active in amateur radio (KG4GWA), scuba diving, stand-up paddleboarding, and is a certified Florida Master Naturalist.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights