Join Date: Nov 2004
2017 911 Turbo
Miami Blue/ pdk
Extremely Mobile Devices
Silicon Valley Smart Car Some cars already require more code to run than a commercial jet, and they will increasingly use that brainpower to take control of braking, steering and acceleration. By 2030, one engineer predicts, we’ll be summoning driverless cars by cellphone to come pick us up at the airport. Nick Kaloterakis
View Photo Gallery
“You can grip the wheel very loosely,” the BMW engineer told me as I settled into the driver’s seat of the BMW Track Trainer. “Very loosely, to get a feel for how it is turning. But do not touch the pedals.” I detected in his tone an “unless” on the way. “Unless I yell stop! In which case you should grip the wheel tightly and stomp on the brakes.” He smiled. “Shall we go?”
With that, I released the brake and sat back as our unassuming 3 Series sedan accelerated of its own volition down a short straightaway, whipped ably into a right-hander, and then moved wide to set itself up for a fast curve to the left. I was, as instructed, holding on ever-so-slightly, but that felt weirder than just watching the wheel turn on its own, as if I were sitting in the lap of a ghost driver—which is pretty much what I was doing.
The BMW Track Trainer is a robot car: a fully autonomous automobile capable of racing the Mazda Raceway Laguna Seca in California’s Monterey County (or any other track it’s been programmed to run) at the limit of traction, mere seconds off the time a professional would run in the same model. BMW uses it to train drivers by showing them how the perfect racing line feels from the driver’s seat and by providing real-time feedback, with corrections, once they decide to take over the controls themselves. But the car is also a showcase for BMW’s Driver Assistance System, a series of radar and GPS sensors that work in concert with computer-operated steering, brake and power systems to achieve what BMW describes as “highly autonomous driving.”
Robo-Coach: BMW uses its Track Trainer, a self-driving sedan, to teach racers how to make optimal turns and engineers how to make optimal drive systems. Courtesy BMW
BMW chose Laguna Seca because it is a difficult track, which makes barreling into turns at 100 mph all the more impressive, and because it’s a short drive from the company’s research lab in Silicon Valley, where engineers are busy reinventing the automobile for the information age. Since 1978, when microprocessors were first installed in the trip odometer of a Cadillac Seville, the number of chips in the average automobile has grown such that cars now contain anywhere from 50 to 200 processors and a mile of wiring. The increasing prevalence of hybrid and electric cars is accelerating that trend; the plug-in electric Chevrolet Volt, for example, requires 10 million lines of code, two million more than it takes to run a Boeing 787. So carmakers are coming to Silicon Valley, where code is king.
Mercedes-Benz opened a technology center here in 1995, BMW in 1998, Volkswagen in 1998, Toyota in 2001, General Motors in 2007, and Renault-Nissan in the past year—all in large part to tap the skills of the designers and developers and engineers and who have so ably sustained Google, Apple and Facebook. Include homegrown start-ups Tesla Motors, Mission Motors and the autonomous car division at Google itself, and the result is a sort of Detroit West, where California engineers continue to devise new ways to make powerful, affordable, easy-to-use computers—but now they also devise new ways to make them move very, very fast.
Exactly how I felt about all this is something I was chewing on when the Track Trainer crested the hill that leads into Laguna Seca’s infamous “corkscrew.” I had to trust that this robot racecar would remember how to negotiate one of the trickiest and most dangerous corners in the world, a hard left followed immediately by a hard right on a stretch of track that drops five and a half stories in 450 feet. Cresting the hill, the car managed not to panic and brake too soon, as humans tend to do. In fact, as we plunged into the turn, I thought for one terrifying moment that the car wasn’t going to brake at all—until it did, with perfect timing. As we safely exited, I realized I’d just hitched a brief ride into the future.
Click here to get an inside look at Silicon Valley's automotive innovations. Silicon Valley is a surprisingly big place. Getting around requires a lot of driving, which on California’s well-maintained roads is pleasant enough even without robot assistance. And as I drove my rental car from lab to lab, interesting relationships began to reveal themselves.
The engineers at the Volkswagen Electronics Research Laboratory (ERL), for instance, work in a white midrise office building just across a narrow marshy river from the headquarters of Oracle, the company best known for its database-management program. Managing data seemed to be about a different from what automakers do as any pursuit could be. But when ERL’s deputy director, an electrical engineer named Chuhee Lee, met me at the lab, he made it clear that this was not at all the case.
In a second-floor conference room, Lee launched a PowerPoint presentation that he had used many times to justify his lab’s existence to managers back in Munich. Combining data, it turned out, is the essence of new car design. Car engineers had long thought of the various data devices they installed—navigation systems, smartphone adapters, lane-detecting cameras—as independent gadgets with narrowly tailored functions. Now they’re beginning to link these devices to one another, to connect the data from a car’s many sensors and processors. And like the engineers at Oracle, they’ve found great value in these connections.
The Climber: “Shelley,” an autonomous Audi TTS Roadster, used differential GPS and gyroscopic data to navigate the 12.4-mile, 156-turn Pikes Peak road course. Courtesy Audi
Most obviously, ERL engineers have used those connections to build a series of prize-winning robot cars not unlike the BMW Track Trainer. In 2005, a Touareg ERL modified in conjunction with the Stanford Artificial Intelligence Laboratory won the Darpa Grand Challenge, a Pentagon-sponsored desert race for driverless vehicles. In 2007 ERL’s robot Passat took second in DARPA’s Urban Challenge, an obstacle-course competition.
And last fall, a lab-modified Audi TTS self-navigated the entire 12.4-mile Pikes Peak course in Colorado in just 27 minutes, reaching speeds of up to 45 mph. I asked Lee how the kind of smarts on display in all these cars would first reach regular drivers. He played a short video for me that explained the lab’s work on what its engineers call the Affective Intelligent Driving Assistant.
The product of a joint venture with two Massachusetts Institute of Technology labs, AIDA feeds inputs from multiple sensors to a central artificial intelligence that “observes” your habits and behaviors and tailors your car’s performance to them. AIDA can learn your favorite routes and stops, remember and remind you of important events, and over time anticipate other desires; it might know, for instance, which day you like to go to the grocery store because that’s when the wild Alaskan salmon arrives.
In other words, improvements in artificial intelligence will turn the car into a personal assistant. In time, we could even leave the driving to the assistant, because the sensors and software being developed for such applications will add up, the technology will evolve, and a difference in degree will become a difference in kind. “The idea is to change the relationship between human and machine,” Lee says. By 2030, cars could be smart enough that we’ll summon them to pick us up at the airport.
Phone Home: Even as they become sleeker (like this Nissan iV concept), cars are getting smarter. Nissan
Engineers have already overcome most of the physical challenges. Computer processors regularly take control of the braking, steering and acceleration in many current high-end production models—such as when a stability-control system prevents drivers from spinning out on a wet road—and these same high-end cars are also increasingly encrusted with sensors (cameras, radar, LIDAR, infrared, ultrasonic) that gather data to feed those processors.
The car will eventually know where it is and where it is going, and perhaps even how it will get there. Within a few years, differential GPS, which uses fixed ground stations to correct inaccuracies in satellite signals, will allow a car to reliably determine its location to within a few inches. Put these together, and pretty soon you have a Track Trainer that requires no engineer riding shotgun. It will be parked in your garage.
Cars are not especially good at learning right now, but engineers are working on that too. Rob Passaro has worked at BMW’s Group Technology Office in Silicon Valley since it opened in 1998, when the auto industry’s idea of an IT revolution was a car that could play MP3s. When I met him in the “office’s” spotless garage, though, he quickly explained that his primary mission was to “open the car as a platform for applications.” Cars are the most thoroughly computerized machines most of us will ever buy, he said, but unlike phones or laptops, they are nearly impossible to upgrade—you pay your money and then drive the thing unchanged until it’s scrapped. But connect a car to the Internet, and the possibilities become more interesting.
"The idea is to change the relationship between human and machine."Passaro plopped a white iPhone into a cradle in the center console of a 5 Series sedan to demonstrate BMW Apps, a system available on all BMWs produced after March 2011 that connects the car to a website from which the driver can download BMW-specific iPhone apps. For now, BMW offers only customized versions of already-popular apps from companies like Pandora and Facebook. The interesting thing about these apps is not that they exist, however, but where they exist. They show up on the dashboard display, not on the iPhone, and their installation involves customizing software that car companies have traditionally treated as an unalterable, untouchable secret. Car companies are skittish about the possibility, but eventually it’s probably inevitable that someone will invent apps that work their way much further into the car’s vital functions—all the way, perhaps, into the fuel-injection or lane-detection systems.
Cars won’t just talk to the Internet. They will also gather information from their immediate surroundings. After Passaro finished his demo, he handed me off to another engineer, Darren Liccardo, who walked me out of the garage and into a wide, mostly empty parking lot surrounded by giant hedges.
A prototype 5 Series awaited. Its trunk was packed with off-the-shelf computer hardware running a popular open-source operating system called ROS, for Robotics Operating System, which is used in everything from housecleaning robots to self-piloting helicopters. In this case, it would help the car handle a basic traffic problem—negotiating a stoplight. After a drive around the Technology Office, Liccardo pulled back into the parking lot, stopped the car, drew a keyboard out from under his seat, and typed a few commands. A video-camera image of a traffic signal mounted at the back end of the parking lot appeared on the console screen. “This is what we call smart cars meet smart traffic lights,” he said.
The traffic signal had been modified to communicate with our car over a wireless Internet connection. Liccardo pointed to the console screen The light was red, but the screen displayed a countdown clock ticking off the seconds until it would turn green. He stepped on the gas, steered the car toward the red traffic light, and, confident that his vehicle-to-infrastructure communication system would let him know exactly when the light would change, accelerated. The light turned green, and we blew through it without slowing down.