Search This Blog

Wednesday, September 25, 2013

How Nissan Will Roll Out Self-Driving Cars: Fricking Lasers

As reported by ReadWrite: It was an improbably futuristic scene: A man standing on a sunbaked tarmac in Irvine, Calif., next to a Nissan Leaf electric car, pushed a button on the hatchback’s key fob. The Leaf, unassisted by human intervention or preprogrammed maps, crawled at about five miles per hour through rows of parked vehicles, detected an SUV pulling out of a space, paused, and allowed the SUV to pull away. Then it moved past the now-vacated parking spot, slowed into position, glided back into the space, and powered down.

A moment later, the man pushed the button again, and the Leaf fetched itself, reversing its previous steps, and returned to the man’s side.

This isn't science fiction. I watched this all myself, dumbfounded, just a little over a week ago.

Was this self-parking demonstration a bit of razzle-dazzle that will never make it into the vehicles in dealer lots? Maybe not.

To witness this scene, I drove 45 miles in a 2014 Infiniti Q50 sedan from LAX to the decommissioned El Toro Marine Corps Air Station. (That's where Nissan held its month-long Nissan 360 technology showcase.) The Q50 was equipped with the luxury car’s $3,200 tech package , which pushes the nicely appointed vehicle’s price over $50,000.

The relevant features of the teched-up Q50 are Intelligent Cruise Control and Active Lane Control. The technology allowed me to travel at highway speeds along short, straight stretches of the 405 and the 5, with my foot off the pedals and my hands at my side.

Take that, Google! The search engine is investing an unknown amount in self-driving cars, and those prototypes have driven millions of miles. Google promises to offer the technology to consumers by 2018, but the Q50 is on sale today.

Proto-Automation

The Q50’s camera located in front of the rearview mirror, along with its image-processing system, can read lines and dashes on the roadway.  When the vehicle gets close to the white paint separating lanes, the car gently nudges the steering wheel in the direction of safety.  But here’s a problem that I experienced: When the car approached the white line to the left, it overcorrected, sending me across the lane to the right-side boundary, where the camera and computer nudged me back again across the lane to the left line.  With my hands off the steering wheel, the Q50 became a careening, 3,500-pound ping-pong ball. 

In fairness, the visual guidance technology in the Q50 is not meant to fully automate driving.  It’s intended to play an assist role, which according to Infiniti—Nissan’s upscale division—reduces driver fatigue and otherwise enhances the vehicle’s luxury feel. It worked as intended.

Similarly, the Q50’s Forward Assist technology was effective.  Set the cruise control to, say, 65 miles per hour, and lift your foot off the accelerator.  That’s plain ol’ cruise control, right?  But thanks to a radar system behind the front bumper, the car can detect the speed of cars ahead in the same lane, and automatically slow down the Q50 to match their pace—all the way down to a complete stop, only to resume acceleration when the car ahead gets going. This is an increasingly common automotive feature, usually called adaptive cruise control.  A related safety feature rapidly and automatically applies brakes when the vehicle in front comes to a screeching halt.

Driving Back to the Future

These early manifestations of autonomous driving technologies already seem unremarkable.  But what’s surprising is that the fully automated Leaf on display in Irvine uses the same exact camera, image-processing technology, and radar found in the Q50.

“To find objects that are approaching from far away very fast, radar is the best technology,” explained Tetsuya Iijima, general manager of intelligent transportation systems engineering at Nissan. “But unlike the driver-assisting features on the Q50, fully automated technology can’t make any excuses to the customer.”
So Iijima and his team of engineers employ more serious automagical mojo: six laser scanners that surround the car.  And not just the fixed broad-beam or one-dimensional lasers already used in auto-safety systems from Continental and other suppliers. These are three-dimensional ones that scan left, right, up, and down, to make a full spatial rendering of all road objects on the fly.  Three radars are still used, one in front and two in back, as well as five cameras that can read speed-limit signs (to modulate speed according to the highway rules) and the color of traffic signals (to know when to stop and go at an intersection). 

Add 12 sonars, and you now have a Leaf electric car that can travel autonomously and safely on highways—and do that cool robotic-parking trick as well.  Iijima demonstrated those two feats in two separate vehicles—each equipped with precisely the same hardware, but programmed for either highway travel or automated parking.  Nissan executives said that these automated features will go on sale in 2020—and will become available a few years later in a wide range of models.

The Secret Sauce: Fricking Laser Scanners

Several carmakers already offer features similar to the ones available in the Infiniti Q50, and are making claims about fully automated driving coming in the not-too-distant future—although most do not give timetables.

The reason Nissan thinks it can set a date is that it has committed to laser technology.
“We believe that we are leading this technology," said Iijima. "Other companies still have not decided to use a laser scanner. We have come to the conclusion that laser scanners are required. The image is a regular three-dimensional picture. Each point has depth information.”

The Google car uses a relatively large roof-mounted LIDAR system, using 64 lasers in a spinning 360-degree turret to create a high-resolution map accurate to about 11 centimeters, according to Popular Science.   The autonomous Leaf embeds six fixed laser scanners—around the car in corner body panels and into rear-passenger doors—each one providing resolution to 1 centimeter, according to Nissan.

Iijima declined to identify the companies that Nissan is considering to supply the three-dimensional laser hardware or what it might cost. Nissan is developing its own software that filters all the various inputs, and integrates the data into steering-wheel position, acceleration levels, and braking. It’s Big Data on wheels. The intricate integration of hardware and software will take an alliance of companies, according to Iijima.

But Nissan has ruled out one type of technology, at least for the next few years—intelligent GPS-based geographical mapping, in the vein of Google Maps or Nokia’s Here. The info gathered from those mapping services is not detailed enough, according to Iijima. Also forget vehicle-to-vehicle or vehicle-to-infrastructure communications that will take decades to penetrate across enough cars and roadways to become useful.

The cool self-parking car, unlike similar systems unveiled from Audi and Volvo, does not require GPS or any sensors or transmitters applied to the pavement.  Instead, as Iijima believes, vehicle automation should work with on-board sensors.  (Nonetheless, Nissan is working on a parallel development using precise maps that will enable cars to run autonomously in more challenging city environments.)  For now, Nissan is only talking about tackling the simpler challenge of highway driving and automated parking.

The Beginning Of The End Of Driving

Iijima outlined some limitations to the system: a max speed of 80 miles per hour and difficulty in extreme weather conditions, like a snowstorm.  He said that his work now focused on increasing processing power, reducing cost, and shrinking the size of the hardware that currently occupies the entire hatch space—down to about the size of a shoebox that could fit into the engine compartment.

The software, which Nissan developed in-house with unnamed partners, is not unusual.

“It’s C++,” Iijima said with a chuckle.  And ironically, the most important required infrastructure is … white paint.  “The white line defines the road,” he said.  “It’s minimal infrastructure.”

What’s at stake with this program?  Big stuff. The promise of zero fatalities.  The ability for elderly and disabled people to gain mobility.  More efficient use of fuel and roadways.  And nothing less than a complete transformation of the relationship between car and driver.

“When the driver is no longer necessary, there is no need for cars to be owned by individuals,” he said. He envisions a world of shared autonomous mobility robots roaming global roadways by 2030.  Yet, there’s no single finish line set to be crossed in the distant future, but rather a slow and steady supplanting of human drivers by onboard computers, cameras, radar, sonar and lasers.

No comments:

Post a Comment