Putting brains in a car, but taking the driver out
Brannon with Denny Hemlin’s car at Bristol Motor Speedway
Among the most exciting developments in technology, few heralds as much promise in everyday life as automation. From self-driving cars to self-controlled personal vacuums, as smart technology gets smarter, one by one obstacles are met and overcome with cutting-edge innovations.
For William Brannon ’18, a robotics engineer with California-based Coast Autonomous (CA), his daily role is finding ways to make existing car technology not only smarter, but self-reliant too.
The company has already made a name for itself by adapting self-driving technology to ordinary vehicles, a breakthrough that could drive the future of autonomous engineering.
It’s an opportunity he was looking for long before he ever left Auburn.
“At Auburn, I knew I liked math, and mechanical engineering is one of the broader fields of engineering, so I figured I’d decide what I want to do later in life—for now I can just kind of go with this one-size-fits-all major,” said Brannon from his home in Silicon Valley, Calif. “Fortunately, there was one professor who mentioned autonomous cars to me, and all of a sudden I was like, “man that’s what I want to work on.”
Brannon’s main focus at Coast Autonomous is designing the software algorithms that determine the cars’ actions once it is switched to self-driving mode.
While a different sector of CA designs and builds the gadgetry that turns the steering wheel, shifts gears and operates the brake and gas pedals, Brannon is focused on building essentially the “brain” of the vehicle once it’s without a human driver.
CA’s software does everything from planning out its route to operating the vehicle’s internal controls. An important factor for Brannon and the rest of their team is accounting for the terrain and neighborhoods their vehicles are designed to go through.
Since joining in 2020, he’s been working on a controller that, though still used in simulation, will potentially be out on the test track soon. But still, he admits, there are a million things that could go wrong.
“With autonomous vehicles, you have a lot of moving parts on both the hardware side and the software side, so getting everything to fit in correctly and getting things to be timed correctly, that ends up presenting a lot of challenges.”
In the past, Coast Autonomous has earned recognition for itself by “automatizing” pre-built vehicles, outfitting them with the technology to drive themselves without rebuilding it. Whereas car companies like Tesla offer built-in automation from their assembly line, Coast Autonomous is updating the cars of today with driverless technology.
“We fit all the appropriate sensors on the vehicle, and then put whatever motors are needed to turn the steering wheel, hit the gas and retrofit a car to be autonomous.”
The retrofitting model has been a boon for business, with companies rethinking their vehicle operations now that they can accomplish the same tasks without drivers.
An autonomous delivery vehicle employed at a railport in Kinney County, Texas was able to make its supply route without requiring a driver, freeing up workers to focus on more disciplined tasks.
A self-driving shuttle created by Coast Autonomous for New York City’s Times Square
Already, Coast Autonomous has used a self-driving shuttle in New York City’s Time Square, and has plans to set up similar shuttles in other urban areas to “give the cities back to the people, rather than be so car-focused.” The designs and utility purposes may pose some challenges from a hardware perspective, Brannon said, but on the software side, it eliminates barriers rather than erects them.
Through technological advances in “machine learning”—teaching a machine to recognize and process new data on its own—new avenues are opening for future work.
In particular, a subfield of machine learning, dubbed “deep learning,” is seeking to take that automated process even further, Brannon said.
“If you imagine a car driving through a city, it’s messy—there are a ton of moving parts around it. Suppose you could envision where all of these moving parts will be five seconds into the future, so you can plan accordingly. That’s something that you could use deep learning for.”
Brannon has been studying the interaction between software simulations and hardware since his days at Auburn, where he worked with the Mechanical Engineering department on creating CAD designs for 3D printers.
His senior year, he interned with Toyota Racing at their North Carolina headquarters working on controls for a world-class racing simulator that could give engineers critical test-track time away from the track. Not only would it reduce the wear and tear on the vehicle but acclimate drivers well ahead of race day. He calls it “the craziest videogame you’ve ever seen.”
“What they wanted to do was create a controller to actually [drive] the vehicle in their simulation environment at professional-level speeds,” said Brannon. “The idea was to recreate the driver, and then change the parameters to where the controller performs as well as possible.
Besides learning more about vehicle dynamics, he found himself paying extra attention to minute details, like the traction on race car wheels, that he incorporates into his current Coast Autonomous work.
“I got the car to spin out a lot of times that summer before I was able to get it to go all the way around the track without sliding around. That actually presented a big challenge at first—on a golf cart, you’re not really going to deal with that going only 15 miles an hour.”