Michigan’s New Motor City. That’s what The New York Times called Ann Arbor last year. The city is home to some central pieces of an expanding autonomous vehicle R&D ecosystem that rivals Detroit’s storied leadership in advancing the automobile. Established firms and startups alike have set up shop in southeast Michigan, lured by a mix…
Michigan’s New Motor City. That’s what The New York Times called Ann Arbor last year. The city is home to some central pieces of an expanding autonomous vehicle R&D ecosystem that rivals Detroit’s storied leadership in advancing the automobile.
Established firms and startups alike have set up shop in southeast Michigan, lured by a mix of singular testing sites, forward-looking state policies, and proximity to an entrenched supply chain, university resources and a burgeoning venture capital scene and workforce.
Some examples: Toyota has opened an autonomous driving research institute here. Waymo, Google’s self-driving vehicle arm, has offices in Ann Arbor and Detroit. French firm NAVYA, builder of all-electric, fully autonomous shuttles, opened its first U.S. production plant in Saline in 2017. Ann Arbor-based Michigan Engineering startup May Mobility raised $11.5 million in seed funding in 2018 and partnered with supplier Magna to release its first fleet of low-speed autonomous shuttles. And Ford has put $15 million toward the Ford Motor Company Robotics Building, slated to open on North Campus in early 2020. Ford engineers will occupy that building’s fourth floor. The list goes on.
Mcity – a public-private partnership headquartered in the U-M Office of Research and by now a household name among Michigan Engineers – has become a hub at the university. It counts among its major investors Aptiv, Denso, Econolite, Ford, GM, Honda, Intel, LG, StateFarm, Toyota and Verizon. Each of these firms has pledged $1 million over three years.
“The state of Michigan has led the traditional auto industry for the last 100 years, and we’re well positioned to be at the forefront of the next revolution,” said Kirk Steudle, who was director of the Michigan Department of Transportation at publication time and stepped down at the end of October. MDOT helped to fund Mcity’s unique test facility.
“We have a lot of assets that play nicely together on a global stage. We have a glut of end customers – multiple OEMs with assembly plants and R&D centers. Ninety-two of the top 100 global auto suppliers have a presence in Michigan. So you can develop and test in other places if you like, but the truth is if you want it on a vehicle or mass produced it’s got to touch Michigan at some point.”
More and more efforts are situating here from the start. U-M and Michigan Engineering are helping to draw them here, through institutional partnerships and faculty-led research projects.
And they’re not just about technology. Even the most earth-shattering new tech won’t, on its own, revolutionize transportation. The ecosystem around the technology will also need to change. We’ll need new education and workforce training programs, shifts in societal thinking, advanced testing sites, and deployment strategies that put the vehicles out in the world in a way that makes a positive difference. U-M, including Michigan Engineering, is working on the shift on all these fronts. Their work touches on the full life cycle of this industry’s transformation.
Education and Workforce Development
When Uber moved into autonomy research a few ago, it poached 40 people from Carnegie Mellon’s robotics center – four faculty members and 36 researchers and technicians. The company took some heat in the headlines for “gutting” the lab. But where else can you get that kind of workforce today?
In 2015, the state of Michigan’s Connected and Autonomous Vehicle Task Force surveyed 50 employers in southeast Michigan about the autonomous vehicle workforce. Their most common need was for “connected systems engineers” – a specialty that includes software engineering, systems engineering and electrical engineering, and pays roughly $90,000 a year.
“Industry still needs people who have deep expertise in a single field, but the need for people with knowledge in multiple fields is more urgent,” said Huei Peng, director of Mcity and the Roger L. McCarthy Professor of Mechanical Engineering. “They need expertise in traditional fields such as dynamics, controls, signal processing, and coding, as well as in the relatively new fields of artificial intelligence, big data, and cybersecurity.”
Across the globe, companies are looking to hire thousands of this new kind of engineer.
U-M and other entities in the region are working to establish a pipeline to these high-demand, high-paying jobs. The interest is there. In 2017, Michigan Engineering began offering the interdisciplinary course Self Driving Cars: Perception and Control. Instructors planned for 30 students. They got 400. A similar number turned out this fall.
Last year, the U-M Department of Civil and Environmental Engineering launched a master’s degree in Transportation Systems Engineering and a PhD in Next Generation Transportation Systems. And starting in 2020, faculty members in the Robotics Institute will be housed in the new Ford Motor Company Robotics Building, just down the road from the Mcity Test Facility, a proving ground for autonomous and connected vehicles. The Robotics Institute’s master’s and PhD programs are ranked #2 in the nation.
In TechLab, which is run by the Center for Entrepreneurship in partnership with Mcity, undergrads from across the College work with autonomous vehicle startups. Last year, a team worked with NAVYA on the Mcity Driverless Shuttle, which started operating this spring on a one-mile route. And the U-M Transportation Research Institute (UMTRI) leads the Center for Connected and Automated Transportation, a consortium of six Midwestern universities that includes Washtenaw Community College. Among its charges is to develop courses and programs to educate tomorrow’s autonomous vehicle workforce. That includes future auto technicians and mechanics, who will also need a new kind of training.
Vehicle Technology
SENSORS
Lidar, radar and cameras work together to take in an autonomous vehicle’s environment in 360 degrees and to pinpoint its location. While today’s cameras and radar are robust and cost-effective, lidar sensors aren’t there yet. They range in price from $7,000 to $70,000 and in some cases, they’re lasting in the hundreds-of-miles range – far from the auto-grade 100,000 miles or the lifetime of a car. U-M researchers have a possible solution. Ryan Eustice, professor of naval architecture and marine engineering and senior vice president of automated driving at the Toyota Research Institute – Ann Arbor, used video game technology to turn pre-recorded maps into 3D visualizations that make it possible for an autonomous vehicle to rely on inexpensive cameras rather than lidar technology to pinpoint location.
PERCEPTION SOFTWARE
In order for autonomous vehicles to understand what their sensors take in, researchers are turning to a combination of classical computer vision and the younger field of deep learning. Where traditional computer vision relies on models that focus on edges and other defining features that humans find meaningful, deep learning takes what some call a “brute force” approach. It involves feeding the system an immense set of annotated images that it can learn from.
“At the moment there are publically available datasets to test deep learning systems and they have several thousand images – all annotated by a human. People go in and draw boxes around all the people and cars and sidewalks and stop signs, for example. But we need millions of these images to train these algorithms well,” said Ram Vasudevan, assistant professor of mechanical engineering and co-leader of the U-M/Ford Center for Autonomous Vehicles.
He and his colleague and co-leader Matthew Johnson-Roberson, associate professor of naval architecture and marine engineering, are working to streamline the process. Video games come to the rescue again. Grand Theft Auto, it turns out, looks enough like the real world to train a system. They were able to develop automated image annotation algorithms and then, overnight, extract and mark up ten million scenes, which they used to improve the accuracy of their system.
The team has also developed an algorithm that can find a pedestrian in a scene and zoom in on their hands, which can be used to make predictions about what they’ll do next.
Connectivity
To be as safe as possible, autonomous vehicles should talk to each other and to the infrastructure around them.
Dedicated Short Range Communications, or DSRC, lets vehicles send messages about their location, direction, speed and more at the rate of 10 per second, and at a distance of up to 1,500 feet. DSRC isn’t restricted to line-of-sight, like a camera or lidar. The technology has undergone testing for more than a decade, and it’s ready for market, even on human-driven vehicles. It’s being piloted on a grand scale around Ann Arbor right now. A federal government mandate, which has stalled, would advance adoption, says Jim Sayer, director of UMTRI.
“Every year that we wait to put connected vehicle technology in place, we’re losing tens of thousands of lives,” Sayer said. “And I don’t believe you can have highly automated vehicles without connectivity.”
Some automakers are moving ahead with plans to install DSRC ahead of any mandates.
Beyond reducing crashes, connectivity could curb traffic jams and lead to dramatic improvements in energy efficiency. Gabor Orosz, an assistant professor of mechanical engineering, has shown that the smoother transitions a connected, automated vehicle makes between braking and accelerating can boost energy efficiency by as much as 19 percent.
Path-Planning
How will autonomous vehicles decide how to get where they’re going – not just where to turn, but when to change lanes, when to brake hard and when to speed up? Prior mapping will be central to navigation. Prior mapping involves loading the vehicle with detailed surveys so it knows where to expect traffic signals and trees, for example, reducing the need for on-the-fly perception. Not only does this tell the car where it is in the world, it opens space for the vehicle to pay more attention to things that aren’t on its map. Eustice and Edwin Olson, a U-M associate professor of computer science and engineering, worked to develop streamlined, robust prior mapping approaches that let vehicles localize themselves, with centimeter precision, even when the road is covered in snow.
If a vehicle could predict what will happen around it, it could make better decisions. “Predicting the future is hard,” said Jason Corso, associate professor of electrical and computer engineering. “Today most methods are able to detect a lane change only when a vehicle has signaled that it’s going to change lanes.”
Corso can best that, though. He recently showed that, based on traffic flow and vehicle speeds, he can anticipate the trajectories of nearby vehicles up to five seconds before a vehicle signals its intent.
He’s also working on ways to get autonomous vehicles to understand verbal commands. “Imagine at some point, you want your vehicle to change course,” he said. “You don’t want to have to use special language or a dashboard controller.”
U-M is also the birthplace of a radically different way of planning vehicle behaviors using a technique known as multi-policy decision making. In this approach, the vehicle doesn’t plan a path at all – it uses a library of driving strategies and runs a real-time “election” to pick the best one for a particular situation. Olson and his team are pushing this technology to produce increasingly human-like behavior, and he’s commercializing it through his startup, May Mobility.
SOCIETAL ASPECTS
Moving society to next-generation mobility systems – and making sure the shift doesn’t lead to unintended consequences – will require new laws, city designs, business models, cybersecurity measures, and also public acceptance. Across U-M, scholars are delving into these broader aspects.
On the law front, driverless cars will likely shift fault for accidents from drivers to auto manufacturers, raising new liability questions. They will also collect more data than we’re used to, leading to privacy issues. In a step toward exploring these quandaries, the U-M Law School and Mcity, in collaboration with the Michigan governor’s office, launched the Law and Mobility Project in June.
“Our goal is to analyze not only the direct and obvious legal and regulatory issues – such as tort liability and federal or state regulation of driverless cars – but also to think more broadly about the many significant ways in which the coming mobility revolution willreshape law, regulation, culture, economics and many other domains of life,” said Daniel Crane, professor at the U-M Law School and editor of the project’s new Journal of Law and Mobility.
“Connected and automated vehicles will have a disruptive impact on our transportation system,” said Henry Liu, professor of civil and environmental engineering who leads the center. “While these technologies will continue their steady advance toward public roadway systems, there are a variety of open questions and issues on technology development, policy and planning, and system design and operations that require answers and resolution.”
One of the biggest opportunities driverless cars will bring is wheels for those who don’t have access to reliable transportation – the elderly, disabled and economically disadvantaged. “We need to think about accessibility,” said Carrie Morton, deputy director of Mcity. “How do we take this unique moment in time when we’re re-envisioning transportation from the ground up to make sure we design it to move all society forward, and that means thinking about socioeconomic mobility, access to healthcare, to grocery stores – all of those things.”
Mcity is doing that, through collaborations across campus. But sometimes a driverless car won’t be the answer.
“It’s not all about connected and autonomous vehicles,” said UMTRI Director Sayer. “They’re one tool. The goal really needs to be improving mobility and that can mean reducing the need to move. Rather than dragging an elderly person to a doctor, why not take the service to them in other ways?”
EARLY STAGE TESTING
Rain and snow can cloud driverless car perception systems just like they cloud your vision. So Ford Motor Company turned to the Mcity Test Facility to evaluate its algorithms in winter weather. That’s an example of the kind of work that’s possible in this first-of-its-kind proving ground. At the test facility, industry and faculty researchers can put their vehicles through potentially dangerous situations that self-driving cars must master before they can take the place of human drivers. The 32-acre site with more than 16 acres of roads and traffic infrastructure is a safe place to test before taking technologies on public roads, which is legal in Michigan.
Mcity, a campus-wide, public-private initiative, opened the test facility in 2015, and since that time the public-private partnership has grown to more than 60 industry partners, and the capabilities of the test facility have expanded.
“We’ve added infrastructure connectivity, the ability to use augmented reality to create a more robust testing environment and the ability to monitor Mcity’s traffic in real time,” said Mcity Director Peng. The state-of-the-art Michigan Traffic Lab, the traffic control center for Mcity, can monitor and control all infrastructure. The traffic lab also enables augmented reality testing at Mcity. That combines the real-world environment with simulated connected vehicles to serve as realistic background traffic. It’s a good way to fine-tune control parameters before involving a lot of real vehicles, said Professor Liu, who leads the lab.
PRODUCT DEVELOPMENT
Technology that’s graduating from prototype to product can take the next steps at the American Center for Mobility. ACM’s more-than-500- acre campus hosts a 2.4-mile highway-speed loop, multi-decker bridges, a tunnel, a 6-by-6 lane intersection and a full-scale boulevard designed with the Michigan Department of Transportation’s help.
“It is the real world,” said John Maddox, former ACM president and CEO. “For the verification and validation aspects we’re focusing on, that’s what you need. If you want to operate your vehicle on a highway, you have to teach it on a highway. Otherwise, you haven’t taught it.”
Validation and verification involve ensuring that a design meets its specifications in real-world conditions and for intended customers. Individual companies can set up their own scenarios or participate in events like “Plug fest,” where ACM invites manufacturers to connect to other products and test interoperability and security.
ACM, at Willow Run in Ypsilanti, is less than 10 miles from Mcity, and the facilities complement each other. “Their proximity is a clear example of our leadership in this area,” Maddox said. “There’s no other place like this in the world.”
While Mcity focuses on the earlier-stage testing, some product development happens there as well. At TechLab, students work with startups to improve their products. The last cohort included CARMERA, of New York and Seattle, which provides real-time 3D maps for autonomous vehicles; and Zendrive, of San Francisco, which uses smartphone sensors to identify driving behaviors and provide insights and coaching to help you drive more safely.
DEPLOYMENT
Two autonomous shuttle systems with ties to U-M are now on the road.
Faculty startup May Mobility began its first operation in October 2017 and expects to have multiple deployments in 2018. Its first market is self-driving shuttles in compact environments.
“May Mobility was the first self-driving company to replace an existing bus service, and we did it in downtown Detroit,” Olson said.
And the Mcity Driverless Shuttle research project began on North Campus this spring. It’s the first research project in the U.S. focusing on autonomous vehicle user behavior. The shuttle ferries students, staff and faculty to the North Campus Research Complex from more distant parking. Its cameras record what’s going on in and around the vehicle.
“We’re not focusing on the technology with the shuttle, but on understanding consumer acceptance,” said Morton, Mcity deputy director. “We’re studying how passengers and others on the road experience this. Questions like: How does their trust change over time? How do pedestrians react? Are other vehicles impatient?”
And through the “living laboratory” of the Ann Arbor Connected Vehicle Test Environment, thousands of vehicles across the city are communicating with one another and infrastructure like road signs, traffic lights and crosswalks. AACVTE, as it’s called, is run by UMTRI, in collaboration with Mcity, the city of Ann Arbor, the U.S. Department of Transportation and the Michigan Economic Development Corporation. It was born out of Safety Pilot, which, in 2012, was the largest connected vehicle deployment in the world.