Turns out, we may not actually want driverless cars to drive like us. That’s according to researchers at the University of Michigan, who say they’ve found three core “personality” traits autonomous vehicles need to have to make people feel safer with them - even if they themselves don’t have those same traits.
Getting people to trust, and use, driverless cars
Car makers already know some drivers still feel weird about the idea of sharing the road with an AV, much less actually using one.
“That’s partly because [autonomous vehicles] are designed purely from the technical perspectives, and those don’t necessarily comply with our human social norms,” says X. Jessie Yang, a professor in U of M's Department of Industrial and Operations Engineering, and the School of Information.
“Like consider you’re a pedestrian who wants to cross the street,” she adds. “When you’re interacting with a human driving a car, you’ll have eye contact. You might wave at the drive and the driver will signal: please go ahead, it’s safe to cross. But those social cues will be gone [with AVs.]”
So just telling people these cars are safe and fuel efficient won’t cut it. They’ll need to feel like the car is more human.
“A large part of trust and safety, is this projection of a personality,” says Lionel Robert Jr, associate professor at the School of Information and core faculty member at U of M’s Robotics Institute.
We all think cars have personalities
We already project a personality on our current cars, Robert says. “If we have a car and it breaks down, you're mad at the car, you know? The car let you down. The car's not doing this. And so we have this idea of humanizing technology, especially technology that can engage in an interactive behavior, whether we like it or not.”
That will probably be especially true for AVs, which can make decisions - like whether to speed up at a yellow light, or cut somebody off in traffic. But the research on human/robot interactions is mixed, Robert says: some of it says people like when robots have personalities similar to their own, while other studies say it’s more of an “opposites attract situation.” (This team’s research paper also points to a previous study, showing it actually depends on the context of what we want the robot to do.)
The experiment starts with taking a personality test
So these researchers recruited more than 400 drivers and had them fill out a personality test based on the “Big 5” traits in social science: extroversion, agreeableness, conscientiousness, emotional stability, and openness to experience.
Then, each driver watched four video simulations of an AV driving, with the participant sitting in the front seat. In each video, the AV had a different “personality” in terms of how it handled road conditions and traffic (like “sunny and aggressive” driving vs. “snowy and aggressive driving.”)
“So we may have a condition where the road is snowy, there's ice on the road, and you expect the car to slow down,” Robert says. “And if it didn't, you might have seen it as being emotionally unstable or even not very considerate...So think about it in terms, let’s imagine if the car had a driver, and then the human drove like that, what would be your impression of the human? That's how people go and project that personality onto a car.”
Most people preferred the same kind of car
After the videos, participants answered a series of questions ranking which cars they perceived to be safest. Cars perceived to be high in agreeableness, conscientiousness, and emotional stability consistently got the best ratings, even if the participant themselves didn’t have those same qualities.
“That’s not what I expected,” Yang says. Instead of aggressive drivers wanting aggressive cars, nearly everyone seemed to prefer the same type of AV. Meanwhile, it didn’t seem to make a difference to people whether or not the car seemed “open to new experiences” or “extroverted.”
And that was surprising to Robert. “Typically in the literature on human/robot interaction, the biggest personality trait that matters is extroversion. People generally like robots that are outgoing or social.”
The takeaway for manufacturers, Robert says, is that people don’t necessarily want AVs like themselves.
“And that was sort of a big thing: are there traits that would be so universal for everybody, that we can say [to car makers] ‘Go out and design a car that behaves like this?’ Or do we say, ‘Hey, it depends on the person behind, you know, sitting in a car, and it should be very dependent on that individual,’” he says.
“And these results said that it really is not. That basically, we have these three traits that people want, irrespective of their own personality. Which really makes things a lot easier when it comes to designing autonomous vehicles.” It also underscores the need to build AVs with people in mind, Yang says. “They shouldn’t just think about the technical aspects. The human and social aspects should be taken into consideration, too.”