in ,

Avoiding carsickness when the cars drive themselves

Avoiding carsickness when the cars drive themselves

The day is approaching when commuters stuck in soul-crushing traffic will be freed from the drudgery of driving. Companies are investing billions to devise sensors and algorithms so we motorists can turn our attention to where we like it these days: our phones.

But before the great promise of multitasking on the road can be realized, we need to overcome an age-old problem: motion sickness. “The autonomous vehicle community understands this is a real problem it has to deal with,” said Monica Jones, a transportation researcher at the University of Michigan. “That motivates me to be very systematic.”

So starting in 2017, Jones led a series of studies in which more than 150 people were strapped into the front seat of a 2007 Honda Accord. They were wired with sensors and set on a ride that included roughly 50 left-hand turns and other maneuvers.

Each subject was tossed along the same twisty route for a second time but also asked to complete a set of 13 simple cognitive and visual tasks on an iPad Mini. About 11% of the riders got nauseated or, for other reasons, asked that the car be stopped. Four percent vomited.

Avoiding Carsickness When the Cars Drive Themselves

A participant wears a headband during a series of studies with a goal of helping people avoid motion sickness in self-driving cars, at the University of Michigan in Ann Arbor, Mich., on Jan. 16, 2020. If the future lets people focus on work instead of driving during the daily commute, many of us will have to conquer motion sickness to read memos (or tweets). Researchers are working on some fixes. (Ryan Debolski/The New York Times)

Jones takes no joy in documenting her subjects getting dizzy, hyperventilating or losing their lunch. She feels their pain. Jones, a chronic sufferer of motion sickness, has experienced those discomforts in car back seats all her life. “I don’t remember not experiencing it,” she said. “As I’m getting older, it’s getting worse.”

It’s also getting worse for the legions of commuters hailing Ubers or taxis and hopping in, barely lifting their gaze from a screen in the process.

The University of Michigan subjects were recruited to represent not only those with histories of getting carsick, like Jones, but also passengers along a spectrum of susceptibility. An equal number of men and women were tested.

The first 20-minute test drives were conducted at MCity, an ersatz city managed by the University of Michigan’s Transportation Research Institute. But more recently, the Accord merged with local traffic for one-hour drives. Test riders will eventually be relocated to the back seat, where Americans increasingly find themselves.

In the study, subjects narrated their levels of nausea during the route. Video cameras and wired sensors captured facial expressions, heart rate, skin temperature and changes in body and head posture. Those were indexed against precise metrics about the vehicle’s movement.

Jones wants to help people avoid and treat motion sickness. But at this early stage of her research, she’s merely aiming to better understand the “fundamentals of human response.” For example, there might be clues in how riders who get carsick hold their heads, maintain their posture or position the mobile devices they’re using. “I’m not out for the engineering solution directly,” Jones said.

But Florian Dauth, an automated-driving engineer for the ZF Group of Germany — one of the world’s largest automotive suppliers — is in the business of devising engineering solutions. He has been working for more than two years on strategies to reduce motion sickness in autonomous vehicles.

“We are developing algorithms that self-learn based on bodily reactions,” he said, referring to the machine-generated code that determines the vehicle’s path. To navigate the road safely, automated vehicles already receive and combine data from an arsenal of radar, laser, video and ultrasonic sensors. ZF said data about the passenger’s well-being should be added to the algorithm.

Dauth is collecting passengers’ biological data via cabled inputs, like measurements of brain activity from electrodes placed on a rider’s scalp and similar monitoring of the heart. When put into production, the self-driving biofeedback system would most likely be reduced to cameras powered by facial-detection software or perhaps wearable devices.

“Let’s say the car takes a strong left curve and then brakes very roughly at a red traffic light. We are recording all the vehicle movements and the passenger’s reactions in parallel,” Dauth said. “If you react in a way that gives you symptoms, then in the future we will avoid these maneuvers.” In other words, the self-driving car’s AI learns how to drive in a way that doesn’t make you sick.

ZF might want automated cars to become calmer drivers, but back in Michigan, Jones’ research places some of the responsibility for avoiding motion sickness on a rider’s common sense. As you might expect, not reading a book, or Twitter, helps one avoid motion sickness.

But Brian Lathrop, a technologist at Volkswagen with a doctorate in cognitive psychology, doesn’t harbor hope that passengers will put down their phones. “If you’re talking about a Level 4 autonomous vehicle, you have to ask yourself, what are people going to be doing in the car?” he said. In a so-called Level 4 car, passengers don’t need to pay any attention to a steering wheel or the road.

“The easy answer is, they’ll still use their smartphones,” he said. “But you also have to anticipate the high probability that they will be using some sort of virtual reality or augmented reality system.” That’s right. We’re facing a brave new automotive world in which people zoom down the road in a self-driving vehicle while wearing fully immersive VR headgear.

Lathrop, working with fellow technologists at Volkswagen’s Innovation and Engineering Center California in the heart of Silicon Valley, is trying to eliminate motion sickness when using VR in a moving automobile. Lathrop said the unease happens when there’s a disconnect between the signals sent to your brain from your inner ear and what you’re seeing. “I wanted to look at, how could you address that disconnect between the visual signals and the stimulus signal?” he said.

Source: www.chicagotribune.com

What do you think?

486 points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Ambarella presents the demos CV2, CV22 and CV25

Ambarella presents the demos CV2, CV22 and CV25

Vehicle from Local Motors is 80% 3D-printed

Vehicle from Local Motors is 80% 3D-printed