IAA spotlight: Evolution of In-Cabin Experience

I’m getting a lot of inquiries about interior sensors, interior UX and smart in-cabin technologies to describe the future interior.

What does that mean? Is it a massage seat? Is it a touchscreen instead of conventional switches? Or even the display in the middle of the steering wheel?

Yes! And No!

The current systems are vehicle and brand based. Preconfigured and limited. We have the driver seat and passenger seats with different purposes. Maybe a massage function, seat heating and/or cooling. AC meanwhile provides individual zones but here the individualism ends.

No real privacy, own content or prediction of needs. The usability is well, challenging… different. At best.

Besides conventional buttons and switches, touch and sometimes speech are used. But all this has been implemented rather lovelessly so far.

In a stress situation, no one wants to hear that a command was not understood, regardless of whether it was justified or not. We are stressed!

Gesture control is not used enough, just like cameras for interaction. Multimodal communication with the user must be adopted.

But where are health, safety and security? Exterior cameras could identify the user and if they stagger, call for proof of fitness to drive.

There is so much to improve in seating and in-cabin experience. Of course the passenger comfort is meanwhile vanilla. If it was part of the vehicle configuration and therefore pre-paid.

Advanced sensors in the cabin started evolving into individual cocoons. Interactive, adaptive and caring systems that combine the best for our human centric experience. Monitoring driver and passengers for fatigue, motion sickness, indisposition and anger management.

Inside, cameras can already track audio orientation, lip-reading to pick out the speaker.

But what about additional interaction capabilities like BMI to measure focus and distraction much more accurately than with cameras. Mercedes had already shown a use case, but unfortunately missed the point completely.

Not only was the implementation with the headband unfortunate, which in a real vehicle rather brings more problems than solutions.

Since a rather primitive technology was used that requires a lot of training and the usecase was only a gimmick, this technology is deterrent instead of inspiring. A smart headrest would have been the better choice. They should check out some market leaders like Neiry instead a using a toy.

Alcohol, diabetes, fever have a significant impact while driving a vehicle. With a smart steering wheel, a smart seat belt, in cabin video and thermography or additional seat sensors these can be detected and actions could be embedded. A smart seat with medical monitoring and 1st responder features. How does it sound?

Key technologies are edge computing, NLP& NLU, cognitive computing, interior radar, health sensors, smart surfaces, XR and embedding our wearables to support the automotive technologies. Multimodal interaction, minimalist dashboard, information when and how the user needs it instead of the distracting “mouse cinema”.

The next steps in the in-cabin experience go towards afterconfiguration, where all tech is embedded and can be used with pay per use or subscriptions. This experience has to become brand independent and human oriented. In future it will not matter if you are a passenger or driver, access to same systems is granted. Interactive smart surfaces will display functions when needed, maybe on morphable surfaces.

A BMW driver will have access to same features in a Merc or Cadillac. His personal device becomes part of a PAN (personal area network) for individual interaction and entertainment.

Of course data ownership has to be solved, but due to this interoperability it should remain with the human. The more we use personal data, including advanced and biometric sensors and use of data from built-in devices like driver monitoring camera and brought-in devices like the smartphone or smart watch.

Takeaway: The machine aka vehicle must follow its user.