Lei Zhang, Global Vice President, Digital Cockpit and Software Development, NIO
Lei Zhang leads the global Digital Cockpit team for Nio in Silicon Valley. The team is responsible for the single, highly capable domain controller and the holistic digital experienceit enables in its products. Lei has over ten years of experience in the technology industry and specializes in Android, embedded Linux, automotive IVI, mobile applications, and web services. Previously, Lei was VP, Software and Chief Architect at Huami Corp. and Tech Lead Manager at Google. He holds a Master's degree in Computer Engineering from Peking University.
Many automotive industry observers are debating how onboard digital assistants' advances can enable future innovations for electric vehicles, and eventually,for autonomous cars. This debate is helping us define what a "smart car" will be for its users.
Current consumer devices such as Siri by Apple, Amazon Alexa, and Google Home provide convenient services for consumers to manage their home networks, interact with entertainment systems, oversee security, and reduce energy consumption around the house. However, our behaviors in the car arevery different from those in the household. Drivers engage inhigh-speed travel with a need for rapid decisions about navigation routes, charging and maintenance requirements, and extensive safety protections.
As autonomous cars are transformed into mobile living spaces,we envision that digital assistants will evolve over time and understand users to help them better use the vehicles and services provided.
Through voice-activated conversations and machine learning, these assistants will eventually embody the vehicle itself and serve as a trusted companion for drivers and their families on the road.
As autonomous cars are transformed into mobile living spaces,we envision that digital assistants will evolve over time and understand users to help them better use the vehicles and services provided
We have already started on this path toward a more welcoming experience for drivers and passengers. NIO's onboard digital assistant called NOMI lights up the car exterior when its computer vision system recognizes people approaching it. NOMI also automatically adjusts the cabin airflow vents, seating positions, and mirror settings based on each user's unique preferences.
NOMI can light up the darkened dashboard touchscreen in another new feature when it anticipates someone is trying to talk to her. Over time, we will refine NOMI's machine learning algorithms to improve each user's driving experience by further anticipating their desires and actions. Again, we perceive NOMI to be the embodiment of the car, and when the user is talking to NOMI, he or she is talking directly to the car.
Today, some OEM’s offer a voice-recognition interface in its electric cars, but it is designed more as a passive interaction for drivers to input their destinations or play certain songs. We believe this function must go further to become the user's active companion in the digital cockpit. Other competitors in China recognize this will be the future for all auto manufacturers.
Yet NOMI is not only a digital voice assistant – it is a smart companion that can understand the user's patterns within the mobile living space. This breakthrough is based on complex artificial intelligence technologies that learn to anticipate what a user wants over time while enhancing the user's sense of control, comfort, and security. Some are working aggressively to deliver voice interactions that enable natural conversations between the user and the car's voice assistant, much like NOMI.
Building the Digital Companion
Today, NIO is integrating NOMI with arrays of interior cameras and sensors such that she can understand what a user is saying and perceive the user's location in the car – whether in the driver's seat or passenger seat backseat. Furthermore, NOMI has been designed to only respond to the specific person who initiated the conversation, in what we call the multi-modal "audio zone." These audio zone conversations are isolated to only occur between NOMI and the user who initiated the conversation. The voice engine is smart enough not to be interrupted or confused by other passengers speaking in the background or by an infant crying in the baby seat.
NOMI will instantly know who the user is based on facial ID and image recognition technologies that can detect someone's identity and even what kind of mood or emotion they feel through sentiment recognition tools. For example, NOMI's cameras could recognize if the user is tired after a long day at work. So, in addition to coordinating essential functions such as navigating the carback home, NOMI might also play some relaxing music and initiate a casual, cheerful conversation with the user, perhaps even sharing some corny personalized jokes to lighten the mood.
Of course, NOMI will recognize whether it is speaking with an adult or a child andtailor its responses accordingly to be understandable to different audiences. For example, NOMI might become a storyteller to engage abored kid's attention or serve up some educational games on the backseat touchscreen. We have taken this concept of the "digital nanny"even further to protect children or pets from being accidentally left alone in the car on a hot day. In such cases, NOMI will send an immediate alert to the user and even activate the air conditioning or lower windows to keep loved ones cool and safe until the owner can return.
The onboard digital assistant will evolve to become much more than a convenience that enables voice commands. Eventually, the onboard companion will serve as an essential intelligence that drivers depend upon to meet their heightened expectationsin the coming era of autonomous driving.