Áß±¹ÀÇ ÀÚµ¿Â÷ ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼Ç °³¹ß(2024³â)
China Automotive Multimodal Interaction Development Research Report, 2024
»óǰÄÚµå : 1613812
¸®¼­Ä¡»ç : ResearchInChina
¹ßÇàÀÏ : 2024³â 12¿ù
ÆäÀÌÁö Á¤º¸ : ¿µ¹® 270 Pages
 ¶óÀ̼±½º & °¡°Ý (ºÎ°¡¼¼ º°µµ)
US $ 3,800 £Ü 5,498,000
Unprintable PDF (Single User License) help
PDF º¸°í¼­¸¦ 1¸í¸¸ ÀÌ¿ëÇÒ ¼ö ÀÖ´Â ¶óÀ̼±½ºÀÔ´Ï´Ù. Àμ⠺Ұ¡´ÉÇϸç, ÅØ½ºÆ®ÀÇ Copy&Pasteµµ ºÒ°¡´ÉÇÕ´Ï´Ù.
US $ 5,700 £Ü 8,247,000
Printable & Editable PDF (Enterprise-wide License) help
PDF º¸°í¼­¸¦ µ¿ÀÏ ±â¾÷ÀÇ ¸ðµç ºÐÀÌ ÀÌ¿ëÇÒ ¼ö ÀÖ´Â ¶óÀ̼±½ºÀÔ´Ï´Ù. Àμ⠰¡´ÉÇϸç Àμ⹰ÀÇ ÀÌ¿ë ¹üÀ§´Â PDF ÀÌ¿ë ¹üÀ§¿Í µ¿ÀÏÇÕ´Ï´Ù.


Çѱ۸ñÂ÷

1. À½¼º ÀνÄÀÌ Á¶Á¾¼® ÀÎÅÍ·¢¼ÇÀ» µ¶Á¡Çϰí, ¿©·¯ ¹æ½Ä°ú À¶ÇÕÇÏ¿© »õ·Î¿î ÀÎÅÍ·¢¼Ç °æÇèÀ» âÃâÇÕ´Ï´Ù.

ÇöÀç Á¶Á¾¼® ÀÎÅÍ·¢¼Ç ¿ëµµ Áß À½¼º ÀÎÅÍ·¢¼ÇÀº Áö´ÉÇü Á¶Á¾¼®¿¡¼­ °¡Àå ±¤¹üÀ§ÇÏ°í ºó¹øÇÏ°Ô »ç¿ëµÇ°í ÀÖÀ¸¸ç, ResearchInChinaÀÇ ÃֽŠÅë°è¿¡ µû¸£¸é 2024³â 1¿ùºÎÅÍ 8¿ù±îÁö ¾à 1,100¸¸ ´ëÀÇ ÀÚµ¿Â÷¿¡ ÀÚµ¿ À½¼º ½Ã½ºÅÛÀÌ Å¾ÀçµÇ¾ú½À´Ï´Ù. ¹ÙÀ̵Π¾ÆÆú·Î(Baidu Apollo)ÀÇ Áö´ÉÇü Á¶Á¾¼® »ç¾÷ ÃѰ渮ÀÎ ¸®Å¸¿À(Li Tao)´Â "óÀ½¿¡´Â ÇÏ·ç¿¡ 3-5¹ø Á¤µµ »ç¿ëÇßÁö¸¸ Áö±ÝÀº µÎ ÀÚ¸´¼ö·Î ´Ã¾î³µ°í, À½¼º ´ëÈ­ ±â¼úÀ» ¼±µµÇÏ´Â ÀϺΠ¸ðµ¨¿¡¼­´Â ¼¼ ÀÚ¸´¼ö¿¡ À°¹ÚÇÑ´Ù"°í ¸»Çß½À´Ï´Ù. À½¼º ´ëÈ­ ±â¼úÀ» ¼±µµÇÏ´Â ÀϺΠ¸ðµ¨¿¡¼­´Â ¼¼ ÀÚ¸´¼ö¿¡ À°¹ÚÇϰí ÀÖ´Ù"°í ÁöÀûÇß½À´Ï´Ù.

À½¼º ÀÎ½Ä ±â´ÉÀÇ ºó¹øÇÑ »ç¿ëÀº »ç¿ëÀÚÀÇ ÀÎÅÍ·¢Æ¼ºê °æÇèÀ» Å©°Ô ÃÖÀûÈ­ÇÒ »Ó¸¸ ¾Æ´Ï¶ó ÅÍÄ¡ ¹× ¾ó±¼ Àνİú °°Àº ´Ù¸¥ ÀÎÅÍ·¢Æ¼ºê ¹æ½Ä°úÀÇ À¶ÇÕÀ̶ó´Â °³¹ß Ãß¼¼¸¦ ÃËÁøÇϰí ÀÖ½À´Ï´Ù. ¿¹¸¦ µé¾î NIO Banyan 2.4.0ÀÇ Ç® ijºó ¸Þ¸ð¸® ±â´ÉÀº ¾ó±¼ ÀνÄÀ» ±â¹ÝÀ¸·Î NOMI°¡ Á¤º¸¸¦ ±â·ÏÇÑ Å¾½ÂÀÚ¿¡°Ô Àû±ØÀûÀ¸·Î ÀλçÇÕ´Ï´Ù(¿¹ : "Good morning, Doudou") Zeekr 7X´Â À½¼º Àνİú ´« Á¢ÃËÀ» ÅëÇÕÇÏ¿© ¿îÀüÀÚ°¡ ´«À¸·Î ¸»ÇÏ°í °í°³¸¦ ±â¿ïÀÏ ¼ö ÀÖ½À´Ï´Ù. °í°³¸¦ ±â¿ï¿© À½¼ºÀ¸·Î Â÷·®À» Á¶ÀÛÇÒ ¼ö ÀÖ½À´Ï´Ù.

2. BYD´Â ¼Õ¹Ù´Ú Á¤¸Æ ÀÎÁõÀ» ½ÃÀÛÇß°í, Å×¶ó Â÷³» °Ç°­ ¸ð´ÏÅ͸µÀÌ µ¥ºßÇß½À´Ï´Ù.

Áö¹®, Á¤¸Æ, ½É¹Ú¼ö µî »ýüÀÎ½Ä ±â¼úÀº À½¼ºÀνÄ, ¾ó±¼ÀÎ½Ä µî ¼º¼÷ÇÑ »óÈ£ÀÛ¿ë ¹æ½Ä¿¡ ºñÇØ ¾ÆÁ÷ Ž»ö ¹× °³¹ß Ãʱ⠴ܰ迡 ÀÖÁö¸¸, Á¡Â÷ ´ë·® »ý»ê ¹× Ȱ¿ëÀÌ ÀÌ·ç¾îÁö°í ÀÖ½À´Ï´Ù. ¿¹¸¦ µé¾î BYD´Â 2024³â ¼Õ¹Ù´Ú Á¤¸Æ ÀÎÁõ ±â´ÉÀ» Ãâ½ÃÇØ Æí¸®ÇÑ Â÷·® Àá±Ý ÇØÁ¦¸¦ ½ÇÇöÇß°í, Á¦³×½Ã½º¿Í ¸Þ¸£¼¼µ¥½º-º¥Ã÷´Â °¢°¢ 2025 Á¦³×½Ã½º GV70°ú 2025 ¸Þ¸£¼¼µ¥½º-º¥Ã÷ EQE BEV¿¡ Áö¹® ÀÎÁõ ½Ã½ºÅÛÀ» žÀçÇß´Ù, »ç¿ëÀÚ´Â Áö¹®¸¸À¸·Î º»ÀÎ È®ÀÎ, Â÷·® ½Ãµ¿, °áÁ¦ µî ´Ù¾çÇÑ Á¶ÀÛÀ» ÇÒ ¼ö ÀÖ½À´Ï´Ù. ¶ÇÇÑ Exeed Sterra´Â »õ·Î¿î ET ¸ðµ¨¿¡µµ ArcSoftÀÇ ½Ã°¢ ÀÎ½Ä ±â¼úÀ» »ç¿ëÇÏ¿© Â÷·®³» Áö´ÉÇü °Ç°­ ¸ð´ÏÅ͸µ ±â´ÉÀ» ±¸ÇöÇÏ¿© ½É¹Ú¼ö, Ç÷¾Ð, Ç÷Áß »ê¼Ò Æ÷È­µµ, È£Èí¼ö, ½É¹Ú¼ö º¯µ¿ µî 5°¡Áö ÁÖ¿ä ½Åü ÁöÇ¥¸¦ Æ÷ÇÔÇÑ °Ç°­ º¸°í¼­¸¦ Ãâ·ÂÇÕ´Ï´Ù.

»ýüÀÎ½Ä ±â¼úÀÇ µµÀÔÀº ¿îÀüÀÇ Æí¸®¼ºÀ» Çâ»ó½Ãų »Ó¸¸ ¾Æ´Ï¶ó ÀÚµ¿Â÷ÀÇ ¾ÈÀü º¸È£ ¼º´ÉÀ» Å©°Ô Çâ»ó½ÃÄÑ ÇÇ·Î ¿îÀü, ÀÚµ¿Â÷ µµ³­ µî ÀáÀçÀûÀÎ ¾ÈÀü À§ÇèÀ» È¿°úÀûÀ¸·Î ¹æÁöÇÒ ¼ö ÀÖ½À´Ï´Ù. ÇâÈÄ ÀÌ·¯ÇÑ »ýüÀÎ½Ä ±â¼úÀº Áö´ÉÇü Ä¿³ØÆ¼µåÄ« °³¹ß¿¡ ´õ¿í ±¤¹üÀ§ÇÏ°Ô Àû¿ëµÇ¾î ¿îÀüÀÚ¿¡°Ô ´õ¿í ¾ÈÀüÇÏ°í °³ÀÎÈ­µÈ ¸ðºô¸®Æ¼ °æÇèÀ» Á¦°øÇÒ °ÍÀ¸·Î ¿¹»óµË´Ï´Ù.

»ç·Ê1: Genesis 2025GV70ÀÇ Áö¹®ÀÎ½Ä ½Ã½ºÅÛÀº »ç¿ëÀÚ°¡ Áö¹®ÀνÄÀ» ÅëÇØ °³ÀÎÈ­µÈ ¼³Á¤(½ÃÆ®, Æ÷Áö¼Ç µî)À» ºü¸£°Ô Àû¿ëÇÒ ¼ö ÀÖÀ» »Ó¸¸ ¾Æ´Ï¶ó Â÷·® ½Ãµ¿/¿îÀüÀ» Áö¿øÇÕ´Ï´Ù. ¶ÇÇÑ °£Æí Á¶ÀÛ, Áö¹® °áÁ¦, ÁÖÂ÷¿ä¿ø ¸ðµå µî °³ÀÎÈ­µÈ ¿¬µ¿ ±â´ÉÀÌ ÀÖ½À´Ï´Ù.

»ç·Ê 2: BYDÀÇ ¼Õ¹Ù´Ú Á¤¸Æ ÀÎÁõ ½Ã½ºÅÛÀº Ä«¸Þ¶ó·Î ¼Õ¹Ù´Ú Á¤¸Æ µ¥ÀÌÅ͸¦ Àоî 8-20cm °Å¸®, ¼öÆò 360µµ, ¼öÁ÷ 15µµ ¹üÀ§¿¡¼­ ÀνÄÇÕ´Ï´Ù. »ó¾÷¿ë À̹ÌÁö ȹµæ ¸ðµâ·Î Á¤¸Æ ÆÐÅÏÀÇ À̹ÌÁö¸¦ ȹµæÇϰí, ¾Ë°í¸®ÁòÀ¸·Î Ư¡À» ÃßÃâÇϰí ÀúÀåÇÏ¿© ÃÖÁ¾ÀûÀ¸·Î ½Äº° ¹× ÀνÄÀ» ½ÇÇöÇÕ´Ï´Ù. ÇâÈÄ °í±Þ ºê·£µå Yangwang ¸ðµ¨¿¡ ¿ì¼± žÀçµÉ °¡´É¼ºÀÌ ÀÖ½À´Ï´Ù.

»ç·Ê 3: Exeed Sterra ET ¸ðµ¨¿¡´Â DHS Áö´ÉÇü °Ç°­ ¸ð´ÏÅ͸µ ±â´ÉÀÌ Å¾ÀçµÇ¾î ÀÖ½À´Ï´Ù. ÷´Ü ½Ã°¢Àû ¸ÖƼ¸ð´Þ ¾Ë°í¸®ÁòÀ» ±â¹ÝÀ¸·Î üǥ¿¡¼­ °Ç°­ »óŸ¦ ½Ç½Ã°£À¸·Î ºÐ¼®ÇÏ¿© ½É¹Ú¼ö, Ç÷¾Ð, Ç÷Áß »ê¼ÒÆ÷È­µµ, È£Èí¼ö, ½É¹Ú¼ö º¯µ¿ µî 5°¡Áö ÁÖ¿ä ½Åü ÁöÇ¥¸¦ ÃøÁ¤ÇÏ°í °Ç°­ º¸°í¼­¸¦ Ãâ·ÂÇÒ ¼ö ÀÖ½À´Ï´Ù.

Áß±¹ÀÇ ÀÚµ¿Â÷ »ê¾÷¿¡ ´ëÇØ Á¶»çºÐ¼®ÇßÀ¸¸ç, ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼Ç ÁÖ·ùÀÇ ¹æ½Ä, 2024³â ¹ß¸Å¿¡¼­ ÀÎÅÍ·¢¼Ç ¹æ½ÄÀÇ ÀÌ¿ë, °¢ OEM/°ø±Þ¾÷ü ¼Ö·ç¼Ç, °³¹ß µ¿Çâ µîÀÇ Á¤º¸¸¦ Á¦°øÇϰí ÀÖ½À´Ï´Ù.

¸ñÂ÷

Á¦1Àå ÄÛÇÍ ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼ÇÀÇ °³¿ä

Á¦2Àå ÄÛÇÍ ½Ì±Û ¸ð´Þ ÀÎÅÍ·¢¼Ç

Á¦3Àå OEM ÄÛÇÍ ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼Ç ¼Ö·ç¼Ç

Á¦4Àå °ø±Þ¾÷ü ÄÛÇÍ ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼Ç ¼Ö·ç¼Ç

Á¦5Àå Â÷Á¾ º¥Ä¡¸¶Å·¿¡¼­ ¸ÖƼ¸ð´Þ ÀÎÅÍ·¢¼Ç ¼Ö·ç¼ÇÀÇ ÀÀ¿ë »ç·Ê

Á¦6Àå ¸ÖÆ¼¸ð´Þ ÀÎÅÍ·¢¼ÇÀÇ ¿ä¾à°ú °³¹ß µ¿Çâ

KSA
¿µ¹® ¸ñÂ÷

¿µ¹®¸ñÂ÷

Multimodal interaction research: AI foundation models deeply integrate into the cockpit, helping perceptual intelligence evolve into cognitive intelligence

China Automotive Multimodal Interaction Development Research Report, 2024 released by ResearchInChina combs through the interaction modes of mainstream cockpits, the application of interaction modes in key vehicle models launched in 2024, and the cockpit interaction solutions of OEMs/suppliers, and summarizes the development trends of cockpit multimodal interaction fusion.

1. Voice recognition dominates cockpit interaction, and integrates with multiple modes to create a new interaction experience.

Among current cockpit interaction applications, voice interaction is used most widely and most frequently in intelligent cockpits. According to the latest statistics from ResearchInChina, from January to August 2024, the automate voice systems were installed in about 11 million vehicles, a year-on-year increase of 10.9%, with an installation rate of 83%. Li Tao, General Manager of Baidu Apollo's intelligent cockpit business, pointed out that "the frequency of people using cockpits has increased from 3-5 times a day at the beginning to double digits today, and has even reached nearly three digits on some models with leading voice interaction technology."

The frequent use of voice recognition function not only greatly optimizes user interactive experience, but also promotes the development trend of fusing with other interactive modes such as touch and face recognition. For example, the full-cabin memory function of NIO Banyan 2.4.0 is based on face recognition, and NOMI actively greets occupants who have recorded information (e.g., "Good morning, Doudou"); Zeekr 7X integrates voice recognition with eye contact to enable the driver to see and speak to control, and tilt his/her head to control the car via voice.

2. BYD launched palm vein recognition, and Sterra in-cabin health monitoring debuted

Compared with the mature interaction modes such as voice and face recognition, biometric technologies such as fingerprint, vein, and heart rate are still in the early stage of exploration and development, but they are gradually being mass-produced and applied. For example, BYD launched a palm vein recognition function in 2024, which can realize convenient vehicle unlocking; Genesis and Mercedes-Benz introduced fingerprint recognition systems in the 2025 Genesis GV70 and 2025 Mercedes-Benz EQE BEV respectively, allowing users to complete a range of operations such as identification, vehicle start and payment only with fingerprints; in addition, Exeed Sterra still uses visual perception technology provided by ArcSoft in new ET model, realizing in-cabin intelligent health monitoring function, and outputting health reports for users including five major physical indicators, i.e., heart rate, blood pressure, blood oxygen saturation, respiratory rate and heart rate variability.

Introduction of biometric technology not only improves driving convenience, but also significantly enhances the safety protection performance of vehicles, effectively preventing potential safety hazards such as tired driving and car theft. In the future, these biometric technologies will be more widely integrated into the development of intelligent and connected vehicles, providing drivers with a safer and more personalized mobility experience.

Case 1: Fingerprint recognition system of Genesis 2025 GV70 allows users to quickly apply personalized settings (seats, positions, etc.) through fingerprint authentication, and also supports vehicle start/drive. In addition, there are personalized linkage functions such as easy to use, fingerprint payment, and valet mode.

Case 2: BYD's palm vein recognition system uses a camera to read palm vein data for recognition at a distance of 8-20cm, 360 degrees horizontally and 15 degrees vertically. It uses professional image acquisition module to obtain images of vein patterns, extracts characteristics through algorithms and stores them, and finally realizes identification and recognition. In the future, it may be first installed in high-end brand Yangwang models.

Case 3: Exeed Sterra ET model is equipped with DHS intelligent health monitoring function. Based on advanced visual multimodal algorithm, it can analyze health status in real time according to the surface of the human body, measure the five major physical indicators of heart rate, blood pressure, blood oxygen saturation, respiratory rate and heart rate variability, and output a health report.

3. AI foundation models lead cockpit interaction innovation, and perceptual intelligence evolves into cognitive intelligence

China Society of Automotive Engineers clearly defines and classifies intelligent cockpits in its jointly released white paper. The classification system is based on capabilities achieved by intelligent cockpits, comprehensively considers the three dimensions of human-machine interaction capabilities, scenario expansion capabilities, and connected service capabilities, and subdivides intelligent cockpits into five levels from L0 to L4.

With the wide adoption of AI foundation models in intelligent cockpits, HMI capabilities have crossed the boundary of L1 perceptual intelligence and entered a new stage of L2 cognitive intelligence.

Specifically, in the stage of perceptual intelligence, intelligent cockpit mainly relies on the in-cabin sensor system, such as cameras, microphones and touch screens, to capture and identify the behavior, voice and gesture information of driver and passengers, and then convert the information into machine-recognizable data. However, limited by established rules and algorithm framework, the cockpit interaction system in this stage still lacks the capability of independent decision and self-optimization, which is mainly reflected in the passive response to input information.

After entering the cognitive intelligence stage, intelligent cockpits can comprehensively analyze multiple data types such as voice, vision and touch by virtue of powerful multimodal processing capabilities of foundation model technology. This feature makes intelligent cockpits highly intelligent and humanized, able to actively think and serve, as well as keenly perceive actual needs of the driver and passengers, providing users with personalized HMI services. perceives

Case 1: SenseAuto introduced an intelligent cockpit AI foundation model product, A New Member For U, at the 2024 SenseAuto AI DAY. It can be regarded as the "Jarvis" on the vehicle, which can weigh up occupants' words and observe their expressions, actively think, serve, and plan. For example, on the road, it can actively turn up the air conditioner temperature and lower music volume for the sleeping children in the rear seat, and adjust the chassis and driving mode to the comfort mode to create a more comfortable sleeping environment. In addition, it can actively detect the physical condition of occupants, find the nearest hospital for the sick ones, and plan the route.

Case 2: NOMI Agents, NIO's multi-agent framework, uses AI foundation models to reconstruct NOMI's cognition and complex task processing capabilities, allowing it to learn to use tools, for example, calling search, navigation, and reservation services. Meanwhile, according to complexity and time span of the task, NOMI is able to perform complex planning and scheduling. For example, among NOMI's six core multi-agent functions, "NOMI DJ" recommends a playlist that suits the context to users based on their needs, and actively creates an atmosphere; "NOMI Exploration" understands based on spatial orientation, matches map data and world knowledge, and answers children's questions, for example, "what is the tower on the side?".

Table of Contents

1 Overview of Cockpit Multimodal Interaction

2 Cockpit Single-modal Interaction

3 Cockpit Multimodal Interaction Solutions of OEMs

4 Cockpit Multimodal Interaction Solutions of Suppliers

5 Application Cases of Multimodal Interaction Solutions in Benchmarking Vehicle Models

6 Summary and Development Trends of Multimodal Interaction

(ÁÖ)±Û·Î¹úÀÎÆ÷¸ÞÀÌ¼Ç 02-2025-2992 kr-info@giikorea.co.kr
¨Ï Copyright Global Information, Inc. All rights reserved.
PC¹öÀü º¸±â