Improve Customer Experience in Automotive Industry Through Advanced Driver Assistant Systems

Luca Giraldi

UniversityofMacerata

This manuscript explores the impact of emerging technologies in the automotive industry, specifically focusing on enhancing the customer experience and driving safety. The study investigates the advantages of incorporating emotion-tracking technologies like facial coding and affective computing algorithms into Advanced Driver Assistant Systems (ADAS). A simulated experiment involving 20 participants uses facial coding systems to track emotional responses and statistical analysis to establish a connection between emotional states and driving behaviour. The research reveals a strong correlation between negative emotions and unsafe driving behaviour. The study proposes an ADAS system that utilizes emotional tracking to provide real-time feedback to drivers and adjust the driving environment accordingly.Although the study highlights the potential benefits of emotional tracking technology, it emphasizes the need for furtherresearchtorefine and validatethe proposed ADASsystemand address privacy concerns.Overall, this research offers an innovative approach to improving driving experience and safety, contributing to integrating new technologies in the automotive sector.

Keywords:customerexperience,driverexperience,automotive,sensors,advanceddriverassistantsystems

INTRODUCTION

Digitalisation and new technologies have profoundly influenced the proposition of products and their delivery in all sectors and aspects of daily life. New technologies have been integrated, such as the Internet of Things for the exploitation of large amounts of data (Lee et al., 2022), Artificial Intelligence (e.g., Karim et al., 2022), Blockchain (e.g., Xu et al., 2019), or emerging techniques such as tokenisation (e.g., Wang et al., 2021), which are “hidden” and non-invasive, profoundly impacting the customer journey and the purchasing behaviour of consumers.

The customer journey is defined as understanding and analysing the consumer experience (customer experience, CX), which is a consumer’s inner and subjective reaction to any direct or indirect contact with a company (Meyer & Schwager, 2007). It is particularly relevant in areas such as shopping (Shi, Wang, Chen, & Zhang, 2020), service delivery (e.g., in finance, Buckley & Webster, 2016), and manufacturing sectors such as automotive (Genzlinger et al., 2020). In this sector, a phase of the customer journey that is of particular interest is the driving experience, for instance, in Self-Driving (Elliott, Meng, & Hall, 2021), as it determines the comfort of the journey and influences road safety. This experience is strongly influenced by the driver’s emotions, which can significantly increase or decrease their attention to driving and driving pleasure.

For example, a customer experience scenario for a driver can be represented by how much a given danger affects their emotions or how much environmental factors affect satisfaction and safety (Pêcher et al., 2011). Indeed, road safety is recognised as a central issue by consumers since it is an activity influenced by the environment, road infrastructure, mechanical-electrical status of the car, and the emotional and cognitive state of the driver, which in some cases poses risks for oneself and others (e.g., in Müller-Seitz, Dautzenberg, Creusen, & Stromereder, 2009). To reduce some common causes of accidents, the automotive industry is developing innovative ways to provide safety through ADAS, such as advanced driver assistance systems (Sullivan, 2016), which help reduce road accidents and promote smoother and more efficient transportation. Such systems monitor external factors such as the road, the amount of light, and the distance from other cars, alerting the driver to the existence of a risk or actively assisting in reducing its impact. Most ADAS, both active and passive, mainly focus on monitoring the performance of the vehicle and the surrounding environment without evaluating the emotional and cognitive state of the driver, which, as mentioned above, can be crucial for comfort and safety. The benefits of an ADAS implementation are potentially considerable because of the significant decrease in human suffering or stress, economic costs and pollution (Nasr et al., 2021).

For this reason, this research aims to investigate whether and how new emotion tracking systems, which integrate face coding and affective computing algorithms, can improve the driving experience, laying the foundation for their future integration into ADAS. It seeks to determine whether such systems can measure emotions and how much they can influence the driver’s decisions. The experiment presented in the article seeks to answer the research question. Then it provides a new concept of ADAS capable of overcoming the limitations of current ones and significantly improving the customer experience of driving.

LITERATURE REVIEW

Literature is unanimous in stating that behaviour and decision-making processes are influenced by emotions (Kopton & Kenning, 2014); in turn, emotions are influenced by the surrounding environment and sensory and contextual stimuli (Chaudhuri & Micu, 2018). Emotions occur in every situation, including driving. Negative emotions such as anger and sadness have been shown to activate unrestrained driver actions, contributing to unsafe driving (Jeon, 2016). Driving, in turn, is impacted by several external variables/environmental conditions, such as the weather variable, situational conditions (traffic, traffic jams, accidents), and interactional conditions; therefore, the driver’s affective state is affected and changes continuously (Steyer, Ferring, & Schmitt, 1992).

The results of the study conducted by Tavakoli, Boukhechba, & Heydarian (2020) suggest that drivers are less stressed when driving in non-rainy conditions. In addition, it was also indicated that high traffic intensity may cause frustration in the driver, which can result in more abrupt braking while driving.

Listening to music also affects the listener’s emotional state and thus induces mood changes in various situations. Navarro, Osiurak, Gaujoux, Ouimet, & Reynaud (2019) observed how listening to music influences driving behaviour using a driving simulator. The results showed that listening or not listening to music while driving affects subjective mood, physiological arousal level and driving performance. Listening to one’s favourite music behind the wheel was found to have a positive impact on mood, leading to higher scores of pleasantness, arousal and a positive mood.

However, driving behaviour was not significantly modified, as revealed by the inter-vehicular time. Also, according to (Simmons, Franklin, & Casey, 2019), music is an effective tool for influencing behaviour and emotions, especially for the female gender.

The critical point is determining when and where to use this information and how to introduce it to design and create a unique and personalised customer experience. To this end, developing emotion analysis systems is a focal point for organisations to understand how to design products and services based on the contingent emotional states of target consumers (Bagozzi, Gopinath, & Nyer, 1999). In the automotive industry, many scholars have developed systems to evaluate the impact of emotions on driving performance, allowing vehicles to adapt the driving experience according to the preferences and comfort levels of drivers and passengers.

An advanced ADAS system can support the driver while driving through an empathetic and adaptive response to the driver’s emotional state.

Traditional research methods such as pre-and post-drive questionnaires and interviews are still widely used. However, they can be considered invasive as they directly request preferences or feelings/sensations from the individual (Telpaz, Webb, & Levy, 2015), leading to doubts about their actual effectiveness in detection (Breuer, Scherndl, & Ortner, 2020). Behavioural research shows difficulties understanding emotional reactions using traditional interviews and questionnaires (Iloka & Onyeke, 2020).

In recent years, there has been growing confidence among scholars that using innovative research tools can enable the collection of additional data, such as feelings, emotions, values, memories, or even judgments, using non-invasive methodologies (Lim, 2018; Plassmann, Venkatraman, Huettel, & Yoon, 2015). Specifically, the literature focuses on three methods for recognising emotions while driving:

  • Recognition of emotions based on changes in tone of voice, intensity, and speech energy (Douglas-Cowie et al., 2003) through the installation of a microphone attached to the seatbelt.
  • Recognition of emotions through physiological data such as body temperature, heart rate, and skin reaction (Shu et al., 2018). However, this tracking system requires the use of invasive tools such as fixed, mobile or wearable devices (Haag, Goronzy, Schaich, & Williams, 2004) and, thanks to technological advances, several scholars (Shu et al., 2018; Torres, Torres, Hernández- Álvarez, & Yoo, 2020), consider methods such as ECG or EEG obsolete as they are invasive, altering the spontaneity and, consequently, the emotions that users experience (Phan et al., 2021) .
  • Facial coding systems recognise emotions using a camera that records and analyses images and videos (Cheng, Wang, Jiang, Hou, & Qin, 2019). This allows for identifying the human state at any moment during driving (e.g., in a dangerous situation for the driver) through the identification of facial expressions.

The analysis of emotions can be integrated by studying other elements helpful in understanding the user, such as eye tracking, which studies people’s eye movement and behaviour based on this movement (Jacob & Karn, 2003). Different technologies can be used to measure human eye movement: the most common ones are those that measure the observation of controlled stimuli at fixed points in photos, videos, and other interactions that users have through the screen of a computer (e.g., Holmqvist et al., 2011).

Thanks to technological progress, it is now possible to perform eye-tracking analysis from a simple webcam of any electronic device entirely remotely without asking participants to wear special equipment (e.g., (Yang & Krajbich, 2023). Today, cameras are everywhere, in cities, homes, and cars, making it possible to track human emotions in real time, even while driving. This has led to the development of new systems such as the Driver Monitoring System (DMS), which uses eye-tracking technology to detect drowsiness and distraction and provides alerts to the driver (e.g. (Gupta, Smith, & Shalley, 2006).

Integrating eye-tracking technology with other technologies, such as facial expression recognition and physiological sensors, can create a more comprehensive system to track emotional states and improve the driving experience (Liu, We, & Xu, 2019). One of the key factors for the success of these systems is the ability to measure and interpret emotional states accurately. In recent years, affective computing, which aims to develop algorithms and systems capable of recognising, interpreting, and responding to human emotions, has gained significant attention (Picard, 1997). Affective computing has been used in various applications, including gaming, education, and healthcare, to improve user experience and outcomes (e.g., Fonseca et al., 2021; Yadegaridehkordi et al., 2019).

In the automotive industry, affective computing can create a more personalised and emotionally engaging driving experience by identifying and responding to the driver’s emotional state (Oh et al., 2021).

In conclusion, the integration of new technologies such as eye-tracking and affective computing has the potential to revolutionise the automotive industry by creating more personalised and emotionally engaging driving experiences, as well as improving safety. The present study aims to investigate the potential of such technologies in improving the driving experience and safety by measuring emotional states. The findings of this study can contribute to the development of more advanced and comprehensive ADAS and ultimately provide a more satisfying and safe driving experience for consumers.

METHODOLOGY

An exploratory research design was adopted to understand the integrability of non-intrusive technologies for driver monitoring and to utilise the data to enhance safety through ADAS. The study involved installing an advanced facial and ocular recognition system, typically used in neuromarketing, in a driving simulator to test its effectiveness for the above-mentioned purposes. The software system used in this research was developed by EMOJ, a spin-off of the Polytechnic of Marche established in 2017.

The method comprises three modules:

  • The first module analyses all human-system interactions at every touchpoint by encoding faces captured by any camera in the environment.
  • The second module collects data and calculates essential indicators to assess user experiences, such as satisfaction, attention, and engagement.
  • The third module selects a series of actions to improve the experience. It activates a series of real-time reactions, such as changes in lighting, content adaptation, and other stimuli, to achieve a different emotional state so that the experience is perceived as more enjoyable, comforting, and engaging.

More specifically, the first module of the software enables the acquisition of various types of information related to customer behaviour. It implements multiple technologies such as facial recognition, recognition of facial expressions, gaze tracking, head orientation monitoring, and body posture monitoring. The second module provides a collection of smart analytics tools that allow for rapid and easy processing of collected data to extract valuable insights for better understanding the touchpoint characteristics and cues that most influence the customer experience. The last module utilises the system described in (Ceccacci, Generosi, Giraldi, & Mengoni, 2018) to implement artificial intelligence algorithms (machine learning) based on inductive inference, making decisions based on logical rules derived from a knowledge base defined through the relationships between customer profiles and the adaptive feature of products/services/cues determined through analysis of historical data or according to the results of psychological and marketing studies.

In this way, the system can monitor customers every time they approach any touchpoint along their path and react by adapting the experience based on their emotions and behaviours. The above-described software system was installed in a driving simulator (Figure 1) equipped with a 4k camera positioned on the dashboard to monitor the driver’s emotional and cognitive state in real time while driving on a simulated route, during which different sounds with solid emotional connotations were introduced to elicit different reactions.

Twenty participants with an average age of 27.5 years, all with valid driver’s licenses and normal hearing abilities, were involved in the study. Each trial in the present study was 20 minutes for the control group and 15 minutes for the experimental group.

Before the main driving task commenced, all participants underwent a 5-minute familiarisation period in the driving simulator. Subsequently, both groups completed a 6-minute driving task, during which the experimental group received six acoustic stimuli. The other ten were asked to drive without any acoustic stimulation.

We focused on sound stimuli as many studies were conducted on driving and music: 91% of music exposure occurs during transportation transits, and heart rate is a mediator variable in the relationship between auditory stimulus and driving performance. Music has the proficiency to affect relaxation, speed, or even driver stress while driving (Febriandirza, Chaozhong, Zhong, Hu, & Zhang, 2017). One of the pioneer studies developed by (Brown, 1965) studied the effects of background music, speech and silence during light and heavy traffic. Music (sound having rhythm, melody or harmony) may be more distracting than noise (unwanted signal or disturbance). Using a 3D auditory display to provide information from the Advanced Driver Assistance Systems reduces the eye-off-road time, exploiting the human capability to associate sounds with positions in space (Bellotti, Berta, De Gloria, & Margarone, 2002).

This study explored the effect of acoustic stimulations on emotion in a driving context, using a convolutional Neural network model to recognise the six of Ekman’s universal emotions. The classification data of each of the six universal emotions have been combined to create an engagement index as a metric for emotional involvement instead of the absence of emotions. Furthermore, the emotions indexes have been combined to differentiate emotional engagement according to the arousal-valence model.

FIGURE 1 DRIVING SIMULATOR

On this base, four new indexes were created:

Engagementclassifiedonthearousal axis:

  • Active engagement (Surprise, Fear, Anger, Happiness);
  • Passive engagement (Sad and disgust).

Engagementclassifiedonthevalenceaxis:

  • Positive engagement (Happiness and surprise);
  • Negative engagement (Anger, Disgust, Sadness, Fear).

While the users were driving, the simulator recorded the steering wheel movements, pedal force, and exact position of the car in the simulated scenario and sent the camera footage to the EMOJ system for

subsequent processing. The seven sounds played were selected from the dataset of IADS (International Affective Digital Sounds), each with the function of eliciting different emotions. The first analysis was to verify the emotional effects of the acoustic stimuli on the driver by measuring the emotional state, i.e., recognising Ekman’s (Ekman, 2005) six emotions and calculating the positive and negative engagement index in the two groups of users (anger, surprise, disgust, enjoyment, fear, and sadness).

The second analysis was concerned with studying the effects of emotional state on driving behaviour by evaluating driving performance in different emotional conditions recorded by the software system. At the end of the tasks, all participants were asked to compile a NASA-TLX (NASA-Task Load Index) survey to rate the perceived workload, a working method to estimate workload while a group of people are performing a task or immediately afterwards. NASA-TLX was initially designed for use in aviation, but in recent years, more studies have focused on automobile drivers (8%).

RESULTS

The emotion recognition system effectively identified a different emotional state following the reproduction of acoustic stimuli. For instance, in the case of signals typical of a road accident, regurgitation, or vehicle malfunction, the prevailing emotions recorded were sadness and anger. In contrast, in the case of a child’s laughter or flatulence, joy was the prevailing emotion, as shown in Figure 2. The analysis of the level of engagement between the two groups, with and without acoustic stimuli, showed a difference in emotional reaction. The absence of stimuli produced reduced engagement and vice versa; for this evidence, the importance of introducing alarms and sounds in ADAS to improve driving attention corresponds to more safety. Stimuli of the same kind were presented in different ways to test the coherence of the reaction of our testers.

FIGURE2

RESULTS OF THE EXECUTED TESTS

During the tests performed by the two groups, the internal monitoring system in the driving simulator recorded SDLP, i.e., the standard deviation of the projection line on the road, as a performance indicator. SDLP is typically an index of ‘weaving’: a stable measure of driving performance with high test–retest reliability. Comparing the two groups showed greater accuracy where acoustic stimuli were present than in their absence (Ceccacci et al., 2020). The second primary emotion monitored was sadness, associated with five stimuli (car crash, human vomiting, zombie, car horn and scream pain), followed by happiness, which showed a higher level in 2 stimuli (laughing child and fart/raspberry).

The cognitive workload for each group was assessed using the NASA-TLX, which defines the workload as “the sources of loading imposed by different tasks” (Hart, 2006). The overall workload rating in this study was medium-low, falling in the bottom half of the NASA-TLX scale in all the experimental conditions (values range from 0 to 20). The workload resulted overall higher in the control group (driving task without acoustic stimuli) with a mean value of 8.18. In contrast, the experimental group (driving task with acoustic stimuli) reported a workload mean value of 5.3. For both groups, the item with the highest rating was Effort, respectively 11 in the experimental condition and 6.5 in the control group (Ceccacci et al., 2020).

These results are consistent with previous studies showing the potential of audio-based emotional detection and its application in driving safety (Han, He, Wu, Peng, & Li, 2019; Schuller, Steidl, Batliner, Schiel, & Krajewski, 2011). Furthermore, they support that audio-based emotional detection can enhance driving safety by alerting drivers to potential hazards and helping them maintain their attention on the road (Han et al., 2019). In conclusion, our study highlights the potential of audio-based emotional detection in improving driving safety. The findings support the importance of audio stimuli in ADAS to enhance drivers’ attention to the road and promote safe driving. Further research is needed to explore the full potential of audio-based emotional detection in driving safety.

NewADAS Concept

The tests on the driving simulator described above laid the foundation for designing a new concept of passive ADAS that acts on the driver’s emotional response, increasing their attention to driving and involvement and producing positive effects on safety. The project for the new ADAS, called HU-DRIVE, resulted from the collaboration between EMOJ and RE:Lab, a hi-tech company specialising in interaction engineering for the automotive sector.

HU-DRIVE, schematised in Figure 3, is an innovative assisted driving system that dynamically adapts the vehicle’s interface to the driver’s emotional and cognitive conditions.

FIGURE3

HU-DRIVEASSISTEDDRIVINGSYSTEM

HU-DRIVE consists of two main software modules:

  • The first deals with Face Coding, i.e., recognising the 6 Ekman emotions and gaze direction from face images and then measuring the level of emotional valence, arousal, involvement, and attention, respectively. The images come from a Full HD camera positioned above the steering wheel in the dashboard’s centre. They are processed by Artificial Intelligence algorithms integrated into the car’s control unit (AI Edge).
  • The second module is responsible for selecting a series of nudges, i.e., emotional reaction activators, such as sounds and alarms, infotainment graphics, colours, and intensity of the cluster, infotainment, and internal vehicle environment, for implementing a series of emotional state regulation strategies, for calming the driver or excite them in case of fatigue.

The HU-DRIVE technology can provide feedback to the driver according to the emotional reaction, as shown in FIGURE 4 and FIGURE 5.

Whenever the system detects a critical state, the second module activates reactions accompanying the driver in controlling their emotions and bringing their attention back to driving. The HU-DRIVE concept, schematised in Figure 6, is currently in the prototyping phase and will first be tested in the simulator as mentioned above and then brought to the control unit of a car to realise the new generation of “emotional” ADAS.

FIGURE4

HU-DRIVESOFTWAREEXAMPLEPICTURE#1

FIGURE5

HU-DRIVESOFTWAREEXAMPLEPICTURE#2

FIGURE6

CONCEPTUALINFRASTRUCTUREOFHU-DRIVE

DISCUSSION

The objective of the present study was to examine the impact of emotions on different driving experiences using novel, non-intrusive technological systems that support safe driving. Specifically, we investigated the utility of new emotion-tracking methods, which can increase driver attention and, thus, safety through acoustic signals.

Furthermore, we highlighted the reliability and accuracy of the driver’s emotion recognition system in the automotive context, shedding light on which emotions occur more frequently in specific driving contexts.

The real-time driver tracking system becomes integral to developing an intelligent ADAS called Hu- Drive that obtains and processes information about the driver’s emotional state to personalise the driving experience.

The potential improvement that tools like HU-DRIVE can bring to the driving experience is evident, thanks to the automatic monitoring of the driver’s experience in real-time and in a non-intrusive manner, while also providing companies with valuable information on the driver, including spontaneous or subconscious emotions, which can provide insights into the usability and user experience of driving.

Measurements taken and analysis of emotional responses, including verification of driving performance, confirmed that acoustic stimulation can act as a facilitator and that emotions induced by variations in lights and sounds within the vehicle can modulate the driver’s state and increase attention and, thus, safety. Moreover, they confirmed the reliability and accuracy of the employed emotion recognition system.

CONCLUSION

In this contribution, we have highlighted the potential of ADAS innovations to enhance driving safety and reduce automotive accidents. By incorporating a reliable and accurate emotional response recognition system and active stimuli set, it is feasible to facilitate the driver’s attention state and consequently promote driving safety. However, to validate and further advance the results achieved, future research will aim to enhance eye-tracking technology, as it has been demonstrated that the permissible mean error is susceptible to improvement. In addition, our efforts will be directed towards improving the accuracy of detecting micro- facial expressions. The present study’s findings offer a promising avenue for developing more effective ADAS solutions to improve driving safety and mitigate the risk of automotive accidents. ADAS in heavy vehicles has garnered increased attention recently (Blades, Douglas, Early, Lo, & Best, 2020). Moreover, commercial fleets have increasingly implemented ADAS to improve safety. ADAS impacts not only the driving experience but also how drivers feel whilst driving (Eckoldt, Knobel, Hassenzahl, & Schumann, 2012). Continued research in this area is crucial to advance the field and preserving lives on the road.

REFERENCES

Bagozzi, R.P., Gopinath, M., & Nyer, P.U. (1999). The Role of Emotions in Marketing. Journalofthe Academy of Marketing Science, 27(2). https://doi.org/10.1177/0092070399272005

Bellotti, F., Berta, R., De Gloria, A., & Margarone, M. (2002). Using 3D Sound to Improve the Effectiveness of the Advanced Driver Assistance Systems. PersonalandUbiquitousComputing, 6(3), 155–163. https://doi.org/10.1007/s007790200016

Blades, L., Douglas, R., Early, J., Lo, C.Y., & Best, R. (2020, April 14). AdvancedDriver-Assistance Systems for City Bus Applications. https://doi.org/10.4271/2020-01-1208

Breuer, S., Scherndl, T., & Ortner, T.M. (2020). Effects of Response Format on Psychometric Properties and Fairness of a Matrices Test: Multiple Choice vs. Free Response. Frontiers in Education, 5. https://doi.org/10.3389/feduc.2020.00015

Brown, I.D. (1965). Effect of a car radio on driving in traffic. Ergonomics, 8(4), 475–479. https://doi.org/10.1080/00140136508930828

Ceccacci, S., Generosi, A., Giraldi, L., & Mengoni, M. (2018). Tool to Make Shopping Experience Responsive to Customer Emotions. InternationalJournalofAutomationTechnology, 12(3), 319–

326. https://doi.org/10.20965/ijat.2018.p0319

Ceccacci, S., Mengoni, M., Andrea, G., Giraldi, L., Carbonara, G., Castellano, A., & Montanari, R. (2020). APreliminaryInvestigationTowardstheApplicationofFacialExpressionAnalysisto Enable an Emotion-Aware Car Interface. https://doi.org/10.1007/978-3-030-49108-6_36

Chaudhuri, A., & Micu, D. (2018). Consumer, Product and Situational Effects on Willingness to Help Churches: The Role of Emotion and Reason. Journal of Marketing Development and Competitiveness, 12(3). https://doi.org/10.33423/jmdc.v12i3.66

Cheng, Q., Wang, W., Jiang, X., Hou, S., & Qin, Y. (2019). Assessment of Driver Mental Fatigue Using Facial Landmarks. IEEE Access, 7, 150423–150434. https://doi.org/10.1109/ACCESS.2019.2947692

Douglas-Cowie, E., Cowie, R., Sneddon, I., Cox, C., Lowry, O., McRorie, M., . . . Karpouzis, K. (n.d.). The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data. In Affective Computing and Intelligent Interaction (pp. 488–500). Berlin, Heidelberg: Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-540-74889-2_43

Eckoldt, K., Knobel, M., Hassenzahl, M., & Schumann, J. (2012). An Experiential Perspective on Advanced Driver Assistance Systems. Itit, 54(4), 165–171. https://doi.org/10.1524/itit.2012.0678

Ekman, P. (2005). Basic Emotions. In HandbookofCognitionandEmotion(pp. 45–60). Chichester, UK: John Wiley & Sons, Ltd. https://doi.org/10.1002/0470013494.ch3

Elliott, K., Meng, J.G., & Hall, M. (2021). An Integrated Approach for Predicting Consumer Acceptance of Self-Driving Vehicles in the United States. Journal of Marketing Development and Competitiveness, 15(2). https://doi.org/10.33423/jmdc.v15i2.4330

Febriandirza, A., Chaozhong, W., Zhong, G., Hu, Z., & Zhang, H. (2017). The Effect of Natural Sounds and Music on Driving Performance and Physiological. Engineering Letter, 25(04), 1–9.

Fonseca, X., Slingerland, G., Lukosch, S., & Brazier, F. (2021). Designing for meaningful social interaction in digital serious games. Entertainment Computing, 36, 100385. https://doi.org/10.1016/j.entcom.2020.100385

Gupta, A.K., Smith, K.G., & Shalley, C.E. (2006). The Interplay Between Exploration and Exploitation.

AcademyofManagementJournal, 49(4), 693–706. https://doi.org/10.5465/amj.2006.22083026 Haag, A., Goronzy, S., Schaich, P., & Williams, J. (2004a). Emotion Recognition Using Bio-sensors:

FirstStepstowardsanAutomaticSystem. https://doi.org/10.1007/978-3-540-24842-2_4

Han, X., He, H., Wu, J., Peng, J., & Li, Y. (2019). Energy management based on reinforcement learning with double deep Q-learning for a hybrid electric tracked vehicle. AppliedEnergy, 254, 113708. https://doi.org/10.1016/j.apenergy.2019.113708

Hart, S.G. (2006). Nasa-Task Load Index (NASA-TLX); 20 Years Later. ProceedingsoftheHuman Factors and Ergonomics Society Annual Meeting, 50(9), 904–908. https://doi.org/10.1177/154193120605000909

Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye-tracking: A comprehensive guide to methods, paradigms and measures (OUP Oxford). Oxford.

Iloka, B.C., & Onyeke, K.J. (2020). Neuromarketing: A historical review. NeuroscienceResearchNotes, 3(3), 27–35. https://doi.org/10.31117/neuroscirn.v3i3.54

Jacob, R.J.K., & Karn, K.S. (2003). Eye Tracking in Human-Computer Interaction and Usability Research. In TheMind’sEye(pp. 573–605). Elsevier. https://doi.org/10.1016/B978-044451020- 4/50031-1

Jeon, M. (2016). Don’t Cry While You’re Driving: Sad Driving Is as Bad as Angry Driving. International Journal of Human–Computer Interaction, 32(10), 777–790. https://doi.org/10.1080/10447318.2016.1198524

Kopton, I.M., & Kenning, P. (2014). Near-infrared spectroscopy (NIRS) as a new tool for neuroeconomic research. Frontiers in Human Neuroscience, 8. https://doi.org/10.3389/fnhum.2014.00549

Lim, W.M. (2018). Demystifying neuromarketing. JournalofBusinessResearch, 91, 205–220. https://doi.org/10.1016/j.jbusres.2018.05.036

Liu, M., We, K., & Xu, J.J. (2019). How Will Blockchain Technology Impact Auditing and Accounting: Permissionless versus Permissioned Blockchain. AmericanAccountingAssociation, 13(2), 19–29.

Meyer, C., & Schwager, A. (2007, February). Understanding Customer Experience. HarvardBusiness Review.

Müller-Seitz, G., Dautzenberg, K., Creusen, U., & Stromereder, C. (2009). Customer acceptance of RFID technology: Evidence from the German electronic retail sector. Journal of Retailing and Consumer Services, 16(1), 31–39. https://doi.org/10.1016/j.jretconser.2008.08.002

Navarro, J., Osiurak, F., Gaujoux, V., Ouimet, M.C., & Reynaud, E. (2019). Driving Under the Influence: How Music Listening Affects Driving Behaviors. Journal of Visualized Experiments, (145). https://doi.org/10.3791/58342

Oh, G., Ryu, J., Jeong, E., Yang, J.H., Hwang, S., Lee, S., & Lim, S. (2021). DRER: Deep Learning– Based Driver’s Real Emotion Recognizer. Sensors, 21(6), 2166. https://doi.org/10.3390/s21062166

Phan, T.-D.-T., Kim, S.-H., Yang, H.-J., & Lee, G.-S. (2021). EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels. Sensors, 21(15), 5092. https://doi.org/10.3390/s21155092

Picard, R.W. (1997). AffectiveComputing(Hardcover). MIT Press.

Plassmann, H., Venkatraman, V., Huettel, S., & Yoon, C. (2015). Consumer Neuroscience: Applications, Challenges, and Possible Solutions. Journal of Marketing Research, 52(4), 427–435. https://doi.org/10.1509/jmr.14.0048

Schuller, B., Steidl, S., Batliner, A., Schiel, F., & Krajewski, J. (2011). The INTERSPEECH 2011 speaker state challenge. Interspeech 2011, pp. 3201–3204. ISCA: ISCA. https://doi.org/10.21437/Interspeech.2011-801

Shi, S., Wang, Y., Chen, X., & Zhang, Q. (2020). Conceptualization of omnichannel customer experience and its impact on shopping intention: A mixed-method approach. International Journal of Information Management, 50, 325–336. https://doi.org/10.1016/j.ijinfomgt.2019.09.001

Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., . . . Yang, X. (2018). A Review of Emotion Recognition Using Physiological Signals. Sensors, 18(7), 2074. https://doi.org/10.3390/s18072074

Simmons, M., Franklin, J., & Casey, B. (2019). Influential Article Review – Understanding Links

Between Emotions and Women’s Product Preferences. JournalofMarketingDevelopmentand Competitiveness, 13(6).

Steyer, R., Ferring, D., & Schmitt, M.J. (1992). States and traits in psychological assessment. European Journal of Psychological Assessment, 8(2), 79–98.

Tavakoli, A., Boukhechba, M., & Heydarian, A. (2020). PersonalizedDriverStateProfiles:A Naturalistic Data-Driven Study. https://doi.org/10.1007/978-3-030-50943-9_5

Telpaz, A., Webb, R., & Levy, D.J. (2015). Using EEG to Predict Consumers’ Future Choices. Journalof Marketing Research, 52(4), 511–529. https://doi.org/10.1509/jmr.13.0564

Torres, E.P., Torres, E.A., Hernández-Álvarez, M., & Yoo, S.G. (2020). EEG-Based BCI Emotion Recognition: A Survey. Sensors, 20(18), 5083. https://doi.org/10.3390/s20185083

Wang, Q., Li, R., Wang, Q., & Chen, S. (2021). Non-fungibletoken(NFT):Overview,evaluation, opportunities and challenges. arXiv:2105.07447. https://doi.org/10.48550/arXiv.2105.07447

Xu, X., Lu, Q., Liu, Y., Zhu, L., Yao, H., & Vasilakos, A.V. (2019). Designing blockchain-based applications a case study for imported product traceability. FutureGenerationComputerSystems, 92, 399–406. https://doi.org/10.1016/j.future.2018.10.010

Yadegaridehkordi, E., Noor, N.F.B.M., Ayub, M.N. Bin, Affal, H.B., & Hussin, N.B. (2019). Affective computing in education: A systematic review and future research. Computers&Education, 142, 103649. https://doi.org/10.1016/j.compedu.2019.103649

Yang, X., & Krajbich, I. (2023). Webcam-based online eye-tracking for behavioral research. Judgment and Decision Making, 16(6), 1485–1505. https://doi.org/10.1017/S1930297500008512