{"id":13313,"date":"2024-12-30T22:00:00","date_gmt":"2024-12-30T22:00:00","guid":{"rendered":"https:\/\/modernsciences.org\/staging\/4414\/?p=13313"},"modified":"2024-12-13T07:27:59","modified_gmt":"2024-12-13T07:27:59","slug":"ai-emotion-recognition-claims-science-doesnt-stack-up-december-2024","status":"publish","type":"post","link":"https:\/\/modernsciences.org\/staging\/4414\/ai-emotion-recognition-claims-science-doesnt-stack-up-december-2024\/","title":{"rendered":"Tech companies claim AI can recognise human emotions. But the science doesn\u2019t stack up"},"content":{"rendered":"\n<div class=\"theconversation-article-body\">\n    <figure>\n      <img  decoding=\"async\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  class=\" pk-lazyload\"  data-pk-sizes=\"auto\"  data-pk-src=\"https:\/\/images.theconversation.com\/files\/634895\/original\/file-20241128-15-2pj7uh.jpg?ixlib=rb-4.1.0&#038;rect=686%2C0%2C5681%2C4168&#038;q=45&#038;auto=format&#038;w=754&#038;fit=clip\" >\n        <figcaption>\n          \n          <span class=\"attribution\"><a class=\"source\" href=\"https:\/\/www.shutterstock.com\/image-photo\/collage-close-male-female-eyes-isolated-2016262469\" target=\"_blank\" rel=\"noopener\">Master1305\/Shutterstock<\/a><\/span>\n        <\/figcaption>\n    <\/figure>\n\n  <span><a href=\"https:\/\/theconversation.com\/profiles\/natalie-sheard-1268322\" target=\"_blank\" rel=\"noopener\">Natalie Sheard<\/a>, <em><a href=\"https:\/\/theconversation.com\/institutions\/la-trobe-university-842\" target=\"_blank\" rel=\"noopener\">La Trobe University<\/a><\/em><\/span>\n\n  <p>Can artificial intelligence (AI) tell whether you\u2019re happy, sad, angry or frustrated? <\/p>\n\n<p>According to technology companies that offer AI-enabled emotion recognition software, the answer to this question is yes. <\/p>\n\n<p>But this claim does not stack up against mounting scientific evidence.<\/p>\n\n<p>What\u2019s more, emotion recognition technology poses a range of legal and societal risks \u2013 especially when deployed in the workplace. <\/p>\n\n<p>For these reasons, the European Union\u2019s <a href=\"https:\/\/eur-lex.europa.eu\/legal-content\/EN\/TXT\/?uri=CELEX:32024R1689#cpt_II\" target=\"_blank\" rel=\"noopener\">AI Act<\/a>, which <a href=\"https:\/\/commission.europa.eu\/news\/ai-act-enters-force-2024-08-01_en\" target=\"_blank\" rel=\"noopener\">came into force in August<\/a>, bans AI systems used to infer emotions of a person in the workplace \u2013 except for \u201cmedical\u201d or \u201csafety\u201d reasons. <\/p>\n\n<p>In Australia, however, there is not yet specific regulation of these systems. As I argued in my <a href=\"https:\/\/consult.industry.gov.au\/ai-mandatory-guardrails\/submission\/view\/206\" target=\"_blank\" rel=\"noopener\">submission<\/a> to the Australian government in its most recent round of consultations about high-risk AI systems, this urgently needs to change. <\/p>\n\n<h2 id=\"a-new-and-growing-wave\">A new and growing wave<\/h2>\n\n<p>The global market for AI-based emotion recognition systems is <a href=\"https:\/\/www.marketsandmarkets.com\/Market-Reports\/emotion-detection-recognition-market-23376176.html\" target=\"_blank\" rel=\"noopener\">growing<\/a>. It was valued at US$34 billion in 2022 and is expected to reach US$62 billion by 2027.<\/p>\n\n<p>These technologies work by making predictions about a person\u2019s emotional state from biometric data, such as their heart rate, skin moisture, voice tone, gestures or facial expressions. <\/p>\n\n<figure class=\"align-center zoomable\">\n            <a href=\"https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip\" target=\"_blank\" rel=\"noopener\"><img  decoding=\"async\"  alt=\"Woman wearing a hoodie, sweating.\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  class=\" pk-lazyload\"  data-pk-sizes=\"auto\"  data-ls-sizes=\"(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\"  data-pk-src=\"https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\"  data-pk-srcset=\"https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=338&amp;fit=crop&amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=338&amp;fit=crop&amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=338&amp;fit=crop&amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=424&amp;fit=crop&amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=424&amp;fit=crop&amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/638076\/original\/file-20241212-17-4mbr4v.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=424&amp;fit=crop&amp;dpr=3 2262w\" ><\/a>\n            <figcaption>\n              <span class=\"caption\">Someone\u2019s skin moisture is not a reliable predictor of their emotional state.<\/span>\n              <span class=\"attribution\"><a class=\"source\" href=\"https:\/\/www.shutterstock.com\/image-photo\/sweaty-selfconfident-woman-training-outdoors-2254675929\" target=\"_blank\" rel=\"noopener\">Domenico Fornas<\/a><\/span>\n            <\/figcaption>\n          <\/figure>\n\n<p>Next year, Australian tech startup <a href=\"https:\/\/intruth.io\/\" target=\"_blank\" rel=\"noopener\">inTruth Technologies<\/a> plans to launch a wrist-worn device that it claims can track a wearer\u2019s emotions in real time via their <a href=\"https:\/\/intruth.io\/\" target=\"_blank\" rel=\"noopener\">heart rate and other physiological metrics<\/a>. <\/p>\n\n<p>inTruth Technologies founder Nicole Gibson <a href=\"https:\/\/www.theage.com.au\/technology\/the-bold-aussie-start-up-using-ai-to-track-your-emotions-20241024-p5kl47.html\" target=\"_blank\" rel=\"noopener\">has said<\/a> this technology can be used by employers to monitor a team\u2019s \u201cperformance and energy\u201d or their mental health to predict issues such as post-traumatic stress disorder. <\/p>\n\n<p>She has also said inTruth can be an \u201cAI emotion coach that knows everything about you, including what you\u2019re feeling and why you\u2019re feeling it\u201d. <\/p>\n\n<h2 id=\"emotion-recognition-technologies-in-australian-workplaces\">Emotion recognition technologies in Australian workplaces<\/h2>\n\n<p>There is little data about the deployment of emotion recognition technologies in Australian workplaces. <\/p>\n\n<p>However, we do know some Australian companies used a video interviewing system offered by a US-based company called <a href=\"https:\/\/www.hirevue.com\/\" target=\"_blank\" rel=\"noopener\">HireVue<\/a> that incorporated face-based emotion analysis.<\/p>\n\n<p>This system used facial movements and expressions to assess the suitability of job applicants. For example, applicants were assessed on whether they expressed excitement or how they responded to an angry customer. <\/p>\n\n<p>HireVue <a href=\"https:\/\/www.wired.com\/story\/job-screening-service-halts-facial-analysis-applicants\/\" target=\"_blank\" rel=\"noopener\">removed emotion analysis from its systems in 2021<\/a> following a formal complaint in the United States.<\/p>\n\n<p>Emotion recognition may be on the rise again as Australian employers <a href=\"https:\/\/www.parliament.vic.gov.au\/news\/infrastructure\/newtech\" target=\"_blank\" rel=\"noopener\">embrace artificial intelligence-driven workplace surveillance technologies<\/a>.<\/p>\n\n<figure class=\"align-center zoomable\">\n            <a href=\"https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=1000&amp;fit=clip\" target=\"_blank\" rel=\"noopener\"><img  decoding=\"async\"  alt=\"Office workers looking at computers.\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  class=\" pk-lazyload\"  data-pk-sizes=\"auto\"  data-ls-sizes=\"(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\"  data-pk-src=\"https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\"  data-pk-srcset=\"https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/638034\/original\/file-20241212-17-42nbhx.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w\" ><\/a>\n            <figcaption>\n              <span class=\"caption\">AI-enabled emotion recognition technology can be used in workplaces to monitor workers\u2019 emotional state.<\/span>\n              <span class=\"attribution\"><a class=\"source\" href=\"https:\/\/www.shutterstock.com\/image-photo\/aachen-germany-november-8-2022-selective-2472131301\" target=\"_blank\" rel=\"noopener\">BalkansCat\/Shutterstock<\/a><\/span>\n            <\/figcaption>\n          <\/figure>\n\n<h2 id=\"lack-of-scientific-validity\">Lack of scientific validity<\/h2>\n\n<p>Companies such as inTruth claim emotion recognition systems are objective and <a href=\"https:\/\/intruth.io\/\" target=\"_blank\" rel=\"noopener\">rooted in scientific methods<\/a>. <\/p>\n\n<p>However, scholars have raised concerns that these systems involve a return to the discredited fields of <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3889454\" target=\"_blank\" rel=\"noopener\">phrenology<\/a> and <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3927300\" target=\"_blank\" rel=\"noopener\">physiognomy<\/a>. That is, the use of a person\u2019s physical or behavioural characteristics to determine their abilities and character. <\/p>\n\n<p>Emotion recognition technologies are <a href=\"https:\/\/www.article19.org\/emotion-recognition-technology-report\/\" target=\"_blank\" rel=\"noopener\">heavily reliant on theories<\/a> which claim inner emotions are measurable and universally expressed. <\/p>\n\n<p>However, recent evidence shows that how people communicate emotions varies widely across cultures, contexts and individuals. <\/p>\n\n<p>In 2019, for example, <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/1529100619832930\" target=\"_blank\" rel=\"noopener\">a group of experts<\/a> concluded there are \u201cno objective measures, either singly or as a pattern, that reliably, uniquely, and replicably\u201d identify emotional categories. For example, someone\u2019s skin moisture might go up, down or stay the same when they are angry.<\/p>\n\n<p>In a statement to The Conversation, inTruth Technologies founder Nicole Gibson said \u201cit is true that emotion recognition technologies faced significant challenges in the past\u201d, but that \u201cthe landscape has changed significantly in recent years\u201d. <\/p>\n\n<h2 id=\"infringement-of-fundamental-rights\">Infringement of fundamental rights<\/h2>\n\n<p>Emotion recognition technologies also endanger fundamental rights without proper justification.<\/p>\n\n<p>They have been found to discriminate on the basis of <a href=\"https:\/\/papers.ssrn.com\/sol3\/papers.cfm?abstract_id=3281765\" target=\"_blank\" rel=\"noopener\">race<\/a>, <a href=\"https:\/\/arxiv.org\/abs\/2103.11436\" target=\"_blank\" rel=\"noopener\">gender<\/a> and <a href=\"https:\/\/journals.sagepub.com\/doi\/10.1177\/14614448221109550\" target=\"_blank\" rel=\"noopener\">disability<\/a>. <\/p>\n\n<p>In <a href=\"https:\/\/theconversation.com\/emotion-reading-tech-fails-the-racial-bias-test-108404\" target=\"_blank\" rel=\"noopener\">one case<\/a>, an emotion recognition system read black faces as angrier than white faces, even when both were smiling to the same degree. These technologies may also be less accurate for people from demographic groups <a href=\"https:\/\/proceedings.mlr.press\/v81\/buolamwini18a.html\" target=\"_blank\" rel=\"noopener\">not represented in the training data<\/a>.<\/p>\n\n<figure class=\"align-center \">\n            <img  decoding=\"async\"  alt=\"Large crowd of people standing in the sunshine.\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  class=\" pk-lazyload\"  data-pk-sizes=\"auto\"  data-ls-sizes=\"(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px\"  data-pk-src=\"https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;fit=clip\"  data-pk-srcset=\"https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=1 600w, https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=2 1200w, https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=600&amp;h=400&amp;fit=crop&amp;dpr=3 1800w, https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=45&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=1 754w, https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=30&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=2 1508w, https:\/\/images.theconversation.com\/files\/638065\/original\/file-20241212-15-93ub7o.jpg?ixlib=rb-4.1.0&amp;q=15&amp;auto=format&amp;w=754&amp;h=503&amp;fit=crop&amp;dpr=3 2262w\" >\n            <figcaption>\n              <span class=\"caption\">Research has shown emotion recognition technology discriminates on the basis of race, gender and disability.<\/span>\n              <span class=\"attribution\"><a class=\"source\" href=\"https:\/\/www.shutterstock.com\/image-photo\/madrid-sep-13-people-audience-show-217152226\" target=\"_blank\" rel=\"noopener\">Christian Bertrand\/Shutterstock<\/a><\/span>\n            <\/figcaption>\n          <\/figure>\n\n<p>Gibson acknowledged concerns about bias in emotion recognition technologies. But she added that \u201cbias is not inherent to the technology itself but rather to the data sets used to train these systems\u201d. She said inTruth is \u201ccommitted to addressing these biases\u201d by using \u201cdiverse, inclusive data sets\u201d. <\/p>\n\n<p>As a surveillance tool, emotion recognition systems in the workplace pose serious threats to privacy rights. Such rights may be violated if sensitive information is collected without an employee\u2019s knowledge.  <\/p>\n\n<p>There will also be a <a href=\"https:\/\/www.oaic.gov.au\/privacy\/australian-privacy-principles\/australian-privacy-principles-guidelines\/chapter-3-app-3-collection-of-solicited-personal-information\" target=\"_blank\" rel=\"noopener\">failure to respect privacy rights<\/a> if the collection of such data is not \u201creasonably necessary\u201d or by \u201cfair means\u201d.<\/p>\n\n<h2 id=\"workers-views\">Workers\u2019 views<\/h2>\n\n<p>A <a href=\"https:\/\/drive.google.com\/file\/d\/1Teo8eOYoAucdb7hb2CUakfVElauQizVE\/view\" target=\"_blank\" rel=\"noopener\">survey published earlier this year<\/a> found that only 12.9% of Australian adults support face-based emotion recognition technologies in the workplace. The researchers concluded that respondents viewed facial analysis as invasive. Respondents also viewed the technology as unethical and highly prone to error and bias. <\/p>\n\n<p>In a <a href=\"https:\/\/theconversation.com\/emotion-tracking-ai-on-the-job-workers-fear-being-watched-and-misunderstood-222592\" target=\"_blank\" rel=\"noopener\">US study<\/a> also published this year, workers expressed concern that emotion recognition systems would harm their wellbeing and impact work performance. <\/p>\n\n<p>They were fearful that inaccuracies could create false impressions about them. In turn, these false impressions might prevent promotions and pay rises or even lead to dismissal. <\/p>\n\n<p>As one participant stated: <\/p>\n\n<blockquote>\n<p>I just cannot see how this could actually be anything but destructive to minorities in the workplace.<!-- Below is The Conversation's page counter tag. Please DO NOT REMOVE. --><img  loading=\"lazy\"  decoding=\"async\"  src=\"data:image\/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABAQMAAAAl21bKAAAAA1BMVEUAAP+KeNJXAAAAAXRSTlMAQObYZgAAAAlwSFlzAAAOxAAADsQBlSsOGwAAAApJREFUCNdjYAAAAAIAAeIhvDMAAAAASUVORK5CYII=\"  alt=\"The Conversation\"  width=\"1\"  height=\"1\"  style=\"border: none !important; box-shadow: none !important; margin: 0 !important; max-height: 1px !important; max-width: 1px !important; min-height: 1px !important; min-width: 1px !important; opacity: 0 !important; outline: none !important; padding: 0 !important\"  referrerpolicy=\"no-referrer-when-downgrade\"  class=\" pk-lazyload\"  data-pk-sizes=\"auto\"  data-pk-src=\"https:\/\/counter.theconversation.com\/content\/243591\/count.gif?distributor=republish-lightbox-basic\" ><!-- End of code. If you don't see any code above, please get new code from the Advanced tab after you click the republish button. The page counter does not collect any personal data. More info: https:\/\/theconversation.com\/republishing-guidelines --><\/p>\n<\/blockquote>\n\n  <p><span><a href=\"https:\/\/theconversation.com\/profiles\/natalie-sheard-1268322\" target=\"_blank\" rel=\"noopener\">Natalie Sheard<\/a>, Researcher and Lawyer, <em><a href=\"https:\/\/theconversation.com\/institutions\/la-trobe-university-842\" target=\"_blank\" rel=\"noopener\">La Trobe University<\/a><\/em><\/span><\/p>\n\n  <p>This article is republished from <a href=\"https:\/\/theconversation.com\" target=\"_blank\" rel=\"noopener\">The Conversation<\/a> under a Creative Commons license. Read the <a href=\"https:\/\/theconversation.com\/tech-companies-claim-ai-can-recognise-human-emotions-but-the-science-doesnt-stack-up-243591\" target=\"_blank\" rel=\"noopener\">original article<\/a>.<\/p>\n<\/div>\n\n","protected":false},"excerpt":{"rendered":"Master1305\/Shutterstock Natalie Sheard, La Trobe University Can artificial intelligence (AI) tell whether you\u2019re happy, sad, angry or frustrated?&hellip;\n","protected":false},"author":1037,"featured_media":13315,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"nf_dc_page":"","fifu_image_url":"https:\/\/images.pexels.com\/photos\/8090263\/pexels-photo-8090263.jpeg?auto=compress&cs=tinysrgb&w=1260&h=750&dpr=1","fifu_image_alt":"","footnotes":""},"categories":[16],"tags":[3142,3145,3150,3146,3147,3149,3155,3156,3151,3144,3148,3152,3154,3153,474,3143],"class_list":{"0":"post-13313","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech","8":"tag-ai-emotion-recognition","9":"tag-ai-driven-workplace-tools","10":"tag-bias-in-ai-systems","11":"tag-biometric-data-analysis","12":"tag-cultural-emotion-variability","13":"tag-emotion-recognition-technology","14":"tag-employee-monitoring","15":"tag-ethical-issues-in-ai-emotion-tools","16":"tag-facial-expression-bias","17":"tag-phrenology-concerns","18":"tag-physiognomy-concerns","19":"tag-privacy-rights-in-ai","20":"tag-public-perception-of-ai-surveillance","21":"tag-race-and-gender-discrimination-in-ai","22":"tag-the-conversation","23":"tag-workplace-surveillance","24":"cs-entry","25":"cs-video-wrap"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/posts\/13313","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/users\/1037"}],"replies":[{"embeddable":true,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/comments?post=13313"}],"version-history":[{"count":1,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/posts\/13313\/revisions"}],"predecessor-version":[{"id":13314,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/posts\/13313\/revisions\/13314"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/media\/13315"}],"wp:attachment":[{"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/media?parent=13313"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/categories?post=13313"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/modernsciences.org\/staging\/4414\/wp-json\/wp\/v2\/tags?post=13313"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}