Эмоциональное отношение к компонентам цифровой среды (на материале текстового анализа комментариев в Сети)
- Авторы: Кузнецова Ю.М.1
-
Учреждения:
- Федеральный исследовательский центр «Информатика и управление» Российской академии наук
- Выпуск: Том 19, № 2 (2022): Цифровое общество как культурно-исторический контекст развития личности
- Страницы: 253-281
- Раздел: ЛИЧНОСТЬ В ЦИФРОВУЮ ЭПОХУ: РАЗВИТИЕ, ПОЗНАНИЕ, КОММУНИКАЦИЯ
- URL: https://journal-vniispk.ru/2313-1683/article/view/326105
- DOI: https://doi.org/10.22363/2313-1683-2022-19-2-253-281
- ID: 326105
Цитировать
Полный текст
Аннотация
Установление специфических отношений человека с киберсредой и ее компонентами является одним из психологических эффектов цифровизации. Представлены результаты исследования эмоциональной составляющей отношения к компонентам цифровой среды, проведенного с помощью эмотивно-предикатного анализа - нового метода компьютерной обработки текста, реализованного в инструменте TITANIS. Метод позволяет в автоматическом режиме извлекать из текстов описания эмоциональных ситуаций, в которых компоненты цифровой среды выступают причиной либо субъектом 68 эмоциональных состояний. Материалом для анализа послужили тексты 2048 сетевых обсуждений видео, размещенных в русскоязычном YouTube. Показано, что эмоциональные ситуации с участием различных компонентов цифровой среды достаточно типичны даже для далеких по своей тематике сетевых обсуждений. Упоминаемые в нетематических обсуждениях в качестве участников эмоциональных ситуаций компоненты цифровой среды отнесены к группам: общие понятия из сферы цифровых технологий, цифровые устройства, опосредованная цифровыми технологиями деятельность. Относящиеся к последней группе лексемы, обозначающие различные аспекты сетевой коммуникации, входят в абсолютное большинство описаний эмоциональных ситуаций с участием компонентов цифровой среды, при этом в шесть раз чаще в качестве причин эмоций, чем в качестве субъектов эмоциональных состояний. По своему знаку эмоциональное отношение к компонентам киберсреды в целом характеризуется как сбалансированное, без заметного преобладания негативных или позитивных эмоций, однако негативные состояния чаще атрибутируются компонентам киберсреды как субъектам, чем как причине эмоций. Практическая значимость описанного метода текстового анализа как средства, позволяющего оценивать эмоциональную составляющую отношения к компонентам киберсреды, определяется тем влиянием, которое аффективные реакции пользователей оказывают на востребованность технических инноваций и направленность их развития.
Об авторах
Юлия Михайловна Кузнецова
Федеральный исследовательский центр «Информатика и управление» Российской академии наук
Автор, ответственный за переписку.
Email: kuzjum@yandex.ru
ORCID iD: 0000-0001-9380-4478
кандидат психологических наук, старший научный сотрудник
Российская Федерация, 117312, Москва, пр-кт 60-летия Октября, д. 9Список литературы
- Bartneck, C., Kulić, D., Croft, E., & Zoghbi, S. (2009). Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. International Journal of Social Robotics, 1(1), 71–81. https://doi.org/10.1007/s12369-008-0001-3
- Belyaev, G.Yu. (2020). Social and digital environment as a source of new opportunities and new risks for modern education. Otechestvennaya i Zarubezhnaya Pedagogika, (4(69)), 109–123.
- Briggs, G., Gessell, B., Dunlap, M., & Scheutz, M. (2014). Actions speak louder than looks: Does robot appearance affect human reactions to robot protest and distress? The 23rd IEEE International Symposium on Robot and Human Interactive Communication: Conference Proceedings (pp. 1122–1127). Edinburgh: IEEE. http://doi.org/10.1109/ROMAN.2014.6926402
- Brondi, S., Pivetti, M., Di Battista, S., & Sarrica, M. (2021). What do we expect from robots? Social representations, attitudes and evaluations of robots in daily life. Technology in Society, 66, 101663. https://doi.org/10.1016/j.techsoc.2021.101663
- Burov, S.P. (2018). Methods for assessing social human-robot interaction. Gumanitarnaya Informatika, (14), 18–26. (In Russ.) https://doi.org/10.17223/23046082/14/2
- Carroll, J.M. (1997). Human – computer interaction: Psychology as a science of design. Annual Review of Psychology, 48, 61–83. https://doi.org/10.1146/annurev.psych.48.1.61
- Cave, S., Coughlan, K., & Dihal, K. (2019). “Scary Robots:” Examining public responses to AI. AIES '19: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society (pp. 331–337). New York: Association for Computing Machinery. http://doi.org/10.1145/3306618.3314232
- Connolly, J., Mocz, V., Salomons, N., Valdez, J., Tsoi, N., Scassellati, B., & Vázquez, M. (2020). Prompting prosocial human interventions in response to robot mistreatment. HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human – Robot Interaction (pp. 211–220). New York: Association for Computing Machinery. http://doi.org/10.1145/3319502.3374781
- Doyle-Burke, D., & Haring, K.S. (2020). Robots are moral actors: Unpacking current moral HRI research through a moral foundations lens. Social Robotics: 12th International Conference, ICSR 2020, Golden, CO, USA, November 14–18, 2020, Proceedings (pp. 170–181). Berlin, Heidelberg: Springer. https://doi.org/10.1007/978-3-030-62056-1_15
- Enikolopov, S.N., Kuznetsova, Y.M., Osipov, G.S., Smirnov, I.V., & Chudova, N.V. (2021). The method of relational-situational analysis of text in psychological research. Psychology. Journal of the Higher School of Economics, 18(4), 748–769. (In Russ.) https://doi.org/10.17323/1813-8918-2021-4-748-769
- Epley, N., Waytz, A., & Cacioppo, J.T. (2007). On seeing human: a three-factor theory of anthropomorphism. Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
- Fast, E., & Horvitz, E. (2017). Long-term trends in the public perception of artificial intelligence. AAAI'17: Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (pp. 963–969). AAAI Press. https://doi.org/10.48550/arXiv.1609.04904
- Gao, S., He, L., Chen, Y., Li, D., & Lai, K. (2020). Public perception of artificial intelligence in medical care: Content analysis of social media. Journal of Medical Internet Research, 22(7), e16649. https://doi.org/10.2196/16649
- Gimaletdinova, G.K., & Dovtaeva, E.Kh. (2020). Sentiment analysis of the reader’s internet commentary on a political text. Political Linguistics, (1), 42–51. (In Russ.) https://doi.org/10.26170/pl20-01-05
- Goldenberg, A., Garcia, D., Halperin, E., & Gross, J.J. (2020). Collective emotions. Current Directions in Psychological Science, 29(2), 154–160. https://doi.org/10.1177/0963721420901574
- Gregor, B., & Gotwald, B. (2021). Perception of artificial intelligence by customers of science centers. Problemy Zarządzania (Management Issues), 19(1), 29–39. https://doi.org/10.7172/1644-9584.91.2
- Horstmann, A.C., Bock, N., Linhuber, E., Szczuka, J.M., Straßmann, C., & Krämer, N.C. (2018). Do a robot’s social skills and its objection discourage interactants from switching the robot off? PLoS ONE, 13(7), e0201581. https://doi.org/10.1371/journal.pone.0201581
- Io, H.N., & Lee, C.B. (2020) Social media comments about hotel robots. Journal of China Tourism Research, 16(4), 606–625. https://doi.org/10.1080/19388160.2020.1769785
- Keijsers, M., & Bartneck, C. (2018). Mindless robots get bullied. HRI '18: Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction (pp. 205–214). New York: Association for Computing Machinery. https://doi.org/10.1145/3171221.3171266
- Kelley, P.G., Yang, Y., Heldreth, C., Moessner, C., Sedley, A., & Woodruff, A. (2021). “Mixture of amazement at the potential of this technology and concern about possible pitfalls:” Public sentiment towards AI in 15 countries. IEEE Data Engineering Bulletin, 44(4), 28–46.
- Korotkova, V.O., & Lobza, O.V. (2021). Personal determinants of the functioning of mental structures in the digital space. Research Result. Pedagogy and Psychology of Education, 7(4), 59–73. (In Russ.) https://doi.org/10.18413/2313-8971-2021-7-4-0-5
- Kwon, D., Chung, M.J., Park, J.C., Yoo, C.D., Jee, E., Park, K., Kim, Y., Kim, H., Park, J., Min, H., Park, J.W., Yun, S., & Lee, K. (2008). Emotional exchange of a socially interactive robot. IFAC Proceedings Volumes, 41(2), 4330–4335. https://doi.org/10.3182/20080706-5-KR-1001.00729
- Lazer, D.M.J., Pentland, A., Watts, D.J., Aral, S., Athey, S., Contractor, N., Freelon, D., Gonzalez-Bailon, S., King, G., Margetts, H., Nelson, A., Salganik, M.J., Strohmaier, M., Vespignani, A., & Wagner, C. (2020). Computational social science: Obstacles and opportunities. Science, 369(6507), 1060–1062. https://doi.org/10.1126/science.aaz8170
- Li, S., Xu, L., Yu, F., & Peng, K. (2020). Does trait loneliness predict rejection of social robots? The role of reduced attributions of unique humanness: Exploring the effects of trait loneliness on anthropomorphism and acceptance of social robots. HRI '20: Proceedings of the 2020 ACM/IEEE International Conference on Human – Robot Interaction (pp. 271–280). New York: Association for Computing Machinery. https://doi.org/10.1145/3319502.3374777
- Liu, X., Burns, A.C., & Hou, Y. (2017). An investigation of brand-related usergenerated content on Twitter. Journal of Advertising, 46(2), 236–247. https://doi.org/10.1080/00913367.2017.1297273
- Lucas, H., Poston, J., Yocum, N., Carlson, Z., & Feil-Seifer, D. (2016). Too big to be mistreated? Examining the role of robot size on perceptions of mistreatment. 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN): Conference Proceedings (pp. 1071–1076). IEEE Press. https://doi.org/10.1109/ROMAN.2016.7745241
- Malinowska, J.K. (2021). What does it mean to empathise with a robot? Minds and Machines, 31(3), 361–376. https://doi.org/10.1007/s11023-021-09558-7
- Mdivani, M. (2019). Anthropomorphic trends in the perception of a personal car. Vestnik Samarskoi Gumanitarnoi Akademii. Seriya: Psikhologiya, (1), 74–81. (In Russ.)
- Mdivani, M.O. (2018). Interaction of individual with anthropogenous environment. Social Sciences and Humanities: Theory and Practice, (1), 535–547. (In Russ.)
- Melo, C.D., Carnevale, P.J., & Gratch, J. (2011). The effect of expression of anger and happiness in computer agents on negotiations with humans. AAMAS '11: The 10th International Conference on Autonomous Agents and Multiagent Systems: Conference Proceedings (vol. 3, pp. 937–944). Richland: IFAAMAS.
- Nadarzynski, T., Miles, O., Cowie, A., & Ridge, D. (2019). Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: A mixed-methods study. Digital Health, 5. https://doi.org/10.1177/2055207619871808
- Naneva, S., Gou, M.S., Webb, T.L., & Prescott, T.J. (2020). A systematic review of attitudes, anxiety, acceptance, and trust towards social robots. International Journal of Social Robotics, 12(6), 1179–1201. https://doi.org/10.1007/s12369-020-00659-4
- Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
- Nicolas, S., & Wykowska, A. (2021). The personality of anthropomorphism: How the need for cognition and the need for closure define attitudes and anthropomorphic attributions toward robots. Computers in Human Behavior, 122, 106841. https://doi.org/10.1016/J.CHB.2021.106841
- Nomura, T., Fujita, A., Suzuki, D., & Umemuro, H. (2015). Development of the multi-dimensional robot attitude scale: Constructs of people’s attitudes towards domestic robots. Social Robotics. ICSR 2015. Lecture Notes in Computer Science: Conference Proceedings (vol. 9388, pp. 482–491). Cham: Springer. https://doi.org/10.1007/978-3-319-25554-5_48
- Nomura, T., Kanda, T., & Suzuki, T. (2006). Experimental investigation into influence of negative attitudes toward robots on human – robot interaction. AI & Society, 20(2), 138–150. https://doi.org/10.1007/s00146-005-0012-7
- Nomura, T., Kanda, T., Suzuki, T., & Kato, K. (2008). Prediction of human behavior inhuman – robot interaction using psychological scales for anxiety and negative attitudes toward robots. IEEE Transactions on Robotics, 24(2), 442–451. https://doi.org/10.1109/TRO.2007.914004
- Onnasch, L., & Roesler, E. (2019). Anthropomorphizing robots: The effect of framing in human – robot collaboration. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 63(1), 1311–1315. https://doi.org/10.1177/1071181319631209
- Ouchchy, L., Coin, A., & Dubljevic, V. (2020). AI in the headlines: the portrayal of the ethical issues of artificial intelligence in the media. AI & Society, 35(4), 927–936. https://doi.org/10.1007/s00146-020-00965-5
- Panov, V.I., & Patrakov, E.V. (2020). Digitalization of the information environment: Risks, representations, interactions. Moscow: Psychological Institute of RAE Publ.; Kursk: Universitetskaya Kniga Publ. (In Russ.) https://doi.org/10.47581/2020/02.Panov.001
- Ray, C., Mondada, F., & Siegwart, R.Y. (2008). What do people expect from robots? 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems: Conference Proceedings (pp. 3816–3821). IEEE. https://doi.org/10.1109/IROS.2008.4650714
- Reeves, B., Hancock, J., & Liu, X. (2020). Social robots are like real people: First impressions, attributes, and stereotyping of social robots. Technology, Mind, and Behavior, 1(1). https://doi.org/10.1037/tmb0000018
- Riek, L.D., Rabinowitch, T., Chakrabarti, B., & Robinson, P. (2009). Empathizing with robots: Fellow feeling along the anthropomorphic spectrum. 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops: Conference Proceedings (pp. 1–6). IEEE. https://doi.org/10.1109/ACII.2009.5349423
- Rosen, J., Lindblom, J., Billing, E., & Lamb, M. (2021). Ethical challenges in the human – robot interaction field. ArXiv. arXiv:2104.09306. https://doi.org/10.48550/arXiv.2104.09306
- Rosenthal-von der Pütten, A.M., Krämer, N.C., Hoffmann, L., Sobieraj, S., & Eimler, S.C. (2013). An experimental study on emotional reactions towards a robot. International Journal of Social Robotics, 5(1), 17–34. https://doi.org/10.1007/s12369-012-0173-8
- Ruijten, P.A.M., & Zhao, T. (2017). Computers and people alike. Investigating the similarity-attraction paradigm in persuasive technology. Persuasive Technology: Development and Implementation of Personalized Technologies to Change Attitudes and Behaviors. PERSUASIVE 2017. Lecture Notes in Computer Science: Conference Proceedings (vol. 10171, pp. 135–147). Cham: Springer. https://doi.org/10.1007/978-3-319-55134-0_11
- Savela, N., Garcia, D., Pellert, M., & Oksanen, A. (2021). Emotional talk about robotic technologies on Reddit: Sentiment analysis of life domains, motives, and temporal themes. New Media & Society. https://doi.org/10.1177/14614448211067259
- Schmidtler, J., Bengler, K., Dimeas, F., & Campeau-Lecours, A. (2017). A questionnaire for the evaluation of physical assistive devices (QUEAD): Testing usability and acceptance in physical human-robot interaction. 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC): Conference Proceedings (pp. 876–881). IEEE. https://doi.org/10.1109/SMC.2017.8122720
- Shpilnaya, N.N. (2018). The principle of the suppositional connection of the lexeme and the text as a key principle of human-computer communication organization. Kul'tura i Tekst, (4), 209–226. (In Russ.)
- Spatola, N., & Wudarczyk, O.A. (2021). Ascribing emotions to robots: Explicit and implicit attribution of emotions and perceived robot anthropomorphism. Computers in Human Behavior, 124, 106934. https://doi.org/10.1016/j.chb.2021.106934
- Stepnova, L.A., Safonova, T.E., & Kostyuk, Ju.A. (2020). Study of students’ digital consciousness by the method of semantic differential. World of Science. Pedagogy and Psychology, 8(6), 71. (In Russ.)
- Strait, M.K., Aguillon, C., Contreras, V., & Garcia, N. (2017). The public’s perception of humanlike robots: Online social commentary reflects an appearance-based uncanny valley, a general fear of a “Technology Takeover”, and the unabashed sexualization of female-gendered robots. 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN): Conference Proceedings (pp. 1418–1423). IEEE. https://doi.org/10.1109/ROMAN.2017.8172490
- Strait, M.K., Contreras, V., & Vela, C.D. (2018). Verbal disinhibition towards robots is associated with general antisociality. ArXiv. arXiv:1808.01076. https://doi.org/10.48550/arXiv.1808.01076
- Sundar, S.S. (2020). Rise of machine agency: A framework for studying the psychology of human – AI Interaction (HAII). Journal of Computer-Mediated Communication, 25(1), 74–88. https://doi.org/10.1093/jcmc/zmz026
- Swaminathan, V., Schwartz, H.A., Menezes, R., & Hill, S. (2022). The language of brands in social media: Using topic modeling on social media conversations to drive brand strategy. Journal of Interactive Marketing, 57(2), 255–277. https://doi.org/10.1177/10949968221088275
- Takayama, L., Ju, W., & Nass, C. (2008). Beyond dirty, dangerous and dull: What everyday people think robots should do. HRI '08: Proceedings of the 3rd ACM/IEEE International Conference on Human Robot Interaction (pp. 25–32). New York: Association for Computing Machinery. https://doi.org/10.1145/1349822.1349827
- Titova, T.A. (2013). Anthropomorphism as a way of mastering reality: Socio-philosophical analysis. Ph.D. in Philosophy Thesis. Kazan: Kazan Federal University. (In Russ.)
- Viik, T. (2020). Falling in love with robots: A phenomenological study of experiencing technological alterities. Paladyn, Journal of Behavioral Robotics, 11(1), 52–65. https://doi.org/10.1515/pjbr-2020-0005
- Wang, X., & Krumhuber, E.G. (2018). Mind perception of robots varies with their economic versus social function. Frontiers in Psychology, 9, 1230. https://doi.org/10.3389/fpsyg.2018.01230
- Waytz, A., Epley, N., & Cacioppo, J.T. (2010). Social cognition unbound: Insights into anthropomorphism and dehumanization. Current Directions in Psychological Science, 19(1), 58–62. https://doi.org/10.1177/0963721409359302
- Wiese, E., Metta, G., & Wykowska, A. (2017). Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Frontiers in Psychology, 8, 1663. https://doi.org/10.3389/fpsyg.2017.01663
- Zhang, B. (2021, October 7) Public opinion toward artificial intelligence. OSF Preprints. https://doi.org/10.31219/osf.io/284sm
- Zhuravlev, A.L., & Nestik, T.A. (2016). Psychological aspects of negative attitudes toward new technologies. Psikhologicheskii Zhurnal, 37(6), 5–14. (In Russ.)
- Zilberman, N.N. (2019). Social robot in shopping malls: First results, challenges and prospects for researches. Gumanitarnaya Informatika, (16), 34–40. (In Russ.) https://doi.org/10.17223/23046082/16/5
- Zilberman, N.N., Chekunova, A.V., Gladky, D.A., & Kulikov I.A. (2015). Stereotypical children’s attitudes about social robot’s status-role characteristics. Modern Research of Social Problems, (4), 398–417. (In Russ.) https://doi.org/10.12731/2218-7405-2015-4-36
Дополнительные файлы

