Legal issues in the use of chatbots in socio-political communications

 
PIIS102694520024146-1-1
DOI10.31857/S102694520024146-1
Publication type Article
Status Published
Authors
Occupation: Head of the Center for Legal Studies of Digital Technologies, State Academic University for the Humanities (GAUGN)
Affiliation: State Academic University for the Humanities (GAUGN)
Address: Russian Federation, Moscow
Occupation: Dean of the Faculty of Law, State Academic University for the Humanities (GAUGN)
Affiliation: State Academic University for the Humanities (GAUGN)
Address: Russian Federation, Moscow
Journal nameGosudarstvo i pravo
EditionIssue 1
Pages68-78
Abstract

Political bots have become an important tool for political technologists. In all major electoral processes around the world, the active use of chatbots on social media has been documented. However, researchers have identified the negative effects of such bots on political processes. The destructive essence of political bots is that they are a digital tool to manipulate public consciousness. Therefore, there is a need to establish a legal framework for the use of such artificial intelligence systems in socio-political communications. The article analyses the experience of foreign countries and substantiates the necessity of legislative consolidation of the principle of transparency of artificial intelligence systems, which will allow to oblige developers and owners of such systems to mark them. Informing users that they are interacting with an artificial intelligence system will reduce the risk of manipulating the public consciousness.

Keywordsartificial intelligence, political chatbots, social media, political technology, legal regulation of AI, transparency of AI, marking of bots
AcknowledgmentProject 122101000041-6 implemented at the State Academic University for the Humanities following the selection of research projects conducted by the Ministry of Higher Education and Science of the Russian Federation and EISI.
Received07.11.2022
Publication date20.02.2023
Number of characters34719
Cite  
100 rub.
When subscribing to an article or issue, the user can download PDF, evaluate the publication or contact the author. Need to register.

Number of purchasers: 0, views: 291

Readers community rating: votes 0

1. Vasilkova V.V., Legostaeva N.I. Social bots in political communication // Herald RUDN. Ser.: Sociology. 2019. Vol. 19. No. 1. P. 121–133 (in Russ.).

2. Vasilkova V.V., Legostaeva N.I., Koroshdevsky V.B. Social bots as a tool for the development of civic participation // Monitoring public opinion: economic and social changes. 2019. No. 5. P. 19 - 42. https://doi.org/10.14515/monitoring.2019 .5.02 (in Russ.).

3. Grachev G., Melnik I. Manipulation of personality. Publishing house “Algorithm”, PDF format. 1999 (in Russ.).

4. Dotsenko E.L. Psychology of manipulation: phenomena, mechanisms and protection. M., 1997 (in Russ.).

5. Kara-Murza S.G. Manipulation of consciousness. M., 2012 (in Russ.).

6. Martyanov D.S. Political bot as a profession // POLITEX. 2016. No. 1. P. 74 - 89 (in Russ.).

7. Stroganov V.B. Technologies of political manipulation on the Internet: dis. ... Candidate of Political Sciences’. 23.00.02. Ekaterinburg, 2019. P. 10 (209) (in Russ.).

8. Chesnokov V.O. Application of the community allocation algorithm in information warfare in social networks // Cybersecurity issues. 2017. No. 1. P. 37 - 44 (in Russ.).

9. Shostrom E. The manipulator man. The inner journey from manipulation to actualization. Kiev, 2003 (in Russ.).

10. Bessi A., Ferrara E. Social Bots Distort the 2016 U.S. Presidential Election Online Discussion // First Monday. 2016. No. 21(11).

11. Bonneau J., Grondin-Robillard L., Ménard M., Mondoux A. Fighting the “System”: A Pilot Project on the Opacity of Algorithms in Political Communication // Hepp A., Jarke J., Kramp L. (eds). New Perspectives in Critical Data Studies. Transforming Communications – Studies in Cross-Media Research. Palgrave Macmillan, Cham, 2022. https://doi.org/10.1007/978-3-030-96180-0_5

12. Caldarelli G., De Nicola R., Del Vigna F. et al. The role of bot squads in the political propaganda on Twitter // Commun Phys. 2020. No. 3. P. 81. DOI: 10.1038/s42005-020-0340-4

13. Duffy C., Fung B. Elon Musk commissioned this bot analysis in his fight with Twitter. Now it shows what he could face if he takes over the platform // CNN. 2022. October 10. URL: https://edition.cnn.com/2022/10/10/tech/elon-musk-twitter-bot-analysis-cyabra/index.html

14. Ferrara E. Disinformation and Social Bot Operations in the Run up to the 2017 French Presidential Election. // First Monday. 2017. No. 22(8). DOI: 10.5210/fm.v22i8.8005

15. Haeg J. The Ethics of Political Bots: Should We Allow Them for Personal Use? // Journal of Practical Ethics. 2017. No. 5(2). P. 85 - 104.

16. Holznagel B., Kalbhenn J.C. Media law regulation of social networks - country report: Germany // Perspectives on Platform Regulation: Concepts and Models of Social Media Governance Across the Globe / J. Bayer, Holznagel B., Korpisaari P., Woods L. (eds). Nomos, 2021.

17. Howard P.N., Kollanyi B. Bots, #Strongerin, and #Brexit: Computational Propaganda During the UK-EU Referendum // SSRN. 2016. DOI: 10.2139/ssrn.2798311

18. Howard P., Woolley S., Calo R. Algorithms, bots, and political communication in the US 2016 election: The challenge of automated political communication for election law and administration // Journal of Information Technology & Politics. 2018. No. 15 (2). P. 81 - 93. DOI: 10.1080/19331681.2018.1448735

19. Keller T.R., Klinger U. Social Bots in Election Campaigns: Theoretical, Empirical, and Methodological Implications // Political Communication. 2019. No. 36(1).

20. Larsson S., Heintz F. Transparency in artificial intelligence // Internet Policy Review. 2020. No. 9(2). https://doi.org/10.14763/2020.2.1469

21. Martini F., Samula P., Keller T. R., Klinger U. Bot, or not? Comparing three methods for detecting social bots in five political discourses // Big Data & Society. 2021. No. 8(2). https://doi.org/10.1177/20539517211033566

22. Massimo S., Ferrara E., De Domenico M. Bots Increase Exposure to Negative and Inflammatory Content in Online Social Systems // Proceedings of the National Academy of Sciences 115. 2018. No. 49. P. 12435–1240. DOI: 10.1073/pnas.1803470115

23. McPherson M., Smith-Lovin L., Cook J.M. Birds of a Feather: Homophily in Social Networks // Annual Review of Sociology. 2001. No. 27(1). P. 415 - 444.

24. Pedrazzi S., Oehmer F. Communication Rights for Social Bots? Options for the Governance of Automated Computer-Generated Online Identities // Journal of Information Policy. 2020. No. 10. P. 549–81.

25. Ramalingaiah A., Hussaini S., Chaudhari S. Twitter bot detection using supervised machine learning // Journal of Physics: Conference Series. 2021. Vol. 1950 012006. 1950. https://iopscience.iop.org/article/10.1088/1742-6596/1950/1/012006/meta

26. Ross B., Pilz L., Cabrera B., Brachten F., Neubaum G., Stieglitz S. Are social bots a real threat? An agent-based model of the spiral of silence to analyse the impact of manipulative actors in social networks // European Journal of Information Systems. 2019. No. 28(4). P. 394–412.

27. Sainath Gannarapu S., Dawoud A., Dawoud A. et al. Bot Detection Using Machine Learning Algorithms on Social Media Platforms: Conference: 5th International Conference on Innovative Technologies in Intelligent Systems and Industrial Applications (CITISIA). 2020.

28. Santia G., Mujib M., Williams J. Detecting Social Bots on Facebook in an Information Veracity Context // Proceedings of the International AAAI Conference on Web and Social Media. 2019. No. 13(01). P. 463 - 472.

29. Sayyadiharikandeh M., Varol O, Yang KC. et al. Detection of novel social bots by ensembles of specialized classifiers // Proceedings of the 29th ACM International Conference on Information & Knowledge Management (CIKM ‘20). 2020. P. 2725–2732.

30. Stricke B. People v. Robots: A Roadmap for Enforcing California's New Online Bot Disclosure Act // Vanderbilt Journal of Entertainment and Technology Law. 2020. No. 22(4). P. 839 - 894.

31. Subrahmanian V.S., Azaria A., Durst S. et al. The DARPA Twitter bot challenge // Computer. 2016. No. 49(6). P. 38–46.

32. Van Dijk T. Discourse and manipulation // Discourse & Society. 2006. No. 17(2).

33. Xing Y., Shu H., Zhao H. et al. Survey on Botnet Detection Techniques: Classification, Methods, and Evaluation // Mathematical Problems in Engineering. 2021. P. 1 - 24. https://doi.org/10.1155/2021/6640499

34. Walmsley J. Artificial intelligence and the value of transparency // AI & Soc. 2021. No. 36. P. 585–595. https://doi.org/10.1007/s00146-020-01066-z

35. Woolley S. Bots and Computational Propaganda: Automation for Communication and Control // Persily N., Tucker J. (eds.). Social Media and Democracy: The State of the Field, Prospects for Reform. Cambridge, 2020. P. 89 - 110.

36. Woolley S., Guilbeault D. Computational Propaganda in the United States of America: Manufacturing Consensus Online / S. Woolley, Ph. N. Howard (eds). Working Paper. 2017.5. Oxford, P. 1–28.

Система Orphus

Loading...
Up