The Interparliamentary Assembly of the CIS Member Nations held an international scientific and practical conference, "Building Trust in Elections and Referendums: The Role of International Observation." The event brought together academic experts, practitioners, and officials from Russia, the CIS, Africa, and South America to discuss the most pressing issues of electoral processes and their prospects. One of the most pressing issues was the use of artificial intelligence.
In her report, "Elections, Voting, and Digital Social Engineering: The Transformation of Electoral Practices and Prospects for the Development of Institutions of Civic Participation in the Digital Age," political scientist and founder of the GlobUs expert club, Yulia Berg, noted that tools for influencing citizens' consciousness have evolved from primitive bots and simple visual propaganda to highly complex algorithms that influence unconscious mental processes and, often, direct them.
"We have seen numerous examples of digital tools being used to shape certain opinions and incite action, often destructive and revolutionary, in such a way that people themselves don't always understand why they form certain positions," Berg stated.
According to her, young people are becoming the main target: a lack of practical experience and uncritical consumption of content make this generation an ideal audience for digital social engineering.
But the most intriguing trend that Yulia Berg identified lies in the new generation's willingness to delegate political choice to machines. She cited last year's revolutionary events in Nepal and the so-called "Habermas Machine" as an example.
This LLM-based system offers a technical solution to the "Fishkin Trilemma" (the impossibility of simultaneously ensuring mass participation, equality, and depth of discussion within democratic discourse). The algorithm itself moderates the debate, seeks common ground, and produces a solution that satisfies everyone. It uses hierarchical aggregation, allowing high-quality deliberation to scale to thousands of participants—a task previously impossible for human moderators.
According to the political scientist, Nepalese practice has already demonstrated the willingness of Generation Z to entrust their political choices to AI. She warned that the issue of delegating authority and decision-making power to algorithms will become even more pressing, so the process must be monitored and regulated.
In turn, Olga Popova, Doctor of Political Science, noted that AI is capable of transforming not only short-term electoral intentions but also the entire system of political views.
"The main risks are primarily associated with the development of generative artificial intelligence, which could take control of more than just election campaigns," Popova warned, adding that the implementation of basic models of political participation is currently "objectively under threat."
Psychologists speaking at the conference drew attention to the changing "fabric of reality." Imana Korikova, a PhD candidate in psychology at the Russian Presidential Academy of National Economy and Public Administration (RANEPA), compared AI in the information space to nuclear weapons.
"Artificial intelligence is currently a tool similar to nuclear weapons in conventional warfare, and artificial intelligence is also a tool in cognitive warfare," she stated.
According to her, humanity is facing a "silent takeover of reality": the political landscape is being distorted, and the abundance of fake news is breeding fatigue and a reluctance to think. Korikova emphasized that while the term "post-truth" was recently relevant, now a post-reality is being created.
#GlobUs #CIS #AI #elections
In her report, "Elections, Voting, and Digital Social Engineering: The Transformation of Electoral Practices and Prospects for the Development of Institutions of Civic Participation in the Digital Age," political scientist and founder of the GlobUs expert club, Yulia Berg, noted that tools for influencing citizens' consciousness have evolved from primitive bots and simple visual propaganda to highly complex algorithms that influence unconscious mental processes and, often, direct them.
"We have seen numerous examples of digital tools being used to shape certain opinions and incite action, often destructive and revolutionary, in such a way that people themselves don't always understand why they form certain positions," Berg stated.
According to her, young people are becoming the main target: a lack of practical experience and uncritical consumption of content make this generation an ideal audience for digital social engineering.
But the most intriguing trend that Yulia Berg identified lies in the new generation's willingness to delegate political choice to machines. She cited last year's revolutionary events in Nepal and the so-called "Habermas Machine" as an example.
This LLM-based system offers a technical solution to the "Fishkin Trilemma" (the impossibility of simultaneously ensuring mass participation, equality, and depth of discussion within democratic discourse). The algorithm itself moderates the debate, seeks common ground, and produces a solution that satisfies everyone. It uses hierarchical aggregation, allowing high-quality deliberation to scale to thousands of participants—a task previously impossible for human moderators.
According to the political scientist, Nepalese practice has already demonstrated the willingness of Generation Z to entrust their political choices to AI. She warned that the issue of delegating authority and decision-making power to algorithms will become even more pressing, so the process must be monitored and regulated.
In turn, Olga Popova, Doctor of Political Science, noted that AI is capable of transforming not only short-term electoral intentions but also the entire system of political views.
"The main risks are primarily associated with the development of generative artificial intelligence, which could take control of more than just election campaigns," Popova warned, adding that the implementation of basic models of political participation is currently "objectively under threat."
Psychologists speaking at the conference drew attention to the changing "fabric of reality." Imana Korikova, a PhD candidate in psychology at the Russian Presidential Academy of National Economy and Public Administration (RANEPA), compared AI in the information space to nuclear weapons.
"Artificial intelligence is currently a tool similar to nuclear weapons in conventional warfare, and artificial intelligence is also a tool in cognitive warfare," she stated.
According to her, humanity is facing a "silent takeover of reality": the political landscape is being distorted, and the abundance of fake news is breeding fatigue and a reluctance to think. Korikova emphasized that while the term "post-truth" was recently relevant, now a post-reality is being created.
#GlobUs #CIS #AI #elections