Thank you, Prime Minister Jacinda ARDERNOVA.
Thank you, ladies and gentlemen, for being here. This new Christchurch Call summit has really allowed us to move forward behind closed doors, as is the rule, to discuss with the major platforms and social networks what I will call digital public policy. Here I want to acknowledge the work that began with the Aqaba process, especially the strong involvement of the King of Jordan, which we started in recent years with the TechforGood summit. And indeed, as the Prime Minister has just recalled, what we started together in Paris in May 2019 after the terrible terrorist attacks in Christchurch, with this desire to bring together states, NGOs, academics and societies concerned, social networks, online services and reports.
It is a fundamental discussion and essentially also a challenge that is ours, in this very space where our children and we ourselves are developing every day and more and more, how to build public order? This means rules that make it possible to avoid hate speech, go as far as terrorist acts, and protect. In this regard, it has enabled the adoption of the Digital Services Directive in the European framework for the regulation of digital content, the creation of liability systems and better protection. The context of the war in Ukraine and its consequences at all levels, including the spread of disinformation and propaganda online, which are often the root cause of violence, shows how important this debate is. Since 2019, as the Prime Minister reminded, we have achieved real progress and real results. Since 2019, the Christchurch Call has been a demonstration that it is possible to remove violent or terrorist content online. And this famous “golden hour” works and we were able to experience it. And so we have now both put in place regulations in several of our countries that would be, I would say, hard-coded. But we also had proactive behavior from platforms that helped us technically remove online content, ensuring that removal within an hour, and we saw that with the Buffalo attack, for example. And we learned all this from previous attacks, especially just after Christchurch, where we saw millions of people trying to put the most brutal scenes online.
Also, another real success is that we actually put in place a governance structure, the GIFCT, that works, that we got more funding for. We will continue to mobilize them, because that is precisely what makes it possible to streamline, if I may say so, this rule with continuous monitoring and therefore a real accountability mechanism. And then our community grows with more than 50 partner states, new partner digital platforms like Zoom, Clubhouse, Roblox, and therefore really advances on new digital frontiers.
Now we want to go further. The Prime Minister reminded us: fight disinformation and propaganda more effectively by increasing positive interventions on the networks. We want to go further with more axes. The first is to have this work that is necessary on algorithms. We launched it a few months ago. The agreement that has been announced with New Zealand, the United States, Twitter and Microsoft is very important because in particular there is an exchange of algorithms and data recorded by Twitter, and I want to appreciate their commitment. It is a step forward and a very important element. Basically what we’re saying is that when we don’t know anything about algorithms, we can’t have a guarantee that they won’t lead to behavior that can be compromised, where they might not be as efficient as we’d like in some way. to prevent. Therefore, improving prevention means deciding to have a real partnership in the field of algorithms, to share information, to achieve the most effective approach with all platforms. Until then, many technical arguments opposed us. A lot of work has been done and we will continue to do it to make things better.
The second key element is artificial intelligence. Whether it’s social media or messaging, whether they use algorithms or not, it’s clear that with AI we can do a much better job of preventing exactly what leads from hate, then online radicalization to the risk of terrorist behaviour. And so we will continue to push to have very clear commitments on this. The third point is research and public research, and commitments were made in this direction at this meeting. They are very important. For example, we have also done this in the field of health, but we want to continue developing the model of public research in these topics. Why? Because I was talking about digital public order. It’s very simple. Our partners, which are platforms and social networks, must accept that researchers may have access to their data and information for a certain period of time. That they conduct independent, transparent and public research, which is the very fact of research, and therefore that it is not only proprietary, but helps us better understand and helps prevent them better. And so we’ve made progress on that point as well.
We have discussed and together we want to go further on the topic of the dark web, where it is clear that today we have terrorists who upload extremely content, not questionable, but content that is problematic and that is dangerous in our eyes and that somehow take advantage of the lack of regulation of this space. And so we want the Christchurch challenge and our forum to get into that space as well. As you well know, we have succeeded, we are in the process of succeeding, there is still work to be done, we need to consolidate the resources we have put into it to remove the so-called terrorist content much faster, to qualify it. have the same definition and oversight mechanisms. We want to improve their efficiency by depositing, asking social networks to give more money. Now we put all our energy into preventing radicalization, hate speech, everything that leads to terrorism, and that’s why we have this work on: algorithm, dark web, artificial intelligence and public research.
The last point is the protection of children. This is a very important topic that is dual to the one we are discussing. We have long discussed the fact that today many of our children are exposed to inappropriate content, but above all that the development of many games, immersive spaces, has an impact on our children and on the relationship to violence and behavior that can lead to the worst. In this regard, the young age of many perpetrators of terrorist acts everywhere in the world cannot leave us without a reaction. On this topic, as you know, we launched a call to protect children in the digital age almost a year ago at the Paris Peace Forum. We will continue this work in November, working very closely with the Prime Minister, to empower and protect our children from online harassment, from anything, precisely, content, games and systems that are considered dangerous for our children. And what we want to do in that regard is to look at how the protocols that we’ve been able to develop within Christchurch can be used to protect children online. It is clear from this that today we need to improve prevention and education of our children. We need to improve our regulation and we are doing it in several countries, but we need to coordinate. We also need to improve moderation elements: platforms are not doing enough to moderate content in our languages to protect our children. We also want to advocate that these different games, immersive systems, be taken into account when designing architectures and systems, taking into account the interests of our children, their protection and the fact that sometimes they appeal to minds that are still forming and that do not have codes and for which it can lead to decisions, attitudes that can then feed this very violence online and in the real world. That was another axis I wanted to add to the work done.
I want to thank the Prime Minister once again for the wonderful cooperation, for the initiatives that we are constantly renewing and, I believe, for the importance and strength of this call from Christchurch.