Toespraak staatssecretaris van Huffelen tijdens SXWS 14 maart 2023

14-03-2023
694 keer bekeken

Toespraak staatssecretaris van Huffelen tijdens SXWS 14 maart 2023: Dutch Value-Driven Policy Design to Tame Big Tech

Cyberspace could be humane and fair but now apps lure kids into pro ana clips, ChatGPT advises people to divorce, governmental algorithms discriminate and democracy is under attack from disinformation. The Dutch value driven policy agenda to change this includes conducting privacy and human rights assessments on services such as Google Workspace and Facebook Pages, publishing and regulating governmental algorithms and setting the bar high when it comes to inclusion, trust and control of your own digital life. Join us in this movement!

Why, do you wonder, would a Dutch minister for the digital transition visit SXSW? To be frank: I am here to start a movement. A movement of citizens, industry and government. A movement to ask questions about technology. Are you in control? Do you trust it? Is everyone included? Asking the right questions and demanding that we build a digital society we want to live in. Guaranteeing that digital technology is responsible by design. Because technology can bring so much. Here @Southby we are more aware of that than ever. There are so many examples here showing technology is about creativity, about imagining new worlds and about changing the world.

Reminding us of the ideals many had at the start of the Internet. Stated for example in the infamous Cyberspace declaration. Of an open, free internet built upon principles. The principle of treating others as you want to be treated yourself. The principle of governments not interfering, the principle of radical equality between everyone online, and the principle of the radical freedom online to be who you want to be.

But boy, did that work out differently. With big tech not living up to these principles. With the open cyberspace now closed and being run by some of the biggest companies in the history of the world. With those companies using and selling our data for profit. And with governments using technology not just for good.

Last week Wired showed that an algorithm used by the Dutch city of Rotterdam singled out single mothers as frauds. Journalists showed that certain characteristicsラbeing a parent, being a woman, being young, not being fluent in Dutch, or struggling to find workラincreased someoneメs risk score. Which was then used to identify potential frauds. This is far from the digital society that we want to live in.

When starting as minister for Digital transition last year I already knew firsthand about the negative impact algorithms can have on the lives of citizens. When I joined the government in 2020, I joined amid a major scandal ラ more than 20,000 people were wrongly accused of fraud when applying for child care benefits. Because of a faulty algorithm used by Dutch tax authorities. For years, Dutch citizens were unjustly labeled as frauds. The government really failed in using technology for good.

But companies have not been doing much better.
Apps now lure kids into pro anorexia clips, into addiction, depression, and into conspiracy theories. ChatGPT advises people to divorce. Democracy is under attack from disinformation. It spreads easily online, and is being weaponized in wars. Citizens have become polarized online, and our political systems have become fragmented.
While digital technology brings many benefits for insiders, it is harmful to vulnerable people. Think of hate speech against women, against the LGBTQ community and against refugees. I hesitate to read my own Twitter timeline. Everyday is a new low on social media. What should we do about this?

It all starts by asking the right questions. Asking in what digital world we want to live. That is what we do in a democracy. Offline and online. That is why we have regulations in the offline world on products such as peanut butter or pharmaceuticals. And that is why there also must be more accountability, more transparency and more regulation in the digital world.

It might not be a message you all would like to hear but yes: we have to tame ourselves as digital governments. We have to tame big tech companies. We have to do so by asking the right questions and demanding that they are answered responsibly. Regulation is about forcing the conversation. Not to stop innovation, but to steer it in the right direction. To guarantee responsibility by design.

Let us all ask the questions, let us all demand they are answered. Let us do so together and live up to the ideals we had at the very start of the Internet. Starting a movement, forcing the conversation.

In that movement I see fundamental rights and values, such as non-discrimination and privacy, as central to our digital society. These values are the foundation of the digital agenda that the Netherlands has published in November last year. Citizens should be able to trust that the rule of law protects them online, as it does offline. They must be sure that we can carry our democracy forward into the future.

Let us all ask the questions. In our agenda we ask three fundamental questions. Do we leave no one behind in the digital transition? Can we trust the digital world? And do we have control over our own digital lives?

Let me discuss these three in turn: participation, trust and control.
Firstly, participation. We want to leave no one behind in the digital world, and we therefore work hard to educate and design for digital skills.
Right now, around three million Dutch people have insufficient digital and administrative skills, such as those needed to apply for benefits and to do business.
That is why digital skills will an integral part of school curriculum both in primary and secondary school. This is why in libraries all over the country, citizens can now ask for help on digital skills.

At the same time, it must also be possible for people to be offline. To file your taxes in writing, speak to a person and not a computer, and to develop social relations in schools without being online. There is now a wide societal debate around a ban of phones in schools. So that digital participation becomes a choice, that everyone will be free to make.
Now I come to the subject of trust. Citizens must be able to trust the digital world. Trust that they are safe from threats, safe from discrimination, safe from hate speech. And trust the information they find online. We are stepping up our efforts to tackle disinformation in order to increase trust online. Right now, more than five million Dutch adults struggle to consciously, critically and actively engage with digital social media. They struggle to deal with algorithms on social media platforms, that often show you solely the information supporting your own viewpoints or push crazy conspiracy theories.
So we will do more to enhance media literacy and to make sure platforms follow the new laws we have made and will continue to make in Europe. Laws such as the Digital Services Act, that forces companies to take down illegal content, when notified by citizens. And that allows people to object to their content being taken down by platforms to protect their free speech. This is how we force the conversation.

The law also forces very large online platforms to fight the risks their services pose to the rights of citizens, to our democracy, to children and to minorities. Independent audits must be done. Those audits must be transparent. And large fines can follow when the law is violated. So we can have an honest conversation.

We are also ensuring that researchers, small companies and creatives get more access to the workings of these platforms. So they can ask the right questions. So they can have an honest conversations. Conversations that allow us to really understand what is going on online. Conversations that are necessary for citizens to trust the digital world again.
It is not enough to shape what already exists. We have to lead by example. That is why we are adding trustworthy places to that digital world ourselves as well. We are realizing a digital public social medium that meets the highest standards for public values. Creating trustworthy digital spaces for all public institutions and for all citizens.

Thirdly, as a government we must also lead the way in giving people more control over their lives online. That is why we are developing our own open source ID wallet. With standards that set the bar high for data protection, for reliability and for accessibility. Realizing a public open source wallet that ensures that our citizens can take back control over their digital lives. In a way that meets our values and makes it easier to do business digitally. We do so fully transparent and in the open. Everyone can follow us, on GitHub and elsewhere.

From my own experience I also know that governments must do better when using algorithms. We must have more checks and balances, more transparency and more accountability for AI systems as their power increases.

That is why we have set up a national algorithm registry, and a Dutch Algorithm authority. All government algorithms that have a high impact on our citizensメ lives will be published in our open source public registry. An independent watch dog will make sure citizens are protected. We are at the start of this, but we are taking the lead.

With our European partners, we are making the AI Act. It will force developers of high risk AI systems to be more transparent, to do human rights assessments and to pass the highest standards of certification. AI that exploits the vulnerable or that build towards a society of social scoring will be banned.

To summarize: we have to leave no one behind, we have to ensure the digital world to be trustworthy and we have to guarantee that people have control over their own digital life. By having the right conversations, by asking the right questions.

Questions we also have to ask to big tech. Are we in control? Can we trust your technology? Is everyone included? Asking the right questions and demanding that we build a digital society we want to live in. To force the right conversation. To force change.
You might have read the article in the NY Times on how the Netherlands is taming big tech. This new Dutch approach apparently did not go unnoticed. We were portrayed as a small country that made big companies live and work by strong privacy rules and thus change their way of working. David vs. Goliath, the paper has put it. It's a nice metaphor, but too stark for my taste.

Because in the original story David killed Goliath. And killing tech and innovation would be a very bad idea. Instead we should shape it in a way that tech and innovation adhere to our values. So what we have been doing is ensuring that when the public sector uses digital technology, for example in schools, libraries or in government services, we set our own terms. On what data is shared, how children are protected, and how algorithms determine content. We are using our negotiation power to have the conversations that ensures public values for our citizens in public spaces.

We do so by conducting Data Protection and Human Rights Impact Assessments on all digital services that have a big impact on our citizens. We had a privacy impact assessment made on public use of Microsoft Teams, OneDrive and SharePoint. Microsoft took measures to remedy the six high risks that were found. Because the assessment was based on the GDPR, these measures helped Microsoft comply in 27 EU countries. It brought them a competitive advantage. That is why they applied the changes worldwide. What is good enough for Europe is good enough for the world.

We did the same on Google Workspace for Education. Google then mitigated the 8 high privacy risks found. So our children can safely enter our digital society. We also did the same on Facebook Pages for public communication. Asking the right questions again. Do governments have any control over the data processing that happens when they publish a page on Facebook? What recommendations do algorithms make for citizens when governments post about the importance of getting vaccinated? Do these recommendations help spread disinformation about vaccinations to citizens? And what happens to the personal data of citizens when they engage with campaigns on sensitive issues, such as women's rights?

These impact assessments identified multiple high risks when they were published. This is how we forced the conversation. This is how we forced negotiation. This is how we trying to force change. Just as we did earlier with Microsoft. Because as a government we want to build a digital society we all want to live in. We will continue to conduct Data Protection and Human Rights Impact Assessments on all digital services that have a big impact on our citizens. A big impact on our children in particular. So today I announce that we have started working with schools in the Netherlands to ask the right questions about the metaverse. We will once again conduct a data protection and a human rights assessment. This time asking the tough questions on the metaverse. When children use the Meta Quest VR Headset, is their sensitive biometric data protected? What data goes in and out of these devices? And how do we prevent that children are manipulated by the technology? By asking the right questions we are demanding that we build a digital society we want to live in.

Ladies and gentlemen,
I hope you all join me in asking the right questions. In forcing the conversation. In building a digital society we all want to live in. Let this be the start of a movement. Join us.
I am very much looking forward to vibrant discussions on these issues. Asking the right questions, having the right conversations and in doing so building the digital society together. Guiding big tech to responsibly design a digital society that leaves no one behind, a digital society that can be trusted and a digital society that ensures that we have control over our own digital lives. And stepping in as a government to lead by example.

So I hope you will join me in establishing a value-driven digital society. I wish you all lots of inspiration and bit of fun along the way.

Thank you very much!

Afbeeldingen

X (voorheen Twitter)

Cookie-instellingen