Wikimedia Foundation Annual Plan/2023-2024/Draft/External Trends
Wikimedia Foundation External Trends: 2023-2024
The Wikimedia movement has created an amazing collection of knowledge over the past two decades. In order to protect the work accomplished by contributors during that period, as well as all the knowledge yet to compile, we must keep an eye on the world around us and how it may affect our mission.
Last year, the Wikimedia Foundation shared a list of external trends that were part of an internal reflection to shape annual planning. This year, a group of Foundation staff set out to update the list and are requesting the help of the wider movement to share their thoughts on these topics. When considering the world around us, a variety of perspectives makes for a clearer picture of reality and better-informed decisions. We welcome and encourage your feedback on the draft ideas below.
This is an opportunity to look outward: As a movement, we need to keep asking: “what does the world need from us now?” It is also a starting point for shared understanding, even with different views: Trends analysis requires us to have a long-term view and to track what is important to us - even if we have differing views about how to address it.
With that said, we live in a complex, fast-changing world. This is not a comprehensive list of threats and opportunities facing our movement, but rather a few of the most pressing issues we face.
Search & Content
Update from 2022: social platforms continue to disrupt traditional search engines, but AI threatens even more significant disruption
- Personality-driven experiences are increasingly drawing younger audiences to social platforms (TikTok, Instagram) and away from traditional search engines.
- Social platforms are testing out new search features to keep users engaged. Traditional search engines are testing different strategies to stay competitive and remain destinations – which reduce SEO ranking of content in external search results links.
- The explosion of sophisticated AI could benefit knowledge creation and consumption, but creates uncertainty for our role in the knowledge ecosystem.
- In just 2 months, ChatGPT became the fastest-growing consumer web application of all time. New AI models and tools built on them could benefit knowledge creation and consumption, but the nature of AI exacerbates challenges around attribution and disintermediation.
- These trends could help us advance our mission – or erode our sustainability.
- Increased competition in search and the rise of AI present opportunities to bring free knowledge to more of the world (in new ways) than ever before. They also present major open questions and risks to our organization, projects, and movement.
Disinformation
- Information warfare is intensifying.
- Information warfare as political and geopolitical weapon by governments and political movements is intensifying and growing more complex/subtle, while also growing more dangerous (disinformation campaigns are increasingly accompanied by physical threats, blackmail, arrests etc.).
- Machine-generated content is expanding.
- The ability of artificial systems to generate quality content is expanding, and crucially, its societal mainstreaming is unfolding fast in most major markets. How Wikimedia positions itself could help shape the field.
- Encrypted disinformation and misinformation attack vectors are growing.
- Digital privacy and information warfare fears push disinformation further into closed channels where encryption makes monitoring and prediction more challenging, allowing disinformation to thrive and polarization to further advance. As an open platform counter to this damaging trend (monitor-able, stewarded, cared for, public and open) we can show our form/method as a functionally addressing the problem.
- Wikimedia has become a legitimate target.
- In 2022, disinformation narratives and dedicated attacks against the movement, individual volunteers, and the Foundation also increased, creating increasingly severe risks for our volunteers and for the Foundation's reputation.
Regulation
Wikimedia has become more international as an organization, which means that more laws of more countries apply to us.
To protect our projects and people, we will need to comply with an increasingly broad range of laws around the world. We must be prepared to fight against harmful government actions in court and publicly advocate against harmful laws in even more countries as we continue to grow.
- More is demanded of hosting providers than ever before.
- Governments are under increased political pressure to address a grab bag of perceived harms and biases online. This year, two cases challenging seminal internet law CDA 230 will be heard by the US Supreme Court, potentially disrupting well-established intermediary protections that platforms like Wikipedia rely upon. Meanwhile, penalties for hosting harmful content are growing, including criminal liability in some cases like the UK Online Safety Bill.
- Lawmakers aren’t thinking about Wikipedia.
- Legislation continues to conflate Wikimedia with for-profit platforms. Few policy makers understand Wikimedia’s volunteer-led content moderation model. We need to educate governments and policy influencers about Wikimedia’s model - and how laws should protect and support it.
- Our relationship with for-profit tech platforms is important and complicated.
- We need each other. But to protect Wikimedia’s model, projects, and people from harmful regulation lawmakers and policy influencers must be educated about how we are different from large for-profit platforms. We can accomplish this with positive messaging to lawmakers and policy influencers about how our volunteer-led model works and movement’s positive role in society.
Open call: Artificial Intelligence in Wikimedia
The Foundation will be hosting an open call to gather perspectives about AI products like ChatGPT and their implications for us as a free knowledge movement. The goal is to start the conversation around the opportunities, risks, and open questions related to AI technology so that we can make informed decisions that reflect the thinking of communities.
- Details: 23 March, 18:00 UTC on Zoom (find your local time). Email answers wikimedia.org for the link.
- Interpretation will be provided for languages where there are 3 or more interested community members. Email answers wikimedia.org with interpretation requests.
Notes from the call
You can see the notes on the March 23 call.