Talk:Wikimedia Foundation/Chief Executive Officer/Maryana’s Listening Tour
This page is for discussions related to the Wikimedia Foundation/Chief Executive Officer/Maryana’s Listening Tour page. Please remember to:
|
We welcome speakers of all languages in this discussion. Please comment here in any language you wish; staff or other volunteers will translate your comments to English if possible. |
For prior conversation during Maryana's Listening tour see: Talk:Wikimedia Foundation Chief Executive Officer/Maryana’s Listening Tour/Overview
Feedback
editWelcome and thanks for this "essay" of the insights you have had this far. I believe you have caught much of the essence of how we think and work and that your insights are almost spot on, and your priorities for the next six months look very promising!Yger (talk) 17:12, 14 January 2022 (UTC)
- Thanks, @Yger! RWeissburg (WMF) (talk) 15:47, 23 February 2022 (UTC)
A few comments
editHi! I hope this comment comes out the right way, and please consider that English is not my first language. I really loved the efforts to share the outcomes of the listening tour, but I'm somewhat confused by the "External trends" section. It seems to be landing on purely analyzing Wikipedia as a "tech platform". Why is relevant to analyze Facebook, TikTok, etc., all the social media platforms as if Wikipedia was competing with them? Is Wikipedia in the business of social media? Wikipedia is an encyclopedia, so it's actually worth analyzing the real information and knowledge ecosystem, which is not Facebook, TikTok, YouTube, Google, or at least not exclusively. These companies do work with information and knowledge, but they are a fraction of the information and/or knowledge producers ecosystem. Moreover, this analysis is heavily skewed towards "information consumption". This is an important aspect, but I'd argue that Wikipedia does much more than just "offering content". There's actual knowledge being produced in this platform.
For example, a more interesting trend to analyze on the knowledge ecosystem could have been what's happening right now all over the world with investigative journalism being de-prioritized and de-funded in favor of click-baiting, the lack of strong public media in countries that urgently need it as a way to build sources and references for Wikipedia, or the rising threat to journalists in contexts where Wikipedia operates that could seriously backlash to editors -- i.e. in Mexico, where two months in 2022 they already have five journalists killed. Wikipedia needs good quality journalism as a way to build knowledge too, because we depend on reliable sources to write Wikipedia articles. There are a ton of things that WMF could do to help bring its leverage to this problem, like partnering with organizations such as IFEX or ClimateTracker, to name a few.
In a similar fashion, what happens with universities, libraries and cultural heritage institutions being de-funded left and right all over the world? Not that I'm advocating for focusing (yet again) in the US, but the whole GLAM ecosystem in the US was seriously affected by the pandemic, with lay-offs and people leaving the sector and never coming back. GLAM partnerships have been one of the major sources of good quality content and contributions into Wikipedia, but there's not an analysis on what the rampant defunded, overworked GLAM professionals situation might mean for Wikipedia.
The content translation section does not take into account the fact that the most important thing that the community is doing is not just "translating content". A very important part of what we do as a community has to do with actually trying to bring more local, contextual knowledge to the platform. While I personally use the translation content tool heavily for certain areas, it would be awful to just translate a bunch of stuff written with the perspective of editors who are based in the US or in Europe and heavily use Western sources -- we need to bring more local perspectives in, not just try to meet some crazy, capitalist demand to cater content into more languages without actually offering knowledge perspectives that are not Western-based or Western-biased.
The misinformation section does not seem to take into account what the community has already produced on the topic, i.e., what Yumiko Sato wrote about Japanese Wikipedia or the extensive reports produced by Marco Silva for the BBC on climate change misinformation, where several members of the Wikipedia community were interviewed. Instead, that whole section reads as "tech platforms have a misinformation problem, but they're modelling after us", when the reality is that Wikipedia does have a misinformation problem, only it happens to be outside English Wikipedia (and with all fairness -- there are several teams at the org. working on this, but their potential analysis of this problem doesn't appear here).
Last but not least, the mentioning to learning initiatives from TikTok does not take into account other problems that I think are more pressing. For example, more and more activists are turning their attention into using Instagram, TikTok or Facebook as platforms to share knowledge and information about pressing issues because some of their audiences are there, but they don't seem to see Wikipedia as a platform where to contribute some of the (amazing) content they seem to be creating, while ignoring all the problems that these platforms have (i.e., lack of easy information retrieval or archiving functions). Or, for that matter, content producers in those platforms have more incentives to create content there because they might eventually be able to monetize it (although the things that can get monetized might not be really fit for Wikipedia). The fact that there are more audiences in those platforms is not necessarily a threat to Wikipedia, but the fact that knowledge producers more and more are concentrating their efforts into these platforms is.
TL;DR: I could go on and on and on, but I'll stop here. While I understand the communication goals of a report that tries to summarize key trends in a sector, I'm confused about why the social media platforms is the sector that has been prioritized to analyze. Scann (talk) 13:59, 18 February 2022 (UTC)
- Hi @Scann - Yael here, Director of Strategical Partnerships at the foundation. My team led the work that is (very briefly) summarized here.
- I want to thank you *so much* for this input. While I personally sit in the Bay Area of the U.S. (aka Silicon Valley), my team is spread all over the world, from Jakarta to Accra, Mexico City to Amsterdam. We make every effort not to be overly tech-centric, and to take a wider lens of the world that Wikimedia projects operate in. At the same time, I know that where we sit (in this case, me personally) by nature defines the lens we look through, and the longer I've spent in the Bay Area, the more I see the world through that lens. I imagine some of my contextual bias has therefore shaped this research. One of the reasons we wanted to share this more broadly with the community is precisely to get the kind of input you're giving - to help us see our blind spots, take an even wider lens, and look at trends from different perspectives. So, a sincere thank you for offering us that so clearly here.
- On a personal note, I will say that while I personally am not a TikTok or much of a Facebook user, I *am* seeing the impact these huge social media platforms are having on Wikimedia projects. So, while Wikipedia is an encyclopedia, for better or worse we are not competing with Britannica or others for attention - especially from a younger audience. We're "competing" with social media for time and attention. So I do think that while we may see Wikipedia and TikTok as apples and oranges (to use an American metaphor), the two are in fact more related to each other than they may have been a decade ago.
- AND, I agree with you that there's more to look at here: I love your point about journalism trends, and I'd like to add that to the next round of research. If you're open to it, I'd like to actually interview you and others in the movement who you think may have insight into this, about your perception of these and other trends.
- I'd also like to share the full(er), still draft, version of the research with you and others if you're open to it. We shared a very short summary here, which doesn't do justice to the work. AND, I know that even our first draft of research doesn't fully cover your points, and you're giving us good feedback about where to turn next. Please let me know how best to share this with you - either sharing a link to the fuller work here, or sending it directly.
- I have on additional comment on your excellent feedback on dis/misinformation, which I'll share below.
- Again, thank you, @Scann, for engaging. This is what I was personally hoping for in sharing this work, and I'm feeling energized by your contributions as it's helping me articulate our next steps in this work. Thank you! RWeissburg (WMF) (talk) 17:35, 22 February 2022 (UTC)
- Hi @RWeissburg (WMF), thanks for taking the time to reply.
- So, when you say: "I am seeing the impact these huge social media platforms are having on Wikimedia projects", would you mind, to the extent that's possible to share in this public space, describe what this impact is and how does it look like? What are those differences you're mentioning? I believe the summary being shared here is not capturing that. Scann (talk) 21:33, 22 February 2022 (UTC)
- Your'e right. The summary is definitely not capturing that. I'm not sure who originally posted the summary, but I'm happy to flesh it out with more context. But, to answer your question:
- One potential impact (documented on Phab) is declining pageviews on Wikipedia. We don't know the causation here, of course, but in a world where >95% of our external traffic is referred to us from text-based search, and 90% of that search traffic comes from Google, AND where Google has now been surpassed by TikTok as the most visited site on the internet, it stands to reason that there's a correlation between the rise of social media and the decline in pageviews. To be clear, we have not determined this to be the case by any means, but the data we *do* have seems to indicate that we exist in a world where whether or not we like it, we are competing with TikTok for attention.
- P.S. I'm trying to add links to citations but it's getting blocked for some reason. Happy to share another way or be guided on how to add external links and not get blocked by Spam filters :) RWeissburg (WMF) (talk) 15:44, 23 February 2022 (UTC)
- Hello @RWeissburg (WMF), again, thanks for taking the time to reply. If you're using bit.ly or shortened URLs by Google that's what might be preventing sharing the links.
- I appreciate your response, but I still fail to see the connection you're trying to build with the arguments being presented in the report. There's no connection between the question "what does the world needs from us now", an "analysis of the information ecosystem" (as suggested at the beginning of the report) and "our declining pageviews due to how TikTok is behaving". These are three entirely different things.
- What does the world needs from us now? That we keep on providing reliable, high-quality, well curated, culturally appropriate, contextually relevant knowledge (not content). There are vast swaths of human knowledge where we're failing miserably at, but they are barely mentioned in this report (to name only two as a way of example: women's reproductive rights and agriculture). I personally think that the "how we do this" suggestions presented in this report are problematic to say the very least, and have little to do with what we might need to do as a movement. And again, this report does not mention any single of the knowledge producers we have depended on to provide this reliable, high-quality information, such as public media, GLAM institutions, journalists and librarians. What the WMF might need to do with the platform to make sure that this content is provided to the audiences that need it is an entirely different issue, but the difference between WMF & the movement is blurred at the beginning of the report saying "as a movement we need to ask this" and then "leaders at the WMF wrote this".
- What does the decline in pageviews mean and how it impacts different areas of what the WMF and the movement do? To me, it's clear I disagree where this report is landing on, where there seems to be suggestions that "sticky content", "gaining attention" and "creating a stay-on platform" are the ways to go. There are a ton of ways in which this problem can be approached, so this is heavily skewed towards a perspective that has little to do with what I believe Wikipedia is and the movement does and cares about. Moreover, it's actually worrisome that we might think for a second to deploy any of the tactics that social media deploys to make people keep on scrolling to unhealthy levels (as in young teenage girls have their self esteem smashed thanks to Instagram). I won't argue against the WMF putting more love into creating better video & image narratives, but it's a long stretch to think that people who goes to TikTok or Instagram to look for videos of cats are the same people that are coming into Wikipedia to look for answers to their questions, or even if they are, it's a long stretch to assume they are having the same information patterns towards the platforms. There was a research done in Latin America that was presented recently at one of the research office hours (video here) where this same question was explored and even youth has a very distinctive way of approaching the information they find in these different platforms because they recognize that the information and the medium are actually quite different.
- The problem is not agreeing on the fact that social media competes for people's attention (although for that matter, so does the gender divide on domestic housework, which probably affects how women and men use Wikipedia). The problem are the conclusions that this paper is arriving at on what to do with it. They are completely at odds with what Wikipedia does. Again, these platforms have very specific engagement tactics (and missions) that have nothing to do with what we collectively do here.
- Of course, if you still want me to read the full report I can do it, but be aware I have a tendency to be very critical and I might disagree strongly with your data collection methods for this report. You can share it with me at scannopolis [at] gmail [dot] com. Scann (talk) 21:54, 26 February 2022 (UTC)
Search has fundamentally changed
editこんにちは。私は受動的であるため、ウィキメディア運動で活動する方に感謝します。
このセクションの主旨がよくわからないのですが、検索方法が変化しているため、それに合わせてリッチコンテンツにしていこうということでしょうか。だとすれば、私はあまり良いこととは思えません。人が便利で楽な方に流されていくのは仕方のないことで、画像や映像で表面的に理解したつもりになるより、現在はニッチな方に見えるテキストで深層を知ることが、長期的には大事なのではないかと思うからです。
私の誤解に因る一方的な意見になってしまっていたら、お詫び申し上げます。温厚知新 (talk) 08:27, 16 April 2022 (UTC)
Hello, I'm passive, so, appreciate anyone who is active in the Wikimedia movement. I'm not sure what the purpose of this section is, but to say that search methods are changing and we should try to make rich content accordingly? If so, I don't see it as a good thing. It's inevitable that people are carried away toward convenience and ease. I think what's important in the long run is: getting in depth by texts which looks a niche now than understanding superficially through images and videos. I apologize if this is a one-sided opinion based on my misunderstanding. translation by --YShibata (WMF) (talk) 11:12, 18 April 2022 (UTC)
Meeting the global demand for content
editDisinformation and misinformation are on the rise
editIn addition to my comments above...
editWith the risk of speaking too much on this issue, I'd like to add a couple more thoughts. I must admit I'm upset about the misinformation section, because it seems like a very shallow analysis. There is an article quoted by Scientific American, but that source does not do the trick.
Part of the problem is that misinformation on social media has been treated by these companies as if misinformation was an "externality" of these platforms, rather than a key, core component of their business models. In other words, is not that Facebook or Youtube have a misinformation problem, is rather that they literally thrive on polarization, on undermining public discourse, on people spreading misinformation, and on profit from ads with outright lies. None of these companies have a misinformation problem, they are the misinformation problems.
For them, it's very convenient to treat misinformation as an "externality". It follows the playbook of manufacturing industries across the 20th century that make people sick with their products, spilled their waste on the river, destroyed whole ecosystems, and then it was up for disempowered, ill people and grassroots organizations and activists to denounce what was happening and clean up the mess. We are part of the grassroots movements suffering with the externalities, not a player with equal footing among these companies. In other words, all these companies are using Wikipedia links to point people to "potential contentious material", relying on the backs of volunteers to keep on making their profit on selling piles and piles of ... lies.
Moreover, I'm worried that Wikipedia is being put in that same bag of "trends on misinformation". There is a spectrum and diferent flavors of knowledge and information, and there is a side of the spectrum where there is privatization of knowledge and information feudalism and on the other side there is public knowledge and democratization of freedom of speech. There are gradients, but Wikipedia is much more closely to public knowledge than to privatization of knowledge.
All these companies not only create this problem, it's their core business, that's why they barely do anything to address it, as Frances Haugen revelations have pointed out, over and over. There has been extensive, serious research on this topic that keeps on proving that polarization and misinformation allows several of these companies to drive more engagement on their platforms, which in turns allows them to make more profits by selling more ads.
Is not so much that Facebook etc have this problem and haven't done anything to solve it, but rather that they are actively working *against* our mission.
I'd also be very, very cautious of treating misinformation just as a problem of tech platforms. It is not. Misinformation is a social problem that has a lot of complex dynamics involved in it (as anyone who lives in an ex USSR country can provide testimony of). This section treats misinformation as a problem of "changes in user behavior", as if a change in reading patterns was the core issue that suddenly makes people buy into a pile of ... lies. This is simplistic to say the very least. One of the main problems is that misinformation thrives when there is lack of public trust on knowledge, social and state institutions, including science. Economic inequality, lack of funding for public schools, etc., etc., are increasingly part of the enabling conditions for misinformation.
These social media companies receive money to undermine the lack of public trust, creating silos and information bubbles and leaving deeply troubling worldviews go unchallenged in the name of "free speech" and "freedom thinkers". In turn, people that could be doing something more productive with their time have to discuss that, "no, seriously, ivermectin won't prevent you from catching COVID".
I'm also concerned that in not taking into account the extensive and careful research that has been done in this issue, it's sweeping over the piling amount of controversies and evidences that show that companies such as FB have repeatdly ignored human rights violations that were being enabled by their platform, such as the Rohingya massacre (source). And FB is doing it again with climate change misinformation, see this as just one example.
All in all, I'm worried that this analysis provides grounds for any type of policy. Here I'll make a disclaimer that with another colleague I had to write an issue brief on "Climate Justice & the Knowledge Commons: Opportunities for the digital rights space" commissioned by a set of digital rights US & Europe funding organizations, which is not yet published but will be soon. On this particular point, this is what we wrote:
"The conversation about online misinformation is still predominantly framed in an Anglo-European context, which obscures where and in which languages people have the greatest problems regarding misinformation and access to basic scientific research (Saudelli 2021; The Arab Weekly 2021). Underrepresented languages in the online climate conversation correspond to the same regions that already suffer the most severe impacts of climate change. These regions have less, if any, access to basic scientific climate research (largely written in English), scientific literacy, and public media to help with interpreting or communicating research. This information gap affects large swaths of the public, as well as policymakers (Ryan et al. 2019), so climate activists turn to social media to bridge the knowledge gaps on climate change in their own languages and contexts: their audiences need accessible, clear, and accurate information about the climate crisis, how it will affect their lives, and how society can adapt. However, for climate change, in particular, languages other than English and countries outside Western Europe and the U.S. are being targeted by misinformation campaigns that use subtle tactics to sow doubt around scientific evidence about the existence and impacts of climate change (Coan et al. 2021; Stinson 2021; Silva 2021). With misinformation on climate change rampant in their countries or languages, these climate activists are working in an uneven playing field.
(...)
To address misinformation on climate change, multiple approaches should be adopted, from community participation to public policy advocacy to coalition building, and there is ample room here for ideation and collaboration between digital rights groups and climate activists. It is worth pointing out that more research is needed in this area, particularly in languages other than English and in regions outside the U.S. and Europe, where we have less clarity on the extent of the problem."
I'm not advocating that WMF should take this path, but I do worry that if misinformation is identified as a big problem, then WMF needs to carefully pick its allies on solving that issue. To me, the WMF should be joining forces with other organizations that are asking accountability from these companies to stop spreading misinformation, such as this one on climate change misinformation.
More than ever, organizations that are working with a public good mission should speak truth to power and use their leverage to actually bring around the change that it's needed to address some of these very complex social human issues. --Scann (talk) 13:54, 19 February 2022 (UTC)
- Yael here again. The only thing I have to say to this comment is 100% yes. This part from you:
- "I'm not advocating that WMF should take this path, but I do worry that if misinformation is identified as a big problem, then WMF needs to carefully pick its allies on solving that issue. To me, the WMF should be joining forces with other organizations that are asking accountability from these companies to stop spreading misinformation, such as this one on climate change misinformation."
- is as close as I can get to articulating my personal opinion in this work. I am not a public policy, legal, or mis/disinformation expert. Neither is anyone who ultimately drafted this work (though we consulted with our Public Policy and Trust & Safety colleagues on it). If I could summarize our perspective on this work it's this: WMF, and Wikimedia projects more broadly, do not have a coherent mis/disinformation strategy off-Wiki. And we as a community need to decide if this is something we should have (I personally think the answer is yes), or if we believe that our work should be limited to fighting mis/disinformation on-Wiki only. And if others agree with me (again, my personal opinion, not the Foundation's), then where do we go from here? And that's a conversation for all of us, not for my team, or even the Foundation.
- Happy (and curious) to hear your thoughts on this! RWeissburg (WMF) (talk) 17:42, 22 February 2022 (UTC)
I think we have done fairly well in dealing with the "alternative facts" era of politics, certainly better than some other sites. I think I see a change coming in the informationsphere and in global politics, people are learning to handle political lies more effectively. But we must not get complacent, and we need to start thinking how we combat "deep fakes" and also how we work with those who want to turn round a discredited news site or politician and start to say things worth taking seriously. Our emphasis needs to move towards the question what would the Daily Mail need to do for us to treat it as a reliable source? WereSpielChequers (talk) 10:34, 3 February 2023 (UTC)
- Thanks, @WereSpielChequers. It sounds like you're suggesting that WMF proactively reaching out to problematic/perennial sources like the Daily Mail and making the case to them for how/why they should improve their journalistic standards could be an important way to address the spread of mis/disinformation online. Not sure if that idea has been raised before, and I wonder if those kinds of outlets would be responsive to a good-faith dialog (or if they're too attached to the traffic they get from misleading clickbait to care about the traffic they lose from not being citable on Wikipedia), but it's something to consider as part of a broader misinformation/disinformation strategy. MPinchuk (WMF) (talk) 23:27, 8 February 2023 (UTC)
- Hi MPinchuk, I wasn't thinking of going so far as outreach, more documenting what the process would be. I think the analogy should be with the EN:WP:Standard offer we make on the English language Wikipedia to people who have been banned. Document the process so that those who want it have a route back. WereSpielChequers (talk) 19:29, 17 February 2023 (UTC)
Distrust and disagreements
editIn my opinion, the Wikimedia movement's biggest external risk is public distrust. And it's happening already: many people are no longer trusting our projects.
Of course, this is part of a worldwide phenomenon of massive disinformation, where snake oil salesmen are succeeding in being more trusted than scientists. And as a major knowledge resource, Wikipedia is being actively targeted both to introduce biased context and to become untrustworthy.
The Wikimedia community has made a huge effort to keep the neutral point of view. However, some key community members still do not take such precautions. For example, Trump's English Wikipedia article simply states "he falsely claimed that there was widespread electoral fraud". How do you think that Republican Party supporters think of such an article? It should state that Trump's claims were considered false by federal judges, but the regular editors think that this clarification is unnecessary.
However, even if we acknowledge the existence of distortive forces, there's an underlying problem: people disagree on all sorts of issues, so it's pretty much impossible to write a text that everybody agrees with. Even neutrality is an opinion. How to write an article about Donald Trump, Vladimir Putin or Nicolás Maduro and prevent anyone from saying it's biased? Be it right, left or both, someone will almost surely disagree with any text we publish, be it for criticising their idolised leaders, or for not denouncing their sworn enemies.
So yes, the Wikimedia community can do better to regain and keep trust. But even keeping dark forces away won't be enough. We can't just write facts and think that people will blindly believe them. We must prove that we are truly neutral. --NaBUru38 (talk) 18:59, 28 February 2022 (UTC)
- We need to differentiate between places and subjects where we need to regain trust lost because we have failed to meet our standards, and people whose trust we have lost because they trusted Trump over election denial, or they trusted the anti Vax lobby or the Climate change denial lobby. The WMF has an important role to play in regaining communities and projects that have been subverted by governments and political parties. But neutrality is not the mid point between both sides, sometimes we need to state facts such as Trump lied when he claimed he was cheated by fraud in 2020. We need those facts to be well cited, but it isn't like comparing reviews of a ballet where one critic thought the approach was innovative and another dismissed it as ponderous. Sometimes it is as simple as the guy who lost an election told his supporters they'd been cheated. WereSpielChequers (talk) 10:00, 3 February 2023 (UTC)
Puzzle 4: Multilingual in name only?
editHello, It is clear that the subject is well understood, but the last appearance of the foundation (Conversation with the Trustees), there was no effort in this area "the absence of any translation or guarantee of it" and for get it we have obliged to ask for it before! cordially Nehaoua (talk) 20:23, 25 February 2022 (UTC)
- Hi Nehaoua, nice to see you in last week's Conversation with the Trustees and thanks for following up here. We'd love to get to a place where we offer simultaneous interpretation into multiple languages for these sessions, but that will depend on community interest. We have created the current system to ensure we are hiring interpreters where they will actually be utilized by communities. Using the Movement Strategy discussions as a guide, and keeping in mind the cost of interpretation, we commit to hiring interpreters wherever there is interest from 5 or more community members. If you and your community are interested in attending these events in the future, I of course would be happy to personally coordinate the needed interpretation. I hope to get to a future where there is enough interest to allow for these meetings to have live interpretation into multiple languages every time, but again we rely on participation from community members to help us make that determination. --ELappen (WMF) (talk) 19:41, 28 February 2022 (UTC)
- @ELappen (WMF): How about just automatic translation of speech-to-text? There are google translate plugins for this, for jitsi and other videochat platforms. Or (a communitarian approach) have an audio channel for each session where mutlilingual community members are invited to hang out to translate Qs from other languages [many people are able to generally understand the audio, especially with auto-generated translated subtitles, but need help translating their own comments, which requires just basic translation ability and not simultrans capacity]. –SJ talk 14:03, 11 July 2022 (UTC)
Much of the debate is in English because that's our most widely used language and the default headquarters language of an overly centralised multinational organisation. But the solution to the WMF being overly centralised when compared to institutions such as the red cross and crescent is to decentralise the WMF. Devolve more of the budget to national chapters in those countries where we can operate such chapters and have the central fundraising team be an enabler and a disseminator of good practice, but the money donated in India or Spain should by default be donations to our legal entities in India or Spain. We'd probably still have some global conversations defaulting to English, but it doesn't need to be the problem it is. As long as the volunteer parts of our movement are radically decentralised on a language basis, but the paid parts of our movement are tightly controlled from the USA, we will have inbuilt excess unneeded tension in the movement.
As for technology, we've come further in the last twenty years than critics might think. But I look at my wife's facebook feed and most of her friends and family write in Georgian using the Latin script because they use devices with qwerty keyboards. We have money and could invest in the technology so that people editing our Georgian projects, and any other projects with non Latin scripts, can have the option to have their writing changed from qwerty to the relevant script on the fly. WereSpielChequers (talk) 10:21, 3 February 2023 (UTC)
Priority 3: Refresh the organisational values of the Wikimedia Foundation to guide our way
editIt is often difficult for an organisation to combine volunteers and paid staff, or at least not without tension. This is especially true when you add the cultural barriers of global interaction via the internet. The WMF hasn't always been a bunch of twenty somethings talking to a bunch of greybeards on the misapprehension that they are talking to adolescents. But it is the stereotype. The WMF would benefit from looking at the values of other organisations that combine paid and unpaid staff and look at the values that enable that to work elsewhere, such as "we employ people to do the things that volunteers want to happen but aren't volunteering to do". It is possible for an NGO to have values that are more in accord with the ethos of its volunteers. WereSpielChequers (talk) 20:59, 3 February 2023 (UTC)