Ediciones de IP: Mejora de privacidad y mitigación de abusos

This page is a translated version of the page IP Editing: Privacy Enhancement and Abuse Mitigation and the translation is 20% complete.
Other languages:
Deutsch • ‎English • ‎Türkçe • ‎español • ‎français • ‎italiano • ‎magyar • ‎polski • ‎português do Brasil • ‎беларуская (тарашкевіца) • ‎български • ‎українська • ‎فارسی • ‎বাংলা • ‎ગુજરાતી • ‎中文 • ‎日本語

Introducción

En los últimos años, los usuarios de Internet se han vuelto más conscientes de la importancia de comprender la recopilación y el uso de sus datos personales. Los gobiernos de varios países han creado nuevas leyes en un esfuerzo por proteger mejor la privacidad del usuario. Los equipos de políticas públicas y legales de la Fundación Wikimedia están monitoreando continuamente los desarrollos en varias leyes alrededor del mundo, cómo podemos proteger mejor la privacidad del usuario, respetar las expectativas del usuario y defender los valores de los movimientos de Wikimedia. Con ese trasfondo, nos pidieron investigar y emprender una mejora técnica de los proyectos. Necesitamos hacer esto junto a ti.

MediaWiki almacena y publica las direcciones IP de los contribuyentes no registrados (como parte de su firma, en el historial de la página y en los registros) y las hace visibles para cualquiera que visite nuestros sitios. La publicación de estas direcciones IP compromete la seguridad y el anonimato de estos usuarios. En algunos casos, puede incluso invitar al peligro de poner a las personas en riesgo de persecución gubernamental. Si bien les decimos a los usuarios que su dirección IP será visible, pocos comprenden las ramificaciones de esta información. Estamos trabajando en una mayor protección de la privacidad para los contribuyentes no registrados al ocultar sus direcciones IP cuando contribuyen a los proyectos, de manera similar a como el lector promedio no puede ver la IP de un usuario registrado. Esto implicará la creación de un nombre de usuario de «IP enmascarada», que se generará automáticamente, pero será legible por humanos. Tenemos diferentes ideas sobre cómo implementar esto de la mejor manera. Puedes comentar para hacernos saber lo que necesitas.

Los proyectos de Wikimedia tienen una muy buena razón para almacenar y publicar direcciones IP: juegan un papel fundamental para mantener el vandalismo y el acoso fuera de nuestras wikis. Es muy importante que los patrulleros, administradores y funcionarios tengan herramientas que puedan identificar y bloquear vándalos, usuarios títeres, editores con conflictos de interés y otros malos actores. Trabajando contigo, queremos encontrar formas de proteger la privacidad de nuestros usuarios a la vez que mantenemos nuestras herramientas anti-vandalismo funcionando a la par de cómo funcionan ahora. La parte más importante de esto es el desarrollo de nuevas herramientas para ayudar a trabajar contra el vandalismo. Una vez hecho esto, esperamos trabajar para proteger las direcciones IP de nuestras wikis, lo que incluye restringir la cantidad de personas que puedan ver las direcciones IP de otros usuarios y reducir la cantidad de tiempo que las direcciones IP se almacenan en nuestras bases de datos y registros. Es importante tener en cuenta que una parte fundamental de este trabajo será garantizar que nuestras wikis sigan teniendo acceso al mismo nivel, o mejor, de herramientas antivandálicas y que no corran el riesgo de sufrir abusos.

El objetivo de la Fundación Wikimedia es crear un conjunto de herramientas de moderación que eliminen la necesidad de que todos tengan acceso directo a las direcciones IP. Con esta evolución de nuestras herramientas de moderación, podremos enmascarar las IP de las cuentas no registradas. Somos muy conscientes de que este cambio afectará los flujos de trabajo de moderación actuales, y queremos asegurarnos de que las nuevas herramientas permitan una moderación efectiva, protejan los proyectos del vandalismo y respalden la supervisión de la comunidad.

Sólo podremos llegar a una etapa decisiva mediante el trabajo colaborativo con los checkusers, stewards, administradores y otros usuarios moderadores.

Este es un problema muy desafiante, con riesgos para nuestra capacidad de proteger nuestras wikis en caso de que fallemos, razón por la cual esto ha sido pospuesto a lo largo de los años. Pero a la luz de la evolución de los estándares de privacidad de datos en Internet, las nuevas leyes y las expectativas cambiantes de los usuarios, la Fundación Wikimedia cree que ahora es el momento de abordar este problema.

Actualizaciones

10 de junio de 2021

Hola a todos. Han pasado unos meses desde nuestra última actualización sobre este proyecto. Nos hemos tomado este tiempo para hablar con mucha gente, en la comunidad de editores y dentro de la Fundación. Hemos prestado especial atención a sopesar todas las preocupaciones planteadas en nuestras discusiones con miembros experimentados de la comunidad sobre el impacto que esto tendrá en los esfuerzos contra el vandalismo en todos nuestros proyectos. También hemos escuchado a un número significativo de personas que apoyan esta propuesta como un paso hacia la mejora de la privacidad de los editores no registrados y la reducción de la amenaza legal que representa para nuestros proyectos exponer las IP al mundo.

Cuando hablamos de este proyecto en el pasado, no teníamos una idea clara de la forma que tomaría este proyecto. Nuestra intención era comprender cómo las direcciones IP son útiles para nuestras comunidades. Desde entonces, hemos recibido muchos comentarios sobre este frente de una serie de conversaciones en diferentes idiomas y en diferentes comunidades. Estamos muy agradecidos con todos los miembros de la comunidad que se tomaron el tiempo para informarnos sobre cómo funciona la moderación en sus wikis o en su entorno específico de «multi-wikis».

Propuesta para compartir direcciones IP con quienes necesiten acceso

Ahora tenemos una propuesta más concreta para este proyecto que esperamos permita que la mayor parte del trabajo antivandálico se realice sin problemas y, al mismo tiempo, restrinja el acceso a las direcciones IP de las personas que no necesitan verlas. Quiero enfatizar la palabra «propuesta» porque de ninguna manera es, busca o forma un veredicto final sobre lo que sucederá. Nuestra intención es buscar sus comentarios sobre esta idea. ¿Qué crees que funcionará? ¿Qué crees que no funcionará? ¿Qué otras ideas pueden mejorar esto? Desarrollamos estas ideas durante varias discusiones con miembros experimentados de la comunidad y las perfeccionamos en colaboración con nuestro departamento legal. Este es el esquema:

  • Los checkusers, stewards y administradores deberían poder ver las direcciones IP completas al optar por una preferencia en la que acuerden no compartirla con otras personas que no tienen acceso a esta información.
  • A los editores que participan en actividades antivandálicas, según lo examinado por la comunidad, se les puede otorgar el derecho a ver las direcciones IP para continuar con su trabajo. Esto podría manejarse de manera similar a la administración de nuestros proyectos. La aprobación de la comunidad es importante para garantizar que solo los editores que realmente necesitan este acceso puedan obtenerlo. Los editores deberán tener una cuenta que tenga al menos un año de antigüedad y al menos 500 ediciones.
  • Todos los usuarios con cuentas de más de un año y al menos 500 ediciones podrán acceder a direcciones IP parcialmente enmascaradas sin permiso. Esto significa que aparecerá una dirección IP con su(s) octeto(s) de cola (los últimos dígitos) ocultos. Esto será accesible a través de una preferencia en la que acuerden no compartirla con otras personas que no tengan acceso a esta información.
  • Todos los demás usuarios no podrán acceder a las direcciones IP de los usuarios no registrados.

El acceso a la dirección IP se registrará para que se pueda realizar el debido escrutinio cuando sea necesario. Esto es similar al registro que mantenemos para verificar el acceso de los usuarios a los datos privados. Así es como esperamos equilibrar la necesidad de privacidad con la necesidad de las comunidades de acceder a la información para combatir el spam, el vandalismo y el acoso. Queremos dar la información a quienes la necesitan, pero necesitamos un proceso, necesitamos que sea opt-in para que solo aquellos con una necesidad real la puedan ver, y necesitamos que los accesos estén registrados.

Nos gustaría saber de tu opinión acerca de la propuesta. Puedes dejar tus comentarios en la página de discusión.

  • ¿Qué piensas que funcionará?
  • ¿Qué piensas que no funcionará?
  • ¿Qué otras ideas pueden mejorar la propuesta?

Actualización sobre el desarrollo de herramientas

Como ya sabrás, estamos trabajando en la creación de algunas herramientas nuevas, en parte para suavizar el impacto del enmascaramiento de IP, pero también para simplemente crear mejores herramientas antivandálicas para todos. No es un secreto que el estado de las herramientas de moderación en nuestros proyectos no les da a las comunidades las herramientas que se merecen. Hay mucho margen de mejora. Queremos crear herramientas que faciliten el trabajo eficaz de los luchadores contra el vandalismo. También queremos reducir la barrera de entrada a estos roles para los contribuyentes no técnicos.

We have talked about ideas for these tools before and I will provide a brief update on these below. Note that progress on these tools has been slow in the last few months as our team is working on overhauling SecurePoll to meet the needs of the upcoming WMF Board elections.

IP Info feature

 
Mockup for IP Info

We are building a tool that will display important information about an IP address which is commonly sought in investigations. Typically patrollers, admins and checkusers rely on external websites to provide this information. We hope to make this process easier for them by integrating information from reliable IP-vendors within our websites. We recently built a prototype and conducted a round of user testing to validate our approach. We found that a majority of the editors in the interview set found the tool helpful and indicated they would like to use it in the future. There is an update on the project page that I would like to draw your attention to. Key questions that we would like to have your feedback on the project talk page:

  • When investigating an IP what kinds of information do you look for? Which page are you likely on when looking for this information?
  • What kinds of IP information do you find most useful?
  • What kinds of IP information, when shared, do you think could put our anonymous editors at risk?

Editor matching feature

This project has also been referred to as "Nearby editors" and "Sockpuppet detection" in earlier conversations. We are trying to find a suitable name for it that is understandable even to people who don't understand the word sockpuppetry. We are in the early stages of this project. Wikimedia Foundation Research has a project that could assist in detecting when two editors exhibit similar editing behaviors. This will help connect different unregistered editors when they edit under different auto-generated account usernames. We heard a lot of support for this project when we started talking about it a year ago. We also heard about the risks of developing such a feature. We are planning to build a prototype in the near term and share it with the community. There is a malnourished project page for this project. We hope to have an update for it soon. Your thoughts on this project are very welcome on the project talk page.

Data on Portuguese Wikipedia disabling IP edits

Portuguese Wikipedia banned unregistered editors from making edits to the project last year. Over the last few months, our team has been collecting data about the repercussions of this move on the general health of the project. We have also talked to several community members about their experience. We are working on the final bits to compile all the data that presents an accurate picture of the state of the project. We hope to have an update on this in the near future.

Previous updates

30 October 2020

We have updated the FAQ with more questions that have been asked on the talk page. The Wikimedia Foundation Legal department added a statement on request to the talk page discussion, and we have added it here on the main page too. On the talk page, we have tried to explain roughly how we think about giving the vandal fighters access to the data they need without them having to be CheckUsers or admins.

15 October 2020

This page had become largely out of date and we decided to rewrite parts of it to reflect where we are in the process. This is what it used to look like. We’ve updated it with the latest info on the tools we’re working on, research, fleshed out motivations and added a couple of things to the FAQ. Especially relevant are probably our work on the IP info feature, the new CheckUser tool which is now live on four wikis and our research into the best way to handle IP identification: let us know what you need, the potential problems you see and if a combination of IP and a cookie could be useful for your workflows.

Statement from the Wikimedia Foundation Legal department

This statement from the Wikimedia Foundation Legal department was written on request for the talk page and comes from that context. For visibility, we wanted you to be able to read it here too.

Hello All. This is a note from the Legal Affairs team. First, we’d like to thank everyone for their thoughtful comments. Please understand that sometimes, as lawyers, we can’t publicly share all of the details of our thinking; but we read your comments and perspectives, and they’re very helpful for us in advising the Foundation.

On some occasions, we need to keep specifics of our work or our advice to the organization confidential, due to the rules of legal ethics and legal privilege that control how lawyers must handle information about the work they do. We realize that our inability to spell out precisely what we’re thinking and why we might or might not do something can be frustrating in some instances, including this one. Although we can’t always disclose the details, we can confirm that our overall goals are to do the best we can to protect the projects and the communities at the same time as we ensure that the Foundation follows applicable law.

Within the Legal Affairs team, the privacy group focuses on ensuring that the Foundation-hosted sites and our data collection and handling practices are in line with relevant law, with our own privacy-related policies, and with our privacy values. We believe that individual privacy for contributors and readers is necessary to enable the creation, sharing, and consumption of free knowledge worldwide. As part of that work, we look first at applicable law, further informed by a mosaic of user questions, concerns, and requests, public policy concerns, organizational policies, and industry best practices to help steer privacy-related work at the Foundation. We take these inputs, and we design a legal strategy for the Foundation that guides our approach to privacy and related issues. In this particular case, careful consideration of these factors has led us to this effort to mask IPs of non-logged-in editors from exposure to all visitors to the Wikimedia projects. We can’t spell out the precise details of our deliberations, or the internal discussions and analyses that lay behind this decision, for the reasons discussed above regarding legal ethics and privilege.

We want to emphasize that the specifics of how we do this are flexible; we are looking for the best way to achieve this goal in line with supporting community needs. There are several potential options on the table, and we want to make sure that we find the implementation in partnership with you. We realize that you may have more questions, and we want to be clear upfront that in this dialogue we may not be able to answer the ones that have legal aspects. Thank you to everyone who has taken the time to consider this work and provide your opinions, concerns, and ideas.

Tools

Like mentioned previously, our foremost goal is to provide better anti-vandalism tools for our communities which will provide a better moderation experience for our vandal fighters while also working towards making the IP address string less valuable for them. Another important reason to do this is that IP addresses are hard to understand and are really very useful only to tech-savvy users. This creates a barrier for new users without any technical background to enter into functionary roles as there is a higher learning curve for them to work with IP addresses. We hope to get to a place where we can have moderation tools that anyone can use without much prior knowledge.

The first thing we decided to focus on was to make the CheckUser tool more flexible, powerful and easy to use. It is an important tool that services the need to detect and block bad actors (especially long-term abusers) on a lot of our projects. The CheckUser tool was not very well maintained for many years and as a result it appeared quite dated and lacked necessary features.

We also anticipated an uptick in the number of users who opt-in to the role of becoming a CheckUser on our projects once IP Masking goes into effect. This reinforced the need for a better, easier CheckUser experience for our users. With that in mind, the Anti-Harassment Tools team spent the past year working on improving the CheckUser tool – making it much more efficient and user-friendly. This work has also taken into account a lot of outstanding feature requests by the community. We have continually consulted with CheckUsers and stewards over the course of this project and have tried our best to deliver on their expectations. The new feature is set to go live on all projects in October 2020.

The next feature that we are working on is IP info. We decided on this project after a round of consultation on six wikis which helped us narrow down the use cases for IP addresses on our projects. It became apparent early on that there are some critical pieces of information that IP addresses provide which need to be made available for patrollers to be able to do their roles effectively. The goal for IP Info, thus, is to quickly and easily surface significant information about an IP address. IP addresses provide important information such as location, organization, possibility of being a Tor/VPN node, rDNS, listed range, to mention a few examples. By being able to show this, quickly and easily without the need for external tools everyone can’t use, we hope to be able to make it easier for patrollers to do their job. The information provided is high-level enough that we can show it without endangering the anonymous user. At the same time, it is enough information for patrollers to be able to make quality judgements about an IP address.

After IP Info we will be focusing on a finding similar editors feature. We’ll be using a machine learning model, built in collaboration with CheckUsers and trained on historical CheckUser data to compare user behavior and flag when two or more users appear to be behaving very similarly. The model will take into account which pages users are active on, their writing styles, editing times etc to make predictions about how similar two users are. We are doing our due diligence in making sure the model is as accurate as possible.

Once it’s ready, there is a lot of scope for what such a model can do. As a first step we will be launching it to help CheckUsers detect socks easily without having to perform a lot of manual labor. In the future, we can think about how we can expose this tool to more people and apply it to detect malicious sockpuppeting rings and disinformation campaigns.

You can read more and leave comments on our project page for tools.

Motivation

We who are working on this are doing this because the legal and public policy teams advised us that we should evolve the projects’ handling of IP addresses in order to keep up with current privacy standards, laws, and user expectations. That’s really the main reason.

We also think there are other compelling reasons to work on this. If someone wants to help out and don’t understand the ramifications of their IP address being publicly stored, their desire to make the world and the wiki a better place results in inadvertently sharing their personal data with the public. This is not a new discussion: we’ve had it for about as long as the Wikimedia wikis have been around. An IP address can be used to find out a user’s geographical location and institution and other personally identifiable information, depending on how the IP address was assigned and by whom. This can sometimes mean that an IP address can be used to pinpoint exactly who made an edit and from where, especially when the editor pool is small in a geographic area. Concerns around exposing IP addresses on our projects have been brought repeatedly by our communities and the Wikimedia movement as a whole has been talking about how to solve this problem for at least fifteen years. Here’s a (non-exhaustive) list of some of the previous discussions that have happened around this topic.

We acknowledge that this is a thorny issue, with the potential for causing disruptions in workflows we greatly respect and really don’t want to disrupt. We would only undertake this work, and spend so much time and energy on it, for very good reason. These are important issues independently, and together they have inspired this project: there’s both our own need and desire to protect those who want to contribute to the wikis, and developments in the world we live in, and the online environment in which the projects exist.

Investigación

Véase tambien: IP masking impact report.
 
A Wikimedia Foundation-supported report on the impact that IP masking will have on our community.

IP masking impact

IP addresses are valuable as a semi-reliable partial identifier, which is not easily manipulated by their associated user. Depending on provider and device configuration, IP address information is not always accurate or precise, and deep technical knowledge and fluency is needed to make best use of IP address information, though administrators are not currently required to demonstrate such fluency to have access. This technical information is used to support additional information (referred to as “behavioural knowledge”) where possible, and the information taken from IP addresses significantly impact the course of administrative action taken.

On the social side, the issue of whether to allow unregistered users to edit has been a subject of extensive debate. So far, it has erred on the side of allowing unregistered users to edit. The debate is generally framed around a desire to halt vandalism, versus preserving the ability for pseudo-anonymous editing and lowering the barrier to edit. There is a perception of bias against unregistered users because of their association with vandalism, which also appears as algorithmic bias in tools such as ORES. Additionally, there are major communications issues when trying to talk to unregistered users, largely due to lack of notifications, and because there is no guarantee that the same person will be reading the messages sent to that IP talk page.

In terms of the potential impact of IP masking, it will significantly impact administrator workflows and may increase the burden on CheckUsers in the short term. If or when IP addresses are masked, we should expect our administrators' ability to manage vandalism to be greatly hindered. This can be mitigated by providing tools with equivalent or greater functionality, but we should expect a transitional period marked by reduced administrator efficacy. In order to provide proper tool support for our administrators’ work, we must be careful to preserve or provide alternatives to the following functions currently fulfilled by IP information:

  • Block efficacy and collateral estimation
  • Some way of surfacing similarities or patterns among unregistered users, such as geographic similarity, certain institutions (e.g. if edits are coming from a high school or university)
  • The ability to target specific groups of unregistered users, such as vandals jumping IPs within a specific range
  • Location or institution-specific actions (not necessarily blocks); for example, the ability to determine if edits are made from an open proxy, or public location like a school or public library.

Depending on how we handle temporary accounts or identifiers for unregistered users, we may be able to improve communication to unregistered users. Underlying discussions and concerns around unregistered editing, anonymous vandalism, and bias against unregistered users are unlikely to significantly change if we mask IPs, provided we maintain the ability to edit projects while logged out.

CheckUser Workflow

We interviewed CheckUsers on multiple projects throughout our process for designing the new Special:Investigate tool. Based on interviews and walkthroughs of real-life cases, we broke down the general CheckUser workflow into five sections:

  • Triaging: assessing cases for feasibility and complexity.
  • Profiling: creating a pattern of behaviour which will identify the user behind multiple accounts.
  • Checking: examining IPs and useragents using the CheckUser tool.
  • Judgement: matching this technical information against the behavioural information established in the Profiling step, in order to make a final decision about what kind of administrative action to take.
  • Closing: reporting the outcome of the investigation on public and private platforms where necessary, and appropriately archiving information for future use.

We also worked with staff from Trust and Safety to get a sense for how the CheckUser tool factors into Wikimedia Foundation investigations and cases that are escalated to T&S.

The most common and obvious pain points all revolved around the CheckUser tool's unintuitive information presentation, and the need to open up every single link in a new tab. This cause massive confusion as tab proliferation quickly got out of hand. To make matters worse, the information that CheckUser surfaces is highly technical and not easy to understand at first glance, making the tabs difficult to track. All of our interviewees said that they resorted to separate software or physical pen and paper in order to keep track of information.

We also ran some basic analyses of English Wikipedia's Sockpuppet Investigations page to get some baseline metrics on how many cases they process, how many are rejected, and how many sockpuppets a given report contains.

Patroller use of IP addresses

Previous research on patrolling on our projects has generally focused on the workload or workflow of patrollers. Most recently, the Patrolling on Wikipedia study focuses on the workflows of patrollers and identifying potential threats to current anti-vandal practices. Older studies, such as the New Page Patrol survey and the Patroller work load study, focused on English Wikipedia. They also look solely at the workload of patrollers, and more specifically on how bot patrolling tools have affected patroller workloads.

Our study tried to recruit from five target wikis, which were

  • Japanese Wikipedia
  • Dutch Wikipedia
  • German Wikipedia
  • Chinese Wikipedia
  • English Wikiquote

They were selected for known attitudes towards IP edits, percentage of monthly edits made by IPs, and any other unique or unusual circumstances faced by IP editors (namely, use of the Pending Changes feature and widespread use of proxies). Participants were recruited via open calls on Village Pumps or the local equivalent. Where possible, we also posted on Wiki Embassy pages. Unfortunately, while we had interpretation support for the interviews themselves, we did not extend translation support to the messages, which may have accounted for low response rates. All interviews were conducted via Zoom, with a note-taker in attendance.

Supporting the findings from previous studies, we did not find a systematic or unified use of IP information. Additionally, this information was only sought out after a certain threshold of suspicion. Most further investigation of suspicious user activity begins with publicly available on-wiki information, such as checking previous local edits, Global Contributions, or looking for previous bans.

Precision and accuracy were less important qualities for IP information: upon seeing that one chosen IP information site returned three different results for the geographical location of the same IP address, one of our interviewees mentioned that precision in location was not as important as consistency. That is to say, so long as an IP address was consistently exposed as being from one country, it mattered less if it was correct or precise. This fits with our understanding of how IP address information is used: as a semi-unique piece of information associated with a single device or person, that is relatively hard to spoof for the average person. The accuracy or precision of the information attached to the user is less important than the fact that it is attached and difficult to change.

Our findings highlight a few key design aspects for the IP info tool:

  • Provide at-a-glance conclusions over raw data
  • Cover key aspects of IP information:
    • Geolocation (to a city or district level where possible)
    • Registered organization
    • Connection type (high-traffic, such as data center or mobile network versus low-traffic, such as residential broadband)
    • Proxy status as binary yes or no

As an ethical point, it will be important to be able to explain how any conclusions are reached, and the inaccuracy or imprecisions inherent in pulling IP information. While this was not a major concern for the patrollers we talked to, if we are to create a tool that will be used to provide justifications for administrative action, we should be careful to make it clear what the limitations of our tools are.

Preguntas frecuentes

Q: Will users with advanced permissions such as CheckUsers, Admins, Stewards still have access to IP addresses after this project is complete?

A: We don’t yet have a definitive answer to this question. Ideally, IP addresses should be exposed to as few people as possible (including WMF staff). We hope to restrict IP address exposure to only those users who need to see it.

Q: How would anti-vandalism tools work without IP addresses?

A: There are some potential ideas for achieving this goal. For one, we may be able to surface other pertinent information about the user instead of the IP to the functionaries that provide the same amount of information. In addition, it may be possible to automatically verify if two separate user accounts link to the same IP, without exposing the IP - in cases of sockpuppet investigations. It’s also possible that anti-vandalism tools will continue to use IP addresses, but will have restricted access. We will need to work closely with the community to find the optimal solutions.

Q: If we don’t see IP addresses, what would we see instead when edits are made by unregistered users?

A: Instead of IP addresses, users will be able to see a unique, automatically-generated, human-readable username. This can look something like “Anonymous 12345”, for example.

Q: Will a new username be generated for every unregistered edit?

A: No. We intend to implement some method to make the generated usernames at least partially persistent, for example, by associating them with a cookie, the user’s IP address, or both.

Q: Will you also be removing existing IP addresses from the wikis as part of this project?

A: We will not be hiding any existing IP addresses in history, logs or signatures for this project. It will only affect future edits made after this project has been launched.

Q: Is this project the result of a particular law being passed?

A: No. Data privacy standards are evolving in many countries and regions around the world, along with user expectations. We have always worked hard to protect user privacy, and we continue to learn from and apply best practices based on new standards and expectations. This project is the next step in our own evolution.

Q: What is the timeline on this project?

A: As mentioned above, we will not be making any firm decisions about this project until we have gathered input from the communities. We'd like to figure out sensible early steps that a development team could work on soon, so we can get started on what we think will be a long project, but we're not hurrying to meet a particular deadline.

Q: How do I get involved?

A: We would love to hear if you have ideas or feedback about the project! We would especially like to hear if you have any workflows or processes that might be impacted by this project. You can drop your thoughts on the talk page or fill out this form and we’ll reach out to you. Some of us will be at Wikimania and would love to meet you there as well.

Q: Why is this proposal so unclear?

A: It’s not really a proposal and shouldn’t have been described it as such. We don’t have a solution, but are trying to work out the best solutions with the communities. It might be helpful to understand this as a technical investigation trying to figure out how IP masking could work.

Q: Why don’t you just turn off the ability to edit without registering an account?

A: Unregistered editing works differently across different Wikimedia wikis. For example, Swedish Wikipedia has discussed unregistered editing in the light of this investigation and decided they still want unregistered editing. Japanese Wikipedia has a far higher percentage of IP editing than English Wikipedia, but the revert rate of those edits are only a third – 9.5% compared to 27.4% – indicating that they are also far more useful. We think that deciding for all wikis that they can’t have IP editing is a destructive solution. The research done on IP editing also indicates IP editing is important for editor recruitment.

Q: Who will have access to IPs of unregistered users now?

A: We are not going to leave this burden to e.g. the CheckUsers and the stewards alone. We will have a new user right or the ability to opt in to see the IP if you fulfill certain requirements. Others could potentially see partial IP addresses. We are still talking to the communities about how this could best work.

Q: Has this been decided?

A: Yes. The Wikimedia Foundation’s Legal department has stated that this is necessary. As the entity legally responsible for protecting the privacy of Wikimedia users, the Wikimedia Foundation has accepted this advice and is now working to find the best way to implement this while supporting and listening to the user communities. Some Wikimedians will be unhappy with, this, but legal decisions like these have not been a matter of community consensus. What the communities can be part of deciding is how we do this. That very much needs to be defined with the Wikimedia communities.

Q. Will masks be global, for all Wikimedia wikis, or local, for one wiki?

A: Global. A masked IP will look the same across all Wikimedia wikis.

Q: Will all unregistered users be unblocked when this happens? If not you could track the information in the logs.

A: No. This would wreak havoc on the wikis. This solution will have to be a compromise. We have to balance the privacy of our unregistered editors with our ability to protect the wikis.