Research:Analysis and Response Toolkit for Trust
This page is an incomplete draft of a research project.
Information is incomplete and is likely to change substantially before the project starts.
Key personnel
edit- Connie Moon Sehat (Hacks/Hackers) and co-principal investigators
- Ariel Cetrone (Wikimedia DC)
- Andrea Bras (Hacks/Hackers)
- Alexandra Bornhoft (Hacks/Hackers)
- Peter B Meyer (Wikimedia DC)
Project summary
editThe Analysis and Response Toolkit for Trust (ARTT) project is aimed at providing people in online communities with tools that support trust-building conversations when discussing vaccine efficacy and other topics. One aspect of the ARTT project involves examining this issue of quality vaccine journalism. Through a collaboration with Wikimedia, this aspect of the project aims to develop and test software tools and resources, with the goal of helping Wikimedia editors and "community connectors" on social media platforms to discuss vaccine efficacy. The tools will help evaluate article quality, detect and address misinformation, and aid in Talk page discussion.
Through August 2022 (Phase I) the team is collecting input on what makes vaccine articles reliable, what makes talk page and social media conversations about vaccines constructive, and how software tools can help. The design of ARTT software tool is envisioned to integrate components of this research, such as:
- evaluate Wikimedia articles and other pages as sources of reliable information on vaccines
- develop lists of reliable and unreliable sources in this area and evaluation methods
- understand the structures and page relationships on Wikimedia and other platforms so as to use related information on, for example, e.g. talk pages, article history pages, user histories, and ORESscores of articles.
- help improve such articles
- improve accuracy and effectiveness of response, and reduce the workload of community connectors, in the long run
- we invite responses to our preliminary questionnaire at bit.ly/ARTTform
The following research questions were asked in Phase I:
- Can ARTT strengthen a reliable source guide (perennial source guide)?
- Is it possible to increase the quality of a vaccine-related Wikipedia articles to the point where ARTT may recognize them as reliable sources? What would it take for an article, itself, to become a reliable source?
- How can the tool encourage collegial exchanges among editors, e.g. on talk pages? If there was a tool that recommends tactics for effective communication with other editors, would you be inclined to use it?
- Would a tool that recommends quality sources be of use to trainers?
Methods
edit- Focus groups
- Interviews and snowball surveys, opportunistically
- Evaluate related tools, designs, and source code such as ORES, Cite Unseen, and Busy Squirrel
Dissemination
editWikimedia Policies, Ethics, and Human Subjects Protection
edit- Generally we won't use personally identifying information unless we permission of the respondent.
Benefits for the Wikimedia community
editThe tool will in the long run help evaluate and monitor vaccine materials on Wikimedia.
Timeline
edit- NSF funding goes to August 2022 for the research/design phase I.
Funding
edit- US National Science Foundation
References
edit- ARTT description at the University of Washington Computer Science dept has links to datasets and tools to diagnose reliable information and misinformation
- Hacks/Hackers, Partners Awarded Funding to Participate in the 2021 National Science Foundation’s Convergence Accelerator, NewsQ News Quality Initiative, September 22, 2021
- Justine Zhang, Dario Taborelli, et al
- ORES
- Cite Unseen
- Busy Squirrel, a game associated with a computer program's positivity/sentiment inferences about Wikimedia talk pages
- Sheldon Krimsky. 2012. Do financial conflicts of interest bias research? An inquiry into the 'funding effect' hypothesis. Science, Technology, & Human Values 38(4):, 566-587. -- discusses biases in chemical/pharma research, also called "funder effects" or "funder COI" -- note that Wikipedia articles would intrinsically include such biases. A Wikipedia article on a new topic could not be perfect.