Ad
Skip to content

Generative AI will help Wikipedia editors moderate, translate, and onboard newcomers

Image description
GPT-Image-1 prompted by THE DECODER

Key Points

  • The Wikimedia Foundation has introduced a new AI strategy aimed at supporting Wikipedia's volunteer editors with tools for moderation, translation, and onboarding, choosing to lighten workloads rather than replace human contributors.
  • The Foundation will prioritize tools that help editors work more efficiently and bring in new contributors, while holding off on widespread AI-generated content to avoid overwhelming moderators and to protect content quality.
  • While Wikimedia prefers open-source AI, limited resources mean the organization is also considering "open-weight" models, and there are no current plans to build its own proprietary open-source foundation model.

The Wikimedia Foundation unveils its new AI strategy focused on empowering volunteer contributors with tools for moderation, translation, and onboarding.

The Wikimedia Foundation has rolled out a new AI strategy designed to support Wikipedia's volunteer community. Instead of automating away the work of human editors, the plan aims to use AI to reduce the workload and make it easier for more people to get involved.

With broad, site-wide use of AI out of reach for now, the Foundation is focusing its limited resources on a handful of key areas. That means rolling out AI-powered tools for moderation, translation, and smoother onboarding for newcomers.

AI will also help editors surface information faster, freeing up time for the discussion and consensus-building that are at the heart of Wikipedia. The hope is that by lowering technical barriers, even more people will feel empowered to contribute.

Ad
DEC_D_Incontent-1

"Our future work with AI will not only be determined by what we do, but also how we do it," write Chris Albon and Leila Zia, who lead AI and research at the Wikimedia Foundation.

Content generation might come down the road

The new AI strategy doesn’t rule out AI-generated content, but the Foundation is prioritizing content integrity for now.

"If we invest heavily in new content generation before moderation, the new content will overwhelm the capacity of the editors to moderate them. This balance might well shift in time as the needs of moderation vs. new content shifts," the strategy explains.

When it comes to technology choices, Wikimedia prefers open-source AI models. But given resource constraints, the Foundation is also leaning on "open-weight" models—AI systems whose weights are available to the public, even if the code isn't fully open source. Developing a proprietary open-source foundation model isn't on the table for now.

Ad
DEC_D_Incontent-2

AI News Without the Hype – Curated by Humans

As a THE DECODER subscriber, you get ad-free reading, our weekly AI newsletter, the exclusive "AI Radar" Frontier Report 6× per year, access to comments, and our complete archive.

Source: Wikimedia | Meta-Wiki