The Wikimedia Foundation unveils its new AI strategy focused on empowering volunteer contributors with tools for moderation, translation, and onboarding.
The Wikimedia Foundation has rolled out a new AI strategy designed to support Wikipedia's volunteer community. Instead of automating away the work of human editors, the plan aims to use AI to reduce the workload and make it easier for more people to get involved.
With broad, site-wide use of AI out of reach for now, the Foundation is focusing its limited resources on a handful of key areas. That means rolling out AI-powered tools for moderation, translation, and smoother onboarding for newcomers.
AI will also help editors surface information faster, freeing up time for the discussion and consensus-building that are at the heart of Wikipedia. The hope is that by lowering technical barriers, even more people will feel empowered to contribute.
"Our future work with AI will not only be determined by what we do, but also how we do it," write Chris Albon and Leila Zia, who lead AI and research at the Wikimedia Foundation.
Content generation might come down the road
The new AI strategy doesn’t rule out AI-generated content, but the Foundation is prioritizing content integrity for now.
"If we invest heavily in new content generation before moderation, the new content will overwhelm the capacity of the editors to moderate them. This balance might well shift in time as the needs of moderation vs. new content shifts," the strategy explains.
When it comes to technology choices, Wikimedia prefers open-source AI models. But given resource constraints, the Foundation is also leaning on "open-weight" models—AI systems whose weights are available to the public, even if the code isn't fully open source. Developing a proprietary open-source foundation model isn't on the table for now.