Technology
Danish Kapoor
Danish Kapoor

Artificial intelligence-induced traffic decline challenges the future of Wikipedia

Wikimedia Foundationaccording to 2025 data Wikipedia page view rates approx. 8% decrease explained what happened. The institution believes that the main reason for this decrease is AI-powered search summaries And large language model based chatbots He states that users are changing the way they access information.

In the evaluation article written by Marshall Miller, it was stated that this trend could affect not only Wikipedia’s visibility but also its sustainability in knowledge production. Miller says the decline in direct visits could weaken volunteer contribution rates, donation amounts and, indirectly, content accuracy. Despite this, Wikipedia remains one of the largest representatives of the verifiable and unbiased information ecosystem on the Internet.

According to the statement, the Wikimedia team has recently made improvements to its data analysis systems to more clearly separate bot traffic. Actual human traffic measured after this renovation was significantly lower than in previous years. The decrease in traffic observed especially in countries such as the USA, India and Germany is considered as a result of the direct involvement of artificial intelligence systems in the information production and sharing process. In addition, search engines now provide the user with Wikipedia-based summary information; This reduces the rate of direct visits to the site.

Recently, the frequency of content scanning of artificial intelligence systems has increased significantly. According to Wikimedia technical reports, from the second half of 2024 LLM bots It is stated that the source scanning rate has doubled. These bots feed their own response models by crawling open data on Wikipedia. However, in this process, users access information without visiting the site directly. Miller defines this situation as “the information chain weakening its own source.”

Wikipedia calls for direct access

Miller made a clearer call for AI-powered platforms. According to him, in order for users to trust the information making the source visible is of great importance. No matter how advanced the chatbot or search summary is, interaction with the source weakens when direct access to information is eliminated. Wikimedia therefore wants major search engines to provide more referrals to Wikipedia. Additionally, more transparent linking policies are called for so that users can re-engage in the content production process.

In addition to all this, the changes planned last summer on Wikipedia artificial intelligence summary system The cancellation of the project is interpreted as an indicator of the sensitivity of participation within the community. Volunteer editors claimed that summaries created by artificial intelligence could disrupt the balance of content integrity and accuracy. The project was shelved before it even started; Thus, Wikipedia’s human-centered editing principle was preserved.

Despite the decline, Wikimedia Foundation is putting some technical and community-based measures for the future on its agenda. These include limiting bot traffic, modernizing volunteer contribution tools, and developing different forms of content access for young users. Additionally, increasing mobile access rates and integrating Wikipedia’s video or image-based content into social platforms are also on the agenda.

However, despite all these initiatives, the traffic loss caused by artificial intelligence is not expected to be regained in the short term. Wikipedia still remains the primary source of information for many AI models; However, this situation creates a paradoxical situation in the long run. As artificial intelligence produces answers using Wikipedia data, the need for users to directly access the site decreases. Thus, the volunteer base of Wikipedia as a source of information shrinks, the verification chain weakens, and the currency of the content is at risk.

Despite this, Wikimedia’s commitment to preserving its philosophy of access to information and participatory production remains. The institution aims to treat this transformation brought by artificial intelligence as a restructuring opportunity, not a loss. Developed new measurement systems and open source bot monitoring tools can contribute to shaping future information sharing in a more transparent framework.


Danish Kapoor