Artificial Intelligence
Google’s AI problem is a problem for us all
From government documents to news reports, commerce, music, and social interactions, much of the world’s information is now online. Google, founded in 1998 with the mission “to organize the world’s information and make it universally accessible and useful,” has become the gateway to this vast repository of knowledge and culture.
However, while Google Search is essential infrastructure, Google itself is undermining its integrity in ways that demand a strong regulatory response.
On May 14, Google announced a revamp of its core search website to integrate generative AI content, aiming to “re-imagine” search. One of the first implementations, AI Overviews, uses a large language model (LLM) to generate authoritative-sounding responses to queries, eliminating the need for users to visit other websites.
To be clear, LLMs are not a form of intelligence, artificial or otherwise. They cannot reason or discern truth. They are high-powered pattern recognition machines, generating responses based on statistical probability derived from their databases. This fundamental limitation has led to humorous yet concerning errors, such as AI Overviews suggesting adding glue to pizza sauce to prevent cheese from sliding off, advising people to eat small rocks, and claiming there are no African countries starting with the letter K.
These are not mere errors but inherent flaws in how LLMs function. They do not evaluate truth claims but rather produce statistically probable outputs based on their training data. Despite Google acknowledging these criticisms and promising improvements, the nature of LLMs as statistical machines means AI Overviews will likely remain flawed.
These issues highlight disturbing concerns about our reliance on Google for organizing and accessing information. Two fundamental flaws in Google Search are becoming increasingly problematic.
First, Google’s dependence on ad revenue has compromised its search functionality. The prioritization of paid advertisements over organic search results has degraded the user experience, favoring advertisers’ interests over those seeking reliable information. This ad-driven model also puts Google in direct competition with media companies for advertising dollars, further complicating the knowledge ecosystem.
This conflict was a key reason for the Canadian government’s Online News Act, requiring companies like Google and Meta to negotiate payments to Canadian news media organizations. The introduction of AI Overviews, designed to keep users on Google rather than directing them to other websites, exacerbates this issue.
Secondly, Google’s approach to knowledge, rooted in an ideology known as “dataism,” values data correlations over context and accuracy. This anti-science worldview ignores fundamental scientific standards of validity and reliability. Google’s search algorithm ranks results based on popularity rather than expert judgment, unlike librarians who curate and categorize books with careful consideration.
The societal damage from relying on a corrupted knowledge-organizing process is significant. Access to sound knowledge is crucial for societal function. Google’s advertising dependence and dataist ideology are actively sabotaging our knowledge ecosystem, necessitating a regulatory response.
Google Search should be managed by individuals with the ethics of librarians, not tech executives driven by profit. Governments must establish minimum standards for search quality, including separating advertising from search results and preventing the use of search data for personalized advertising. Additionally, search companies and global platforms should be subjected to domestic democratic oversight while maintaining interoperability across borders with cooperation from like-minded democratic countries.
Implementing these measures will be challenging, but we cannot continue to delegate the organization of the world’s information to a profit-driven entity indifferent to the truth. Effective regulation is essential to ensure access to reliable information and maintain the integrity of our knowledge ecosystem.