10.06.2025 08:59

Controversial, Yet…

News image

The way people interact with information online is changing how they perceive their own knowledge, and the rise of AI assistants has added a new layer to this phenomenon.

A recent study explored how different methods of accessing information — traditional search engines versus conversational AI assistants — impact self-assessed knowledge. The findings are surprising, but they’ve sparked some disagreement.

Researchers conducted two experiments to compare the effects of these tools. When people used traditional search engines, they tended to overestimate their knowledge, conflating the information they found online with their own understanding, a phenomenon known as the "Google effect."

Essentially, they believed they knew the information all along because it was so readily accessible.

However, when participants interacted with AI digital assistants designed to behave like conversational partners, the opposite occurred. They didn’t inflate their self-assessed knowledge — in fact, they underestimated it.

The study suggests that AI interactions help people better distinguish between what they truly know and what they’ve learned from external sources. The paradox is striking: traditional internet searches breed overconfidence, while AI assistants foster humility and self-awareness.

But here’s where we diverge. Based on our own subjective observations, the reality seems flipped. People who’ve shifted to AI-driven searches often exhibit a surge in vanity. They take pride in having "trained" the AI, claiming they’ve "taught" it to find the "right" answers, "guided" it, or even "co-created" with it.

This interaction fuels a sense of superiority, as if they’re imparting wisdom to an unrefined AI, reinforcing their belief that they already know what’s correct and are merely shaping the AI to reflect that. The dialogue with AI becomes a mirror for their ego, amplifying their confidence in their own expertise.

In contrast, those who stick to traditional Google searches seem more grounded. They understand that search results are just a starting point—links to be critically evaluated. They’re more likely to recognize their own knowledge gaps, approaching the process with skepticism and a willingness to question their assumptions.

The act of sifting through results, reading, and verifying forces a humility that AI interactions often lack, as the latter can feel more like a collaborative partnership than a neutral tool.


Also read:

The study’s findings challenge these observations, creating a contentious debate. On one hand, the research suggests AI assistants could be a humbling force, helping users better understand their limits.

On the other, real-world behavior indicates that AI interactions might inflate egos, fostering a false sense of mastery over both the technology and the knowledge it provides.

The truth likely lies in a gray area, shaped by individual attitudes and the design of the AI itself. What’s clear is that the way we access information is reshaping not just what we know, but how we see ourselves — and that’s a topic worth debating.


0 comments
Read more