[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"nav-categories":3,"article-the-mirage-effect-stanford-just-proved-that-computer-vision-is-often-just-confident-bullshit":70},{"data":4},[5,37,57,64],{"name":6,"slug":7,"categories":8},"Productivity","productivity",[9,13,17,21,25,29,33],{"id":10,"title":11,"slug":12},17,"Branding","branding",{"id":14,"title":15,"slug":16},19,"Marketing","marketing",{"id":18,"title":19,"slug":20},20,"Work","work",{"id":22,"title":23,"slug":24},34,"Community","community",{"id":26,"title":27,"slug":28},21,"For newbies","for-newbies",{"id":30,"title":31,"slug":32},24,"Investment","investment",{"id":34,"title":35,"slug":36},22,"Finance","finance",{"name":38,"slug":39,"categories":40},"Tech","tech",[41,45,49,53],{"id":42,"title":43,"slug":44},28,"Technology","technology",{"id":46,"title":47,"slug":48},32,"Artificial Intelligence","artificial-intelligence",{"id":50,"title":51,"slug":52},26,"Security and protection","security-and-protection",{"id":54,"title":55,"slug":56},31,"YouTube Blog","youtube-blog",{"name":58,"slug":59,"categories":60},"News","news",[61],{"id":62,"title":58,"slug":63},18,"quasanews",{"name":65,"slug":66,"categories":67},"Business","business",[68],{"id":69,"title":65,"slug":66},16,{"post":71,"published_news":95,"popular_news":153,"categories":224},{"title":72,"description":73,"meta_title":72,"meta_description":74,"meta_keywords":75,"text":76,"slug":77,"created_at":78,"publish_at":79,"formatted_created_at":80,"category_id":46,"links":81,"view_type":86,"video_url":87,"views":88,"likes":89,"lang":90,"comments_count":89,"category":91},"The Mirage Effect: Stanford Just Proved That “Computer Vision” Is Often Just Confident Bullshit","A new preprint from Stanford researchers has dropped a quiet bomb on the entire field of multimodal AI. They call it the Mirage Effect — and it’s one of the most uncomfortable findings in recent AI research.","They call it the Mirage Effect — and it’s one of the most uncomfortable findings in recent AI research.","Here’s what happens: you ask a modern vision-language model (GPT-4o, Claude-3.5, Gemini, Qwen, etc.) to describe an image… except there is no image.","\u003Cp>A new preprint from Stanford researchers has dropped a quiet bomb on the entire field of multimodal AI. They call it the \u003Cstrong>Mirage Effect\u003C/strong>&nbsp;&mdash; and it&rsquo;s one of the most uncomfortable findings in recent AI research.\u003C/p>\n\n\u003Cp>\u003Cpicture class=\"image-align-left\">\u003Csource srcset=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173204254.webp\" type=\"image/webp\">\u003Cimg alt=\"The Mirage Effect: Stanford Just Proved That “Computer Vision” Is Often Just Confident Bullshit\" class=\"image-align-left\" height=\"201\" src=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173204254.jpg\" width=\"300\" />\u003C/picture>Here&rsquo;s what happens: you ask a modern vision-language model (GPT-4o, Claude-3.5, Gemini, Qwen, etc.) to describe an image&hellip; except there is no image. The file never uploaded, the link broke, or you simply forgot to attach it. &nbsp;\u003C/p>\n\n\u003Cp>Instead of saying &ldquo;Sorry, no image was provided,&rdquo; the model confidently starts hallucinating vivid details: &ldquo;This chest X-ray shows mild pneumothorax in the left lung,&rdquo; &ldquo;There are three sparrows on the branch,&rdquo; or &ldquo;The license plate is clearly 7H8K-392.&rdquo;\u003C/p>\n\n\u003Cp>It doesn&rsquo;t hedge. It doesn&rsquo;t admit uncertainty. It just delivers a detailed, authoritative description of something that doesn&rsquo;t exist.\u003C/p>\n\n\u003Chr />\n\u003Ch4>\u003Cstrong>The Numbers Are Brutal\u003C/strong>\u003C/h4>\n\n\u003Cp>\u003Cstrong>\u003Cpicture class=\"image-align-right\">\u003Csource srcset=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173205817.webp\" type=\"image/webp\">\u003Cimg alt=\"The Mirage Effect: Stanford Just Proved That “Computer Vision” Is Often Just Confident Bullshit\" class=\"image-align-right\" height=\"447\" src=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173205817.jpg\" width=\"300\" />\u003C/picture>The researchers tested the latest frontier multimodal models and found:\u003C/strong>\u003C/p>\n\n\u003Cul>\n\t\u003Cli>On average, these models produce \u003Cstrong>visual mirages more than 60% of the time\u003C/strong>&nbsp;when no image is actually present.\u003C/li>\n\t\u003Cli>With certain prompting styles, the rate jumps to \u003Cstrong>90&ndash;100%\u003C/strong>. &nbsp;\u003C/li>\n\t\u003Cli>Not a single major model reliably says &ldquo;I don&rsquo;t see an image.&rdquo;\u003C/li>\n\u003C/ul>\n\n\u003Cp>Even worse: when the researchers stripped away the actual images from standard visual benchmarks, the models still achieved \u003Cstrong>70&ndash;80% of their original &ldquo;visual&rdquo; accuracy\u003C/strong>. In other words, a huge chunk of what we&rsquo;ve been calling &ldquo;computer vision success&rdquo; was never vision at all &mdash; it was just the model exploiting statistical patterns in the questions and training data.\u003C/p>\n\n\u003Ch4>\u003Cstrong>The Medical Problem Is Especially Scary\u003C/strong>\u003C/h4>\n\n\u003Cp>The mirages get darker in healthcare. When no image is provided, models don&rsquo;t just guess randomly &mdash; they are heavily biased toward inventing **severe pathologies**. In the paper&rsquo;s medical examples, the hallucinations disproportionately produce melanoma, carcinoma, fractures, tumors, and other &ldquo;scary&rdquo; diagnoses.\u003C/p>\n\n\u003Cp>If an image fails to load in a real clinical pipeline, the model won&rsquo;t flag &ldquo;no data.&rdquo; It will confidently deliver a terrifying (and completely made-up) diagnosis instead.\u003C/p>\n\n\u003Chr />\n\u003Ch4>\u003Cstrong>The Ultimate Humiliation: A 3B Model Beats the Giants\u003C/strong>\u003C/h4>\n\n\u003Cp>To prove how broken the current evaluation is, the Stanford team took a relatively small \u003Cstrong>Qwen-2.5 3B\u003C/strong>&nbsp;model and fine-tuned it on a chest X-ray benchmark &mdash; \u003Cstrong>but without ever showing it a single real image\u003C/strong>. They only let it see the questions and the answer distribution.\u003C/p>\n\n\u003Cp>Result? &nbsp;\u003Cbr />\nThis tiny 3-billion-parameter model \u003Cstrong>outperformed\u003C/strong>&nbsp;giant frontier models *and* the average human radiologist on the benchmark.\u003C/p>\n\n\u003Cp>It didn&rsquo;t learn to read X-rays. &nbsp;\u003Cbr />\nIt learned to read the test.\u003C/p>\n\n\u003Chr />\n\u003Ch4>\u003Cstrong>The Proposed Fix: B-Clean\u003C/strong>\u003C/h4>\n\n\u003Cp>The authors aren&rsquo;t just complaining. They introduced a new cleaning method called \u003Cstrong>B-Clean\u003C/strong>&nbsp;(Benchmark Cleaning). The idea is simple but powerful: go through every visual benchmark and remove any question that a model can answer correctly without actually needing to see the image.\u003C/p>\n\n\u003Cp>Only after this cleaning should we measure true visual intelligence. Until then, we&rsquo;re mostly measuring how well models have learned to bullshit convincingly.\u003C/p>\n\n\u003Cp>\u003Cpicture class=\"image-align-left\">\u003Csource srcset=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173209916.webp\" type=\"image/webp\">\u003Cimg alt=\"The Mirage Effect: Stanford Just Proved That “Computer Vision” Is Often Just Confident Bullshit\" class=\"image-align-left\" height=\"447\" src=\"https://cdn.quasa.io/photos/00/image-2026-04-11t173209916.jpg\" width=\"300\" />\u003C/picture>Also read:\u003C/p>\n\n\u003Cul>\n\t\u003Cli>\u003Ca href=\"https://quasa.io/media/thirteen-bullets-and-one-molotov-cocktail-how-anti-ai-protest-just-got-deadly-serious\">Thirteen Bullets and One Molotov Cocktail: How Anti-AI Protest Just Got Deadly Serious\u003C/a>\u003C/li>\n\t\u003Cli>\u003Ca href=\"https://quasa.io/media/the-dawn-of-the-wisdom-era-why-your-intelligence-is-no-longer-enough\">The Dawn of the Wisdom Era: Why Your Intelligence is No Longer Enough\u003C/a>\u003C/li>\n\t\u003Cli>\u003Ca href=\"https://quasa.io/media/two-seemingly-contradictory-thoughts-about-ai-transformation\">Two Seemingly Contradictory Thoughts About AI Transformation\u003C/a>\u003C/li>\n\u003C/ul>\n\n\u003Ch4>\u003Cstrong>Why This Matters\u003C/strong>\u003C/h4>\n\n\u003Cp>We&rsquo;ve spent years celebrating &ldquo;state-of-the-art&rdquo; scores on visual reasoning benchmarks. This paper suggests that a large portion of those victories were mirages &mdash; impressive-sounding nonsense delivered with total confidence.\u003C/p>\n\n\u003Cp>The models aren&rsquo;t just occasionally wrong. &nbsp;\u003Cbr />\nThey&rsquo;re systematically pretending to see things that aren&rsquo;t there, especially when it matters most (medicine, safety-critical applications).\u003C/p>\n\n\u003Cp>\u003Ca href=\"https://arxiv.org/abs/2603.21687\">The preprint is available here:&rarr; https://arxiv.org/abs/2603.21687\u003C/a>\u003C/p>\n\n\u003Cp>It&rsquo;s one of those rare papers that doesn&rsquo;t just point out a flaw &mdash; it explains why the entire measurement system has been lying to us. And it gives us a practical way to stop believing the mirage.\u003C/p>","the-mirage-effect-stanford-just-proved-that-computer-vision-is-often-just-confident-bullshit","2026-04-11T15:34:04.000000Z","2026-04-16T11:27:00.000000Z","16.04.2026",{"image":82,"image_webp":83,"thumb":84,"thumb_webp":85},"https://cdn.quasa.io/images/news/E7k3kFPS8Ld8gMXfXbpP3UnpeOeVN6n0q2J3rnoT.jpg","https://cdn.quasa.io/images/news/E7k3kFPS8Ld8gMXfXbpP3UnpeOeVN6n0q2J3rnoT.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/E7k3kFPS8Ld8gMXfXbpP3UnpeOeVN6n0q2J3rnoT.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/E7k3kFPS8Ld8gMXfXbpP3UnpeOeVN6n0q2J3rnoT.webp","small",null,35,0,"en",{"id":46,"title":47,"slug":48,"meta_title":47,"meta_description":92,"meta_keywords":92,"deleted_at":87,"created_at":93,"updated_at":94,"lang":90},"Artificial Intelligence, ai, ml, machine learning, chatgpt, future","2024-09-22T08:08:27.000000Z","2024-09-23T12:49:38.000000Z",[96,100,113,126,139],{"title":72,"description":73,"slug":77,"created_at":78,"publish_at":79,"formatted_created_at":80,"category":97,"links":98,"view_type":86,"video_url":87,"views":88,"likes":89,"lang":90,"comments_count":89,"is_pinned":99},{"title":47,"slug":48},{"image":82,"image_webp":83,"thumb":84,"thumb_webp":85},false,{"title":101,"description":102,"slug":103,"created_at":104,"publish_at":105,"formatted_created_at":80,"category":106,"links":107,"view_type":86,"video_url":87,"views":112,"likes":89,"lang":90,"comments_count":89,"is_pinned":99},"a16z’s New Top 100 AI Consumer Apps Just Rewrote the Rules — And the Leaderboard Is Finally Stabilizing","In March 2026, Andreessen Horowitz released the 6th edition of its influential “100 Gen AI Consumer Apps” ranking. This isn’t just another quarterly update.","a16z-s-new-top-100-ai-consumer-apps-just-rewrote-the-rules-and-the-leaderboard-is-finally-stabilizing","2026-04-11T15:17:30.000000Z","2026-04-16T09:17:00.000000Z",{"title":31,"slug":32},{"image":108,"image_webp":109,"thumb":110,"thumb_webp":111},"https://cdn.quasa.io/images/news/bfjWNnYdRvEtVzMMgPieoBIYjbOjUittsYtgz3N0.jpg","https://cdn.quasa.io/images/news/bfjWNnYdRvEtVzMMgPieoBIYjbOjUittsYtgz3N0.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/bfjWNnYdRvEtVzMMgPieoBIYjbOjUittsYtgz3N0.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/bfjWNnYdRvEtVzMMgPieoBIYjbOjUittsYtgz3N0.webp",48,{"title":114,"description":115,"slug":116,"created_at":117,"publish_at":118,"formatted_created_at":80,"category":119,"links":120,"view_type":86,"video_url":87,"views":125,"likes":89,"lang":90,"comments_count":89,"is_pinned":99},"The Billion-Dollar Solo Startup: How One Man Built Medvi — $401M in Year One, $1.8B Run Rate in Year Two — With Just AI and His Brother","In September 2024, Matthew Gallagher did something that most founders would call impossible.","the-billion-dollar-solo-startup-how-one-man-built-medvi-401m-in-year-one-1-8b-run-rate-in-year-two-with-just-ai-and-his-brother","2026-04-11T13:46:41.000000Z","2026-04-16T06:36:00.000000Z",{"title":27,"slug":28},{"image":121,"image_webp":122,"thumb":123,"thumb_webp":124},"https://cdn.quasa.io/images/news/tGvDmtvZsCfJYVWFXT23fjo9Bp0fzURpyH82urY4.jpg","https://cdn.quasa.io/images/news/tGvDmtvZsCfJYVWFXT23fjo9Bp0fzURpyH82urY4.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/tGvDmtvZsCfJYVWFXT23fjo9Bp0fzURpyH82urY4.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/tGvDmtvZsCfJYVWFXT23fjo9Bp0fzURpyH82urY4.webp",61,{"title":127,"description":128,"slug":129,"created_at":130,"publish_at":131,"formatted_created_at":80,"category":132,"links":133,"view_type":86,"video_url":87,"views":138,"likes":89,"lang":90,"comments_count":89,"is_pinned":99},"Huawei’s Moon Mode Scandal: The Forgotten 2019 AI Fake That Suddenly Feels Nostalgic — As Huawei Prepares to Power DeepSeek V4","In early April 2026, a quiet but seismic piece of news dropped: DeepSeek’s upcoming V4 model — the next major leap from one of China’s strongest open-weight AI labs — will run entirely on Huawei chips, not Nvidia.","huawei-s-moon-mode-scandal-the-forgotten-2019-ai-fake-that-suddenly-feels-nostalgic-as-huawei-prepares-to-power-deepseek-v4","2026-04-11T13:33:59.000000Z","2026-04-16T03:25:00.000000Z",{"title":65,"slug":66},{"image":134,"image_webp":135,"thumb":136,"thumb_webp":137},"https://cdn.quasa.io/images/news/OSByK57tywEscodciIprl7UfUwGIZyWDV65GG9Tr.jpg","https://cdn.quasa.io/images/news/OSByK57tywEscodciIprl7UfUwGIZyWDV65GG9Tr.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/OSByK57tywEscodciIprl7UfUwGIZyWDV65GG9Tr.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/OSByK57tywEscodciIprl7UfUwGIZyWDV65GG9Tr.webp",79,{"title":140,"description":141,"slug":142,"created_at":143,"publish_at":144,"formatted_created_at":145,"category":146,"links":147,"view_type":86,"video_url":87,"views":152,"likes":89,"lang":90,"comments_count":89,"is_pinned":99},"Meta’s Muse Spark: A Respectable Step Up That Finally Puts Them Back in the Game","On April 8, 2026, Meta Superintelligence Labs quietly dropped Muse Spark — the first model in their new “Muse” family. It’s not the flashy, headline-grabbing monster that instantly claims the #1 spot on every leaderboard. But here’s the thing: it doesn’t have to be.","meta-s-muse-spark-a-respectable-step-up-that-finally-puts-them-back-in-the-game","2026-04-11T13:12:57.000000Z","2026-04-15T11:06:00.000000Z","15.04.2026",{"title":58,"slug":63},{"image":148,"image_webp":149,"thumb":150,"thumb_webp":151},"https://cdn.quasa.io/images/news/E3InMGumI0q4D1ZaZ3uxWCVLM2CQbKGnPvZkEETO.jpg","https://cdn.quasa.io/images/news/E3InMGumI0q4D1ZaZ3uxWCVLM2CQbKGnPvZkEETO.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/E3InMGumI0q4D1ZaZ3uxWCVLM2CQbKGnPvZkEETO.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/E3InMGumI0q4D1ZaZ3uxWCVLM2CQbKGnPvZkEETO.webp",163,[154,167,183,195,210],{"title":155,"description":156,"slug":157,"created_at":158,"publish_at":159,"formatted_created_at":160,"category":161,"links":162,"view_type":86,"video_url":87,"views":165,"likes":166,"lang":90,"comments_count":89,"is_pinned":99},"The Anatomy of an Entrepreneur","Entrepreneur is a French word that means an enterpriser. Enterprisers are people who undertake a business or enterprise with the chance of earning profits or suffering from loss.","the-anatomy-of-an-entrepreneur","2021-08-04T15:18:21.000000Z","2025-12-14T06:09:00.000000Z","14.12.2025",{"title":65,"slug":66},{"image":163,"image_webp":87,"thumb":164,"thumb_webp":164},"https://cdn.quasa.io/images/news/mVsXPTMuHZuI7UXCsENgL1Qwp1uSOf7Rz3uVPMfm.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/mVsXPTMuHZuI7UXCsENgL1Qwp1uSOf7Rz3uVPMfm.webp",70430,2,{"title":168,"description":169,"slug":170,"created_at":171,"publish_at":172,"formatted_created_at":173,"category":174,"links":175,"view_type":180,"video_url":87,"views":181,"likes":182,"lang":90,"comments_count":89,"is_pinned":99},"Advertising on QUASA","QUASA MEDIA is read by more than 400 thousand people a month. We offer to place your article, add a link or order the writing of an article for publication.","advertising-on-quasa","2022-07-06T07:33:02.000000Z","2025-12-15T17:33:02.000000Z","15.12.2025",{"title":58,"slug":63},{"image":176,"image_webp":177,"thumb":178,"thumb_webp":179},"https://cdn.quasa.io/images/news/45SvmdsTQbiyc3nxgbyHY1mpVbisYyub2BCHjqBL.jpg","https://cdn.quasa.io/images/news/45SvmdsTQbiyc3nxgbyHY1mpVbisYyub2BCHjqBL.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/45SvmdsTQbiyc3nxgbyHY1mpVbisYyub2BCHjqBL.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/45SvmdsTQbiyc3nxgbyHY1mpVbisYyub2BCHjqBL.webp","large",70182,4,{"title":184,"description":185,"slug":186,"created_at":187,"publish_at":188,"formatted_created_at":189,"category":190,"links":191,"view_type":86,"video_url":87,"views":194,"likes":182,"lang":90,"comments_count":89,"is_pinned":99},"What is a Startup?","A startup is not a new company, not a tech company, nor a new tech company. You can be a new tech company, if your goal is not to grow high and fast; then, you are not a startup. ","what-is-a-startup","2021-08-04T12:05:17.000000Z","2025-12-17T13:02:00.000000Z","17.12.2025",{"title":65,"slug":66},{"image":192,"image_webp":87,"thumb":193,"thumb_webp":193},"https://cdn.quasa.io/images/news/EOsQhSW3VXyG7a6NPdE1oZd00xfJXe3bjY5aJGb7.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/EOsQhSW3VXyG7a6NPdE1oZd00xfJXe3bjY5aJGb7.webp",67833,{"title":196,"description":197,"slug":198,"created_at":199,"publish_at":200,"formatted_created_at":201,"category":202,"links":203,"view_type":86,"video_url":87,"views":208,"likes":166,"lang":90,"comments_count":209,"is_pinned":99},"Top 5 Tips to Make More Money as a Content Creator","Content creators are one of the most desired job titles right now. Who wouldn’t want to earn a living online?","top-5-tips-to-make-more-money-as-a-content-creator","2022-01-17T17:31:51.000000Z","2026-01-17T11:30:00.000000Z","17.01.2026",{"title":19,"slug":20},{"image":204,"image_webp":205,"thumb":206,"thumb_webp":207},"https://cdn.quasa.io/images/news/gP8kiumBPpJmQv6SMieXiX1tDetx43VwFfO1P4Ca.jpg","https://cdn.quasa.io/images/news/gP8kiumBPpJmQv6SMieXiX1tDetx43VwFfO1P4Ca.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/gP8kiumBPpJmQv6SMieXiX1tDetx43VwFfO1P4Ca.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/gP8kiumBPpJmQv6SMieXiX1tDetx43VwFfO1P4Ca.webp",41851,1,{"title":211,"description":212,"slug":213,"created_at":214,"publish_at":215,"formatted_created_at":216,"category":217,"links":218,"view_type":180,"video_url":87,"views":223,"likes":166,"lang":90,"comments_count":89,"is_pinned":99},"8 Logo Design Tips for Small Businesses","Your logo tells the story of your business and the values you stand for.","8-logo-design-tips-for-small-businesses","2021-12-04T21:59:52.000000Z","2025-05-05T03:30:00.000000Z","05.05.2025",{"title":15,"slug":16},{"image":219,"image_webp":220,"thumb":221,"thumb_webp":222},"https://cdn.quasa.io/images/news/Wbx2NtS1CnTupgoQbpFMGspJ5jm4uob2hDOq33r0.jpg","https://cdn.quasa.io/images/news/Wbx2NtS1CnTupgoQbpFMGspJ5jm4uob2hDOq33r0.webp","https://cdn.quasa.io/thumbs/news-thumb/images/news/Wbx2NtS1CnTupgoQbpFMGspJ5jm4uob2hDOq33r0.jpg","https://cdn.quasa.io/thumbs/news-thumb/images/news/Wbx2NtS1CnTupgoQbpFMGspJ5jm4uob2hDOq33r0.webp",40978,[225,226,227,228,229,230,231,232,233,234,235,236,237],{"title":23,"slug":24},{"title":47,"slug":48},{"title":55,"slug":56},{"title":43,"slug":44},{"title":51,"slug":52},{"title":31,"slug":32},{"title":35,"slug":36},{"title":27,"slug":28},{"title":19,"slug":20},{"title":15,"slug":16},{"title":58,"slug":63},{"title":11,"slug":12},{"title":65,"slug":66}]