• HazardousBanjo@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    14 hours ago

    When the collapse of Stack Overflow happens because of this shit, say goodbye forever to researching solutions to your coding difficulties.

    • Kaligalis@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      11 hours ago

      It’s been a while that I needed to visited Stack Overflow. When I need a coding question answered, I ask Claude Code. Just like back then with answers from Stack Overflow, I still do the sanity checks and verification using my own natural neural network. AI is pretty useful as an assistant for coding if you are able to properly review the generated code.

  • Tiral@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    15 hours ago

    Feel bad because I use this a lot. It really isn’t fair for the websites though.

    • Teppa@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 days ago

      Maybe it could actually fix microslops crappy online documentation library, thats pro customer.

      Relink all the broken links, fix outdated information from patches where they never bothered updating it.

  • Just inside that door is a Home Alone level of ad boobytraps you have to endure before the shop owner smashes you in the shins with an email newsletter appeal. Only then can you actually shop for a bit before a pop-up ad spider falls on you from the rafters.

    • Raiderkev@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 days ago

      Yes, but surely Google’s plan long term is to be the one placing those ad booby traps and taking the money from that shopkeep, but obviously not in an evil way, that’d be weird.

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Google doesn’t try to stop you from visiting a website. It tries to answer your query directly, which may mean it’s no longer necessary to visit the website.

    A more realistic scenario is someone asking, “hey, what’s 20 ounces in grams?” Then there’s a “website” that wants to invite you in and tell you all about unit conversion, and show you tables for how many tonnes are in a ton, etc. Meanwhile “Google” just says “566.99”. It started doing that sort of thing back in 2012, long before the AI boom started. Many of those info cards (like unit conversions) don’t use LLMs and are actually really handy.

    Having said that, yeah, it’s devastating to websites that were free to use and ad supported and depended on traffic to survive. And, because humans are thrifty, websites that weren’t free to use mostly disappeared a long time ago. I don’t know what the solution is. But, I don’t think it’s “prevent Google from answering your question if it is capable of doing so”.

    • RememberTheApollo_@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 days ago

      In your example, many browsers will do simple math or conversions in the address bar without AI, and google gave that same answer prior to AI with a simple conversion menu box that showed up as you said. It still does sometimes depending on the question.

      More realistically it would be “How long is a Boeing 747?” Now an AI will give you a length range and offer that there are different 747 models manufactured in different years, etc.

      So instead of clicking on an ad-supported “aviation info” site like you say, odds are the asker will just take the provided summary and not proceed any further, or refine the question to a specific model that again an AI will probably answer, even possibly ironically scraping the content from the very same “aviation info” site that would have received your click 5 years ago, but now google gets the view and the site doesn’t.

  • coolfission@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 days ago

    Worst is when is uses phrases like “widely accepted” for questions I ask that are open-ended. Also Gemini completely makes up stuff when it comes to configuring settings in DAW and Davinci Resolve in my experience.

  • slazer2au@lemmy.worldM
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    How about I summarise it? That way you don’t have to deal with a cookie popup, a news letter popup, a signin with google popup, a request to send notifications to your browser popup, a back button that doesnt take you back to your previous page, and ads between every other paragraph.

    • Kaligalis@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      This, and newspaper writing style which endlessly beats around the bush, actually are the reasons for the mainstream accepting AI summaries. AI summaries only work because the Internet is a hellscape of intentionally bad UX with site-owners being hostile towards their users.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        Real newspaper writing style doesn’t beat around the bush.

        SEO-optimization writing style does beat around the bush, because they have to try to “organically” mention all the keywords that might bring someone to the page. They also need to make it longer so there are more places to insert ads.

    • jballs@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      My wife works at a library. People constantly come in asking to use the library fax machine, because Google’s AI says they have one.

      They don’t have one. Their website says they don’t have one. But LLMs have determined it’s plausible for libraries to have a fax machine, so Google tells people that they have one.

      You’d be surprised at the number of people who can’t accept that people working at the library know more about the library than Google.

      • BozeKnoflook@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 days ago

        I’ve had this experience myself; I’m an American living in the Netherlands and sometimes just don’t know the name for the thing I need nor where to buy one. LLM bots are fine for the translation part, but they will make wild assumptions like telling me I can buy a kitchen strainer at the hardware store or food spices at a place called Kruidvat which translates to spice-bucket basically but is actually most like CVS without the pharmacy and does not sell any food besides some candy and chips.

        It’s hilarious how quickly these bots can swing from super useful to actually harmful to trust.