• r4venw@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 hours ago

      I dont want to speak for OP but I think they meant its not generating the search results using an LLM

      • faythofdragons@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 hours ago

        Maybe I just don’t know what “generating results” means. You query a search engine, and it generates results as a page of links. I don’t understand how generating a page of links is fundamentally different from generating a summation of the results?

        • r4venw@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          6
          ·
          7 hours ago

          Its a very different process. Having work on search engines before, I can tell you that the word generate means something different in this context. It means, in simple terms, to match your search query with a bunch of results, gather links to said results, and then send them to the user to be displayed

          • faythofdragons@slrpnk.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 hours ago

            then send them to the user to be displayed

            This is where my understanding breaks. Why would displaying it as a summary mean the backend process is no longer a search engine?

            • MudMan@fedia.io
              link
              fedilink
              arrow-up
              3
              ·
              6 hours ago

              The LLM is going over the search results, taking them as a prompt and then generating a summary of the results as an output.

              The search results are generated by the good old search engine, the “AI summary” option at the top is just doing the reading for you.

              And of course if the answer isn’t trivial, very likely generating an inaccurate or incorrect output from the inputs.

              But none of that changes how the underlying search engine works. It’s just doing additional work on the same results the same search engine generates.

              EDIT: Just to clarify, DDG also has a “chat” service that, as far as I can tell, is just an UI overlay over whatever model you select. That just works the same way as all the AI chatbots you can use online or host locally and I presume it’s not what we’re talking about.

              • faythofdragons@slrpnk.net
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 hours ago

                I see, you’re splitting the UI from the backend as two different things, and Im seeing them as parts to a whole.

                • MudMan@fedia.io
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  6 hours ago

                  Well, yeah, there are multiple things feeding into the results page they generate for you. Not just two. There’s the search results, there’s an algorithmic widget that shows different things (so a calculator if you input some math, a translation box if you input a translation request, a summary of Wikipedia or IMDB if you search for a movie or a performer, that type of thing). And there is a pop-up window with an LLM-generated summary of the search results now.

                  Those are all different pieces. Your search resutls for “3 divided by 7” aren’t different because they also pop up a calculator for you at the top of the page.

                  • faythofdragons@slrpnk.net
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    6 hours ago

                    Yeah, for some reason I was thinking you were trying to say that bolting on widgets made it no longer a search engine.