• RageAgainstTheRich@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    24 hours ago

    It says 16gb of vram in the first line of the article. My 8gb kills me. Its a beast of a card, buy as soon as i go over the vram limit, it slows to a crawl.

    • John Richard@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      23 hours ago

      Their top tier 7800 XTX had 24GB. Most AI models need at least 24 but preferably 32. Guess they don’t need to try when NVIDIA isn’t either, despite not being very expensive to do so.

      • Sickday@kbin.earth
        link
        fedilink
        arrow-up
        10
        arrow-down
        1
        ·
        22 hours ago

        Most AI models need at least 24 but preferably 32.

        Where are you getting this information from? Most models that are less than 16B params will run just fine with less than 24 GB of VRAM. This github discussion thread for open-webui (a frontend for Ollama) has a decent reference for VRAM requirements.

        • John Richard@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          edit-2
          15 hours ago

          I should have been more specific. The home models that actually compete with paid ones in both accuracy & speed. Please don’t be one of those to exaggerate & pretend it works just as good with much less. It simply doesn’t.