• Sickday@kbin.earth
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    22 hours ago

    Most AI models need at least 24 but preferably 32.

    Where are you getting this information from? Most models that are less than 16B params will run just fine with less than 24 GB of VRAM. This github discussion thread for open-webui (a frontend for Ollama) has a decent reference for VRAM requirements.

    • John Richard@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      edit-2
      15 hours ago

      I should have been more specific. The home models that actually compete with paid ones in both accuracy & speed. Please don’t be one of those to exaggerate & pretend it works just as good with much less. It simply doesn’t.