• weew@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    2 days ago

    …I guess I really should just play my Steam backlog before upgrading anyways

  • 9point6@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    ·
    edit-2
    4 days ago

    Well that’s pretty shitty, I assumed it could only be a CUDA blob at this point anyway rather than any specific hardware, so why drop support?

    Edit: ah so it’s 32-bit CUDA in general they’re killing off, which makes a bit more sense as that probably does result in hardware differences.

    Hopefully they open source it at least

    • JustEnoughDucks@feddit.nl
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Hahahahaha Nvidia open sourcing anything? They literally fight tooth and nail against any form of open source every single chance they get. Only under a ton of pressure will they give any ground to open source.

      Nvidia is one of the more anti-open-source companies.

  • Stelus42@lemmy.ca
    link
    fedilink
    English
    arrow-up
    12
    ·
    3 days ago

    I thought my 3080 was an irresponsible splurge when I bought it, but every day I love that thing more and more.

    • Lesrid@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      3 days ago

      Shit with the current 50 series pricing and availability, the 4090 I got myself for Christmas is looking responsible too. It doesn’t even need a 1000 watt PSU

  • MeaanBeaan@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    3 days ago

    I was playing arkham knight last night on geforce now and I could not for the life of me get the fancy fog and debris to work. Every time I’d turn them on the game would tell me I need to restart. Once I did the settings would just revert. Even though I had the option turned on to have Geforce now save my game configs. At the time I thought it was a bug since afaik the 4080 they’re using on their rigs supports these features fine. Now I’m wondering if it was an intentional choice to not allow those features on GeForce Now so as not to make the 50 series cards look bad.

  • Boomkop3@reddthat.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    3 days ago

    And it’s not the first downgrade. I’ve noticed a decline over the generations of releases going back to before the 900 series

    • VindictiveJudge@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      3 days ago

      You could definitely just drop an old GPU in just for PhysX. The driver still supports that. Wouldn’t even need to be a good one. You could also go into driver settings and make the CPU run PhysX if you have enough cores.

  • Gutek8134@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    14
    ·
    4 days ago

    Ah, the classic:

    What’re you gonna use your 1000$ GPU for? Local hosting LLM? Video editing? 3d graphics? …Running new games on highest settings?

    Nah, I’m gonna replay this 10+ year old game.

      • Gutek8134@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        3 days ago

        Oh, I am in this group. I have a mid range PC (Ryzen 5 3600, GTX 1660S) and still mostly play indies or 5+ year old games, because they’re (usually) patched and dirt cheap.

        • FeelzGoodMan420@eviltoast.org
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          3 days ago

          Oh I think the double sarcasm was missed in your previous, comment. My bad man. It sounded like you were making fun of people who play old games on high-end PCs.