• Evil_Shrubbery@lemm.ee
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    52 minutes ago

    Lmao what shitty title (about a game engine that works on phones).

    So no, but also it sometimes takes years to dev a game, and even if the engine would be unoptimised for two current hardware gens, that’s fine too.

    • soulsource@discuss.tchncs.de
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      I would not say “lazy”.

      There are a lot of bold promises in Unreal Engine 5 advertisements, that get taken up by publishers and producers - and then end up in the game budgets…

      And then, near the end of the project, when it turns out that performance isn’t good because the advertisement promises have been a bit too bold, there is no money for optimization left…

    • FizzyOrange@programming.dev
      link
      fedilink
      arrow-up
      2
      arrow-down
      12
      ·
      1 day ago

      Not true. It takes advantage of hardware features that are available on consoles but not on PC. That isn’t laziness.

      • Rusty Shackleford@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        6 hours ago

        My comment on a different post relates to this well:

        I think a lot of the plugin tooling for Unreal promotes bad practices with asset management, GPU optimization, and memory management. I’m trying to say that it allows shitty/lazy developers and asset designers to rely on over-expensive hardware to lift their unoptimized dogshit code, blueprints, models, and textures to acceptable modern fps/playability standards. This has been prevalent for a few years but it’s especially egregious now. Young designers with polygon and vertex counts that are out of control. Extraneous surfaces and naked edges. Uncompressed audio. Unbaked lighting systems. Memory leaks.

        I’ve found that in my personal experience in experimenting with Unreal, the priority matches developing DNNs and parametric CAD modelling applications for my day job: effective resource, memory, and parallelism management from the outset of a project is (or should be) axiomatic.

        I think Unreal 5 runs exceptionally well when that’s the case. A lot of the time, one can turn off all of the extra hardware acceleration and frame generation AI crap if your logic systems and assets are designed well.

        I know this is a bit of an “old man yells at cloud” rant, but if one codes and makes models like ass, of course their game is gonna turn out like ass. And then they turn around and say “tHe EnGiNe SuCkS”.

        No. Fuck you. You suck.

      • Mia@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        8
        ·
        20 hours ago

        Which? Because consoles just use AMD APUs which have the exact same hardware features as their current CPUs and GPUs. UE5 games run like crap on consoles too.

        • FizzyOrange@programming.dev
          link
          fedilink
          arrow-up
          3
          ·
          19 hours ago

          It literally says in the article. Hardware IO controllers that handle compression. I guess this is related to DirectStorage but it doesn’t seem like that takes advantage of dedicated hardware on PC (because as far as I know it doesn’t exist) and apparently only a handful of games actually use it.

          They also have integrated RAM (like Apple M-series laptops).

          • Mia@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            12 hours ago

            All of which are completely irrelevant as to why games run like crap. Those things have zero impact on the game’s framerate, they only affect asset loading and streaming, and even then they do pretty much nothing from what I can see.

            I’m not gonna say it’s just marketing, but it comes close imo. I personally benchmarked Ratchet and Clank: Rift Apart’s loading times between a PS5, an NVMe SSD and a SATA SSD. Literally no difference, save for the SATA one being a fraction of a second slower. And that was one of the games thar was supposed to showcase what that technology can do! (I know it doesn’t run on UE5, but it’s just an example)

            UE5 runs like garbage on all platforms. You can load assets as fast as you want, but if the rendering pipeline is slow as hell it doesn’t matter, games will still run like garbage regardless.

  • Björn Tantau@swg-empire.de
    link
    fedilink
    arrow-up
    24
    ·
    2 days ago

    Hellblade 2 looked and ran great on my Steam Deck. Fortnite must be running great or millions of kids would complain. So I posit that it is not the engine’s fault.

    • fibojoly@sh.itjust.works
      link
      fedilink
      arrow-up
      1
      ·
      8 hours ago

      I’d love to know why Fortnite crashes so damn much on my machine despite being so non-demanding a game. But yeah, it’s rarely the engine’s fault.

    • dormedas@lemmy.dormedas.com
      link
      fedilink
      arrow-up
      12
      ·
      2 days ago

      Yep, developers will optimize their game for their wants and needs. It’s not the engine’s fault, it’s the developers’ for using techniques that don’t perform well on current technology. Unreal 5 could run well on launch on then-current technology.