![](/static/61a827a1/assets/icons/icon-96x96.png)
![](https://lemmy.ca/pictrs/image/7b0211f0-7266-4e13-9d26-8c3e6126af62.png)
The cards aren’t affected* it’s the custom cables people are using.
I wonder why he didn’t do a temp test without the extensions and just Nvidias adapter? Those parts aren’t hot and haven’t caused any issues from any reports so far. It’s ALWAYS had a custom extra cable. Increasing the length of cables increases power draw, it’s the basics of electricity. So yeah… increaseing cable length, and adding extra connections will make it waste more power and draw more…… it’s not surprising it’s only happening with systems with the extra cable length and connections that cause loss.
Also, the headline is misleading, he’s blaming the cards, when the only piece that’s ever melted is custom non approved parts.
Using custom cables even in that scenario would still lead to cables melting.
It’s from increasing the cable length and therefore the required power draw to draw the required amount to power the device. It already happens with 8pin systems as well, this isn’t unique to the delivery system either.
Using too many extension cords has the exact same affect… it’s like blaming the heater because the cords melted. But no, let’s not blame physics, let’s blame the GPU(heater) -.-