Question Rtx 5090 Astral OC it won’t pull more w pull 450w.. why?

syykii

Honorable
Feb 8, 2020
24
2
10,515
Hello, I just got the rtx 5090 astral oc switched the bios switch on while keeping the card off but when playing games it won’t pull more than 430-450w, on heaven bench it was pulling max 480w, however on furmark it was using all 600w which is strange, also my monitor is 1440p I have overclocked it to 3000mhz. My psu is corsair 1000w rmx or something like that, im using the added adapter to power the card. My cpu is 9800x3d ryzen. Thanks.
 
My psu is corsair 1000w rmx or something like that, im using the added adapter to power the card.
Take the side panels off the case and inspect the stickered sides of the PSU. How old is the PSU?

I just got the rtx 5090 astral oc switched the bios switch on while keeping the card off
Which mode is the BIOS switch set to? If you got the GPU in question, what GPU were you working with prior? Did you use DDU prior to swapping over GPU's?
 
My psu is corsair 1000w rmx or something like that, im using the added adapter to power the card.
Take the side panels off the case and inspect the stickered sides of the PSU. How old is the PSU?

I just got the rtx 5090 astral oc switched the bios switch on while keeping the card off
Which mode is the BIOS switch set to? If you got the GPU in question, what GPU were you working with prior? Did you use DDU prior to swapping over GPU's?
My psu is on the side i cant really see its date without taking it out but i bought it in 2023-07-16. I pushed the switch to the right, i had rtx 4090 before, yes i did use DDU however not in the safe mode as there’s a bug if i go to safe mode it asks me for the password and for some reason it says it’s incorrect lol even though im using this password daily to login to my pc.
 
Benchmark that uses ray tracing and tensor cores and the shaders. Heaven is older and would only use the shaders. A game might not use everything on the card at once.

Is the performance low compared to others with a 5090?
Everyone is using it for 4k but i dont have yet a 4k monitor so im not sure however one person on YouTube stress tested his 5090 on heaven benchmark and he hit 600w i’ve copied his settings and his OC and mine pulls max 480w and 4k score higher which doesnt really make sense.
 
Everyone is using it for 4k but i dont have yet a 4k monitor so im not sure however one person on YouTube stress tested his 5090 on heaven benchmark and he hit 600w i’ve copied his settings and his OC and mine pulls max 480w and 4k score higher which doesnt really make sense.
Do you have cyberpunk? Run that maxed with pathtracing. That will pull 500w+ even at 1080p. Make sure you do not have any FPS limiters. Having uncapped FPS will allow the card to stretch its legs. As another above has said, make sure you are not on the Quiet vBIOS because that will pull less power to keep quiet. The vBIOS switch should be flipped to the left to P Mode.
 
  • Like
Reactions: Roland Of Gilead
Do you have cyberpunk? Run that maxed with pathtracing. That will pull 500w+ even at 1080p. Make sure you do not have any FPS limiters. Having uncapped FPS will allow the card to stretch its legs. As another above has said, make sure you are not on the Quiet vBIOS because that will pull less power to keep quiet. The vBIOS switch should be flipped to the left to P Mode.
I’ll give it a try, you sure it has to on the left not to the right?

Fire up a game and set it up for super sampling. You can run any game at 4K using the Nvidia control panel. Just then down samples to your resolution.
If at 4k it will be pulling 600w does that mean im bottlenecking ?
 
Yes, but that does not mean we ignore what it is, a bottleneck. The bottleneck for PC gaming is almost always the CPU or the GPU. At lower resolutions the CPU is typically the bottleneck, and at higher resolutions like 4k, it's the GPU.
What if the game used only 3 cores and those cores were fully loaded and the card is also fully loaded.

I think after about 30 years I don't need somebody to tell me what a limiting factor is on a PC.
Limiting factor is a much better term.
 
What if the game used only 3 cores and those cores were fully loaded and the card is also fully loaded.

I think after about 30 years I don't need somebody to tell me what a limiting factor is on a PC.
Limiting factor is a much better term.
Bottleneck is a colloquial term widely used in gaming, I care not for what it is called. If a CPU and a GPU are equally loaded (almost impossible), then the system has a bottleneck elsewhere. For instance, RAM or storage speeds limiting the CPU it uses to draw frames. If every resource the PC has at its disposal is equally loaded, and the PC has resources at its disposal like extra cores going unused, then the problem is the game not adequately take advantage of hardware.

I am not telling you anything you don't know, however, you contradicted my post, thus I feel I have to explain my reasoning as to not look like I am giving false information.