Upgrade Your Monitor, Not Your GPU

I was in this situation of having an old monitor with 1080p and I upgraded to 27" 144hz 1440p. It had better colors and viewing angles, the text was crisper. That's about it for me. I see no big difference when playing games. What I do see is that suddenly I need a lot more GPU power to drive the increased resolution and to have faster FPS. So, in the end, I think that a better monitor will require a far better GPU, 1.7x the pixels, 2.35 the fps it could mean you need 4x the GPU power for the combo. Since I don't experience a big benefit from higher FPS, and don't have unlimited money, I choose graphics quality over FPS.
 
I think the most important is the quality of the panel - mainly the colors , the image . Second - the response time . Third - refresh rate .- 75-144Hz is fine . Even with 60Hz but 4ms rsponse time you caould get a great immersion .

VA panels are garbage because of the image smearing . TN monitors are bad as well - limited viewing angle , low color accuracy and contrast .
 
Last edited:
In 2018/2019 I upgraded to 32" 4K@60Hz monitor, mostly for productivity, but thinking 4K gaming will pick up. 4K capable cards will get cheaper.
Oh, how wrong was I.
Today nVidia promotes 1080p@60Hz RayTracing gaming.

Now I've "downgraded" to UltraWide 1440p@165Hz.
 
Hold on a sec, this advice needs a nuance check! Upgrading a monitor can be awesome, but it's not a one-size-fits-all solution. Upgrading your GPU might be the way to go depending on your situation.
 
Reality check, did I just read the suggestion to stay on your old GPU and get a higher resolution Monitor?
Like 4k, remind me how many old GPUs support 4k at some not nightmare quality settings...
 
Reality check, did I just read the suggestion to stay on your old GPU and get a higher resolution Monitor?
Like 4k, remind me how many old GPUs support 4k at some not nightmare quality settings...

The argument for a new monitor at higher resolution on and older graphics card is mostly for better image scaling (more pixels to work with) and if you don't already have Gsync/Freesync/VRR. I would personally say going from no VRR to having it is easily the best upgrade path you can get.

Also, with 4K you can do pixel-perfect 1080P native if you set your GPU to handle the scaling (its a divisible resolution). 1440P is divisible with 720P and most people aren't willing to go *that* low anymore.
 
Reality check, did I just read the suggestion to stay on your old GPU and get a higher resolution Monitor?
Like 4k, remind me how many old GPUs support 4k at some not nightmare quality settings...

My 6950 XT is doing just fine at 4k. Still available for $500. Cyperpunk was probably the only game where I had to lower some settings that had no discernable effect on graphics quality.
 
4080 Supers are $1k not $1.2k. You also don't NEED it. 6950XT and anything in between can drive a 4k monitor.

I would not spend a lot on a graphics card right now since even the current ones with DP 2.1 are not UHBR20. If you're getting a DP 2.1 monitor, you're pretty much buying a temporary card. Unless you don't care about DSC, then just use HDMI 2.1 output. If you want uncompressed image, hopefully Nvidia is going to add DP 2.1 UHBR20 later in the year to the upcoming 5 series. It wouldn't make sense to spend top money to only have one piece be UHBR20. That's probably why only Gigabyte added DP 2.1. The rest will probably refresh next year to add it. With such low availability this year, this feels like a low production test run to work out the kinks.
 
Dropped by to say - I was this guy with a 3080 and multi monitor 1080p/60hz setup from several builds ago.

I since went to 1440p/144hz on a Black Friday deal for $200 and couldn’t believe how much more fun I was having.

It encouraged me to save up and go OLED ultrawide, and that has been another whole level of color and immersion - but the $200 jump to 1440/144 was honestly a bigger bang-for-buck.

If you are sitting on a reasonable computer with an old monitor, do yourself a favor and follow the author’s advice.
 
Great advice for those who own an old mid tier GPU. I'm also saving to upgrade my monitor instead of GPU since I have a 3070ti which can handle 1440p quite well.
 
My 34" QD-OLED is the single best upgrade I've ever made. Games went from looking really good to flat out amazing. The visual quality of true HDR and infinite contrast cannot be overstated.
 
Last edited:
My previous monitor failed about 5 years ago and Walmart had a 1080p LED monitor on Black Friday special. It was an ELEMENT with an off-white bottom bezel, which is annoying. What I regret most is the foggy washed out look on the corners, like when you squeeze an LCD/LED screen with your thumb.The viewing is bad and worse if look you from the side. Performance mode helps a little, but it makes the monitor run real hot. You can feel the heat seriously.
 
Upgrading to higher resolution and higher refresh rate monitor while keeping an old GPU?
A little late for an aprils fool article, isnt it...
 
One monitor does tend to last multiple generations of gpus. That said I am still satisfied with my 4 year old CX 48 inch oled that went from 2080ti hybrid to 3090 hybrid to now 4090 hybrid and potentially a 5090 if it doesn't burn in. I'll let the market flood with dp2.1 monitors competition. Although if LG comes with an 48 inch MLA oled even at 144hz I would get that instead (like thier G lineup).
 
The only thing I could agree on is that the monitors are cheap.
Yeah even the LG Ultragear Gaming Monitors
48GQ900-B fell as low as
$594.99 + FREE SHIPPING
$1,499.99 Save: $905.00 (60%)
Coupon: HECA14701
at LG's website but currently it's out of stock. Hopefully a foreshadowing of things to come.
 
The last couple years I have transitioned from gaming monitors to using TVs instead because of the ridiculous prices of gaming monitors; best choice ever for my walket and the visual qualitu is still great. $300 for a 43" 4K TV is a far better call than absurd 4K pc monitor prices.
 
A little opinionated and obviously the advice varies greatly depending on your current monitor, but still maybe a nice wakeup call to someone whose forgotten to check out the monitor scene for a few years.

As someone who both games and works on my PC, non-gaming text clarity is very important to me too, which I understand is still an issue with many (all?) OLED monitors.

The sun will have a blinding brightness, street lights will look incredibly punchy on a dark street at night, and muzzle flashes will dazzle
Yep. And that's why my HDR mode is off 95%+ of the time, because it literally makes me eyes hurt after more than 15 minutes. I was playing one shooter where it all looked great but then I realized I had stopped using grenades because it hurt too much. Maybe this feature is more for your TV than the monitor that is two feet or less in front of you.
 
144hz TN monitor here :D It has better contrast than my 3 IPS monitors and IPS Sony TV. It's hard to explain why, but it does. Blacks are darker, colors pop a lot more. Obviously, 144hz at 1 ms (or the 2 ms without overdrive) makes the game look great in motion. However, as amazing as this is... (ye, I'm talking about TN lol), I still prefer to play on a big 60hz 40ms slow giant 4K TV. I know newer TVs are down to 10ms, and OLEDs are fast as hell. Sure, but the cheapest OLED here cost 1350, even for the 42 inch one. Stores are robbing us hard, that and the EU tax. Ive seen the same LG TVs in the US. They sell for like 700-800.

Paying anything more than 700 is too much for me. Also, I like how 4K makes icons and UI look far cleaner. Its not all about the pixels and sharpness of games. I like the text, buttons, spell effects and character stats in games. All of that stuff look horrible on a 1080p screen, I have compared it. It is true, once you see how sharp 4K is, its hard to go back. I might be willing to go back to 30 fps, if its stable and smooth, and 4K of course. Graphics over feeling good/smooth!

P.s. Yep, my TN has that viwing angle problem too. Hard to avoid that with TN, especially on a 27 inch monitor. Its far harder to notice on my old 20 inch one. In fact, I can barely see the dimming. Anything bigger than 24... yeah, it shows. The top 20% of the screen are way darker than the middle part. Still beats IPS glow anyway of the week tho! ;p
 
I don't agree with the logic. Gpu market is awful and the way to mitigate it is to wait longer between purchases. The monitor market is in innovation phase where the products keep getting better and cheaper, therfore as long as your monitor works you should wait for the stagnation period. You will get excellent monitor for a great price.
 
Upgrading your monitor generally leads to upgrading of your GPU. Otherwise, all the features like 4K, and/or, 240/360 Hz refresh rate are not going to be utilized. So I am not sure if the header is even factual.

What I do think is that many people are chasing after high end GPUs so that they can enjoy RT at reasonable performance. However, I feel upgrading to a monitor with good HDR support actually uplifts visual more than just some fancy RT. Of course if you have the money to enjoy both HDR and RT, that's great. But for the likes of me, I had to choose 1 and going with a good mini LED monitor is more beneficial to me when it comes to visuals. And you don't really lose FPS using HDR which allows for a more impactful presentation.
 
Is this not one of the biggest faceplam moments in Techspot journalistic history? I know that there's a passing mention to it, but any monitor upgrade is pretty well certain to push you into a GPU upgrade; stands to reason.
 
Back