Technology question about screen resolutions

Preach

Well-Known Member
#1
When you run a game (in full-screen mode) at a resolution different to your Windows desktop resolution, and you ALT-TAB between the two, the resolution practically changes as you ALT-TAB, right? I'm trying to figure out if the Windows resolution in any way affects the performance of the application that is being run in a different resolution, say for example if a game at 800x600 run in Windows at 1280x1024 is somehow a downscaled version of the latter.
 

Prize Gotti

Boots N Cats
Staff member
#2
the whole graphics card switches resolution. the game switches the settings as you load/exit the game. so while you have a game running full screen 800x600, so is everything else behind it while you play.
 

Preach

Well-Known Member
#3
so if running in fullscreen mode, theoretically the screen has to change resolution every time you alt-tab unless your game resolution is the same as the windows resolution?

another related question, does running a screens native resolution somehow affect performance? i am ignorant to the specifics of the technology but i can imagine producing the native resolution is somehow less of a strain than upscaling/downscaling a higher or lower resolution.
 

masta247

Well-Known Member
Staff member
#4
Original question - it depends on how your graphics card drivers are set to act. Usually if you set a specific resolution in your game the graphics card switches to that resolution displaying everything in it, if you alt tab to desktop the resolution will change to the desktop resolution again, or not depending on how your drivers react. Some will only go back to the original desktop resolution after you exit the game.

Native resolution is the resolution of your LCD screen, it has nothing to do with your graphics card which doesn't care what resolution you would like your games displayed in unless it's something too big to handle (too many objects to render at a specific time/not enough texture memory).
As you probably know LCDs have a fixed amount of pixels. You SHOULD play in your native resolution because it looks much better - every single pixel your graphics card sends to your LCD monitor gets displayed as it should by corresponding pixels on your LCD panel. If you launch a game in a different resolution your screen tries to adjust but considering that it can't divide a pixel into smaller parts (duh) the image will look bad, not sharp. It doesn't force your graphics card to work harder though. It's just your display trying to work out how to display a different signal that your GPU sends, the GPU doesn't give a shit because it did its job which is to send a signal that you wanted it to send. Actually smaller resolution means less objects to render, which means less work for your graphics card but it will look like shit because of how your LCD acts.
Back in the CRT display days it didn't matter, a smaller resolution meant bigger pixels and that's it, because it didn't have fixed pixels. LCD does. That's why if you want your image to look fine and sharp you should display it in your LCD's native resolution. The second acceptable resolution (sharp but pixelated) would be quarter of your native resolution - that way 4 pixels work as a single pixel.

Now you can't display a higher resolution than your screen's native resolution simply because it wouldn't make any sense and would only force your graphics card to work harder. You can't display more pixels than your LCD technically has.
 

masta247

Well-Known Member
Staff member
#5
This is the result of displaying a non-native (lower than native) resolution on your LCD:





And this is why it happens:


As you can see it gets a lower resolution signal that it tries to display on a higher resolution screen, since it can't divide a pixel it's either displaying something, or not so if you get a low resolution signal (for example it wants to display 1 pixel in 800x600 resolution on a natively 1280x1024 screen - that corresponds to more than 1 pixel for your LCD screen, but less than 4) your screen will try to adjust it to its amount of pixels often flashing 2/3/4 pixels BUT with a smaller intensity wherever a smaller part of pixel should be displaying something which makes artifacts less obvious but also makes the image look blurry, not as sharp.
Like this, native vs slightly smaller than native:

However while it's not as obvious with text without this you'd basically see mosaic on curved lines.
 

Preach

Well-Known Member
#6
About the alt-tabbing, I asked because I currently use a PC that's a bit slow, so I'm trying to min-max the performance. Your explanation confirms what I needed to know so thanks. As for native resolution, I was tired maybe, because when I think about it I kinda already knew. I guess I became unsure about whether it was the monitor itself, or the video card that was "running" the LCD cells or whatever. But thanks again. Now go to my new thread I am about to make and give me more advice pls :-D
 

Latest posts

Donate

Any donations will be used to help pay for the site costs, and anything donated above will be donated to C-Dub's son on behalf of this community.

Members online

No members online now.
Top