Gaming 900p Vs 1080p Screen: A Detailed Review of the Differences and Similarities
- lotusticas30111d2
- Aug 18, 2023
- 7 min read
You should keep in mind that with 900p gaming on 1080p monitors, you might have to sacrifice screen clarity a little. Meanwhile, previous-gen consoles work equally well on a 900p display.
Gaming 900p Vs 1080p Screen
Download Zip: https://tinurli.com/2vGDNI
On the other hand, despite being the halfway point between 1080p and 720p, 900p never really became the industry standard for content producers or monitor and TV manufacturers. Most YouTube videos, shows, and movies are made to be watched in 4K, 1080p, or 720p, but not in 900p.
This is important to understand since it can put into perspective how much better 1080p is. So with 1080 rows of pixels, and there being 1.777 times more columns as rows, this means that there are 1920 columns of pixels on the screen. And once you multiply the number of rows and columns together, you get a total of 2,073,600 pixels.
With all of that out of the way, the question remains: is 1080p better than 900p? The differences between the two and how the two compare can be better understood by taking a look at their different uses.
900p is the best option if you want to live-stream your gaming session, perform day-to-day computer games, or play games on older consoles. This display resolution requires less bandwidth and processing power and such displays are often cheaper than 1080p screens, albeit a little harder to find.
In spite of being halfway between 720p and 1080p, 900p never caught on as an industry standard for producers of TVs, monitors, or content. Most movies, shows, and even YouTube videos are made to be played in 720p, 1080, or even 4k, not 900p.
Why does that matter? Well, if 1080p means 1,080 rows of pixels, and if there are about 1.777 times as many columns as rows, that means your screen has 1,920 columns of pixels. Multiply the number of columns and rows together and you get 2,073,600 pixels in total.
That same math applies for any resolution in the 16:9 aspect ratio. If 1080p means 1,080 rows and 1,920 columns for a total of 2,073,600 pixels, then 900p means 900 rows and 1,600 columns for a total of 1,440,000 pixels.
Hello, I currently own a MSI GTX 960 and a old samsung 1600 x 900 monitor. My questions is how much of an fps difference do I gain since I'm not even gaming at 1080p and how bad/blurry will the image be because its not 1080p? Is there a big difference in either?
The crispness of the image depends on both the pixel count and the screen size. A 46" 1080p tv will be a lot less crisp than a 480p phone viewed from the same distance. It may be possible for you to use the nvidia control panel to force your monitor to run at 1080p (only virtually, you won't magically get more resolution) and you can decide for yourself if the framerate hit is tolerable.
>.
Hmmm, in that case I should get a cheap 1080p monitor soon in the coming months, perhaps for christmas. The best time would be now while they have cheap $70 dollars ones going around but I need to get the rest of my system.... so not really gonna work. I thought I would be getting at least 10 more fps >.> Sucks and going to 1080p will get me a lot more on screen. I'll be upgrading that first once my system is built then.
Fairly often we hear from developers, publishers, members of the press or even just from our friend a block down the road that it's impossible to distinguish between 1080p, 720p or 900p resolutions at certain conditions, normally the size of the screen and/or the distance from it.
We picked seven PC games, that are the perfect testbed for this kind of experiment, as they allow us to change resolution at will. Then we grabbed a screenshot each in 720p, 900p and 1080p. Every other detail setting was maxed for all the screenshots in order to ensure parity of conditions. Finally, we resized the 720p and 900p screenshots to 1080p in order to simulate the upscale done by your console of choice.
Speaking of daily work on a computer, we mainly do office or school-based work. Being an office person, you mainly do all your work in the spreadsheet or doc files. Whether your resolution is 720p, 900p, or 1080p, all those spreadsheets, docs, and emails will look the same. A monitor with 900p is not so bad compared to a 1080p display.
Most modern consoles can play games in 4K rather than 1080p. But if you want to play games in 1080p on your older consoles, try to set the resolution to 900p. Playing games in Playstation 5 will give you the option to play the games in 4K.
You can play games with higher resolution, but you need more RAMs, a high-end CPU, and GPU. You need more horsepower in your PC to get max framerate out of the game in 1080p resolution. Though, old rigs or a PC with average components will struggle to give you high FPS. Switching to 900p can give you higher FPS with smooth gameplay.
It will be hard to tell the differences between a 900p and 1080p monitor from a side-by-side comparison. But there is a slight difference between them. The picture will look a little sharper and more pleasing.
Yes, you can play games in 900p, but the visual experience will be awful. Though you may get higher FPS by setting the resolution to 900p, the visuals will not be crystal clear. You will see blurry images while gaming.
CD Projekt RED recently held a preview event for The Witcher 3: Wild Hunt, and invited gaming publications and online reporters to test the current build of the game running on the latest PlayStation 4 and Xbox One consoles and PC. According to the details shared by a German video game magazine, namely GameStar, the upcoming highly anticipated RPG ran at full HD 1080p resolution on the PlayStation 4 console and at 900p resolution on the Xbox One during the event. After previewing all three versions of the game, the magazine gave its final verdict as: "PC>PS4>Xbox One."
On the Xbox One Geralt goes also with 30 fps on monster hunt, but only in 900p, the textures also seen slightly more washed out than on the PS4. However, polishes CD Projekt own words yet on the Xbox version, maybe you can still screw up to 1080p, but you would not promise anything. Stand now is the PS4 image the sharper and stronger color."
High-resolution monitors remain relatively rare (and are often prohibitively expensive). This is also true in the laptop gaming space, where internal hardware has come down in price but high-resolution panels are still extremely pricey, meaning a lot of manufacturers are turning out powerful portable machines paired with 1080p displays.
While you could of course simply scale up 720p to fill a 4K screen, the results often aren't flattering. Games at this resolution tend to look blurry and soft, with the scaling tech to preserve sharpness absent on many TVs. 1080p and above content fares better, so that's what we'll be targeting here - at a minimum, around double the pixels of the Steam Deck's internal display. A true native 4K is going to elude us except in simple titles, but we should be able to push image quality quite a bit regardless.
First up, we're going to be looking at some older and less demanding games - seventh generation console titles are often a good fit thanks to meagre performance demands and solid gamepad support. Half-Life 2 is a good example, running at 4K 60fps max settings without MSAA. Similarly, Deus Ex: Human Revolution hits 1440p60 just fine at medium settings, where image quality is reasonable, performance is solid, and the artwork holds up - and you can even go for 4K 30 if you prefer. Valkryia Chronicles and Dishonored both perform in a similar range at default settings at 1440p, though framerate dips may prompt you to opt for 1080p instead for a better 60fps lock. Both titles do hold up perfectly fine though and even compare favorably to their eighth-gen console ports - a big win for the Deck. Other games of a similar vintage fare worse though, such as Alan Wake, which requires 900p to hit 60fps, and Mass Effect Legendary Edition, which is probably best played on Deck at 1080p30 - equal with PS4 and Xbox One, but not ideal for a 4K TV.
Finally, and perhaps most interestingly, we have games that use second-gen reconstructive techniques that use aggressive temporal upsampling to produce higher image detail, namely Unreal's TSR and AMD's FSR 2.0. God of War has an implementation of AMD's new upsampling tech, but the results are a bit mixed. Image quality in static or slow-moving areas of the screen is good and looks similar to 1080p, despite rendering with less than half the pixels internally. The downside is that the image is covered in popping and fizzling disocclusion artifacts when Kratos no longer obscures a screen element or moves quickly, while artefacts also crop up in hair and particle effects. 1080p 30fps is just about doable with FSR 2.0 on balanced mode, but ultimately I preferred the cleaner presentation of a lower resolution.
So at least in these titles, the results are somewhat mixed. God of War's FSR 2.0 reconstruction isn't quite good enough to really deliver a convincing 1080p picture, while Ghostwire is too demanding to allow us to target a 1080p output in the first place, though its reconstruction is very good. I would have loved to have shown off Deathloop as well, but that title has some long-standing stability issues on the Steam Deck and currently fails to load past the title screen for me.
However, updates to aid docked play are arriving regularly. For example, it was originally impossible to run games in SteamOS's gaming mode at resolutions higher than 1280 x 800 for instance, even when connected to a 1080p or 4K display. After a June update however, it's now possible to set the display resolution to anything between 640x400 and a full 4K, although this applies to both portable and docked play and may need to be changed per title, which I had to do for our testing. 2ff7e9595c
Comments