Announcement

Collapse
No announcement yet.

1080p 28 inch monitor or 1440p 28incher?

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • 1080p 28 inch monitor or 1440p 28incher?

    I want a bigger pc monitor around the 28inch size but I need help knowing whether to get stikc with 1080p or shell out the money for 1440p.

    Give me your opinions (by which I mean, Belimawr, tell me what to do oh tech guru )

  • #2
    1440 will give a crisper picture assuming you have the graphics card to run comfortably at 1440.

    other than that it comes down to the quality of the monitor, if it is a choice between a ISP 1080p screen and a TN Edge LED 1440 screen then the 1080 will be the better of the two.

    the same obviously being true if your graphics card can only run comfortably at 1080 as you always want to run a monitor at it's native/max res.

    also I would try to make sure it is a 16:9 aspect ratio, at 1080 or 1440 it should be, but some firms still like to push 16:10 that can cause black bars with some ports.

    so really ISP>TN, also a higher refresh rate won't hurt as even if you can't match the hz on frame rate it will reduce tearing when not using Vsync, also ms isn't a massive selling point, a 60hz screen only actually needs a refresh rate of 16ms with a 120hz screen only needing about 8ms, you need to get to 500hz before you actually hit 2ms.

    the other problem with some screens is input lag, but the only way you will find that out is in reviews.

    Comment


    • #3
      Definitely go for 1440p. You'll only be wanting to upgrade a year or two down the line if you don't get one now. You get much more desktop real estate, a nice crisp picture as Beli said, and at 28 inches 1080p will start to look pixelated at the close distances we sit from our monitors anyway. But yeah make sure it's an IPS panel or one of it's derivatives like PLS.

      I disagree about 16:9 over 16:10. The latter is a much nicer aspect for desktops, but unfortunately they are very uncommon these days given the prevaliance of 16:9 HDTVs.

      I have an Asus PB278Q which I've been very happy for the last couple of years with and would recommend, and you can get one for about £400 now. But I'm sure there are newer models worth looking into as well.

      tftcentral is the place to go if you want to get really in depth with reviews.

      It's 2015, there's no point buying a 1080p monitor.

      Comment


      • #4
        you say there is no reason for a 1080 screen, but I would easily take a good quality Gsync screen over taking something just for 1440. (obviously a gsync 1440 screen would cost a fortune and need more than 1 high end card)

        Comment


        • #5
          Thanks for the replies.

          I'm leaning towards 1440 now.

          I have some questions thought. What are Gsync and Vsync and how important are they? What does ISP, TN and IPS mean? Is 120Hz noticeably better than 60Hz? Why is refresh rate important?

          My graphics card is also a GeForce GTX 660
          Last edited by Daniel; 08-04-15, 18:13.

          Comment


          • #6
            V sync is vertical sync it stops tearing by locking the frame rate below the refresh rate of the monitor, with G sync it does the same, but the screen comes with a chip that talks to newer Nvidia cards doing the same job on the fly making sure that a full frame is always shown and you don't get tearing, but it is a lot more accurate as it is an entirely hardware solution keeping the signal intact.

            as for IPS and TN it's just 2 different forms of LCD panel, TN is the standard panel for most screens but IPS has a much higher ability to reproduce colour and has much larger viewing angles before you get colour distortion, so really if you can get an IPS it will look better than a TN panel if they are of similar quality, then you have PLS it's generally faster than IPS but while it is still considerably better than a TN panel the colours aren't as rich as an IPS panel. (obviously a shitty make IPS will be worse than a good make TN)

            on the refresh rates it's somewhat dealers choice, other than a placebo effect (more on that at the end as I found it funny) yours will never see the difference unless you are running at over 60fps then the 120hz screen can actually take up to 120 frames before it really starts to show tearing.

            on the GTX660 while it is still a fairly good card at 1440p you may see low frame rates, if your motherboard supports it and you can pick up a second dirt cheap it would probably be worth it if you go the 1440p route, even a 2gig card will be pushed on some newer games as the higher the res the more VRAM you generally need, it's why some games are starting to claim to need 3-4gig of VRAM.


            anyway on the frame refresh rate, a friend of mine was instant on 120hz screens were better and crisper, so I made so test programs that lied, the first 2 tests I ran the screen at 60hz and 120hz with Vsync on and the frame rate counter off, in it he struggled to pick the difference and in the end chose the 60 as being the faster, the third and fourth test I played the exact same again with a faked frame counter that showed the 60hz test as 120fps and the 120hz test at 60 fps, again he picked the 60hz screen because he thought it was faster, so I told him what I did and that I would do a fair test so the final round he went in sceptical that I was conning him again, but the numbers were correct he played both speeds 3 times on the final test and then was determined I must have faked the number and the 60hz test was the 120hz test. the fact is in reality our eyes can only actually see about 25fps, the reason we seem like we can see more is because our brains fill in the gaps it's why some people will always have terrible hand eye coordination as their brain is just incapable of guessing where the object will be as what we see is actually a fraction of a second behind what is actually happening.

            so if you don't want to use vsync I would go for 120hz as it lets you have twice as many frames before tearing starts to crop up, but in a true test pretty much no one could pick out the difference between the 2 and it would just be pot luck, the main thing that confuses some people is they go from a crap screen that ghosts to a better quality 120hz screen that doesn't ghost so it gives the impression of a crisper picture, but ultimately a good quality 60hz screen would have lost the ghosting as well, but since they only buy the 120hz screen they never find that out.

            Comment


            • #7
              How do you find out how much VRAM you have? If I use the dxdiag tool to look up my GPU it says I have 4GB. is this VRAM?

              Comment


              • #8
                dxdiag can be iffy as quite often it includes shared memory or in cases like my 980 just reports way off, dxdiag claims my 980 has 3gig dedicated and 1gig shared when it is a 4gig card, the best place to check is the Nvidia control panel. basically go to nvidia control (has to be nvidia control panel not the geforce experience) hit the system info button then under the info on your card it should list dedicated memory as what is actually on the card.

                Click image for larger version

Name:	dedicated memory.png
Views:	1
Size:	174.2 KB
ID:	7435

                Comment


                • #9
                  I have 2GB dedicated memory then, not 4GB. Would that be a big problem for 1440p then?

                  I'm going to be keeping my 108pp monitor anyway, so I can go back to using two screens again, so could use that for newer more demanding games but that kind of defeats the point

                  Comment


                  • #10
                    chances are with stuff like GTA5 you will see slow down or limitations on what you can do (but saying that I put money on my 980 would struggle at 1440 with R*'s record on ports)

                    but you have to think stuff like shadow of mordor says it needs a 6gig graphics card to run the HD textures but people were running them fine on much smaller cards.

                    so it's hit and miss but chances are if you are seeing a game you play have frame rate drops during play going to 1440 will make those frame rate drops even higher, that is the main thing to remember the more pixels the faster your card will hit it's limit and with there being big jumps in required VRAM with recent games, it will likely become a problem because as I said before the more pixels the more power and VRAM you need. and you have to think going from 1080p to 1440p you are having an increase in pixels of about 78% so as you can imagine it would be quite hard hitting.

                    but there are a lot of people saying even with 770's or 780's they are having to reduce settings to run at 1440p in newer games (some of these from early last year) so it seems to be looking like if you want to max your graphics at 1440 your likely looking at a pair of 770's (or better) to run at ultra, so if you did get 1440p I would probably be counting on looking at a graphics card in the next few months.

                    Comment

                    Working...
                    X