Comments Locked

69 Comments

Back to Article

  • DigitalFreak - Wednesday, January 28, 2015 - link

    This is why I ended up with a 21:9, 3440x1440 monitor. It's more expensive, but you can run with 100% scaling, and use 1440p resolution for games that don't support 21:9. I can also hit 60fps easily with GTX 980 SLI with all options maxed. No G-Sync, but I don't like being locked into a GPU vendor when I only replace my monitor every 5 years or so.
  • keatre - Wednesday, January 28, 2015 - link

    Also looking into the 3440x1440 spectrum. Out of curiosity, which monitor did you go with?
  • Mondozai - Wednesday, January 28, 2015 - link

    Acer is coming out with a 34" 144 Hz ultra-wide 1440p monitor with G-Sync. So that could be an alternative.
  • Mondozai - Wednesday, January 28, 2015 - link

    Oh and LG have their 34UM67 model, a Freesync IPS 1440p ultrawide 34" monitor. It's going to cost about 500 dollars or so, so the prices are coming down fast.
  • JarredWalton - Wednesday, January 28, 2015 - link

    $500 -- you have a source for that? If they get IPS 3440x1440 34" for that price, I'll be extremely surprised. After all, their non-FreeSync option currently costs over $900:
    http://www.amazon.com/LG-Electronics-34UM95-34-Inc...
  • jackstar7 - Wednesday, January 28, 2015 - link

    Need to jump in and say there are zero confirmed 3440x1440 Freesync of Gsync monitors. There are rumors, but that is all.

    Right now, the best 3440x1440 appear to be the curved Dell and LG, but I'm also waiting to read more testing of the AOC non-curved and the Samsung curved.
  • JarredWalton - Wednesday, January 28, 2015 - link

    AMD had an LG at CES... however I think it may have been 2560x1080.
  • jackstar7 - Wednesday, January 28, 2015 - link

    Indeed it was only 1080.

    People are taking a couple "stories" about new models where the authors are writing that they "believe" the monitors will have 3440x1440 and running with that "belief". Facts are thus far not present.
  • Black Obsidian - Thursday, January 29, 2015 - link

    Unless something's changed in the last few days, there's no official confirmation of the 34UM67 being 1440p.

    To the contrary, given that the 34UM65 is *1080p* (while the 34UM95 & 34UM97 are 1440p), unfortunately there's good reason to believe that the 34UM67 will be 1080p Freesync.
  • Black Obsidian - Thursday, January 29, 2015 - link

    Oh, and if the 34UM67 is indeed 1080p, that would make a ~$500 price tag more reasonable. The 34UM9x 1440p parts are still north of $900, but the 1080p 34UM65 can regularly be found much closer to that $500 mark.
  • DigitalFreak - Wednesday, January 28, 2015 - link

    Dell U3415W
  • DigitalFreak - Wednesday, January 28, 2015 - link

    If you look around for coupon codes, you should be able to get it for under $1000.
  • Frenetic Pony - Wednesday, January 28, 2015 - link

    G-Sync seems like a dead end anyway. It's both GPU vendor locked and more expensive than "Freesync" which is also part of an open standard.
  • eddman - Thursday, January 29, 2015 - link

    "No G-Sync, but I don't like being locked into a GPU vendor when I only replace my monitor every 5 years or so."

    That doesn't make sense. When did you buy your monitor? A year ago? You could've bought a G-sync monitor and enjoyed the syncing whenever you ended up with an nvidia card in your computer, but now you can't have either of them for a few more years anyway; unless you change your routine and replace your monitor too.
  • Narg - Friday, January 30, 2015 - link

    I easily hit 60fps on my 1440p monitor with only a GTX 970 on most games. Not sure why people spend so much on hardware at times.
  • IdBuRnS - Thursday, February 19, 2015 - link

    "I can also hit 60fps easily with GTX 980 SLI with all options maxed."

    Well I'd surely hope so...
  • Mondozai - Wednesday, January 28, 2015 - link

    Jarred, a quick note:
    "A solution to this might be G-SYNC to enable gaming that looks smooth even when running below 60Hz"

    That should be fps, not Hz, as the panel is at 60 Hz all the time.
  • paradeigmas - Wednesday, January 28, 2015 - link

    You do know the fundamentals of G-Sync is its ability to drop the refresh rate according to fps right? Which means if your game is running at 45fps, your G-Sync monitor will refresh at 45Hz.
  • Antronman - Wednesday, January 28, 2015 - link

    But the usage of the word "Hertz" is still incorrect.
  • JarredWalton - Wednesday, January 28, 2015 - link

    Fixed.
  • inighthawki - Friday, January 30, 2015 - link

    In what way is it incorrect?
  • perpetualdark - Wednesday, February 4, 2015 - link

    Hertz refers to cycles per second, and with G-Sync the display matches the number of cycles per second to the framer per second the graphics card is able to send to the display, so in actuality, Hertz is indeed the correct term and it is being used correctly. At 45fps, the monitor is also at 45hz refresh rate.
  • edzieba - Wednesday, January 28, 2015 - link

    "We are still using DisplayPort 1.2 which means utilizing MST for 60Hz refresh rates." Huh-what? DP1.2 has the bandwidth to carry 4k60 with a single stream. Previous display controllers could not do so unless paired, but that was a problem at the sink end. There are several 4k60 SST monitors available now (e.g. P2415Q)..
  • TallestJon96 - Wednesday, January 28, 2015 - link

    Sync is a great way to make 4k more stable and usable. However, this is proprietary, costs more, and 4k scaling is just ok. Any one interested in this is better off waiting for a better, cheaper solution that isn't stuck with NVIDIA.
    As mentioned before, the SWIFT is simply a better option, better performance at 1440p, better UI scaling, higher maximum FPS. Only downside is lower Res, but 1440p certainly isn't bad.
    A very niche product with a premium, but all that being said I bet Crisis at 4k with G-Sync is amazing.
  • Tunnah - Wednesday, January 28, 2015 - link

    "Other 4K 28” IPS displays cost at least as much and lack G-SYNC, making them a much worse choice for gaming than the Acer. "

    But you leave out the fact that 4K 28" TN panels are a helluva lot cheaper. Gamers typically look for TN panels anyway because of refresh issues, so the comparison should be to other TN panels, not to IPS, and that comparison is G-SYNC is extremely expensive. It's a neat feature and all, but I would argue it's much better to spend the extra on competent graphics cards that could sustain 60fps rather than a monitor that handles the framerate drop better.
  • Tunnah - Wednesday, January 28, 2015 - link

    Response time issues even
  • Midwayman - Wednesday, January 28, 2015 - link

    If it ran 1080 @ 144hz as well as 4k@ 60hz this would be a winning combo. Getting stuck with 60hz really sucks for FPS games. I wouldn't mind playing my RPGs at 40-60fps with gsync though.
  • DanNeely - Wednesday, January 28, 2015 - link

    "Like most G-SYNC displays, the Acer has but a single DisplayPort input. G-SYNC only works with DisplayPort, and if you didn’t care about G-SYNC you would have bought a different monitor."

    Running a second or third cable and hitting the switch input button on your monitor if you occasionally need to put a real screen on a second box is a lot easier than swapping the cable behind the monitor and a lot cheaper than a non-VGA KVM (and the only 4k capable options on the market are crazy expensive).

    The real reason is probably that nVidia was trying to limit the price premium from getting any higher than it already is, and avoiding a second input helped simplify the chip design. (In addition to the time element for a bigger design, big FPGAs aren't cheap.)
  • JarredWalton - Wednesday, January 28, 2015 - link

    Well, you're not going to do 60Hz at 4K with dual-link DVI, and HDMI 2.0 wasn't available when this was being developed. A second input might have been nice, but that's just an added expense and not likely to be used a lot IMO. You're right on keeping the cost down, though -- $800 is already a lot to ask, and if you had to charge $900 to get additional inputs I don't think most people would bite.
  • Mustalainen - Wednesday, January 28, 2015 - link

    I was waiting for the DELL P2715Q but decided to get this monitor instead(about 2 weeks ago). Before I got this I borrowed a ASUS ROG SWIFT PG278Q that I used for a couple of weeks. The SWIFT was probably the best monitor that I had used until that point in time. But to be completely honest, I like the XB280HK better. The colors, viewing angles (and so on) are pretty much the same(in my opinion) as I did my "noob" comparison. My monitor has some minor blb in the bottom, barely notable while the SWIFT seems "flawless". The SWIFT felt as is was built better and has better materials. Still, the 4k was a deal breaker for me. The picture just looks so much better compared to 1440p. The difference between 1440p and 4k? Well after using the XB280HK I started to think that my old 24" 1200p was broken. It just looked as it had these huge pixels. This never happened with the SWIFT. And the hertz? Well I'm not a gamer. I play some RPGs now and then but most of the time my screen is filled with text and code. The 60hz seems to be sufficient in these cases. I got the XB280HK for 599 euro and compared to other monitors in that price range it felt as a good option. I'm very happy with it and dare to recommend this to anyone thinking about getting a 4k monitor. If IPS is your thing, wait for the DELL. This is probably the only regret I have(not having patience to wait for the DELL).

    I would also like to point out that the hype of running a 4k monitor seems to be exaggerated. I manage to run my games at medium settings with a single 660 gtx. Considering I run 3 monitors with different resolutions and still have playable fps just shows that you don't need a 980 or 295 to power one of these things(maybe if the settings are maxed out and you want max fps).
  • JarredWalton - Wednesday, January 28, 2015 - link

    I suppose the question on 4K gaming is this: would you rather have 4K medium or QHD high settings (possibly even QHD ultra)? There are certainly games where 4K high or ultra is possible with a more moderate GPU, but most of the big holiday releases come close to using 3GB RAM for textures at ultra settings, and dropping to high in many cases still isn't enough. I think people really after 4K gaming in the first place will want to do it at high or ultra settings, rather than to juggle quality against resolution, but to each his own.
  • DigitalFreak - Wednesday, January 28, 2015 - link

    I had the Dell P2715Q for a bit and swapped it for the U3415W. I really didn't like the trade-offs you have to make with 4k (performance, etc.), and didn't really notice that much of a difference in graphics quality.
  • Mustalainen - Thursday, January 29, 2015 - link

    I also looked at that monitor(the U3415W). It is beautiful but it came down to the fact that it was priced at 990 euro. In hindsight I think I'm happier with 4k as text is so sharp. I also like having 2 or more monitors as I can run one application in full screen on one monitor while being able to see whats happening in the other applications on the other monitors. I don't know if I would want to put yet another monitor beside that 34", maybe works great, maybe not, I do not dare to comment on that. The important thing is that everyone has the hardware that fits them the best.

    I'm mostly happy that companies seems to be releasing a variety of monitors at reasonable price points. It felt like monitors 20"-22" were stuck at 1080p forever while mobile phone screens were improved every month. Lets hope that improvements will continue on both markets.
  • Mustalainen - Thursday, January 29, 2015 - link

    Oh, I cant edit, It was supposed to say 20"-27" were stuck...
  • Mustalainen - Thursday, January 29, 2015 - link

    Jarred, you are probably correct. I just wanted to give an alternative opinion to those who are looking at 4k and are leaning towards working(involving a lot of text) and being happy with not maxing out the graphics. I feel happy with 4k. It feels like "something new", have a lot of area to work with, scaling is almost a non-issue in win8.1 (with most of the applications I use).
  • Taristin - Wednesday, January 28, 2015 - link

    Acers always have that blue tint problem. I have 3 acer monitors on my desktop and each of them leans too far into the blue spectrum, even after playing with calibrations. Leads to some rapid eyestrain.
  • B3an - Thursday, January 29, 2015 - link

    TN panel? Nope.

    FreeSync or fuck off.
  • Pork@III - Thursday, January 29, 2015 - link

    Yes! Indeed!

    Write TN and we now know that the only use is to fence in my pigsty.
  • Pork@III - Thursday, January 29, 2015 - link

    WoW I read this article now: http://gamenab.net/2015/01/26/truth-about-the-g-sy...
    Cheers for those who paid lot of money for display with G-Synk module!
  • JarredWalton - Thursday, January 29, 2015 - link

    Interesting, though a bit too laced with conspiracy theory stuff to convince me he's not off his rocker. I'd like to see a game with clear videos of VSYNC Off, On, and G-SYNC modes on that laptop. Part of the issue of course is that a cell phone video of a display is going to be difficult to tell if the refresh rate is really 50Hz, 60Hz, or more importantly variable. The pendulum demo is a bit too staged for "proof".
  • MrSpadge - Thursday, January 29, 2015 - link

    Jarred, please test his claims and modded drivers! He surely comes accross as dubious, but if he's correct that's a real bomb waiting to explode.
  • SkyBill40 - Thursday, January 29, 2015 - link

    There's a huge thing that he's doing that makes his claim patently false: he's running that game WINDOWED and G-Sync only works full screen. Period. So, in essence, while he does make an interesting point... he's full of shit.
  • JarredWalton - Thursday, January 29, 2015 - link

    The current (updated) video is running the pendulum fullscreen, but again... his claims are dubious at best. "Look, it has an Altera FPGA. The only thing that's good for is security!" Ummm... does he even know what FPGA means? Field Programmable Gate Array, as in, you can program it to do pretty much anything you want within the confines of the number of gates available. Also, the suggestion that G-SYNC (which was released before AMD ever even talked about FreeSync) is the same as FreeSync is ludicrous.

    FWIW, I've seen laptop displays that can run at 50Hz before, so with this demo running at a static 50 FPS it seems, that's not really that crazy to have "modded drivers" work. Sure, the drivers allow you to apparently turn G-SYNC on or off, but he could mod the drivers to actually turn triple buffering on/off and I doubt most of us could tell the difference via an Internet video.

    He needs to show it running a game with a variable FPS (with a FRAPS counter), and he needs to zoom out enough that we can see the full laptop and not just a portion of the screen. Take a high speed video of that -- with the camera mounted on a tripod and not in his hands -- and someone could actually try stepping through the frames to see how long each frame is on screen. It would be a pain in the butt for certain, but it would at least make his claims plausible.

    My take is that if G-SYNC is basically hacked, it would have come to light a long time ago. Oh, wait -- the random guy on the Internet with his modded drivers (anyone have time to do a "diff" and see what has changed?) is smarter than all of the engineers at AMD, the display companies, etc.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I agree with you and appreciate your more in depth commentary on it. I still, like you, find his claim(s) to be quite dubious and likely to be pure crap.
  • JarredWalton - Saturday, January 31, 2015 - link

    Turns out this is NOT someone doing a hack to enable G-SYNC; it's an alpha leak of NVIDIA's drivers where they're trying to make G-SYNC work with laptops. PCPer did a more in-depth look at the drivers here:
    http://www.pcper.com/reviews/Graphics-Cards/Mobile...

    So, not too surprisingly, it might be possible to get most of the G-SYNC functionality with drivers alone, but it still requires more work. It also requires no Optimus (for now), and you need a better than average display to drive.
  • Will Robinson - Thursday, January 29, 2015 - link

    Thanx for reposting that link Pork.I posted it yesterday but it seems some people want to accuse him of being a conspiracy theorist or somehow not of sound mind rather than evaluate his conclusions with an open mind.
    I wondered if we would get an official response from AT.
  • nos024 - Thursday, January 29, 2015 - link

    Technically, if AMD is the only one supporting "FreeSync" you'll still be so-called "vendor-locked"? No?

    As a PC gamer you only have two choices for high performance gaming video cards. So I don't understand this so-called vendor-lock debate thing with G-sync and Freesync. Just because G-sync comes in the form of a chip and Freesync with me with the new version of display port, it's the same deal.
  • SkyBill40 - Thursday, January 29, 2015 - link

    No, it's not the same thing. G-Sync is wholly proprietary and the effects of it will *not* work without a G-sync capable video card; on the contrary, Free-sync is just that: free to whatever card you have no matter the vendor. It's open source and thereby there's no proprietary chips in the design. It just works. Period.
  • nos024 - Thursday, January 29, 2015 - link

    What do you mean it just works? If Nvidia decides not to support it, AMD becomes the only one to support it, which means vendor-lock anyways.

    So you are saying if I decide to use Intel's IGP (given it comes with the correct Display port version), I need no additional support from Intel (driver) and Freesync will just work? I don't think it's THAT easy. Bottom line is, you will be locked to AMD graphics card IF AMD is the only one supporting it. It doesn't matter how it is implemented into the hardware - it's all about support.

    The only thing it has going for it is that there's no royalty paid to AMD to adopt the technology from a monitor manufacturing point of view.
  • Black Obsidian - Thursday, January 29, 2015 - link

    And no additional (monitor) hardware required.
    And it's part of the DisplayPort spec.
    And any GPU manufacturer that wants to support it is free to do so.

    The only thing that Freesync has going *against* it is that nVidia might decide to be a bag of dicks and refuse to support an open standard in favor of their added-cost (read: added-profit) proprietary solution.
  • JarredWalton - Thursday, January 29, 2015 - link

    This remains to be seen. Adaptive VSYNC is part of DisplayPort now, but I don't believe it's required -- it's optional. Which means that it almost certainly requires something in addition to just supporting DisplayPort. What FreeSync has going against it is that it is basically a copy of something NVIDIA created, released as an open standard, but the only graphics company currently interested in supporting it is AMD. If NVIDIA hadn't created G-SYNC, would we even have something coming in March called FreeSync?

    My bet is FreeSync ends up requiring:
    1) Appropriate driver level support.
    2) Some minimum level of hardware support on the GPU (i.e. I bet it won't work on anything prior to GCN cards)
    3) Most likely a more complex scaler in the display to make adaptive VSYNC work.
    4) A better panel to handle the needs of adaptive VSYNC.

    We'll see what happens when FreeSync actually ships. If Intel supports it, that's a huge win. That's also very much an "IF" not "WHEN". Remember how long it took Intel to get the 23.97Hz video stuff working?
  • Black Obsidian - Thursday, January 29, 2015 - link

    I agree with you on 1 and 2, and *possibly* 3, but I wouldn't bet on that last one myself, nor on #4. The monitor reviewed here--with a 60Hz maximum and pixel decay under 40Hz--would seem to suggest that a better panel isn't at all necessary.

    I also completely agree that, absent G-SYNC, Freesync very likely wouldn't exist. But that's often the way of things: someone comes up with a novel feature, the market sees value in it, and a variant of it becomes standardized.

    G-SYNC is brilliant, but G-SYNC is also clumsy, because of the compromises that are necessary when you can't exert control over all of the systems you depend on. Now that a proper standard exists, those compromises are no longer necessary, and the appropriate thing to do is to stop making them and transfer those resources elsewhere. This, of course, assumes that Freesync doesn't come with greater compromises of its own, but there's presently no reason to expect that it does.

    As for Intel, the 23.97Hz issue persisted as long as it did because you could round down the number of people who really cared to "nobody." It's possible that the number of people who care about Freesync in an IGP rounds similarly too, of course.
  • andrewaggb - Thursday, January 29, 2015 - link

    Freesync in an IGP for laptops and tablets would be a big deal I think.
  • nos024 - Friday, January 30, 2015 - link

    That's exactly what I am saying. Basically, we have only two GPU choices for PC gaming, nVidia or AMD. I'd understand the vendor-lock argument if there was a third and fourth player, but if nVidia doesn't support Free-Sync, you are basically locked into AMD GPUs for Freesync gaming.

    I'm sure nVidia can reduce the royalty fee or eliminate it completely, but you know what? There's nothing competing against it right now.

    nVidia seems to get away with lots of things, e.g. for a MB to implement SLI, it needs to license it and only come in enthusiast chipsets (Z77/Z87/Z97). Xfire comes free with all Intel chipsets - yet SLI is pretty popular still...just saying.
  • anubis44 - Tuesday, February 3, 2015 - link

    I say let nVidia be a bag of dicks and refuse to support the open standard. Then we'll see their true colours and know to boycott them, the greedy bastards.
  • SkyBill40 - Thursday, January 29, 2015 - link

    I think BO and Jarred have it pretty much covered.
  • anubis44 - Tuesday, February 3, 2015 - link

    Somebody'll hack the nVidia drivers to make nVida cards work with Freesync, kind of like the customized Omega drivers for ATI/AMD graphics cards a few years ago. You can count on it. nVidia wants to charge you for something that can easily be done without paying them any licensing fees. I think we should just say no to that.
  • MrSpadge - Thursday, January 29, 2015 - link

    If Intel were smart they'd simply add Free Sync support to their driver. nVidia gamers could use Virtu to let the Intel + Freesync output the signal from their cards. Non gamers would finally get stutter-free video and GSync would be dead.

    No matter if Intel ever takes this route, they could do so and hence free Sync is not "vendor-locked".
  • phoenix_rizzen - Thursday, January 29, 2015 - link

    "The high resolution also means working in normal applications at 100% scaling can be a bit of an eyestrain (please, no comments from the young bucks with eagle eyes; it’s a real concern and I speak from personal experience), and running at 125% or 150% scaling doesn’t always work properly. Before anyone starts to talk about how DPI scaling has improved, let me quickly point out that during the holiday season, at least three major games I know of shipped in a state where they would break if your Windows DPI was set to something other than 100%. Oops. I keep hoping things will improve, but the software support for HiDPI still lags behind where it ought to be."

    This is something I just don't understand. How can it be so hard?

    An inch is an inch is an inch, it never changes. The number of pixels per inch does change, though, as the resolution changes. Why is it so hard for the graphics driver to adapt?

    A 12pt character should be the exact same size on every monitor, regardless of the DPI, regardless of the resolution, regardless of the screen size.

    We've perfected this in the print world. Why hasn't it carried over to the video world? Why isn't this built into every OS by default?

    Just seems bizarre that we can print at 150x150, 300x300, 600x600, 1200x1200, and various other resolutions in between without the characters changing size (12pt is 12pt at every resolution) and yet this doesn't work on computer screens.
  • DanNeely - Thursday, January 29, 2015 - link

    It's not the drivers; it's the applications. The basic win32 APIs (like all mainstream UI APIs from the era) are raster based and use pixels as the standard item size and spacing unit. This was done because on the slower hardware of the era the overhead from trying to do everything in inches or cm was an unacceptable performance hit when the range of DPIs that they needed to work on wasn't wide enough for it to be a problem.

    You can make applications built on them work with DPI scaling; but it would be a lot of work. At a minimum, everywhere you're doing layout/size calculations you'd need to multiply the numbers you're computing for sizes and positions by the scaling factor. I suspect if you wanted to avoid bits of occasional low level jerkyness when resizing you'd probably need to add a bunch of twiddles to manage the remainders you get when scaling doesn't give integral sizes (ex 13 *1.25 = 16.25). If you have any custom controls that you're drawing yourself you'd need to redo the paint methods of them as well. It didn't help that prior to Windows 8, you had to log out and back in to change the DPI scaling level; which would make debugging it very painful for anyone who tried to make it work.

    Newer interface libraries are pixel independent and do all the messy work for you but changing one out is a major rewrite. For Windows, the first one from MS was Windows Presentation Foundation (WPF); which launched in 2006 and was .net only. You can mix C/C++ and .net in a single application; but it's going to be messy and annoying to do at best. Windows 8 was the first version to offer a decedent of WPF to c++ applications directly; but between lack of compatibility with win7 systems meaning the need to maintain two different UIs and the general dislike of the non-windowed nature of Metro applications it hasn't gained much traction in the market.

    Disclosure: I'm a software developer whose duties include maintaining several internal or single customer line of business applications written in .net using the non-dpi aware Windows Forms UI library. Barring internal systems being upgraded to Win 8 or higher (presumably Win10) and high DPI displays or a request from one of our customers to make it happen (along with enough money to pay for it); I don't see any of what I maintain getting the level rewrite needed to retrofit DPI awareness.
  • NotLupus - Thursday, January 29, 2015 - link

    Get a high-speed camera and test input lag then.
  • AnnonymousCoward - Thursday, January 29, 2015 - link

    Yeah. The input activation could be tied to an LED, like caps lock already has, or if a mouse was modified. The camera measures the LED vs on-screen change.
  • AnnonymousCoward - Thursday, January 29, 2015 - link

    This Acer is so much fail!
    -TN at 28" is bad. Angles are inherently a problem.
    -4K is too many pixels for today's GPUs.
    -They capped framerate to 60Hz. This monitor would be far more interesting if it at least went up to 75Hz, and displayport 1.2 has the bandwidth.
    -70% gamut is poor for games.

    Older screen tech with Gsync added and a faster tcon would have been better, even if it was simply:
    -30" 2560x1600 >90% gamut with Gsync at up to 100Hz.
  • yefi - Sunday, February 1, 2015 - link

    I'd really like to see 30" 1600p monitors with g-sync. Unfortunately today, everything seems to be 1440p or 28" 4k. Whatever happened to vertical height?
  • tsk2k - Friday, January 30, 2015 - link

    The real question is where are the god damn oled monitors!?!?!?! LG????
  • anubis44 - Tuesday, February 3, 2015 - link

    Another issue for GTX970 users, of course, is that 4K will usually require more than 3.5GB of graphics card memory, which will put their frame rates in the toilet, so paying extra for a 4K G-Sync monitor for these users is adding insult to injury.
  • perpetualdark - Wednesday, February 4, 2015 - link

    This article left out the other Acer g-sync option, the $599 XB270H. The downside over the ASUS ROG is it is only 1080, but then I never run anything higher than that in games anyway as it is more than enough resolution for my eyes. Plus you still need a pretty powerful graphics card to run at 1440 reliably. It is $200 less than the swift or the acer 4k. Yes, it is TN, but frankly it is the best looking TN panel I have ever seen, and unless I am trying to game from 10 feet off of center (why on earth would anyone??) it looks perfectly fine. Color is fantastic once I dialed it in a little, and the difference in games is spectacular. runs up to 144hz. Like the ROG swift, you can also choose to run in lightboost mode which makes your LCD look more like a CRT, although G-sync doesn't support lightboost at the same time as g-sync (changing strobe rates to match everything else would be a nightmare). The downside is you have to play a g-sync enabled game. Of course, if it isn't g-sync capable, just turn on lightboost and/or run at 120 or 144hz and you still get far better gaming performance over other monitors.

    I use an older 660 gtx card, and so far every game I have played pretty much runs liquid smooth at the highest settings. Lately I have been playing World of Tanks, and cranked all the way up, I can spin, zoom, pan, scroll, etc as fast as I want and I have yet to see a hitch or a tear, and I have tried to force it. I can go into the garage and with all those details on, spin my view back and forth as fast as I can whip the mouse back and forth, and no tearing at all. It is fantastic, and to me worth every penny.

    I am sure some people will say things like 1080 is way too low of a resolution, or that TN sucks, or that $599 is still too high for a 27", but until something better is out, it is the best 27" gaming monitor you can get for less than $600. Sure, I could have gotten the Swift, but frankly I didn't need 1440 and that extra $200 got me a shiny new Samsung 850 pro ssd, and I am quite happy with it.
  • looncraz - Thursday, February 5, 2015 - link

    "AMD GPUs don’t properly scale the resolution to fill the whole screen"

    This is a setting, pure and simple.

    CCC->My Digital Flat Panels->Properties->Select the proper monitor->Scale image to full panel size

    The latest AMD drivers also have "virtual super resolution" which allows you to run any game (or your desktop) at higher resolution than your monitor supports.

    I play Battlefield 4 @ 3200x1800 @ 60 FPS - I disabled anti-aliasing and couldn't tell a difference at this scaling (when I could clearly see it - and be annoyed by it - at 1080p).

    I will be doing this with every game that supports it now, super awesomeness.
  • Clorex - Tuesday, February 24, 2015 - link

    What is pixel decay?

    From page 2 of the article:
    "while G-SYNC can refresh the panel at rates as low as 30Hz, I find that anything below 40Hz will start to see the pixels on the screen decay, resulting in a slight flicker; hence, the desire to stay above 40 FPS"

Log in

Don't have an account? Sign up now