120hz.NET - Source for 120hz 1440p Monitors - Latest News
Join Today
  • Recent Forum Posts

    jixr

    I took the monitor home, took it apart, And I think the issue was on the frame, the screws were too tight causing the internal ribbon wire to be loose

    Dying Shimian?

    jixr 05-20-2015, 07:49 AM Go to last post
    jixr

    Hey everyone.

    My office monitor has been giving me some issues, and I think it may be dying.

    Its a shimian 1440x2560 monitor,

    Dying Shimian?

    jixr 05-19-2015, 12:25 PM Go to last post
    magikarp7

    Hey guys sorry have been gone for so long. Monitor hasnt had many problems with it since. The displayport is sometimes a bit shitty and flickers but apart

    Wasabi Mango UHD285 Impressions (coming soon)

    magikarp7 05-18-2015, 05:15 AM Go to last post
  • Latest News

    by Published on 02-06-2015 01:51 AM

    Hey everyone. HyperMatrix here. I decided to make use of this website to share some information regarding GSYNC and FreeSync. There appear to be tons of differing opinions on various aspects of the 2 technologies all over the internet. I'm here to share a bit of what I know, and a bit of what I think. As there are currently no FreeSync monitors out there, anyone who writes about it is merely speculating. I too, have the power of speculation and I intend to use it.




    WHAT IS IT: Adaptive Sync

    Adaptive Frame Synchronization sounds simple enough, and makes you wonder why we're only seeing it now. NVIDIA actually drew up a pretty good graph demonstrating how it works. I don't have any fancy graphs, so I will try to explain it in words. Your standard LCD monitor has a 60Hz refresh rate. That means that every second, the image you see on the screen is updated 60 times. Or rather, every 16.67ms the display is refreshed with a new image. (For the purpose and intent of this article, we're not going to get into pixel response time, which is how long after a new image is sent to the monitor, the pixels can actually transition to show it).


    So every 16.67ms, your screen displays a new picture. The problem can be seen here. Below you see the 5 refreshes and the time at which they update on your screen


    0 Seconds - Refresh 1
    16.67ms - Refresh 2
    33.33ms - Refresh 3
    50ms - Refresh 4
    66.67ms - Refresh 5


    Now here is an example of what frame generation looks like (how long between each frame, based on your current FPS):


    0 seconds - Frame 1
    12ms - Frame 2
    30ms - Frame 3
    40ms - Frame 4
    53ms - Frame 5


    So what does this all mean when you try to match up when a frame is generated on your computer, and when the display is updated with the current frame? Well look here (note: the delay between frame generation and the previous display update time is shown between < and > below)


    0 seconds - Frame 1
    <0ms delay>
    0 Seconds - Refresh 1


    12ms - Frame 2
    <4.67ms delay>
    16.67ms - Refresh 2


    30ms - Frame 3
    <3.33ms delay>
    33.33ms - Refresh 3


    40ms - Frame 4
    <10ms delay>
    50ms - Refresh 4


    53ms - Frame 5
    <13.67ms delay>
    66.67ms - Refresh 5


    So what you can see is that there is no correlation between when your computer generates a frame, and when your monitor updates the display with the new frame. In fact, depending on how the timing works out, your display (what you're looking at) could be anywhere between 0.1ms to 16.6ms behind what your computer has generated. This directly adds to input lag, and can also result in mistimed frame generation and fluctuating fps to cause the same frame to fall on to 2 or even 3 refresh cycles, which takes away from smoothness in motion.


    Now...on to Adaptive Sync. It's really quite simple. Your monitor still has a maximum refresh rate. Whether that be 60Hz, 120Hz, or 144Hz. But your monitor no longer needlessly updates every 16.67ms as was the case with the 60Hz monitor example above. Adaptive Sync waits for your computer to generate a frame, then sends just that frame to be updated and displayed on your screen. Whether you are getting 30fps, 45fps, or 60fps, your monitor will update your screen instantly, as soon as a frame is generated! This eliminates screen tearing, for sure, but also reduces input lag. You no longer end up with that variable 0.1ms to 16.6ms delay between frame generation, and screen update. This...is huge.






    GSYNC: What the heck is it?

    GSYNC at its core, is simply Adaptive Sync. But it's also more. But only sometimes. Ok here goes. In order to get your GPU and your Monitor to play nice together, Nvidia had to create a module that they named GSYNC. But when you go to the trouble of creating your own monitor scaler (GSYNC), you ask yourself if there's anything else you can do with this opportunity. And in this case, the answer is absolutely there's more you can do. Nvidia borrowed a bit from its Nvidia 3D Vision supported monitors. Certified Nvidia 3D Vision monitors had a feature called "Lightboost" that was intended for use in a 3D environment. With some driver hacks, some very clever people realized they can activate Lightboost to help reduce motion blur in 2D gaming. Nvidia also saw the benefit of this. And they've added support for an enhanced version of Lightboost, called ULMB (Ultra Low Motion Blur). ULMB is a strobed backlight that seeks to simulate some of the benefits of old CRT monitors. If you check out the BlurBusters website, Mark Rejhon has some great tests and articles showcasing the amazing benefits of ULMB.


    The downside to a strobing backlight, is that it is most beneficial at higher refresh rates. Current monitors have a minimum 85Hz requirement to enable ULMB. This means that while the GSYNC module supports ULMB, if you have a 60Hz monitor, ULMB will not be active. A backlight strobing at 60Hz could actually be quite annoying to see for many people. So this feature will be of great benefit in the 120Hz+ suite of monitors. It should be noted that due to the nature of strobing backlights, there is currently no way to activate ULMB at the same time as GSYNC. The general idea is that for games where you have lower fluctuating FPS, you keep GSYNC on. For games where you have the GPU power to run at high FPS, nothing can beat ULMB. But the important thing to take away from this is that the GSYNC module doesn't just give you Adaptive Sync, but also gives you ULMB. And also potentially support for Nvidia 3D Vision (such as with the ASUS ROG Swift).






    AMD FreeSync: So what's the difference?


    Nothing! Well...we don't know for sure. The basic functionality of AMD's FreeSync is just Adaptive Sync. However, it is absolutely imperative that I mention the following about FreeSync. FreeSync is NOT FREE. This was a brilliant bit of marketing by AMD by calling it "Free" sync. Just like the GSYNC module, FreeSync requires a custom asic scaler. The difference is that unlike Nvidia, AMD is not charging a licensing fee to use it. And FreeSync doesn't include any stereoscopic 3D support or ULMB, which means the basic hardware should be cheaper to make than GSYNC, and not have a licensing fee to add to the cost. Although that also means for enthusiast level gaming, your *standard* FreeSync monitor will fall short of what GSYNC offers.






    PRICE: So will FreeSync be cheaper?


    Probably...but likely not as cheap as you may think. There are a few issues. This gets into "business/economics" speculation. So here is what we know about FreeSync:


    Why it may be cheaper:
    - No Licensing Fee
    - Less expensive hardware due to fewer features
    - VESA DisplayPort support (it is an OPTIONAL component of DP. This is a very important fact)


    Why it may NOT be cheaper:
    - Basic integration doesn't cover ULMB support. Manufacturer can add ULMB on their own, but as a separate module which adds to cost/complexity (Note: Nvidia currently works with manufacturers to integrate GSYNC)
    - Nvidia has no current plans to support FreeSync. Why is this important? Because it limits the market for FreeSync monitors, which is an important consideration for manufacturers
    - Intel does not/cannot currently support it. There has been no communication from them about whether they will or will not support FreeSync in the future.
    - AMD is only sitting at 30% of the discrete gpu sales market


    My thoughts are as follows. Even with GSYNC monitors, the cost of the module is $100 or less for the bigger manufacturers. However, if you take a look at pricing, the difference between standard and GSYNC monitors is far more than $100. The reason for this is that in business, when you spend time/invest in/take a risk with a feature like GSYNC, you're thinking of how much more you can earn on the feature. So Nvidia tells ASUS "Hey...I've got this awesome module that will make your monitors rock" and ASUS thinks "Hmmm...if I pay an extra $100 per monitor to buy this module for enthusiasts, I need a return on my $100 investment as well. So let's up that price by another $100. This is common practice and is not some greedy move by Asus or anyone else.


    When making Niche products you take into account the cost of development and manufacture of a much smaller run. So when Asus decided to make the ROG Swift, they took the cost to develop the monitor, spread it across the expected number of sales, and voila. Now with GSYNC, they were making a product that technically 70% of new gamers could use (due to Nvidia market share). So let's say the development of the monitor cost just $1,000,000 (completely arbitrary numbers here). And Asus estimated they would sell 25,000 units in the first year. That means they could distribute the cost of development across 25,000 units at $40 a piece. So the cost of GSYNC integration would be $100 for the module/license, $100 markup for having integrated this niche feature, and $40 for development. That's a $240 premium over the same monitor without GSYNC. Now if Asus were doing the same thing with FreeSync (which they are doing right now, but haven't released prices for yet) then you could say a $50 custom asic scaler, $100 markup for niche feature, and based on 70%/30% split between Nvidia/AMD market share, an expected 10,800 units sold within the first year, bringing the development cost distribution to $92 per monitor. So $50 + $100 + $92 = $242 premium over the same monitor without FreeSync.


    The example above is important to note, because until Intel and Nvidia sign on to support DisplayPort with Adaptive Sync, all these "FreeSync" monitors will be seen as a standard monitor to them and no one is going to pay the premium for a feature they can't use. Even if the manufacturers decide to stick to a $50 markup instead of a $100 markup, that's still only a $50 price difference between the GSYNC and FreeSync. And if the FreeSync monitor also adds in ULMB integration, then that $50 price difference is gone yet again.






    So...FreeSync is pointless?


    No. Absolutely not. There are still a lot of variables in play. Even if Nvidia doesn't sign on to DisplayPort with Adaptive Sync, Intel could. If Intel supports it, you're going to see FreeSync monitors come down in price a lot, as tons of people will be able to use them. That means easier returns on investment to recover development costs, and less risks associated with launching a FreeSync product. I envision FreeSync could become a fairly standard feature in monitors if Intel supports it. But I can't see Nvidia giving in to FreeSync or giving up on GSYNC. Unless, of course, Intel does. And don't forget...if you're an AMD user, FreeSync is your only option. And vice versa with Nvidia users and GSYNC. Which actually makes it funny when you see people arguing or fighting over GSYNC vs. FreeSync, when people on either side of the argument don't really have a choice to use the other technology anyway. We are all just taking what they hand us.




    CONCLUSION:


    Both FreeSync and GSYNC are going to bring a much needed feature to LCD monitors. Adaptive Sync. And while AMD has won a major marketing battle over the naming of FreeSync, making people think it is going to be substantially cheaper than GSYNC when similarly equipped, that isn't the case. And with ULMB support baked in, GSYNC will be the better option for gaming, even if it turns out it may not be the best option for your wallet. Nvidia is betting that the same people paying the big bucks for its more expensive GPUs will line up and pay the premium for the GSYNC monitors, while AMD and its Value approach to CPUs, GPUs, and now Monitors, is hoping the lower cost of implementation could help it steal some market share away from Nvidia.


    All I know for sure is...Adaptive Sync is amazing. Whatever package it comes in.




    Monitors of note:
    ASUS ROG Swift (GSYNC, 27", 144Hz, 1440p, TN)
    Acer XB270HU (GSYNC, 27", 144Hz, 1440p, IPS)
    Acer Predator XR341CK (GSYNC, 34" Ultrawide, 144Hz, 3440x1440, IPS)
    ASUS MG279Q (Adaptive Sync -unbranded FreeSync-, 27", 120Hz, 1440p, IPS)
    by Published on 01-28-2014 04:29 PM

    So as you may know, I was a big fan of the new ASUS ROG Swift PG278Q. For those who aren't familiar with it, it's this 27" 1440p 120Hz+ beast with Displayport and GSYNC and etc. It is absolutely beautiful, as you can see:


    Now the problem is two-fold. One is that it's a TN display. I consider that a tradeoff. You get lower image quality, for a faster and more fluid panel. And it's a native 8-bit panel, instead of the traditional 6-bit TN's. This one fact I can try to overlook. However...the main issue, is that the monitor is being released with a heavy Matte panel. So you go from an IPS or PLS glossy display currently, with rich vibrant colours, to a matte TN display.

    That's a complete no-go for me. So I posed a question to Asus. The first thing they replied with is general marketing BS:

    "Based on the feedback of ownser of IPS/PLS and TN panels we have produced the majority of users have not had negative feedback about the Polarizer used ( and generally are not even aware of it ). "

    Now...you'd think that he's saying people like us have barely even noticed the matte polarizer used. But he doesn't mention users of Glossy displays. He only mentions IPS/PLS/TN users. They may have surveyed only people who have Matte displays. Either way I still don't buy it.

    He then goes on to compare the difference between Matte and Glossy displays with PWM lighting preference, as you can see here:

    "All in all this something specific to users and does not have a consistently present level of feedback across the majority of users. This similar to how many users are not bothered by PWM flicker but there are some users very sensitive to it. As a whole feedback on the current type of AG Polarizers has been solid with minimal negative feedback."

    When I mentioned that the screen being both TN and having a Matte Anti-Glare filter will discourage many people who have purchased the various current gen 1440p overclockable displays from moving over. He replied:

    " From a volume perspective that quantity of user transitioning form those other monitors is very small and not representative of the majority of the market."

    That response seems odd to me as currently, those users (us), happen to be the majority of the $500+ 1440p gaming monitor market. Which is who they are obviously targeting with this monitor. Your average joe blow doesn't spend $800 on a monitor. And people who do spend $800 on a monitor, often have a lot of money invested in their computers. We are enthusiasts. And we care about things like glossy displays.

    So at this point I make a recommendation to at least do a limited run of Glossy displays that would be available only through a limited number of channels. He gives me a rather odd response regarding it:

    "As to your recommendation of running concurrent models this is complicated as it requires double the investment in the initial design and development and validation. As such it is not a realistic approach. As of now the SWIFT will come with AG Polarizer that i feel confident the majority of users / gamers will be satisfied with. "

    I have no idea how they run their business and how they handle development. But we've been dealing with monitors in the same housing, using both glossy and matte ag coatings with no issue. That should be the only difference in the builds. However, they don't seem to see it that way.

    So in conclusion...as excited as I was for this display...I am going to have to pass. From what I understand, Nvidia will be designing a displayport gsync board for our LG panels. So there should be an option to upgrade our displays in the future. I had been ready to proclaim the ROG Swift monitor the new king of gaming displays. But due to its price point, and rather severe reduction in image quality due to the use of a TN panel, and a heavy Anti-Glare coating, it will have to remain a "Don't Buy" in my books so long as plans for a GSYNC board for our monitors don't get scrapped.

    Thank you, Asus, for making a Niche product (due to the price) while ignoring what the majority of people who have been buying these types of displays at this price point want.

    by Published on 11-25-2013 05:44 PM


    A heck of a great deal. Seiki's 39" 4K LED-Backlit 120Hz resolution display for just $489 through Amazon. Unfortunately, as you may know, this runs through HDMI 1.4, and not 2.0. So the bandwidth is limited. That means you only get 30Hz at 4k resolution. However, you do get over 120Hz at 1080p. And 4k at 30Hz for video content or other work is still amazing at this price. Personally, I'm waiting until an HDMI 2.0/DisplayPort model comes out that can run full 60Hz through a single input. I think I could settle for some 4K60Hz gaming with prices this low. Check it out here if you were interested:


    http://www.amazon.com/dp/B00DOPGO2G/...SIN=B00DOPGO2G
    by Published on 11-20-2013 08:16 PM


    Ok so I don't love everything about Microsoft. I hated that they took away my Start Menu. I hate that they're overcharging for an inferior Xbox One. I hate them for being so slow with Windows Phone updates. And I hate them for creating a unified Win8 system, that isn't really unified...but rather is fragmented across Desktop, Tablet, Phone, and TapTop/DeskLet combo monstrosities. But all of this has been due to their ineptitude. They do have a few brilliant ideas though. They've just started selling "Don't get Scroogled" merchandise on the actual windows store. Check it out here:

    http://www.microsoftstore.com/store/...oryID.67575900

    I am definitely going to buy this when it's back in stock:
    by Published on 11-19-2013 03:43 AM

    Ok so I had a really fun time watching this video of PC Gamer's new superfluous gaming rig. On the surface...you may think nothing of it. I'm guessing they have a budget of around 12k for that rig. What caught my attention though, were the number of amateur mistakes I could spot. Have a look and see how many things you can spot that you'd never do if you were building the "ultimate gaming rig" (in terms of performance) and had 12k and your reputation to blow. After a few days I'll share my observations with you! Remember, don't be an Alienware gamer. Flashy lights don't increase your frame rate or kill/death ratio. - HyperMatrix

    by Published on 10-30-2013 01:01 AM



    As many of you know, I've been anticipating the arrival of the Nokia Lumia 1520. It is the first Windows phone with a 1080p display. It is also the first 6" Windows phone. The first quad-core Windows phone. The first Windows phone with 4x HAAC microphones for superb audio capture even under loud conditions such as a concert, and also stereo/positional audio capture. It is the first Windows phone that has that has attracted so much attention since the Lumia 1020 and its massive 41mp camera. But, alas, it is also one of the last Lumia devices to be made by Nokia prior to the takeover by Microsoft.

    Most of the amazing features of this phone were known to me throughout the many leaks that have come out over the past few weeks. 12 hours of Wifi browsing? 25 hours of talk time? Windows 8 GDR3? Built-in wireless charging? All great features. There's one bit that eluded me. Most rumours were claiming that Nokia would equip the Lumia 1520 with a Samsung-made Super AMOLED HD display. There are pros and cons to that. Over-exaggerated colours and deep blacks do make the screens more fun, even if inaccurate. But the downside is less realistic colour reproduction. Movies and pictures would be off as a result. And visibility in sunlight would be quire poor unless even more power was pumped into the backlight, which again means less battery life.

    Nokia caught me off guard by using an IPS display, using their own deep black technology and the ever wonderful PureMotion 60hz tech, and by introducing a technology I had read about years ago during the release of the iPhone 3GS. There was a company back then that posed a question. And that was...why are we just increasing backlight power to increase visibility? It really doesn't help too much, and it kills the battery. They then did a demo of a screen that had the entire content of the display adjusted on a per-pixel level, to look best under various lighting conditions. This is all some very magical stuff that should become standard equipment on most high end phones in the near future.

    The best way I can explain it is this way. If you're playing a dark game that's intended to be dark, with monsters hiding in corners and jumping out at you, increasing the brightness on your monitor doesn't help you much. Because you're just shining even more light behind a set of black pixels. But once you increase the gamma or brightness through the in-game menu, all those dark black areas become gray and quite visible as a result. Apical does that same sort of adjustment, but in a far more sophisticated manner, and on a per-pixel level as opposed to a full-screen effect, in direct response to the light hitting the screen. This is yet another reason to be excited about the upcoming Lumia 1520!
    by Published on 10-30-2013 12:10 AM



    I was quite excited a few weeks ago when I heard Nvidia talk about using their onboard NVENC h264 hardware encoding system for game capture with next to no hit on performance. Normally I'm skeptical of such claims, but this is exactly what the PS4 is doing as well. And now, I can't believe this wasn't done sooner.

    To use Nvidia's "ShadowPlay" feature you have to download the Nvidia GeForce Experience app which many people generally avoid as it may feel like bloatware to those are already adept at customizing 3d settings on a per-game basis. But this makes it all worthwhile.

    ShadowPlay is currently in Beta so there are some limitations. Currently you can select recording quality between Low, Medium, and High settings. However, the capture resolution and frame rate are currently locked in at 1920x1080@60fps. There are also 2 recording modes. A Manual mode that allows you to start/stop video capture at will, and the Shadow mode which automatically records the last X minutes of gameplay, where X can be anywhere from 1 minute to 20 minutes long. A key combination can be pressed to save the last X minutes of gameplay at any time. So you can game as you normally do, and if you happen to see or do something epic, you can save it. Just like your PVR. Simple as that.

    The software requires 7.5GB of data is required for 20 minutes of "Shadow" mode recording at high quality. Recording gameplay of Batman Arkham Origins for 60 seconds in manual mode took up only 180MB of space. This is a far better recording platform than any software solution or even hardware solution you may currently be using. Now let's hope there's some way to enable higher resolution recording, even if at a lower frame rate.

    You can download GeForce Experience and access ShadowPlay through here: http://www.geforce.com/geforce-experience

    This is only for GTX 600 series cards and above. Happy Recording!

  • Recent Articles

  • Random Images