PDA

View Full Version : Gtx 690 - And Catleap 120hz



whitespider
05-03-2012, 10:27 AM
From my understanding the only graphics card that can output 120hz+ is a single gtx 680 - As powerful as that card is - getting anywhere near 120fps in more demanding games would require SLI GTX 680's. Adding a second card reduced the hz to 100.

And here is the question; Is a gtx 690 limited by the same 100hz limit, or can it run 120hz like the gtx 680.

If so, this would certainly be the ultimate card for this monitor.

Sn0_Man
05-03-2012, 10:38 AM
Nobody has a GTX690 to test so nobody knows for sure. However, GTX 690 IS an implementation of SLI so I'm fairly certain that it will suffer the same 400MHz Pixel clock limitation (~101Hz screen refresh rate @1440p). HyperMatrix said that when he was testing SLI 680s the 400Mhz clock limit felt hard coded into the drivers.

Basically, Everything that I can find says the 690 is functionally 2 680s with slightly lower base clocks, but they hit the same max when OC'd. So I wouldn't rush out to buy a 690 in hopes of 120Hz refresh rate on one of these screens.

whitespider
05-03-2012, 11:12 AM
It certainly would be interesting to find out, one way or the other. I'm mostly considering the 690 over 680 sli for it's 2.0 frame metering tech. Which is exclusive to the gtx 690. There are already some 'translated overseas' reviews showing huge differences in frametimes with crossfire vs gtx 680 sli, vs gtx 690. And the 690 is nearly on par with a single card, aka - little to no microstutter. So in some form or another, there is some architectural differences between a 680 and 690, and not just the usual single-gpu-to-multi-gpu changes either - by the sounds of it.

Sure, 95% chances are that the 690 will act exactly like sli gtx 680's with it's 400mhz pixel clock, I'm just curious about that 5% chance that it might work.

http://www.computerbase.de/artikel/grafikkarten/2012/test-nvidia-geforce-gtx-690/8/#abschnitt_mikroruckler


Chrome has an auto-translate feature, not sure about firefox. Anyway, there is the link to one of the articles.

hikikotaku
05-03-2012, 04:19 PM
Ya, im interested in this too. Depending on the prices of the 120hz monitors i might order this card. 400mhz clockrate would suck. I dont think this has it because of the more advanced SLI and its meant for 3d monitors and multiple monitors. One can hope, hopefully someone orders one and test it out.

HyperMatrix
05-03-2012, 04:39 PM
Ya, im interested in this too. Depending on the prices of the 120hz monitors i might order this card. 400mhz clockrate would suck. I dont think this has it because of the more advanced SLI and its meant for 3d monitors and multiple monitors. One can hope, hopefully someone orders one and test it out.

I know Scribby is hoping to get his hands on one. Personally, I don't understand the card. It's priced exactly 2x what the GTX 680 costs. I don't remember double-gpu cards ever costing exactly double. But I guess when you have your 680's flying off the shelves that quickly you can do whatever you like with the even more rare 690.

If the fact that it's a dual-gpu board has somehow resolved the 400mhz pixel clock limitation, however, I can see it as being quite advantageous in that regard. Let's hope Scribby can pick one up and test it. ;)

x980
05-03-2012, 05:41 PM
It doesnt make since to me either as these cards never overclock as good as 2 separate cards. I guess its good for the space limited crowd that wants to do quad sli but doesnt want the big monster case required for a 4x sli setup. The other problem I have is the fact that 4 GB voltaged unlocked cards will start to roll out a 3 weeks. They will be overclocking beasts. As the Zotac one broke 1400 on air! I would be happy with 2 that would do
1300ish.

hikikotaku
05-03-2012, 06:27 PM
the micro-stutter on the 690 is alot better then the 680, it also runs cooler, quieter and the same speed overclocked.
also 3 dual-link DVI's are perfect for 3 120hz catleaps

whitespider
05-03-2012, 11:07 PM
I know Scribby is hoping to get his hands on one. Personally, I don't understand the card. It's priced exactly 2x what the GTX 680 costs. I don't remember double-gpu cards ever costing exactly double. But I guess when you have your 680's flying off the shelves that quickly you can do whatever you like with the even more rare 690.

If the fact that it's a dual-gpu board has somehow resolved the 400mhz pixel clock limitation, however, I can see it as being quite advantageous in that regard. Let's hope Scribby can pick one up and test it. ;)

Well, it's a bunch of things really. They released this very close to the gtx 680 timeframe, it's the first nvidia card to be pretty much 1:1 with a 680 on a per card basis, the reviews seem to indicate that it can overclock as well or slightly worse than two individual gtx 680's - it's got hardware based frame metering as an exclusive feature for those who want near-single-card-performance. It looks really nice (not that I care about how it looks - I just like smoothness in framerate) It's got a better cooler (I don't care about this either, but some do).

These smaller things all add up. Except for the hardware frame metering - for me personally that's a HUGE thing. I am definitely one of those people who can perceive microcosmic frame shifts (as I have found out the hard way with my 6990). All in all, I still don't think it's affordable. Even if someone can afford it.

I think I am becoming a fan. I just need more than the review I linked with serious sam microstutter levels (or mikroruckler levels, lol) to confirm they have better microstutter mitigation than the gtx 680 sli configs (plus, as the review said, they where using a newer driver for the gtx 690). If it pans out, and the 690 does in fact present more even frames across a wide range of games. Then consider me sold. I just hope it's still being sold when I can finally afford it.

Ps. That's one of the reasons I am so interested in the 120hz catleap monitor, because I am so sensitive to framerate/smoothness

120HzNET
05-03-2012, 11:29 PM
Well I had the 690 in my cart this morning all ready to press "PLACE ORDER" and didn't. Why? I couldn't tell you that. Something in my gut said this isn't the card I really want. Now don't get me wrong - I DO want it - but for all the wrong reasons. It's because I find it so damn sexy. I mean seriously, that is a great looking card! For me I build my systems not only for function, but form. I like the ascetics of the build - the technical pieces and flash out there to see. The 690 was perfect in that regard.

However, I love to SLI on water - goes to the look thing - but also serves a serious function for how I use my home system. The 690, while I could buy two, would kill me since I would gut them, throw on some water blocks and really kill the form aspect of the build (at least for the emotional side of me - why I want a 690 in the first place - looks and performance.)

So where does that leave me? No idea. Right now my index shopping finger is itching for 2 4gb 680 Classifieds. To me that will be the best of form and function given Tiamat's current build out (and with me deciding once and for all to ignore Ivy.)

whitespider
05-05-2012, 06:13 AM
http://www.tweaktown.com/articles/4710/nvidia_geforce_gtx_670_2gb_video_card_performance_ preview/index15.html

I wonder if the gtx 670 will be able to reach 120hz like the 680 can.....or if those disabled clusters will impact the maximum pixel clock.

darklunatica
05-06-2012, 10:46 PM
guys, between GeForce GTX 680 JetStream 4GB and Asus GTX 680 Direct Cu ii 2GB, which one should i get should i want to get higher refresh rate in max resolution?

asus temperature is much lower vs palit has the 4g vram... any input on this?

edit: hmmmm according to videocardz.com and tweaktown:
PALIT GeForce GTX 680 4GB JetStream is in fact a very good solution. But, as we predicted earlier, 4GB won’t have a huge effect on your framerates, it will however provide slightly better performance in high-resulutions.

so should i go for the 4g card instead? :x

Shadman
05-06-2012, 11:16 PM
guys, between GeForce GTX 680 JetStream 4GB and Asus GTX 680 Direct Cu ii 2GB, which one should i get should i want to get higher refresh rate in max resolution?

asus temperature is much lower vs palit has the 4g vram... any input on this?

edit: hmmmm according to videocardz.com and tweaktown:
PALIT GeForce GTX 680 4GB JetStream is in fact a very good solution. But, as we predicted earlier, 4GB won’t have a huge effect on your framerates, it will however provide slightly better performance in high-resulutions.

so should i go for the 4g card instead? :x
With these monitors? Oh hell yeah!

darklunatica
05-06-2012, 11:22 PM
hmmm but i am in no way going for surround... (desk is too small for it)
however, since the price difference for the 2 cards in Singapore is just a few dollars apart, perhaps it will be wise for me to get the palit 4gb... :x

GranTurismo
05-09-2012, 09:09 AM
Funny, Like Scribby I had a 690 in my cart at http://www.dabs.ie/products/evga-geforce-gtx-690-915mhz-4gb-pci-express-3-0-hdmi-82PB.html?q=gtx%20690&src=16

all ready to buy this morning and I chickened out. I bought a 590 (and a 1200w psu as my 700 corsair wouldn't cut it) last October for BF3. I want to hit catleap native res at 100fps with ultra on in BF3 (It's all I play) and I'm not even sure I'll get that with a 590 but adding another in a few months might do the trick.

Anyway, one question I need answering before I take the plunge on a 690 is how much my 590 is worth second hand? I had a 295gtx before that and gave it to a friend but I'm not sure my generosity will stretch quite that far this time around.

GranTurismo
05-09-2012, 09:11 AM
Of course I could get a second hand 590 and add it in but where would that leave me fps wise?

Sn0_Man
05-09-2012, 09:43 AM
A quick check online says GTX590 SLI is about 70% better than a single GTX 590 at 2560x1600 (comparable res). So take whatever you get with your 590 and add 70%. Whats that? You don't have a monitor to test with? Google tells me you should be looking at around 50Fps for a 590 at this resolution, assuming max settings (maybe excluding HBAO/SSAO? I don't know much about those options). So if you play around with your AA and Ambient Occlusion settings I'm willing to bet 2 590's would put you at 80+Fps, maybe 90+.

Is that enough? Well, that is for you to decide. Sorry, it's the best I can do.

whitespider
05-09-2012, 09:59 AM
GranTurismo

The 6990 and 590 are about to rapidly decline in value however. It might be a good idea to switch. Then again, I can't say when the 685/7xx cards are coming out. Next year? Later this year?

The fact is, you could just drop a whole bunch of settings in games and get close to that fps. Use fxaa rather than standard AA, things like that. Or you could get a 690 and it will definitely give you better performance (when you have 4x gpu's things become messy). Another question is, will the next single nividia card be powerful enough to outperform a 690?

Personally? I am going to upgrade from my 6990 to 670 sli, it should be 80% faster than my current already powerful gpu. I do have the extra incentive of getting the bonus hz however. You already have that.

Sn0_Man
05-09-2012, 10:03 AM
FYI both the 6990 and the 590 *should* hit 100Hz. The 670 would hit 120, but if you SLI them then ur stuck at 100Hz again unless nvidia does some big SLI driver changes

whitespider
05-09-2012, 10:52 AM
FYI both the 6990 and the 590 *should* hit 100Hz. The 670 would hit 120, but if you SLI them then ur stuck at 100Hz again unless nvidia does some big SLI driver changes

No way will my 6990 hit 100hz. :( I mean... I would obviously love it if it did. However it's essentially a crossfire 6970, and crossfire is a bit of a problem it seems. Time will tell. For now I will keep my expectations in check. 85hz target. 100hz - what a nice surprise

Sn0_Man
05-09-2012, 11:10 AM
Well then, I won't get your hopes up, but I do strongly recommend that you at least test the monitor with your 6990 before selling it :)

For Science, if nothing else.

goji
05-09-2012, 11:30 AM
FXAA bluring the screen doesn't bother you? I rather play with no AA than FXAA.

whitespider
05-09-2012, 11:50 AM
FXAA bluring the screen doesn't bother you? I rather play with no AA than FXAA.

I was originally against fxaa for that very reason, however I have found it actually improves the general shading of most game engines. You lose some texture filtering. At no point is the majority detail lost (unlike enb), your pixels are still being respected. It's just performing a more scene wide sweep. It's kind of like changing Anisotropic filtering from 16x to effective 8-7x, in exchange the scene seems.. Deeper somehow.

At 2560x1440, the resolution we are all aiming for. The blur is far more minimal as well. At least from my experience with my u2711.

For me, it's kind of like enabling an extra layer of post processing (that's essentially what it is), games that have weak -vanilla- post, benefit from it. Games that already use a lot inherently in their engine, don't.

HyperMatrix
05-09-2012, 04:07 PM
I haven't noticed FXAA blurring the screen. It may be happening. I just never noticed it. Except when I have MSN Messenger open because FXAA kicks in and blurs the hell out of my text. But it doesn't do the same to text in game/etc...which goes back to me not noticing any blurring.

GranTurismo
05-09-2012, 05:41 PM
I don't know what to do then. Would I be able to sell my 590 for $400 do you reckon? I'm not pushed about 120 hz. 101hz will do nicely.

HyperMatrix
05-09-2012, 06:02 PM
I don't know what to do then. Would I be able to sell my 590 for $400 do you reckon? I'm not pushed about 120 hz. 101hz will do nicely.

Easily. Look here: http://www.ebay.com/itm/EVGA-GeForce-GTX-590-Classified-03G-P3-1598-AR-/270969362623?pt=PCC_Video_TV_Cards&hash=item3f170858bf

But going from a 590...I'd just suck it up for a few months until the 780's come out. Because you have 2 options. If you want to max out graphics and get max framerate, you need to do one of the following:

2x GTX 690 = $2000 and brute force your way with current technology

or

2x GTX 780 3gb/6gb + i7 3770k + New Mobo to support/take advantage the 512bit bus and pcie3 = $1800 with a few months waiting.

Though...I do have 2 gtx 680's now. Lol. But I'll be selling/upgrading once the 780's come out. Those cards should finally be able to handle any game maxed out at 120fps (assuming Nvidia fixes the SLI 100hz limit by then!)

120HzNET
05-09-2012, 09:09 PM
I have a buddy with a 690 setup - we are running a few tests on my 2B to see what is what this weekend.

whitespider
05-10-2012, 02:50 AM
I have a buddy with a 690 setup - we are running a few tests on my 2B to see what is what this weekend.

Sweet. keep us updated!

GranTurismo
05-10-2012, 05:42 AM
Thanks guys. I've seen what people look for them on ebay but most are new. The thought of whacking in a second 590 and chewing up all that power (700 w wasn't enough for mys system plus one card) seems like a noisey way to hike up my energy bills...lol.

I'd be anxious to see those 690 results scribby.

whitespider
05-10-2012, 05:21 PM
I did not want to make another whole thread about this 690 review, so I thought I would post it here instead. This is by far - the most impressive review I have ever read. They ACTUALLY FOCUS ON MICROSTUTTER (frametimes) in EVERY SINGLE GAME TESTED.

Look at how terrible my 6990 performs. No freaking wonder I have been complaining about MS for so long. Everyone was telling me that i was "imagining it"

http://techreport.com/articles.x/22922/1

The 690 has the best micro-stutter mitigation ever. As well. That hardware level frame metering was NOT just marketing jargon. It seems VERY real.

bandite
05-24-2012, 08:55 AM
I'm a bit stunned! I got my GTX690 today. Plugged it in, installed the 301.42 WHQL driver. Opened the NVIDIA Control panel and OC'd to 100Hz, no problem of course. Then I thought, what the hell, I must try 120Hz just for fun... and it works?! I thought 100Hz was the maximum for SLI implementation? I'm not joking, see screenshot:

http://img715.imageshack.us/img715/1930/690at120hz.jpg (http://imageshack.us/photo/my-images/715/690at120hz.jpg/)

Could it be that I'm only running on one GPU?

bandite
05-24-2012, 09:45 AM
It turns out that I was running on just one GPU, BUT... when I enabled the other one I was still able to do 120Hz! Allthough the system became very "choppy", everything freezes for a few seconds, then resumes to normal, then freezes and so on, completely unusable but still it was 120Hz with both GPUs enabled. Had to restart to get rid of the freezing phenomenon. I'm now at 100Hz with both GPU's enabled.

HyperMatrix
05-24-2012, 04:31 PM
It turns out that I was running on just one GPU, BUT... when I enabled the other one I was still able to do 120Hz! Allthough the system became very "choppy", everything freezes for a few seconds, then resumes to normal, then freezes and so on, completely unusable but still it was 120Hz with both GPUs enabled. Had to restart to get rid of the freezing phenomenon. I'm now at 100Hz with both GPU's enabled.

Yes this is what happens with all SLI builds that go over 400mhz pixel clock.

bandite
05-25-2012, 01:17 AM
Ok I see. This is my first multi-gpu setup so I have much to learn. :)

Syan48306
07-29-2012, 11:14 AM
I guess rather than starting a whole separate thread, I'd just ask here.

I've got a GTX 690 and I get that it's limited to 100hz due to the pixel clock limitation. What I don't understand is why does it and all SLI setups have trouble with the overclocking yet it's perfectly fine on true 1080p 120hz monitors? If they're able to handle those 120hz monitors, what's fundamentally different with the catleaps when compared to those?

jedi95
07-29-2012, 02:01 PM
I guess rather than starting a whole separate thread, I'd just ask here.

I've got a GTX 690 and I get that it's limited to 100hz due to the pixel clock limitation. What I don't understand is why does it and all SLI setups have trouble with the overclocking yet it's perfectly fine on true 1080p 120hz monitors? If they're able to handle those 120hz monitors, what's fundamentally different with the catleaps when compared to those?

The limit is not "100Hz" directly, it's 400MHz pixel clock. With 2560x1440 that just happens to work out to 100Hz. At 1920x1080 it works out to something like 180Hz, so you don't run into this problem running 120Hz on a 1080P display.

necriss
07-29-2012, 11:10 PM
Its a driver limitation due to the fact beta versions cap around 85hz/330 pixel clock, nvidia can set this value at will which means it can be removed with some hex editing in a similar fashion to ToastyX's AMD patcher.

Syan48306
07-30-2012, 08:19 AM
Well, has anyone had any luck editing the hex so that they've enabled 120 hz on an SLI 680 setup?

Sneaky
07-30-2012, 08:26 AM
Well, has anyone had any luck editing the hex so that they've enabled 120 hz on an SLI 680 setup?
no not yet, maybe never, unless nVidia fix it.
I said maybe ......

OskyATL
09-15-2012, 02:56 AM
I'm always confused on discussions that focus on gpus and cpus in the $1000 price range. Am i wrong to be under the impression that the radeons 7970 is focusing on smarter gpu processing to better handle higher res and settings? and that the 690's approach is a brute force approach with basically an SLI configuration on the same card? it seems like a waste of money considering that probably within a years time there will be cards already matching it's performance while drastically cheaper. ya you can by the bruteforce now and have an artificial sense of being on the edge of technology, but if the radeons performance is viable the game manufacturers will jump on this and nvidia will be playing catch-up. I enjoy the idea that 2-680s can compete or even out perform the 690.

that's not to say that it makes any sense to get 2-7970's in the anticipation of it being supported as well. It's simply an inversion of the same problem with the 690.

am i wrong to assume that to insure being on the actual edge of this technology is being in the right place at the right time? spending a moderately inflated price right before the market changes to support the technology?

of course i could be wrong on all of this.

with that being said I would say my bets on the radeons 7970's future iterations. The community here seems poised to latch on to the next big thing that will handle the 1440 at 120hz and won't have as many compatibility issues.

After a long time being away from the tech community, i feel like I'm in the same place technology always is; Chasing the red queen. Not wanting to jump too soon into technology to feel the penalty in my wallet and it's excessive diminishing returns in turn of the following year; nor wanting to buy at the tail end of exiting technology and trying to find the fuzzy warm middle where I can insure that I will be comfy for atleast a few years.

now if I could just figure out when and what to buy.

edit: just read hyper post so I think I'll be going to look at some 980's :D
if I made an error's would appreciate someone to correct me.

HyperMatrix
09-15-2012, 03:32 AM
Some 980's? I'm confused. But the current generation of cards is adequate in sli. The 780's will provide that extra 25%-75% performance boost over the 680 which will be amazing. The boost depends on what size die they end up using in the 780 design. They probably don't want to do a 75% performance boost when they can get away with 25-33% boost for much less cost per card, while at the same time saving their big boost for yet another card.

I'd personally get a couple 670's, oc them, then swap to 780's when they come out. Shouldn't lose much $$ on the trade in. But sli 780's will take care of you on the gpu side of things. Whether your CPU will bottleneck your gpu in some games...well...don't expect a big CPU boost any time soon. Damn mmo's.

OskyATL
09-15-2012, 04:19 AM
Some 980's? I'm confused. But the current generation of cards is adequate in sli. The 780's will provide that extra 25%-75% performance boost over the 680 which will be amazing. The boost depends on what size die they end up using in the 780 design. They probably don't want to do a 75% performance boost when they can get away with 25-33% boost for much less cost per card, while at the same time saving their big boost for yet another card.

I'd personally get a couple 670's, oc them, then swap to 780's when they come out. Shouldn't lose much $$ on the trade in. But sli 780's will take care of you on the gpu side of things. Whether your CPU will bottleneck your gpu in some games...well...don't expect a big CPU boost any time soon. Damn mmo's.

sorry meant 780's. any spec on price range?

btw...more and more I look I see people bashing tomshardware...didn't they use to be good?

HyperMatrix
09-15-2012, 04:36 AM
sorry meant 780's. any spec on price range?

btw...more and more I look I see people bashing tomshardware...didn't they use to be good?

Price shouldn't be more than the same $500-$650 range you see on the 680 cards.