PDA

View Full Version : Going green and still unsure of steps needed



benzite
10-29-2013, 03:27 PM
I need some advice on Switching over to nividia from a amd card.
With the 290x gpu sound level being totally unacceptable I am going with the gtx 780 or 780 ti which comes out very soon.
What would be the proper steps in order to take to make the switch.
I am not too sure now that my monitor is overclocked and how to proceed.
I am using Qnix 120 htz using toastys amd patcher.
Using windows 7 as well.

With the new price drops on the 780 I guess its a no brainer at this point to pull the trigger.
Thanks all.

BlackOctagon
10-29-2013, 04:08 PM
Hey there. Firstly, for my own curiosity can I ask if you have actually bought and used a 290x?

Secondly, here's what I would do:
1) Set monitor to 60Hz, remove all custom resolutions except 60hz, unpatch drivers and restart
2) Uninstall AMD drivers using Atiman Uninstaller (http://www.mediafire.com/download/0jdko53gk5npzo0/Atiman+Uninstaller+v.7.0.2.msi)*
3) After it's done its thing and restarted, run a pass of CCleaner (including the registry cleaner function)
4) Install latest NVIDIA drivers
5) OC with EVGA's PrecisionX, which is meant to be very straightforward
6) If for some reason PrecisionX does not work, try the 'old fashioned' way using ToastyX's various programs from MonitorTests.com

*Note: If you have an AMD cpu, this might not be the best idea because it could affect your chipset drivers. In this case, just uninstall the normal way and then run Driver Sweeper

benzite
10-29-2013, 05:11 PM
I personally did not but someone I know does and he seems to complain alot about the noise level.
I just do not have the heart to agree with him.
I am leaning towards the EVGA acx 780 so I would imagine that Precision X would work perfectly with it.
I did not know that it could be used to oc the monitor itself unless I misunderstood.

In any case thanks for the advice.

HyperMatrix
10-30-2013, 01:41 AM
290x is almost the single best GPU out there right now. If you're going to SLI, I'd still go with the Nvidia cards especially after the recent price cuts. AMD is still having issues with microstutter. And the biggest seller for me with Nvidia right now is Shadowplay. The hardware encoded game capture with almost no performance hit is a godsend...I did some tests and I don't think I dropped more than 1 or 2 fps while recording 1080p at 60fps, with batman arkham origins at 2560x1440 with everything completely maxed out.

benzite
11-27-2013, 06:57 PM
A big thanks to you both.
Had held off for a while in hopes in seeing some new 290x with aftermarket coolers before making the move.
Sadly none.
I did jump on the EVGA 780 and boy was I caught off guard on how simple it was to oc the monitor using Precision X.
The price cuts made it an easy decision.
Thanks again.

BlackOctagon
11-28-2013, 01:15 AM
Enjoy. Yeah I really struggle to see the logic in the timing of the 290x release. It's fine to have a small wait between reverence and AIB models but this is too long. I feel like the hype has already died down. Could be wrong though. Maybe they're aiming at the Christmas dollar?

Sent from dumbphone (pls excuse typos and dumbness)

Fimconte
11-28-2013, 03:55 PM
Enjoy. Yeah I really struggle to see the logic in the timing of the 290x release. It's fine to have a small wait between reverence and AIB models but this is too long. I feel like the hype has already died down. Could be wrong though. Maybe they're aiming at the Christmas dollar?

Sent from dumbphone (pls excuse typos and dumbness)

Pre-Christmas release would make a lot of sense tbh.

And considering how much the Accelero Xtreme III helped in the tomshardware (http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html) article, I'm salivating at the though of upgrading to dual 290's.

Shadman
11-28-2013, 05:38 PM
Id like to know why they wait so long as well. Maybe because "the new cards are supposed to run at 95C"...

We'll see

HyperMatrix
11-28-2013, 07:38 PM
Id like to know why they wait so long as well. Maybe because "the new cards are supposed to run at 95C"...

We'll see



Even the Titans scared me when I saw it was considered "normal" to run them at 85c with the stock cooler. It doesn't matter if the card was designed to run at a certain temp. Whatever that may mean. That will still result in faster degradation. The fact that a card with a smaller die than the Titan uses more power than the titan is scary. On the other side of things...if they're designed to be stable at 95c...I wonder what that means for overclocking capability under water. It could be interesting.

Fimconte
12-06-2013, 09:43 PM
Even the Titans scared me when I saw it was considered "normal" to run them at 85c with the stock cooler. It doesn't matter if the card was designed to run at a certain temp. Whatever that may mean. That will still result in faster degradation. The fact that a card with a smaller die than the Titan uses more power than the titan is scary. On the other side of things...if they're designed to be stable at 95c...I wonder what that means for overclocking capability under water. It could be interesting.

I've seen quite a few posts about watercooled 290's doing 1300core / 6000-6800+memory at very reasonable temperatures.

Selling my 7950's (prices are insane due to LTC hype, losing only 50€ after using them for almost a year) and then either waiting for non-ref 290's or getting a ref 290 and installing a aftermarket cooler on it (one of the Gelid coolers is quite cheap and offers very good results over the stock cooler).

Although tempted by water as well, but I didn't want to jump on that train before my next system overhaul (DDR4 via Haswell-E or Broadwell).
Will have to see what happens though.

As for Crossfire microstutter:
Supertile fixes it completely (at the cost of GPU usage being almost always at 99% in newer games and resulting in higher temps/electric bill) and the Frame Pacing option is getting improved a lot for AFR.

whitespider
12-10-2013, 07:08 PM
Personally I think gsync is way, way, way, way, way, bigger than mantle. Although, i really do like mantle. It's just not in the same ballpark as gsync.

The quality of a frame, is unrecognizably more important than the QUANTITY of a frame. The quality is all about it's timing. The VALUE of it's timing, is all about the presentation rate, not the output rate, of that frame. Examples.


44fps min framerate on 780 with gsync = 100% Render and see thus 100% quality. Minus 3% looks due to ghosting. 97% output quality.

56fps min framerate on 290x with mantle = 10% render and see, due to delay frames (linked to not keeping refresh) and thus 10% quality.

98fps min framerate on 290x crossfire with mantle = 15% render and see, due to delay frames (linked to not keeping refresh) and thus 15% quality. (It would have been 18% but imperfect frame pacing gets in the way of mgpu perception on top of normal problems)

78fps min framerate on 780 SLI without gsync = 13% render and see, due to delay frames (linked to not keeping refresh) and thus 13% quality.


(I cannot speak for SLI + GSYNC, as I am not sure how frame pacing will interact with gsync)



The about is about the perception of smoothness, the thing that we measure 'silky smooth' with. Not input rates. Which are entirely separate, although gsync is as fast as vsync off in that regard. Naturally, if you had 4000 dollars worth of QUAD SLI goodness, your input rate would be great, because you would be pumping 200fps in even the most demanding games. But that would only be useful, if ALL and i mean ALLLLLLL you cared about was finger press to gunfire. And if that was the case, you would not own an IPS, and would have a 144hz TN panel with a 1ms reaction time.


So then, what is the POINT of expensive hardware, when gsync arrives for IPS? Well, silky smooth is not a SINGLE entity, there are degrees of silky smooth. Each framerate is the ACTUAL framerate, so assuming that SLI frame pacing works well with gsync, to give a comparible silk smoothness to a single gpu (uknown until reviews) then ideally, having that 98fps on SLI, or 150fps on quadsli, will produce that much MORE of a silky experiance, and it will also remove ghosting. So the improvement will still be there. Again, assuming sli works properly with gsync. Whereas even if crossfire PERFECTS frame pacing, which i hope they do, it won't be in the same world or universe, as what gsync can produce.Because there will be zero communication from the frames to the refresh of the frames, leaving horrible delays, which create the perception of a low/high framerate hybrid at a constantly high framerate.


Here are some more examples of how SILLY better I think gsync is that everything else.

GSYNC + 150 dollar (compatible) graphics card >>> smoother than >>> Quad Titans >> In any game

(This is hypothetical) There is a bug in star citizen that makes nividia cards only go over 49 fps here or there, nomatter the hardware. ANY (compatible) nvidia solution with gsync still smoother than Mantle enabled crossfire X 55-180fps 290x custom pcb, 500mhz overclocked, custom cooled cards on a 144hz monitor.





What I am getting at here. Is that gsync is the monitor and graphics card equivalent of jesus.

BlackOctagon
12-11-2013, 11:44 AM
AFAIK it is yet to be definitively confirmed that either G-Sync or Mantle 'must' remain limited to one GPU team forever. Maybe for the first generation or two, but after that I would not be surprised to see both technologies available via GPUs from both companies

Sent from dumbphone (pls excuse typos and dumbness)

Shadman
12-11-2013, 03:41 PM
Well gsync as it is currently is nvidia only, and the kits with the monitors will only work with Nicosia GPUs. If AMD wanted a similar technology, they will need their own monitor kits.

Mantle, on the other hand, nvidia is free to use, but because nvidia uses specific hardware to power their cards, and AMD uses high general computing hardware(compute), it wouldn't be worth it to use on nvidia, except for maybe the 780+, which is unlikely they will do.
(Side note, this is why AMDs are 10x better for bitcoin/litecoin mining)

whitespider
12-11-2013, 07:35 PM
That just means nvidia needs to find another way to leverage their hardware to the people that don't understand that numbers = not everything. So perhaps if they do the same thing with cpu>>gpu, as they did with monitor>>gpu, and make the two work better together. That might give them some numbers across ALL games, rather than just mantle ones.

Imagine that, Nvidia boosts all cpu>>gpu performance by xx%, plus gsync.

Shadman
12-11-2013, 09:09 PM
Its one thing to just say "find another way" and another to engineer it. However, I don't think mantle should be a threat, as Nvidia has many features, ie Shadowplay, integration with Shield, and of course gysnc. They might have something else sometime soon, but I feel like those are an OK tradeoff to having higher raw performance, mantle or not. Like I said, AMD compute kicks ass, and is a separate feature altogether as well, kicks CUDAs butt on several programs.

So the choice just depends on the users needs.

whitespider
12-12-2013, 04:43 AM
Plus nvidia has some hidden features that I personallly love. Forcing ssao, was always awesome in making older games look newer and far better shaded, and most of you probably don't know that recently nvidia changed "high quality" ssao into HBAO+. Before, high quality ssao was simply high quality traditional ssao, but now with HBAO+, it's less costly, by far (actually about as costly as low quality ssao) and does not flicker, looks far less fake.

I ran hbao+ on psychonaughts, and very little of the game had that blotch ssao around it. But if you look in the distance, every single object has a little shadowing. Which makes the world look far more alive. That was using the skyrim profile.

Not to mention all the crazy anti alasing you can do on a nividia card, and in direct x 11 now. Basically, nividia knows drivers. And they let the advanced users willing to tweak, do some pretty incredible things with all of that extra gpu grunt on older games.

toilaai19923
04-17-2015, 02:45 AM
bi viết hay qu, bạn c FB ko để c thể dễ trao đổi hơn :D