Widescreen Gaming Forum

[-noun] Web community dedicated to ensuring PC games run properly on your tablet, netbook, personal computer, HDTV and multi-monitor gaming rig.
It is currently 25 Apr 2024, 18:34

All times are UTC [ DST ]




Post new topic Reply to topic  [ 28 posts ]  Go to page 1, 2, 3  Next
Author Message
 Post subject: ATI Eyefinity in action
PostPosted: 10 Sep 2009, 18:19 
Offline
Editors
Editors
User avatar

Joined: 10 Jun 2005, 21:24
Posts: 1371
http://www.youtube.com/watch?v=dkwVw-azZ0M

Sweet! :D

Looks like its made private. Here's a picture:



Anandtechs review is up:
http://www.anandtech.com/video/showdoc.aspx?i=3635

That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.

Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed.


Top
 Profile  
 


 Post subject: ATI Eyefinity in action
PostPosted: 10 Sep 2009, 21:29 
Offline
User avatar

Joined: 24 Jun 2005, 22:58
Posts: 1045
Yeah, I noticed something like native triple screen support when looking for when the 5800 series will be released. But six!? Even more impressive :D

Not that I'll ever be using the feature, I barely have room for the two screens over here.


Btw, I already decided that the HD 5870 is first on my wishlist atm, for when I get a new pc.


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 10 Sep 2009, 21:51 
Offline

Joined: 28 Jun 2009, 22:17
Posts: 760
I just wish ATI unlocked their catalyst AI internal profiles, so that users can build any profile they want ala nHancer


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 10 Sep 2009, 23:56 
Offline
User avatar

Joined: 14 Nov 2006, 15:48
Posts: 2356
OH MY!!!!!!!!
If a game pulls its resolution list from Windows, it'll work perfectly with Eyefinity.
^ So basically all TH2G games will work with it. This is GREAT news for us.



AMAZING. AAMAZING.

I WANT THIS PROJECTOR.



http://hardocp.com/article/2009/09/09/amd_next_generation_ati_radeon_eyefinity_technology

The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.


First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 11 Sep 2009, 01:21 
Offline

Joined: 02 Jan 2006, 18:49
Posts: 913
LOL, ATI needs to sort out whether they want to fit the budget gaming market niche or the elite crowd, esp now that Intel has what appears to be a definitive Phenom killer in the i5.

I also feel AnandTech are not necessarily the go-to place for tech advice like many seem to think, esp concerning their recent glowing review on the new i5/i7 CPUs. They had this to say about the new 1156 CPUs...

"I'm going to go ahead and say it right now, there's no need for any LGA-1366 processors slower than a Core i7 965."

...but in reality this comment from a poster on Bjorn3D makes more sense...

"The i5 750 is a good CPU but all those charts show it barely topping a 965 running at stock 3.2 with turbo boost disabled.

920 can hit 3.2 without breaking a sweat, get it to about 3.4 3.6 and it's outrunning the i5 750 when the 750 is maxed out thermally speaking."


This was in answer to reviews showing the i5 benched at 3.85GHz, where it's thermally maxed. AnandTech doesn't seem to acknowledge that this AND the better upgrade path of socket 1366, which will fit the upcoming Hex Core i9s, indeed makes it still very worthwhile to purchase the 1366 i7s they so quickly dismiss as being obsolete.


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 11 Sep 2009, 12:04 
Offline
User avatar

Joined: 14 Nov 2006, 15:48
Posts: 2356
More good info from our friends over at PCGames Hardware!

http://www.pcgameshardware.com/aid,694855/Ati-Eyefinity-Extreme-gaming-with-24-displays-Crysis-and-WoW-at-7-680-x-3-200-pixels-246-megapixels/News/


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 11 Sep 2009, 12:50 
Offline
Editors
Editors
User avatar

Joined: 14 Oct 2003, 13:52
Posts: 5706
LOL, ATI needs to sort out whether they want to fit the budget gaming market niche or the elite crowd, esp now that Intel has what appears to be a definitive Phenom killer in the i5.

Why are the two mutually exclusive? They've got both covered anyway.

Just because EyeFinity can do better than TH2Go without (apparently) too many problems, doesn't mean ATi are abandoning single monitor gaming, or the low end. GPU and CPU divisions are separate, don't forget. There is nothing stopping ATi from having these insane monitor count setups on an i7 system. I know I intend to do it with my X58 rig, rather than run a Phenom II. Although I'll admit it'll be likely they'll market it with the "Runs best on Phenom II" (probably when they get the hexacore CPUs to desktop, or possibly the 12-cores - some posts on XtremeSystems show a 2.2GHz 12-core Opteron hitting 3.2GHz without much trouble at all...)

Regardless of that, i5 is anything but a Phenom II killer for one reason very important reason: multi-GPU. The single PCI-E gen 1.1 16x lane that the onboard PCI-E controller can do will kill GPU bandwidth in any scenario where there is more than one GPU. Multi-GPU is a scenario where Phenom II is ahead of Core i5, and nagging at the heels of Core i7. Sure, i7 can encode a lot faster, but for pure gamers, there is no reason to avoid Phenom II.

But basically, anyone who wants this sort of ultra-high-end rig is going to buy it. It'll probably keep AMD alive if Bulldozer isn't the killer they're making it out to be. Anyone who doesn't want ridiculous numbers of screen is going to go to either ATi or nVidia, as they're not going to care.

The fact that a single 5870 can drive six screens and apparently have the framerates playable is going to make nVidia nervous, I bet. If RV870/890 is as good as that, GT300 is going to have to be stormingly good, or nVidia are going to have to cut their own throats to get them to sell (I can't see a huge monolithic die like GT300 being cheap to make, nor can I see yields being good) much like the huge price cuts shortly after GT200 launch, when ATi's cards were 90% of the speed for 50-60% of the price.

That being said, ATi sure are gonna have to get their drivers 100% sorted out... I'd hate to try troubleshooting driver issues for 6 to 24 screens. Further more... that six screen 5870 better come with all six of those mini-display port to DVI adaptors in the box, or there's gonna be trouble. If that 24 screen four 5870 monster can be run off of a 1200w PSU, that says good things for the power draw of RV870, too - maybe they finally got that GDDR5 idle power draw issue more under control. Especially if the card is supposed to draw <190w loaded and <30w idle. That'll be really amazing. Now, if Stanford could get Folding to run well on ATi hardware again (remember, with the X1900 series it would run on ATi cards at the time when it wouldn't run on nVidia cards) then this gets (again) even more exciting.

EyeFinity will be amazing for lots of things other than games, though... I think nVidia's screen spanning SLI on their Quadro cards just got a serious competitor for CAD/CAM and medical fields. This EyeFinity might just force nVidia to unlock their Quadro spanning SLI tech on normal cards as well, rather than making people buy crazily priced cards.


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 11 Sep 2009, 12:59 
Offline
Editors
Editors
User avatar

Joined: 30 May 2005, 14:21
Posts: 964
where is Nvidia when all this has been happening... lol...

And also is that display port they are using?

_________________
I still visit occasionally.


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 11 Sep 2009, 13:43 
Offline
Editors
Editors
User avatar

Joined: 14 Oct 2003, 13:52
Posts: 5706
Mini Display Port, yeah.

nVidia seem to have been arguing with Intel about QPI licensing and worrying about GT300.

To be fair, though... ATi kept EyeFinity damned quiet until a very short time ago. It seems they've managed to put a lid on the leaks they used to have...


Top
 Profile  
 
 Post subject: ATI Eyefinity in action
PostPosted: 12 Sep 2009, 02:27 
Offline

Joined: 02 Jan 2006, 18:49
Posts: 913
Just because EyeFinity can do better than TH2Go without (apparently) too many problems, doesn't mean ATi are abandoning single monitor gaming, or the low end.
Myself having an X1950Pro that they recently dropped from support to focus on other GPUs, I definitely am not seeing it as you do. It's apparent to me they have their priorities out of whack. I suspect they got some investment money from corporations wanting them to make this Eyefinity tech because it is near useless to their consumer base, nor do I see their corporate base making them any more money long term than their consumer base does as some have implied. What's worse is consumers are being easily impressed with this tech, even though the vast majority will never be able to afford such a setup. I say spend time/money making something most consumer gamers can actually use, esp when they're dropping support in some places.

So where is Nvidia in all this? They're probably seeing it as an even bigger waste of R&D than their acquiring and pushing PhysX was. Low and behold the much maligned MS will come along with DX11 and set things back on the path of practicality, because that and W7 are what is going to help consumer gaming most in the months to come. DX11 and W7 will do far more for multi threading than either camp's CPUs, but I think you seriously understated the i5, even though I am passing it over for a 920 (they can be had for $200 now).

Benches show the i5 beating even a stock 965 when OCed. Most reviewers are garnering it as trouble for AMD's Phenom. Also, the onchip Pci-Ex controller can run two Pci-Ex slots at 8x each simultaneously. That is easily enough to power two top shelf single GPU cards to their fullest with NO CPU/GPU lag.

In reality though, the onchip Pci-Ex controller and P55's PCH are really just cheaper ways to build in speed than QPI, HyperThreading and D0 stepping, all of which keep the i7 900 series well ahead of the new 1156 chips when it comes to thermal threshold, OCing and overall speed potential. The truth is you have to push an i5 to it's thermal limit, which is much lower than the 920s, just to beat the 3.2GHz 965. A 920 could do that without breaking a sweat.

There's also the fact that the 900 series, being on socket 1366, is a much more future ready platform, as the i9 Hex Cores are being slated for it. So the i5 and new 800 series can easily compete with and outdo the Phenoms in a similar price range, and Intel has their top flight platform which is miles ahead too, no competition.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 28 posts ]  Go to page 1, 2, 3  Next

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users and 2 guests


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  




Powered by phpBB® Forum Software © phpBB Group