Widescreen Gaming Forum

[-noun] Web community dedicated to ensuring PC games run properly on your tablet, netbook, personal computer, HDTV and multi-monitor gaming rig.
It is currently 20 Apr 2024, 07:44

All times are UTC [ DST ]




Post new topic Reply to topic  [ 11 posts ]  Go to page 1, 2  Next
Author Message
PostPosted: 14 Mar 2005, 17:10 
Offline

Joined: 10 Jan 2005, 09:14
Posts: 3
Having just invested in my first LCD and my first wide aspect display - I've spent allot of time tweaking and finding the right spot for gaming to my satisfaction. I upgraded my 21" Sony CRT and 9800XT to a Dell 2405fpw and X850XTPE.

I was initially disappointed by the results I was getting - comparing the CRT to the LCD. My disappointment was due to my sensitivity to tearing. VSYNC has improved these results as well as the following program.

First off, I am using CAT 5.3's with the standard control panel, not CCC. I'm running at 2XAA and 8XAF with vsync set to application preference. I run my display at its native rez of 1920x1200. The display is set to a 60 Hz refresh rate, but I over-ride the defaults and set XP to 75 Hz. Not sure if this will harm the display, but games seem to load smoother from a visual perspective initially, but no real noticeable game performance boost. Just the initial launch of games seem smoother with 75 Hz not actual game play.

In UT2K4, I edited the .ini file to force vsync on. I set desired refresh rate to 75 Hz, set reduce mouse lag to false, set my FOV to 110 (this is user preference, but 100 at least). Set the menu and game view to 1920x1200 and all details to high. Also, turning on triplebuffering made a significant improvement.

HL2 didn't support native triplebufering. I thought it did, but apparently it does not. Forcing vsync on eliminated the tearing. This was what was making me initially disappointed with games on an LCD. Some people it doesn't bother and they want higher FPS. I understand this, but for me, I had always had vsync off since I had a CRT which gave me 120 Hz refresh - I guess I didn't have tearing that I could see prior to this display.

When I used this program for HL2, it gave me a 50% boost in frames in some scenes.

The program from here: http://www.nonatainment.de/portal/D...ndex=7&tabid=19 is in BETA. But, it lets you setup HL2 and use triplebuffering.

I simply downloaded the program, setup a "Program" called HL2. Set the correct path to point to HL2.exe.

Then, activate "Present changer" plugin set the backbuffer "Count" to 2 for triplebuffering. The rest you can leave at application or empty.

I'm so much happier now with this display and widescreen gaming. To each his own, but the combination of vsync and triplebuffering seems to be the sweet spot for me. Obviously, I'm limited to 60 FPS, but things are smooth and not tearing occurs. I can get the higher fps without vsync, but I'm too sensitive to the thearing on this new display, because the vibrance and quality is so nice that it shows it up much better than my CRT did.

Dim-Ize


Top
 Profile  
 


 Post subject: 2405FPW
PostPosted: 16 Mar 2005, 21:08 
Offline

Joined: 16 Mar 2005, 19:12
Posts: 1
I'm going to try your fixes tonite. I just got my 2405 last night and have a lot of tearing with Half-life. I am also a CRT to LCD first-timer.

I think they should start a separate category just for this monitor - by all indications they are selling fast...


Top
 Profile  
 
PostPosted: 17 Mar 2005, 04:32 
Offline

Joined: 08 Mar 2005, 00:28
Posts: 1
Dim-Ize,

My 2405FPW arrived yesterday, and have a similar response as you. I am a HL2 addict, and while this monitor is stunning visually, and I am chortling with glee on using it for spreadsheets (Tag&Rename for my MP3 collection is great on this wide screen!), I have lots of effects - I assume that is tearing - when fired up Half Life 2.

My card is one step back from yours - I am running a 9800 Pro on a 2.8GHz P4.

Having never felt the need to tweak my graphics setup with my CRT, I would like to ask you (or others who may want to jump in) some noobish questions.

1) What is the relationship between in-game and video card control settings? I went to the standard control panel for the ATI and in the 3D Settings for Direct 3D I have unchecked both "Application Preference" and "Temporal Anti-Aliasing" under AA. I have the slider on 2X and there is an unselectable area that says "Maximum Resolution" 1600x1200" There are radio buttons for "Performance" and "Quality" - did those make a different in your tests?

Any idea what this maximium resolution means? Is that consistent with what you see on your control panel?

Under AF I have the box unchecked for Application Preference and 8X on the slider. The rest of the sliders are "Texture Preference - High Quality" - Mipmap Detail Level - High Quality", "Wait for Vertical Sync - Application Preference" and "TRUFORM - Always Off"

I assume that with this setting it does not matter what I select in HL2 for AA or AF? Correct?

2) The link to the triple buffering tool you provide results in the following message "Das System kann die angegebene Datei nicht finden." That sounds "nicht so gut".

I am still wrestling with getting the FOV commands to play right on HL2 -- off to do more fiddling.

Ken


Top
 Profile  
 
PostPosted: 17 Mar 2005, 06:12 
Offline

Joined: 10 Jan 2005, 09:14
Posts: 3
Congrats on your purchase, Marsmda & Bozo. I hope that you enjoy your new display.

Bozo, I will attempt to answer your questions the best I can. I'll do so, by referring to your question numbers as they were written.

Prior to answering your questions, in case you haven't done so already, I recommend getting the latest drivers from ATI for your video card (I prefer the standard control panel over the CAT Contorl Center - but that's personal preference), the latest version of DirectX from MS, and loading the 2405fpw drivers for your new display (either from the CD or from Dell's site). Set your desktop to 1920x1200x32 and your refresh rate to 60 Hz (I have mine set to 75 at the present, but it doesn't affect game performance, only game loading smoothness - I may set it back in a day or so).

Some people aren't bothered by tearing, but I am. To elliminate tearing, you have to force vsync on. Vsync comes at a performance hit, however, but makes for a much cleaner gaming experience. I recommend using triplebuffering in addition to vsync as it can improve the performance - in some cases as much as 50%. Much of this depends on the amount of memory on your card (128 vs 256+) how much of that memory is in use by textures, and if you have overhead, then triplebuffering is free - and improves performance while using vsync. Here are two good threads on the topic: http://www.rage3d.com/board/showthread.php?t=33801133&page=1&pp=30 and http://www.beyond3d.com/node/13390

1) When you select your settings in the control panel of your video card, they will superseede those of your ingame selection for AA, AF, and vsync. Unless, however, you select application preference in your control panel. Then, they are handled by the game's settings/selection. These selections are personal preference. It determines the amount of AntiAliasing and AnisotripicFiltering used for games. Presently, I use 2XAA and 8XAF while selecting application preference for vsync. Then, I enable vsync in games. I regret to inform you that your card will likely not be able to use AA or AF at 1920x1200 with vsync and deliver very good performance. As far as why it is showing only 1600x1200... That may be because you haven't loaded the correct video drivers for your card or your display. You should see 1920x1200 there.

I recommend for your card that you select Application preference for AA, AF and Vertical Sync. Use HL2 to set those settings since you will be able to easily tell which visual settings give you the best performance. Vsync will elliminate the tearing, but cap your frames to 60 FPS, since this locks the FPS to the refresh rate of your display, which is 60. I would also turn texture preference and mipmap detail level down to either performance or quality in the contorl panel, and set truform to always off. Your card won't be able to generate the overhead necessary to have those features turned on with vsync at any decent performance. But, you may find a happy medium there for your taste.

2) I have no idea what that error or message reads. My only hunch is that you must have .NET 1.1 framework loaded in order to run this program, the latest DirectX version from MS and current video drivers.

Again, the link is: http://www.nonatainment.de/portal/DesktopDefault.aspx?tabindex=7&tabid=19
The file itself is: http://www.nonatainment.de/portal/DesktopModules/Downloader.aspx?itemId=10&filename=dxtweaker.zip

I hope some of this helps you. I spent some time finding just the right mix for myself. I had upgraded from a 9800XT to an X850XT PE (only because I could only find the PE in AGP or I wouldn't have opted for it). My 9800XT couldn't drive vsync + AA +AF and frankly, the X850 is pushed too.

Many people comment that tearing is no biggie. Because, you can exceed 60 FPS in HL2 with 2XAA 8XAF. In some cases with those settings, I get 130 FPS at 1920x1200 without vsync. But, the FPS fluctuates all over the place and I get tearing. So, I've opted for vysnc on with triplebuffering enabled to help the overhead. Frames are locked at 60, sometimes they dip to 55 or so in vast scenes. But, it is a much smoother gameplay for me.

Good luck,

Dim-Ize


Top
 Profile  
 
PostPosted: 27 Jul 2005, 02:51 
Offline

Joined: 26 Jul 2005, 23:02
Posts: 1
To clear one thing up from your first post Dim-Ize, setting the refresh rate to 75Hz doesn't affect anything. Since the pixels in CRT monitors lose their charge, they need to be recharged so many times every second by the electron beam, which scans horizontally. If the refresh rate goes low enough, you can see it blinking. It really depends on the person and the awareness of the brain and the amount of ambient light, but you can see it blinking at low frequencies. LCDs however work by using electromagnets to operate metals in the pixels and it works them like blinds, allowing or disallowing light from the backlight to pass through, or adjust the intensity of the light passing through. Because they're using electromagnets and are charged at all times, the monitor does not flicker, or need to be refreshed. The pixel only changes color as it is told. If the pixel doesn't change color from frame to frame, it won't do anything, whereas a CRT updates every pixel all the time. But since the refresh rate is lower whether you can see it or not, horizontal tearing can occur and you pointed that out.


Top
 Profile  
 
PostPosted: 26 Aug 2006, 08:46 
Offline

Joined: 30 Jun 2006, 07:46
Posts: 119
The display is set to a 60 Hz refresh rate, but I over-ride the defaults and set XP to 75 Hz. Not sure if this will harm the display, but games seem to load smoother from a visual perspective initially, but no real noticeable game performance boost. Just the initial launch of games seem smoother with 75 Hz not actual game play.


I don't quite get this. What does the refresh rate have to do with "the initial launch" of a game? Seems to me that the only impact, gaming-wise, that your refresh rate could have would be in "actual game play." Unless you're talking about, what, the loading screen? The menu screen?

Maybe I'm just not understanding what you meant here.


Top
 Profile  
 
PostPosted: 15 Oct 2006, 12:29 
Offline

Joined: 16 May 2006, 14:55
Posts: 238
ok, i've gone over this many many times with cs fanatics



#1, If your framerate is above your refresh rate you won't see anything past your refresh rate.

#2, if you want to refresh above 60 you should find how to uncap your framerate in a game to match your refresh rate, the reason for doing this is say your playing Doom 3... it has a cap'd Frame rate of 60, the most generic rez people use would be 1280x1024 @ 85hz *just using this as an example, the basis of what I'm saying applies to every resolution at it's normal RR* , what's wrong with that is only 60 frames will ever be displayed.. so your actually forcing the game/monitor to refresh 25 of those frames more then once and in turn actually making it not smooth. what you should strive for is to have your refresh rate/Framerate the exact same no matter what.

also ID was evil and set Anisotropic to 8x in the config and didnt tell anyone :P . I know allot of people that ran out and upgraded b/c ani was at 8x and they didnt know that if they turned it off they're current card could run the game fine, sneaky bastards


so for HL2 you would want to open your console and type fps_max *whatever your refresh rate is at*


The reason I posted this is once you understand this logic you will get better performance esp with Vsync on


Top
 Profile  
 
PostPosted: 23 Jun 2007, 09:49 
Offline

Joined: 17 Jun 2007, 19:37
Posts: 1
I'm running Painkiller at 1680x1050 with all settings maxed and AA at 4x and AF at 8x and getting something around 280 FPS. I can see tearing when moving quickly--if I stare at it with the intent to see it--but it doesn't distract me at all since when I'm moving quickly I'm too busy in the action to stare at the tearing and be bugged by it. Now, when I turn Vsync and triple buffering on, my FPS is locked at 60 and the action is sluggish and totally unacceptable. It is very smooth, yes, but the lack of frame rate gets it to the point of why even have a powerful computer if you're going to force it to 60 FPS?


Top
 Profile  
 
PostPosted: 23 Jun 2007, 15:58 
Offline

Joined: 16 May 2006, 14:55
Posts: 238
Turning VSync on does not lock the FR at 60, it will lock you Framerate at your Refreshrate to prevent screen tearing. If screen tearing does not bother you then don't even worry about it.. your lucky.

And you claim to be running it at 280 fps... with vsync on your sort of actually seeing the 280fps.. but most of the frames are't matching b/c I doubt it can maintain a 280 minimun.

If Screen Tearing does not bother you then don't read my last post and just leave VSync off.

What I said before was to explain to people how to get much better performance with VSync on.

Why would you run a game at 280fps, you really can't see past 60.. but that's another topic. I would stick with 85 *or 60 if the game is cap'd* and never have it slow down and also not have screen tearing. The only time I tell people NOT to do what I said is when even with VSync OFF they can't average around 60fps and they're not quality whores like me.

Also if you say turning it on locked your FR to 60 then that means it's the lcd's 1680x1050 limit ... but it's really the fact that it's an lcd.. and/or painkiller has a built in framecap at 60.. but I doubt it.

_________________

A note to everyone else """Mountainbry""""" is correct.. you will see more frames with VSync off, it's just a matter of whether or not Screen Tearing bugs you, if so then read my 2 posts for better performance with it on.


Top
 Profile  
 
PostPosted: 23 Jun 2007, 18:28 
Offline
User avatar

Joined: 29 May 2006, 02:23
Posts: 873
Have you reinstalled your drivers since you upgraded your card? you probably should. The 2405 is a fantastic panel, alot of users here love it more than the 2407 its new brother. I've never gamed with it but I have gamed with a 2005 which has simular specs, I never noticed tearing except in Prey. HL2 acted fine and this was on a NVIDIA 6800 Vanilla so its not much more powerful than yours.


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 11 posts ]  Go to page 1, 2  Next

All times are UTC [ DST ]


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  




Powered by phpBB® Forum Software © phpBB Group