Anonymous edits have been disabled on the wiki. If you want to contribute please login or create an account.


Warning for game developers: PCGamingWiki staff members will only ever reach out to you using the official press@pcgamingwiki.com mail address.
Be aware of scammers claiming to be representatives or affiliates of PCGamingWiki who promise a PCGW page for a game key.

Difference between revisions of "Nvidia Profile Inspector"

From PCGamingWiki, the wiki about fixing PC games
m (combined NvGames.dll references)
(Initial editing pass. Removed bad or unneeded references, removed useless sections, and organized into wiki format.)
Line 29: Line 29:
 
When you run Nvidia Profile Inspector for the first time you will be greeted with this screen. The "Global" profile
 
When you run Nvidia Profile Inspector for the first time you will be greeted with this screen. The "Global" profile
 
''These are the settings that globally apply to any application. It is '''VERY IMPORTANT''' that you minimize what you force globally to only basic things such as Texture Filtering settings'' [[File:NPI2016.jpg|none|left]]
 
''These are the settings that globally apply to any application. It is '''VERY IMPORTANT''' that you minimize what you force globally to only basic things such as Texture Filtering settings'' [[File:NPI2016.jpg|none|left]]
<br>[http://images.nvidia.com/geforce-com/international/comparisons/just-cause-3/just-cause-3-nvidia-control-panel-anisotropic-filtering-interactive-comparison-001-on-vs-off-rev.html This is an example] of why you should set Texture Filtering globally and forget it. Some developer's own solutions are not as high quality as they should be.
+
<br>
 +
===Why force [[Glossary:Anisotropic filtering (AF)|Anisotropic Filtering]]?===
 +
[http://images.nvidia.com/geforce-com/international/comparisons/just-cause-3/just-cause-3-nvidia-control-panel-anisotropic-filtering-interactive-comparison-001-on-vs-off-rev.html This is an example] of why you should set Texture Filtering globally and forget it. Some developer's own solutions are not as high quality as they should be.
 
<br>If any game has a conflict with driver forced Texture Filtering, Nvidia usually sets a special flag on a profile for texture filtering that disables driver overriding for that game. (Far Cry 3 and Doom 2016 are examples of this)
 
<br>If any game has a conflict with driver forced Texture Filtering, Nvidia usually sets a special flag on a profile for texture filtering that disables driver overriding for that game. (Far Cry 3 and Doom 2016 are examples of this)
 
<br>
 
<br>
 
[http://pcgamingwiki.com/images/1/10/Globalprofile.jpg Here] is an example of a modified Global Profile. You don't have to copy it verbatim, but I at least recommend setting AF to override and HQ at the very minimum.  
 
[http://pcgamingwiki.com/images/1/10/Globalprofile.jpg Here] is an example of a modified Global Profile. You don't have to copy it verbatim, but I at least recommend setting AF to override and HQ at the very minimum.  
 
 
It is worth noting however that whether you want to use <code>Adaptive</code> performance mode or <code>Prefer Max Performance</code> is up to you.
 
Adaptive WILL cause problems with many games.
 
The card will do a variety of things such as not maintaining the base clock as a minimum. Therefore causing performance issues because the card will essentially start going back to idle clocks.
 
On the flip side: Adaptive will reduce idle temperatures depending on individual circumstances and the card will otherwise downclock correctly when not playing games.
 
<br>So, if you want to use Adaptive. Just remember to set <code>Prefer Maximum Performance</code> in each profile for each game you play.
 
==First steps for tweaking driver settings==
 
* '''The first thing you will always want to do for any given is search to see whether Nvidia has already made a profile for your game.
 
'''
 
To do this, left click in the white "Profiles" box, and start typing in the name of the game. The application will search through the list and narrow the list down the more characters you type.
 
*'''If your game does not have a profile, you will need to create one by clicking on the Sun icon.'''
 
It will prompt you to name the profile.
 
*'''You then will need to add the game <code>.exe</code> to this new profile by hitting the folder with the green + icon.
 
'''
 
*'''If you attempt to add an executable to a profile and it prompts you with a message that the <code>Insert Here.exe</code> already belongs to a profile.'''
 
Pay attention to the name of the profile/s it gives you. For example if the game you were looking for didn't show up in a search, it's possible you may have worded it differently than they have it set in the driver.
 
HOWEVER, if the profiles it mentions to you aren't related to the game at all in any way we will need to set up the profile to use an <code>.exe</code> based on directory.
 
<br>When you go to select an <code>.exe</code> click the drop down box in the right hand corner and set to ''"Absolute Application Path"''
 
*'''Now you are ready to change settings on the profile. Don't forget to hit Apply Changes to any changes you make!
 
'''
 
  
 
== List of basic options and explanations ==
 
== List of basic options and explanations ==
 
===Compatibility===
 
===Compatibility===
*'''Ambient Occlusion Compatibility'''
+
====Ambient Occlusion Compatibility====
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get HBAO+ to work with any given game.<br>
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get HBAO+ to work with any given game.<br>
There is an official list of Flags built into the driver configured for specific games (Though not necessarily well for older titles). These flags are not a combination of functions put together to form a flag as with Anti-Aliasing, but rather are directions pointed a set of programmed parameters for each flag.<br>
+
There is an official list of Flags built into the driver configured for specific games (though not necessarily well for older titles). These flags are not a combination of functions put together to form a flag as with Anti-Aliasing, but rather are directions pointed a set of programmed parameters for each flag.<br>
The first 2 bytes of the flag are reserved for Flags pertaining to DX10 and above. While the second set of bytes are reserved for Flags pertaining to DX9 only.<br>
+
The first 2 bytes of the flag are reserved for Flags pertaining to DX10 and above, while the second set of bytes are reserved for flags pertaining to DX9 only.<br>
 
<code>0x'''0000'''0000 < DX10+, 0x0000'''0000''' < DX9</code><br>
 
<code>0x'''0000'''0000 < DX10+, 0x0000'''0000''' < DX9</code><br>
 
Each of the 8 slots goes from a value of 1-16<br>
 
Each of the 8 slots goes from a value of 1-16<br>
 
<code>0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F</code><br>
 
<code>0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F</code><br>
 
Giving them an approximate total of 65536 potential Flags for each set of APIs.<br>
 
Giving them an approximate total of 65536 potential Flags for each set of APIs.<br>
*'''Antialiasing Compatibility'''
+
====Antialiasing Compatibility====
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get various forms of Anti-Aliasing to work with any given DX9 game.
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get various forms of Anti-Aliasing to work with any given DX9 game.
 
This Hex code unlike AO Compatibility, is actually a combination of possible functions that tell the driver what to do in regards to what kind of buffer formats to look for, how to process them among other things.<br>
 
This Hex code unlike AO Compatibility, is actually a combination of possible functions that tell the driver what to do in regards to what kind of buffer formats to look for, how to process them among other things.<br>
*'''Antialiasing compatibility (DX1x)'''
+
====Antialiasing compatibility (DX1x)====
This is where you would enter or paste an 8 digit Hexadecimal code to force AA in DirectX10+...if you had one! (Insert Fairly Odd Parents reference)
+
This is where you would enter or paste an 8 digit Hexadecimal code to force AA in DirectX10+...if you had one.
 
Nvidia's effort sadly did not continue. There are few functions available for this and none of them work in any meaningful way for games other than the ones listed. Often only restricted to MSAA.<br>
 
Nvidia's effort sadly did not continue. There are few functions available for this and none of them work in any meaningful way for games other than the ones listed. Often only restricted to MSAA.<br>
*'''Antialiasing Fix'''
+
====Antialiasing Fix====
 
This is somewhat a mystery almost. Initially this was apparently made just for an issue relating to Team Fortress 2. (As such was originally known as the TF2Fix). But as it turned out, this affected a very large number of games.
 
This is somewhat a mystery almost. Initially this was apparently made just for an issue relating to Team Fortress 2. (As such was originally known as the TF2Fix). But as it turned out, this affected a very large number of games.
  
Line 78: Line 59:
 
It's also interesting that turning the AA fix on disables this. The default setting of "Off" is actually a value of <code>0x00000001</code>. The On value is <code>0x00000000(Team Fortress 2)</code><br>
 
It's also interesting that turning the AA fix on disables this. The default setting of "Off" is actually a value of <code>0x00000001</code>. The On value is <code>0x00000000(Team Fortress 2)</code><br>
 
''Do Not Enable This Globally''| As what it does depends on a per game basis. It is noted whether a game needs it or not in the Anti-Aliasing flags thread or whether it causes issues.
 
''Do Not Enable This Globally''| As what it does depends on a per game basis. It is noted whether a game needs it or not in the Anti-Aliasing flags thread or whether it causes issues.
*'''SLI Compatibility bits'''
+
====SLI Compatibility bits====
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX9 applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.
 
This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX9 applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.
  
Line 89: Line 70:
 
<br>
 
<br>
 
===Sync and Refresh===
 
===Sync and Refresh===
*'''Frame Rate Limiter'''
+
====Frame Rate Limiter====
This setting will enable the driver's built in Frame Rate limiter in a series of pre-defined values non whole number values. It's worth noting that the quality of it has historically been a bit spotty. [http://forums.guru3d.com/showpost.php?p=5250232&postcount=108 The behavior was changed at some point so whole numbers aren't possible.] Before the limiter with whole values would never really stick to those numbers.<br>
+
This setting will enable the driver's built in Frame Rate limiter in a series of pre-defined, non-whole-number values.<ref>{{Refurl|url=http://forums.guru3d.com/showpost.php?p=5250232&postcount=108|title=Guru3D Forums - Nvidia Inspector introduction and guide, post # 108|date=2016-10-12}}</ref>
The 60FPS setting is 59.7 or 60.7, Vsync may or may not clamp this value.
 
 
Some people may prefer [https://www.guru3d.com/files-details/rtss-rivatuner-statistics-server-download.html Unwinder's RTSS] as more useful. As it lets you set the value yourself.<br>
 
Some people may prefer [https://www.guru3d.com/files-details/rtss-rivatuner-statistics-server-download.html Unwinder's RTSS] as more useful. As it lets you set the value yourself.<br>
 +
====GSYNC settings====
 
*'''GSYNC Application Mode'''
 
*'''GSYNC Application Mode'''
 
*'''GSYNC Requested State'''
 
*'''GSYNC Requested State'''
Line 98: Line 79:
 
*'''GSYNC Global Mode'''
 
*'''GSYNC Global Mode'''
 
*'''GSYNC Indicator Overlay'''
 
*'''GSYNC Indicator Overlay'''
*'''Maximum pre-rendered frames'''
+
====Maximum pre-rendered frames====
"The 'maximum pre-rendered frames' function operates within the DirectX API and serves to explicitly define the maximum number of frames the CPU can process ahead of the frame currently being drawn by the graphics subsystem. This is a dynamic, time-dependent queue that reacts to performance: at low frame rates, a higher value can produce more consistent gameplay, whereas a low value is less likely to introduce input latencies incurred by the extended storage time (of higher values) within the cache/memory. Its influence on performance is usually barely measurable, but its primary intent is to control peripheral latency."<ref>https://forums.geforce.com/default/topic/476977/sli/maximum-pre-rendered-frames-for-sli/post/3405124/#3405124</ref><br>
+
This setting controls how many frames the CPU can process ahead of the frame currently being drawn by the GPU.<ref>{{Refsnip|url=https://forums.geforce.com/default/topic/476977/sli/maximum-pre-rendered-frames-for-sli/post/3405124/#3405124|title=Maximum Pre-Rendered frames for SLI|date=2016-10-12|snippet=The 'maximum pre-rendered frames' function operates within the DirectX API and serves to explicitly define the maximum number of frames the CPU can process ahead of the frame currently being drawn by the graphics subsystem. This is a dynamic, time-dependent queue that reacts to performance: at low frame rates, a higher value can produce more consistent gameplay, whereas a low value is less likely to introduce input latencies incurred by the extended storage time (of higher values) within the cache/memory. Its influence on performance is usually barely measurable, but its primary intent is to control peripheral latency.}}</ref>
Values from <code>1-8</code>, the default is <code>3</code> by the driver and it is not recommended to use higher than that. <br>A value of 1 or 2 will reduce Input Latency further<ref>http://www.displaylag.com/reduce-input-lag-in-pc-games-the-definitive-guide/</ref> at the cost of slightly higher CPU performance.<br>
+
{{ii}} Valid values are 1 through 8
When using 1/2 Refresh Rate Vsync a value of 1(Sometimes 2 will suffice but 1 generally reduces latency more) is essentially required. As 1/2 sync will introduce significantly more Input Latency.<br>
+
{{ii}} Default is 3, and it is recommended not to go above this{{cn|date=October 2016}}
*'''Triple buffering'''
+
{{ii}} Values of 1 and 2 will help reduce input latency in exchange for greater CPU usage.<ref>http://www.displaylag.com/reduce-input-lag-in-pc-games-the-definitive-guide/</ref>
Enables Triple Buffering for Vsync, but ONLY for the OpenGL API. For a run down of TB [http://www.anandtech.com/show/2794/2 here is an article.] If you wish to enable TB for D3D APIs you can download and use [http://community.pcgamingwiki.com/files/file/84-d3doverrider/ D3DOverrider.]<br>
+
{{ii}} When Vsync is set to <code>1/2 Refresh Rate</code>, a value of 1 is essentially required due to the introduced input latency.{{cn|date=October 2016}}
It's worth noting that GSYNC makes the concept of Double and Triple Buffering entirely irrelevant. This is only for standard sync monitors.<br>
+
===Triple buffering===
*'''Vertical Sync Smooth AFR behavior'''
+
{{ii}} [http://www.anandtech.com/show/2794/2 Explanatory article]
"Smooth Vsync is a new technology that can reduce stutter when Vsync is enabled and SLI is active.<br>
+
Enables Triple Buffering for Vsync in OpenGL applications
 +
{{ii}} If you wish to enable TB for D3D APIs you can download and use [http://community.pcgamingwiki.com/files/file/84-d3doverrider/ D3DOverrider.]<br>
 +
{{ii}} This setting is unused when using GSYNC.
 +
====Vertical Sync Smooth AFR behavior====
 +
{{ii}} Dynamically adjusts Vsync behavior when using SLI.<ref>{{Refsnip|url=https://nvidia.custhelp.com/app/answers/detail/a_id/3283/~/what-is-smooth-vsync%3F|title=NVIDIA - What is Smooth Vsync?|date=2016-10-12|snippet=Smooth Vsync is a new technology that can reduce stutter when Vsync is enabled and SLI is active.<br>
 +
 
 +
When SLI is active and natural frame rates of games are below the refresh rate of your monitor, traditional vsync forces frame rates to quickly oscillate between the refresh rate and half the refresh rate (for example, between 60Hz and 30Hz). This variation is often perceived as stutter. Smooth Vsync improves this by locking into the sustainable frame rate of your game and only increasing the frame rate if the game performance moves sustainably above the refresh rate of your monitor. This does lower the average framerate of your game, but the experience in many cases is far better. }}</ref>
 +
====Vertical Sync Tear Control (Adaptive Frame Rate)====
 +
This controls when a frame drop is detected whether Vsync should be disabled to maintain performance or sync should drop to the next sync rate.<ref>http://www.geforce.com/hardware/technology/adaptive-vsync/technology</ref>
 +
{{ii}} Without the <code>Adaptive</code> setting the frame rate will drop to 45 or 30FPS because they are the next available sync rate (3/4 and 1/2 monitor refresh rate, respectively).
 +
{{ii}} This setting is unused when using GSYNC.
  
When SLI is active and natural frame rates of games are below the refresh rate of your monitor, traditional vsync forces frame rates to quickly oscillate between the refresh rate and half the refresh rate (for example, between 60Hz and 30Hz). This variation is often perceived as stutter. Smooth Vsync improves this by locking into the sustainable frame rate of your game and only increasing the frame rate if the game performance moves sustainably above the refresh rate of your monitor. This does lower the average framerate of your game, but the experience in many cases is far better. "<ref>https://nvidia.custhelp.com/app/answers/detail/a_id/3283/~/what-is-smooth-vsync%3F</ref><br>
+
====Vertical Sync====
*'''Vertical Sync Tear Control'''<ref>http://www.geforce.com/hardware/technology/adaptive-vsync/technology</ref>
+
Controls whether Vsync can be enabled for any given application.  
This controls when a frame drop is detected whether Vsync should be disabled to maintain performance or sync should drop to the next sync rate.<br> At 60hz, without adaptive the frame rate will drop to 45 or 30FPS because they are the next available sync rate; 3/4 & 1/2.<br>
+
{{ii}} Defaults to <code>Application Controlled</code> which means it's up to the individual application to enable, disable, or offer the option for Vsync.
You can use TB as mentioned above instead of adaptive, or as long as you ensure you have enough power to sustain the performance you are aiming for it shouldn't be an issue.<br>Also irrelevant with Gsync.<br>
+
{{ii}} Can be useful when an application forces a specific setting, such as Fallout 4, which forces Vsync without an option to disable it.
*'''Vertical Sync'''
 
Controls whether Vsync can be enabled for any given application. Typically it's set to "Application Controlled". Which means instead it's up to the individual application itself to enable/disable or offer the option for Vsync.<br>
 
One recent example is Fallout 4. The game has no Vsync option, but it is forced on no matter what. You can disable it by setting this to "Force Off" on the Fallout 4 profile.
 
 
{| class="wikitable"
 
{| class="wikitable"
 
|-
 
|-
Line 129: Line 117:
 
| 1/4 Refresh Rate || Forces 1/4 rate Sync || 144hz/36, 120hz/30, 60hz/15, 50hz/Not possible with whole numbers  
 
| 1/4 Refresh Rate || Forces 1/4 rate Sync || 144hz/36, 120hz/30, 60hz/15, 50hz/Not possible with whole numbers  
 
|-
 
|-
| Fast Sync || Enables Nvidia's Fast Sync<ref>https://www.technopat.net/sosyal/konu/video-what-is-nvidia-fast-sync.329258/</ref> Does not work with DX12 games.|| Fast Sync eliminates tearing when playing games without Vsync on and reduces Input Latency significantly. However to work best, you need to be maintaining multiples of your refresh rate. Apparently only for Pascal GPUs, but some have had success with Maxwell GPUs. YMMV
+
| Fast Sync || Enables Nvidia's Fast Sync<ref>https://www.technopat.net/sosyal/konu/video-what-is-nvidia-fast-sync.329258/</ref> Does not work with DX12 games.|| Fast Sync eliminates tearing when playing games without Vsync on and reduces Input Latency significantly. However to work best, you need to be maintaining multiples of your refresh rate. Designed for Pascal GPUs, but some have had success with Maxwell GPUs. Your success will vary.
 
|}
 
|}
 
<br>
 
<br>
 
===Antialiasing===
 
===Antialiasing===
*'''Antialiasing - Behavior Flags'''
+
====Antialiasing - Behavior Flags====
These mostly exist as a method of governing usage of AA from Nvidia Control Panel.But, they also affect Inspector as well. So you will want to make sure you are clearing out any flags in this field for a game profile when forcing AA.
+
These mostly exist as a method of governing usage of AA from Nvidia Control Panel, though they also affect Inspector as well.  
As it will interfere and cause it not to work if you aren't careful.<br>
+
{{ii}} Make sure you are clearing out any flags in this field for a game profile when forcing AA as it will interfere and cause it not to work if you aren't careful.
*'''Antialiasing - Gamma Correction'''
+
====Antialiasing - Gamma Correction====
Gamma correction for MSAA, this is defaulted through function in the drivers to ON for any GPU Fermi generations and forward. It was new about 10 years ago with Half Life 2 and there is no reason to disable it on modern hardware.<ref>http://www.anandtech.com/show/2116/12</ref><br>
+
Gamma correction for MSAA
*'''Antialiasing - Line Gamma'''
+
{{ii}} Introduced with [[Half-Life 2]] in 2004.
*'''Antialiasing - Mode'''
+
{{ii}} Defaults to <code>ON</code> starting with Fermi GPUs.
 +
{{ii}} Should not be set to <code>OFF</code> on modern hardware.<ref>{{Refurl|url=http://www.anandtech.com/show/2116/12|title=NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10 - What's Gamma Correct AA?|date=2016-10-12}}</ref>
 +
====Antialiasing - Mode====
 
This has 3 settings
 
This has 3 settings
 
{| class="wikitable"
 
{| class="wikitable"
Line 149: Line 139:
 
| Enhance Application Setting || Enhances application AA settings (Ex: Enhance in game MSAA with TrSSAA).
 
| Enhance Application Setting || Enhances application AA settings (Ex: Enhance in game MSAA with TrSSAA).
 
|}
 
|}
<br>
+
=====Notes on the "Override" setting=====
 
When overriding AA in and game you will want to set "Override" and not either of the other two. Enhancing AA will become entirely dependent on the implementation of MSAA in the game you are modifying a profile for.
 
When overriding AA in and game you will want to set "Override" and not either of the other two. Enhancing AA will become entirely dependent on the implementation of MSAA in the game you are modifying a profile for.
 
More often than not, especially in modern DX10+ games. This is a total crap shoot, either it doesn't work, breaks something, or looks very bad.<br>
 
More often than not, especially in modern DX10+ games. This is a total crap shoot, either it doesn't work, breaks something, or looks very bad.<br>
Line 159: Line 149:
  
 
Another great example is Red Faction Guerrilla, you can't force AA in this game.<br>
 
Another great example is Red Faction Guerrilla, you can't force AA in this game.<br>
However you can Enhance the in game MSAA with various methods of AA to some decent results. But it shines when you combine the in game AA+Enhanced AA and then downsample(Using DSR) while also enabling FXAA in the game Profile.<br>
+
=====Using the "Enhance" setting=====
''(FXAA works when Enhancing in game AA. It used to when Overriding as well, but has been broken since after 331.82. It is applied last in the chain so it doesn't cause conflicts with other AA. Though it's not recommended to use it at native over enhanced AA if that makes sense. Oversampling from Downsampling negates any smoothing issues)''<br>
+
When overriding in-game MSAA fails, you can instead replace (enhance) the in-game MSAA with the NVIDIA MSAA technique of your choice.
This is a rather unique exception as most games don't yield this good of results.<br><br>
+
====Antialiasing - setting====
 
+
{{mm}} [http://community.pcgamingwiki.com/topic/1858-aa-af-vsync-and-forcing-through-gpu/?p=8482 Explanation of MSAA methods]
Here is a few comparisons showing it off.<br>
+
This is where the specific method of forced/enhanced MSAA is set.
http://screenshotcomparison.com/comparison.php?id=103126 <br>
+
====Antialiasing - Transparency Multisampling====
This first one shows no AA by default | Vs | The game running at 2x2 Native resolution with 2xMSAA enabled in game with "Enhance Application Setting" enabled and set to 4xS (1x2 OGSSAA + 2xMSAA) together with 2xSGSSAA. Finally with FXAA enabled on the profile.<br><br>
+
{{mm}} [http://http.download.nvidia.com/developer/SDK/Individual_Samples/DEMOS/Direct3D9/src/AntiAliasingWithTransparency/docs/AntiAliasingWithTransparency.pdf Technical explanation]
 
+
{{mm}} [http://screenshotcomparison.com/comparison/149642 Interactive example]
http://screenshotcomparison.com/comparison.php?id=103127 <br>
+
{{ii}} This enables and disables the use of transparency multisampling.
This second one is cropped from the native 3200x1800 buffer with 2xMSAA+4xS+2xSGSSAA |Vs| That with FXAA also enabled showing that there are still some rough edges that FXAA cleans up before it is downsampled back to native 1600x900
+
====Antialiasing - Transparency Supersampling====
<br><br>
+
{{mm}} [http://community.pcgamingwiki.com/topic/1858-aa-af-vsync-and-forcing-through-gpu/?p=8482 Further Reference]
 
+
{{ii}} This sets whether <code>Supersampling</code> and <code>Sparse Grid Supersampling</code> is used.
http://screenshotcomparison.com/comparison/161297
+
=====Supersampling=====
The 3rd comparison shows 2x2 native resolution + 2xMSAA | Vs | 2x2 Native + 2xMSAA+4xS+2xSGSSAA+FXAA cropped and upsampled with point filtering by 2x2 to show how much more aliasing is tackled and resolved compared to without enhancing AA.<br>
+
{{++}} Inexpensive
*'''Antialiasing - setting'''
+
{{++}} Does not require MSAA to work
This is where you would set any primary form of forced Anti Aliasing. <br>
+
{{--}} Only works on alpha-tested surfaces
Which can be MSAA,CSAA(G80 to Kepler GPUs ONLY), OGSSAA (Ex 2x2 supersampling) or HSAA (Ex 16xS).
+
=====Full-Screen Sparse Grid Supersamping=====
If you are going to use SGSSAA, you can use MSAA modes ONLY(2x,2xQ,4x,8xQ). The number of Color Samples have to match.<br>
+
{{++}} Extremely high-quality supersampling
[http://community.pcgamingwiki.com/topic/1858-aa-af-vsync-and-forcing-through-gpu/?p=8482 Refer to this post on which ones do what][http://u.cubeupload.com/MrBonk/revisedaaoverviewtgr.png And also this info graphic by GuruKnight of Guru3D forums]<br>
+
{{++}} Works on the entire image
*'''Antialiasing - Transparency Multisampling'''<ref>http://http.download.nvidia.com/developer/SDK/Individual_Samples/DEMOS/Direct3D9/src/AntiAliasingWithTransparency/docs/AntiAliasingWithTransparency.pdf</ref>
+
{{--}} Requires MSAA to work
The number of games this works with is unknown, but the results can be nice when it does work.<br>
+
{{--}} Very expensive
http://screenshotcomparison.com/comparison/149642<br>
+
'''Notes'''
Nvidia has a demo of this you can download that also includes a demo of 4xTrSSAA<br>
+
{{ii}} The level of supersampling must match a corresponding level of MSAA. For example, if 4x MSAA is used, then 4x Sparse Grid Supersampling must be used.
https://www.nvidia.com/object/transparency_aa.html<br>
+
{{ii}} In OpenGL <code>Supersampling</code> is equivalent to <code>Sparse Grid Supersampling</code>. Ergo, 4xMSAA+4xTrSSAA will give you 4xFSSGSSAA in OpenGL games that work.
*'''Antialiasing - Transparency Supersampling'''
 
The only options here are <code>Supersampling</code> and <code>Sparse Grid Supersampling</code>.<br>
 
In reality they are both SGSSAA, but they are different in their approach. TrSSAA is formally SGSSAA while SGSSAA is actually FSSGSSAA <code>Full Scene Sparse Grid Super Sampling Anti Aliasing</code>.
 
This works by replaying the pixel shading by N times number of color samples. TrSSAA(SGSSAA)does the same however it is decoupled from the main MSAA pass in that it only applies SGSSAA to Alpha Test surfaces like flat textures that come in all forms and varieties.<br>
 
While SGSSAA (FSSGSSAA) is coupled with MSAA and needs the sample counts to match for it to work properly. Though in certain cases you can get away with mixing 4xMSAA with 2xSGSSAA. As 2xSGSSAA is poor in those cases.<br>
 
[http://community.pcgamingwiki.com/topic/1858-aa-af-vsync-and-forcing-through-gpu/?p=8482 Further Reference]<br>
 
''In OpenGL <code>Supersampling</code> is equivalent to <code>Sparse Grid Supersampling</code>. Meaning 4xMSAA+4xTrSSAA will give you 4xFSSGSSAA in OpenGL games that work.''
 
 
<br>Do note that, usually these require AA compatibility Flags to work properly in DX9 games!<br>
 
<br>Do note that, usually these require AA compatibility Flags to work properly in DX9 games!<br>
*'''Enable Maxwell sample interleaving (MFAA)'''<ref>http://www.geforce.com/whats-new/articles/multi-frame-sampled-anti-aliasing-delivers-better-performance-and-superior-image-quality</ref>
+
====Enable Maxwell sample interleaving (MFAA)====
This enables Nvidia's Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced.<br>
+
{{mm}}[http://www.geforce.com/whats-new/articles/multi-frame-sampled-anti-aliasing-delivers-better-performance-and-superior-image-quality Introduction to MFAA]
 +
This enables Nvidia's Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced.
 
What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.<br>
 
What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.<br>
 
There are some caveats to using this though.<br>
 
There are some caveats to using this though.<br>
# It is not compatible with SGSSAA as far as shown in limited testing (SGSSAA in unlikely in DX10+ anyway since you would have to use <code>Enhance Application Setting</code> and in DX10+ games the quality usually isn't good when doing so.
+
{{ii}} It is not compatible with SGSSAA as far as shown in limited testing.{{cn|date=October 2016}}
#It causes visible flickering on geometric edges and other temporal artifacts depending on the game and it's MSAA implementation. Part of this is nullified with Downsampling though. So it's GREAT to use with downsampling (In conjunction with FXAA as well)to improve AA/Performance
+
{{ii}} Depending on the game, MFAA causes visible flickering on geometric edges and other temporal artifacts. Part of this is nullified with downsampling.{{cn|date=October 2016}}
#With screenshots and videos captured locally, there will be visible sawtooth patterns.
+
{{ii}} It has a minimum framerate requirement of about 40FPS. Otherwise MFAA will degrade the image.
#It has a Minimum Framerate requirement of about 40FPS. Otherwise the Temporal Synthesis Filter seems to fall apart in a strange way depending on the game.
+
{{--}} Visible sawtooth patterns with screenshots and videos captured locally.{{cn|date=October 2016}}
#It does not work with SLI. Supposedly it was supposed to happen in a driver update. But, well you know.
+
{{--}} Incompatible with SLI.{{cn|date=October 2016}}
 
+
===Fullscreen Approximate Anti-Aliasing (FXAA)===
*'''Nvidia Predefined FXAA usage'''
+
====Nvidia Predefined FXAA usage====
Simply tells the driver whether FXAA is allowed to be turned on in Nvidia Control Panel (Primarily) or Nvidia Inspector.<br>
+
Simply tells the driver whether FXAA is allowed to be turned on in Nvidia Control Panel (Primarily) or Nvidia Inspector.
*'''Toggle FXAA indicator on or off'''
+
====Toggle FXAA indicator on or off====
This will display a small green icon in the left upper corner if enabled showing whether FXAA is enabled or not.<br>
+
This will display a small green icon in the left upper corner if enabled showing whether FXAA is enabled or not.
*'''Toggle FXAA on or off'''
+
====Toggle FXAA on or off====
Turns FXAA on or Off. You can also enable this when you are Enhancing in game AA as shown above.
+
Turns FXAA on or Off.
<br><br>
 
  
 
===Texture Filtering===
 
===Texture Filtering===
  
*'''Anisotropic Filtering mode'''
+
====Anisotropic Filtering (AF) mode====
Simply tells the driver whether the Driver controls AF for the application or the application is allowed to do its own thing.
+
Tells the driver whether the Driver controls AF for the application.
It is recommended you leave this set to "User Defined/Off". Because lots of games do not have Texture Filtering options and lots of games also have mediocre or intentionally mediocre Texture Filtering.<br>
+
{{ii}} It is recommended you leave this set to <code>User Defined/Off</code> as many games do not have Texture Filtering options and many more games also have mediocre or intentionally-mediocre Texture Filtering.{{cn|date=October 2016}}
Most of the time Driver AF is higher quality than in game AF.<br>
+
====Anisotropic Filtering setting====
Modern recent examples: Just Cause 3, Assassin's Creed Syndicate,Black Ops 3.<br>
+
If the AF mode is set to <code>User Defined/Off</code>, this determines what level of Texture Filtering is forced on an application. <code>16x</code> is the best setting and is recommended to use.
Only rarely will this incur a performance hit of any significance when overriding globally.<br>
+
====Prevent Anisotropic Filtering====
AC:Syndicate is one example of there being a minor performance cost, but it's worth it IMO and it's is a rare duck. Most games do not incur the same kind of performance with Driver AF.<br>
+
Similar to AA Behavior Flags, if this is set to <code>On</code> it will ignore user-defined driver overrides. Some games default to this, such as [[Quake 4]], [[Rage]], [[Doom (2016)]], [[Far Cry 3]] and [[Far Cry 3: Blood Dragon]].
*'''Anisotropic Filtering setting'''
+
====Texture Filtering - Anisotropic filter optimization====
If you have it set as above. This determines what level of Texture Filtering is forced on an Application. <code>16x</code> is the best. But it also has options for <code>Off[point]</code> which is Point Filtering(You wouldn't want this 9/10 times.) <code>Off[Linear]</code> which is assumed Bilinear filtering.<br>
+
"Anisotropic filter optimization improves performance by limiting trilinear filtering to the primary texture stage where the general appearance and color of objects is determined. Bilinear filtering is used for all other stages, such as those used for lighting, shadows, and other effects. This setting only affects DirectX programs."{{cn|date=October 2016}}
*'''Prevent Anisotropic Filtering'''
+
====Texture Filtering - Anisotropic sample optimization====
Similar to AA Behavior Flags, if this is set to On it will ignore Driver Overrides from NI or NVCP. Some games default to this. Such as Quake 4, Rage, Doom 2016, Far Cry 3 and Far Cry Blood Dragon.<br>
+
"Anisotropic sample optimization limits the number of anisotropic samples used based on texel size. This setting only affects DirectX programs."{{cn|date=October 2016}}
*'''Texture Filtering - Anisotropic filter optimization'''<ref name="NvGames.dll">NvGames.dll</ref>
+
====Texture Filtering - Driver Controlled LOD Bias====
"Anisotropic filter optimization improves performance by limiting trilinear filtering to the primary texture stage where the general appearance and color of objects is determined. Bilinear filtering is used for all other stages, such as those used for lighting, shadows, and other effects. This setting only affects DirectX programs."<br>
+
When using SGSSAA with this enabled will allow the driver to compute it's own Negative LOD Bias for textures to help improve sharpness (at cost of potential for more aliasing) for those who prefer it. It's generally less than the fixed amounts that are recommended.{{cn|date=October 2016}}
 
+
{{ii}} Manual LOD bias will be ignored when this is enabled.
This might have been necessary 10-12 years ago. But not now.<br>
+
====Texture Filtering - LOD Bias (DX)====
*'''Texture Filtering - Anisotropic sample optimization '''<ref name="NvGames.dll"/>
+
{{mm}} [http://screenshotcomparison.com/comparison/159382 Interactive example]
"Anisotropic sample optimization limits the number of anisotropic samples used based on texel size. This setting only affects DirectX programs."<br>
+
{{mm}} [http://naturalviolence.webs.com/lodbias.htm Explanation of LOD bias]
Same as above, unnecessary to use today.<br>
+
The Level of Detail Bias setting for textures in DirectX backends. This normally only works under 2 circumstances.<br>
*'''Texture Filtering - Driver Controlled LOD Bias'''
+
For both, "Driver Controlled LoD Bias" must be set to "Off"<br>
When using SGSSAA with this enabled will allow the driver to compute it's own Negative LOD Bias for textures to help improve sharpness (At cost of potential for more aliasing)for those who prefer it. It's generally less than the fixed amounts that are recommended.<br>
 
When this is enabled setting a manual bias will not do anything. And AutoLOD will always be applied.<br>
 
*'''Texture Filtering - LOD Bias (DX)'''
 
The Level of Detail Bias setting for textures in DirectX Backends. This normally only works under 2 circumstances.<br>
 
For both "Driver Controlled LoD Bias" must be set to "Off"<br>
 
 
#When Overriding or Enhancing AA.
 
#When Overriding or Enhancing AA.
#The last choice is an interesting one. If you leave the <code>Antialiasing Mode</code> setting to <code>Application Controlled</code> or <code>Enhance Application Setting</code>(Try Application controlled first!) but you set the AA and Transparency setting to SGSSAA (Eg 4xMSAA and 4xSGSSAA;TrSSAA in OpenGL then you can freely set the LOD bias and the changes will work without forcing AA. This has the side effect that in some games if the game has MSAA, it will act as if you were "Enhancing" the game setting. Even if you are using <code>Application Controlled</code><br>
+
#The last choice is an interesting one. If you leave the <code>Antialiasing Mode</code> setting to <code>Application Controlled</code> or <code>Enhance Application Setting</code>(Try Application controlled first!) but you set the AA and Transparency setting to SGSSAA (Eg 4xMSAA and 4xSGSSAA;TrSSAA in OpenGL then you can freely set the LOD bias and the changes will work without forcing AA. This has the side effect that in some games if the game has MSAA, it will act as if you were "Enhancing" the game setting. Even if you are using <code>Application Controlled</code>
Comparison example http://screenshotcomparison.com/comparison/159382<br>
+
'''Notes:'''
<br>"The so called Level of Detail BIAS (LOD BIAS) controls at which distance from the viewer the switch to lower resolution mip maps takes place. The standard value of the LOD BIAS is 0.0. If you lower the LOD BIAS below zero, the mip map levels are moved farther away, resulting in seemingly sharper textures. But if the scene is moving, the textures start to shimmer."<ref>http://naturalviolence.webs.com/lodbias.htm</ref>
+
{{ii}} If you wish to use a negative LOD bias when forcing SGSSAA, these are the recommended amounts:
<br><br>
 
If you wish to use a -LOD Bias when forcing SGSSAA, these are the recommended amounts
 
 
#2xSGSSAA (2 samples): -0.5
 
#2xSGSSAA (2 samples): -0.5
 
#4xSGSSAA (4 samples): -1.0
 
#4xSGSSAA (4 samples): -1.0
#8xSGSSAA (8 samples): -1.5 <br>
+
#8xSGSSAA (8 samples): -1.5
Do not use a -LOD Bias when using OGSSAA and HSAA modes, these already have their own automatic LOD bias (Which can cause problems in some games)<ref>http://naturalviolence.webs.com/sgssaa.htm</ref><br>
+
{{ii}} Do not use a negative LOD bias when using OGSSAA and HSAA as they have their own LOD bias.<ref>{{Refurl|url=http://naturalviolence.webs.com/sgssaa.htm|title=Natural Violence - SGSSAA|date=2016-10-12}}</ref>
*'''Texture Filtering - LOD Bias (OGL)'''
+
====Texture Filtering - LOD Bias (OGL)====
The same as the above except for OpenGL. If you'd like to try changing the LOD bias without forcing AA. Set the Transparency setting to one of the <code>Supersampling</code> modes instead of <code>Sparse Grid Supersampling</code><br>
+
Identical to [[#Texture Filtering - LOD Bias (DX)|Texture Filtering - LOD Bias (DX)]], but applies to OpenGL applications and games instead.
*'''Texture Filtering - Negative LOD bias'''
+
====Texture Filtering - Negative LOD bias====
This used to control whether NLOD's were Clamped (Not allowed) or Allowed . With Fermi GPU's and on, this no longer really functions. By default it clamps. Driver Controlled LoD Bias works either way.<br>
+
Controls whether negative LODs are clamped (Not allowed) or allowed .  
*'''Texture Filtering - Quality'''
+
{{ii}} Has no effect on Fermi GPU's and beyond; clamps by default on those GPUs.
Leave this at High Quality, this is an old optimization for older hardware to improve AF performance at the cost of some quality. If you have older hardware, like G80(8000 series) and prior, feel free to play around to see if it helps at all.<br>
+
====Texture Filtering - Quality====
*'''Texture Filtering - Trilinear Optimization'''
+
This controls AF optimizations meant to improve AF performance on legacy hardware at the cost of quality.
Same as above, an optimization of the number of texture samples taken for texture filtering. The HQ Setting above disables this anyway. Might only have to do when using Trilinear filtering.<ref>http://www.freepatentsonline.com/7193627.html</ref><br>
+
{{ii}} Should be set to or left at <code>High Quality</code> if not using a GeForce 8 series GPU or earlier.
<br>
+
====Texture Filtering - Trilinear Optimization====
 +
Same as [[#Texture Filtering - Quality|Texture Filtering - Quality]] in practice.{{cn|October 2016}}
 +
{{ii}} Disabled if [[#Texture Filtering - Quality|Texture Filtering - Quality]] is set to <code>High Quality</code>.
 
===Common===
 
===Common===
*'''Ambient Occlusion setting'''
+
====Ambient Occlusion setting====
This needs to be set to enable HBAO+, there are 3 settings.
+
This setting determines what kind of Ambient Occlusion is used in games which do not support it.
 +
{{ii}} Has three modes:
 
#<code>Performance</code>
 
#<code>Performance</code>
 
#<code>Quality</code>
 
#<code>Quality</code>
 
#<code>High Quality</code>
 
#<code>High Quality</code>
Q and HQ are pretty much the same (Though before HBAO+ was introduced there was a significant difference).
+
<code>Quality</code> and <code>High Quality</code> are nearly identical, while <code>Performance</code> noticeably lowers the resolution and precision of the effect in many games with less-accurate and stronger shading.{{cn|October 2016}}
<br>Performance lowers the resolution and precision of the effect noticeably in many games. With less accurate and stronger shading. <br>However in some games, using <code>Performance</code> actually fixes some issues when using the Q and HQ settings without other drawbacks. (Ex: Oddworld New N Tasty,) The HBAO+ thread and list usually mention if needed.<br>
+
{{ii}} Certain games need <code>Performance</code> to support forced Ambient Occlusion, such as [[Oddworld: New 'n' Tasty!]].{{cn|October 2016}}
*'''Ambient Occlusion usage'''
+
====Ambient Occlusion usage====
When using HBAO+ set this to <code>On</code><br>
+
When using forced Ambient Occlusion, set this to <code>On</code>.
*'''Extension limit'''
+
====Extension limit====
"Extension limit indicates whether the driver extension string has been trimmed for compatibility with particular applications. Some older applications cannot process long extension strings and will crash if extensions are unlimited."<br>
+
"Extension limit indicates whether the driver extension string has been trimmed for compatibility with particular applications. Some older applications cannot process long extension strings and will crash if extensions are unlimited.<br> If you are using an older OpenGL application, turning this option on may prevent crashing. If you are using a newer OpenGL application, you should turn this option off"{{cn|October 2016}}
"If you are using an older OpenGL application, turning this option on may prevent crashing. If you are using a newer OpenGL application, you should turn this option off"<ref name="NvGames.dll"/><br>
+
====Multi-display/mixed-GPU acceleration====
*'''Multi-display/mixed-GPU acceleration'''
+
This controls GPU-based acceleration in OpenGL applications and will not have any effect on performance on DirectX platforms.<ref>https://forums.geforce.com/default/topic/449234/sli/mulit-display-mixed-gpu-acceleration-how-to-use-/</ref><br>
"This controls GPU-based acceleration in OpenGL applications and will not have any effect on performance on DirectX platforms. Mixed GPU acceleration permits the use of heterogeneous graphics boards driving multiple monitors independently." <ref>https://forums.geforce.com/default/topic/449234/sli/mulit-display-mixed-gpu-acceleration-how-to-use-/</ref><br>
+
====Power management mode====
*'''Power management mode'''
 
 
"Setting Power management mode from "Adaptive" to "Maximum Performance" can improve performance in certain applications when the GPU is throttling the clock speeds incorrectly."<ref>https://nvidia.custhelp.com/app/answers/detail/a_id/3130/~/setting-power-management-mode-from-adaptive-to-maximum-performance</ref>
 
"Setting Power management mode from "Adaptive" to "Maximum Performance" can improve performance in certain applications when the GPU is throttling the clock speeds incorrectly."<ref>https://nvidia.custhelp.com/app/answers/detail/a_id/3130/~/setting-power-management-mode-from-adaptive-to-maximum-performance</ref>
<br>Often GPU Boost GPUs will not clock correctly in <code>Adaptive</code> mode. They will idle whenever they feel like it. This can cause performance problems. However using <code>Prefer Maximum Performance</code> globally may not allow your GPU to downclock back to idle properly depending on your setup.  
+
=====Adaptive vs. Max Performance power setting=====
If you wish to try and see, when you set this on the Global Profile, you may need to reset the Driver for it to take effect. (Easiest is to restart PC). For setting it on individual Application Profiles, it isn't usually necessary.<br>
+
Adaptive
*'''Shader cache'''
+
{{ii}} Automatically determines whether to use a lower clock speed for games and apps.
This was added in driver 337.88<br>
+
{{++}} Works well when not playing games.
 +
{{--}} Frequently causes problems in games{{cn|date=October 2016}}
 +
{{--}} Nvidia GPU Boost GPUs tend to incorrectly use a lower clock speed.{{cn|date=October 2016}}
 +
Prefer Max Performance
 +
{{ii}} Prevents the GPU from switching to a lower clock speed for games and apps.
 +
{{++}} Works well when playing games.
 +
{{--}} Wasteful when not using GPU-intensive applications or games.
 +
<br>So, if you want to use Adaptive by default, remember to set <code>Prefer Maximum Performance</code> in each profile for each game you play.
 +
=====First steps for tweaking driver settings=====
 +
* '''The first thing you will always want to do for any given is search to see whether Nvidia has already made a profile for your game.
 +
'''
 +
To do this, left click in the white "Profiles" box, and start typing in the name of the game. The application will search through the list and narrow the list down the more characters you type.
 +
*'''If your game does not have a profile, you will need to create one by clicking on the Sun icon.'''
 +
It will prompt you to name the profile.
 +
*'''You then will need to add the game <code>.exe</code> to this new profile by hitting the folder with the green + icon.
 +
'''
 +
*'''If you attempt to add an executable to a profile and it prompts you with a message that the <code>Insert Here.exe</code> already belongs to a profile.'''
 +
Pay attention to the name of the profile/s it gives you. For example if the game you were looking for didn't show up in a search, it's possible you may have worded it differently than they have it set in the driver.
 +
HOWEVER, if the profiles it mentions to you aren't related to the game at all in any way we will need to set up the profile to use an <code>.exe</code> based on directory.
 +
<br>When you go to select an <code>.exe</code> click the drop down box in the right hand corner and set to ''"Absolute Application Path"''
 +
*'''Now you are ready to change settings on the profile. Don't forget to hit Apply Changes to any changes you make!
  
"Shaders are used in almost every game, adding numerous visual effects that can greatly improve image quality (you can see the dramatic impact of shader technology in Watch Dogs here). Generally, shaders are compiled during loading screens, or during gameplay in seamless world titles, such as Assassin's Creed IV, Batman: Arkham Origins, and the aforementioned Watch Dogs. During loads their compilation increases the time it takes to start playing, and during gameplay increases CPU usage, lowering frame rates. When the shader is no longer required, or the game is closed, it is disgarded, forcing its recompilation the next time you play.<br>
+
====Shader cache====
 +
Enables or disables the shader cache, which saves compiled shaders to your hard drive to improve game performance.<ref>{{Refsnip|url=http://www.geforce.com/whats-new/articles/nvidia-geforce-337-88-whql-watch-dog-drivers|title=GeForce 337.88 WHQL, Game Ready Watch Dogs Drivers Increase Frame Rates By Up To 75%|date=2016-10-12|snippet=Shaders are used in almost every game, adding numerous visual effects that can greatly improve image quality (you can see the dramatic impact of shader technology in Watch Dogs here). Generally, shaders are compiled during loading screens, or during gameplay in seamless world titles, such as Assassin's Creed IV, Batman: Arkham Origins, and the aforementioned Watch Dogs. During loads their compilation increases the time it takes to start playing, and during gameplay increases CPU usage, lowering frame rates. When the shader is no longer required, or the game is closed, it is disgarded, forcing its recompilation the next time you play.<br>
 
In today's 337.88 WHQL drivers we've introduced a new NVIDIA Control Panel feature called "Shader Cache", which saves compiled shaders to a cache on your hard drive. Following the compilation and saving of the shader, the shader can simply be recalled from the hard disk the next time it is required, potentially reducing load times and CPU usage to optimize and improve your experience.<br>
 
In today's 337.88 WHQL drivers we've introduced a new NVIDIA Control Panel feature called "Shader Cache", which saves compiled shaders to a cache on your hard drive. Following the compilation and saving of the shader, the shader can simply be recalled from the hard disk the next time it is required, potentially reducing load times and CPU usage to optimize and improve your experience.<br>
By default the Shader Cache is enabled for all games, and saves up to 256MB of compiled shaders in %USERPROFILE%\AppData\Local\Temp\NVIDIA Corporation\NV_Cache. This location can be changed by moving your entire Temp folder using Windows Control Panel > System > System Properties > Advanced > Environmental Variables > Temp, or by using a Junction Point to relocate the NV_Cache folder. To change the use state of Shader Cache on a per-game basis simply locate the option in the NVIDIA Control Panel, as shown below. "<ref>http://www.geforce.com/whats-new/articles/nvidia-geforce-337-88-whql-watch-dog-drivers</ref><br>
+
By default the Shader Cache is enabled for all games, and saves up to 256MB of compiled shaders in %USERPROFILE%\AppData\Local\Temp\NVIDIA Corporation\NV_Cache. This location can be changed by moving your entire Temp folder using Windows Control Panel > System > System Properties > Advanced > Environmental Variables > Temp, or by using a Junction Point to relocate the NV_Cache folder. To change the use state of Shader Cache on a per-game basis simply locate the option in the NVIDIA Control Panel, as shown below.}}</ref>
*'''Threaded optimization'''
+
{{ii}}Added in driver 337.88
It is unknown what this actually does. But it works in DX and OGL and apparently can help and make things worse depending on the game. It defaults to Auto, so that might be the best way to leave it aside from known problematic games.<br>
+
====Threaded optimization====
''Games with known issues with Threaded Optimization''<br>
+
{{ii}} Unknown feature which works in both DirectX and OpenGL games
#Neverwinter Nights
+
{{ii}} Can cause issues in certain games.{{cn|date=October 2016}}
#Battlefield Bad Company 2 in MP
 
#The Chronicles of Riddick: Assault on Dark Athena
 
#DayZ/Arma 2 (Might not be the case anymore. Verification would be nice)
 
''Games that Threaded Optimization helps''<br>
 
#Source Engine games
 
#Sleeping Dogs
 
If you know of any other games it helps or hurts. Post in the discussion tab!
 
<br>
 
 
===SLI===
 
===SLI===
*'''Antialiasing - SLI AA'''
+
====Antialiasing - SLI AA====
"SLI AA essentially disables normal AFR rendering, and in 2-way mode will use the primary GPU for rendering+forced AA, while the secondary GPU is only used to do AA work.<br>
+
This determines whether a Nvidia SLI setup splits the MSAA workload between the available GPUs.<ref>{{Refsnip|url=http://www.nvidia.com/object/slizone_sliAA_howto1.html|title=NVIDIA - SLI Antialiasing|date=2016-10-12|snippet=SLI AA essentially disables normal AFR rendering, and in 2-way mode will use the primary GPU for rendering+forced AA, while the secondary GPU is only used to do AA work.<br>
 
In SLI8x mode for example, each GPU would then do 4xMSAA after which the final result becomes 4xMSAA+4xMSAA=8xMSAA.<br>
 
In SLI8x mode for example, each GPU would then do 4xMSAA after which the final result becomes 4xMSAA+4xMSAA=8xMSAA.<br>
 
This can be useful in games without proper SLI support, so at least the second GPU is not just idling.<br>
 
This can be useful in games without proper SLI support, so at least the second GPU is not just idling.<br>
However it unfortunately only works correctly in OpenGL, and there will be no difference in temporal behavior between for example normal forced 4xMSAA+4xSGSSAA and SLI8x+8xSGSSAA in DX9."<ref>http://forums.guru3d.com/showpost.php?p=5248459&postcount=69</ref><ref>http://www.nvidia.com/object/slizone_sliAA_howto1.html</ref><br>
+
However it unfortunately only works correctly in OpenGL, and there will be no difference in temporal behavior between for example normal forced 4xMSAA+4xSGSSAA and SLI8x+8xSGSSAA in DX9.}}</ref>
*'''Disable SLI (Explicitly set through NVAPI)'''
+
====Disable SLI (Explicitly set through NVAPI)====
*'''Number of GPUs to use on SLI rendering mode'''
+
====Number of GPUs to use on SLI rendering mode====
*'''NVIDIA predefined number of GPUs to use on SLI rendering mode'''
+
====NVIDIA predefined number of GPUs to use on SLI rendering mode====
*'''NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10'''
+
====NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10====
*'''NVIDIA predefined SLI mode'''
+
====NVIDIA predefined SLI mode====
*'''NVIDIA predefined SLI mode on DX10'''
+
====NVIDIA predefined SLI mode on DX10====
*'''SLI indicator'''
+
====SLI indicator====
*'''SLI rendering mode'''
+
====SLI rendering mode====
  
  

Revision as of 15:12, 12 October 2016

This page may require cleanup to meet basic quality standards. The specific problem is: Simplify and shorten everything. You can help by modifying the article. The discussion page may contain useful suggestions.
Nvidia Profile Inspector
Nvidia Profile Inspector cover

Useful Links

Github project page
Download the latest build
Anti-Aliasing Compatibility Flags
HBAO+ Compatibility Flags
SLI Compatibility Flags

Introduction

Nvidia Profile Inspector is a tool created for pulling up and editing Application Profile settings within the Nvidia drivers. Much like the "3D settings" page in Nvidia Control Panel.But not only more in-depth, but much more responsive. NVCP requires you to rely on Nvidia for just about everything. Problem is they don't expose every useful feature,might make some wrong assumptions about any given game and the interface is cumbersome.

NPI was spun off of the original Nvidia Inspector which featured an overclocking utility and required you to create a shortcut just to launch the profile editor.

Setting Up

Instructions

When you run Nvidia Profile Inspector for the first time you will be greeted with this screen. The "Global" profile

These are the settings that globally apply to any application. It is VERY IMPORTANT that you minimize what you force globally to only basic things such as Texture Filtering settings

NPI2016.jpg


Why force Anisotropic Filtering?

This is an example of why you should set Texture Filtering globally and forget it. Some developer's own solutions are not as high quality as they should be.
If any game has a conflict with driver forced Texture Filtering, Nvidia usually sets a special flag on a profile for texture filtering that disables driver overriding for that game. (Far Cry 3 and Doom 2016 are examples of this)
Here is an example of a modified Global Profile. You don't have to copy it verbatim, but I at least recommend setting AF to override and HQ at the very minimum.

List of basic options and explanations

Compatibility

Ambient Occlusion Compatibility

This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get HBAO+ to work with any given game.
There is an official list of Flags built into the driver configured for specific games (though not necessarily well for older titles). These flags are not a combination of functions put together to form a flag as with Anti-Aliasing, but rather are directions pointed a set of programmed parameters for each flag.
The first 2 bytes of the flag are reserved for Flags pertaining to DX10 and above, while the second set of bytes are reserved for flags pertaining to DX9 only.
0x00000000 < DX10+, 0x00000000 < DX9
Each of the 8 slots goes from a value of 1-16
0,1,2,3,4,5,6,7,8,9,A,B,C,D,E,F
Giving them an approximate total of 65536 potential Flags for each set of APIs.

Antialiasing Compatibility

This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get various forms of Anti-Aliasing to work with any given DX9 game. This Hex code unlike AO Compatibility, is actually a combination of possible functions that tell the driver what to do in regards to what kind of buffer formats to look for, how to process them among other things.

Antialiasing compatibility (DX1x)

This is where you would enter or paste an 8 digit Hexadecimal code to force AA in DirectX10+...if you had one. Nvidia's effort sadly did not continue. There are few functions available for this and none of them work in any meaningful way for games other than the ones listed. Often only restricted to MSAA.

Antialiasing Fix

This is somewhat a mystery almost. Initially this was apparently made just for an issue relating to Team Fortress 2. (As such was originally known as the TF2Fix). But as it turned out, this affected a very large number of games.

Currently the only description of the function available is FERMI_SETREDUCECOLORTHRESHOLDSENABLE (Fermi> Set Reduce Color Thresholds Enable)
This would suggest it's a Fermi issue, but it applies really to at least Fermi and everything after it. It's also interesting that turning the AA fix on disables this. The default setting of "Off" is actually a value of 0x00000001. The On value is 0x00000000(Team Fortress 2)
Do Not Enable This Globally| As what it does depends on a per game basis. It is noted whether a game needs it or not in the Anti-Aliasing flags thread or whether it causes issues.

SLI Compatibility bits

This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX9 applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.

Like AA compatibility bits. These flags are a combination of functions within the driver.

  • SLI compatiblity bits (DX10+DX11)

This is where you would enter or paste an 8 digit Hexadecimal code (Always with the prefix "0x") to get SLI working in DX10+ applications. if the application doesn't already have a flag in the driver. OR if the official flag doesn't work well and is of poor quality.

  • SLI compatiblity bits (DX12)

This is a new one. It is assumed it is the same as the other two. Currently there are only 2 flags in the driver. But as more DX12 games come out there will be more and this should actually be interesting to see how it plays out


Sync and Refresh

Frame Rate Limiter

This setting will enable the driver's built in Frame Rate limiter in a series of pre-defined, non-whole-number values.[1] Some people may prefer Unwinder's RTSS as more useful. As it lets you set the value yourself.

GSYNC settings

  • GSYNC Application Mode
  • GSYNC Requested State
  • GSYNC Global Feature
  • GSYNC Global Mode
  • GSYNC Indicator Overlay

Maximum pre-rendered frames

This setting controls how many frames the CPU can process ahead of the frame currently being drawn by the GPU.[2]

Valid values are 1 through 8
Default is 3, and it is recommended not to go above this[citation needed]
Values of 1 and 2 will help reduce input latency in exchange for greater CPU usage.[3]
When Vsync is set to 1/2 Refresh Rate, a value of 1 is essentially required due to the introduced input latency.[citation needed]

Triple buffering

Explanatory article

Enables Triple Buffering for Vsync in OpenGL applications

If you wish to enable TB for D3D APIs you can download and use D3DOverrider.
This setting is unused when using GSYNC.

Vertical Sync Smooth AFR behavior

Dynamically adjusts Vsync behavior when using SLI.[4]

Vertical Sync Tear Control (Adaptive Frame Rate)

This controls when a frame drop is detected whether Vsync should be disabled to maintain performance or sync should drop to the next sync rate.[5]

Without the Adaptive setting the frame rate will drop to 45 or 30FPS because they are the next available sync rate (3/4 and 1/2 monitor refresh rate, respectively).
This setting is unused when using GSYNC.

Vertical Sync

Controls whether Vsync can be enabled for any given application.

Defaults to Application Controlled which means it's up to the individual application to enable, disable, or offer the option for Vsync.
Can be useful when an application forces a specific setting, such as Fallout 4, which forces Vsync without an option to disable it.
Use 3D Application Setting Uses application setting for Vsync Application will decide whether Vsync is enabled.
Force Off Forces Vsync Off Forces application to not use Vsync. Even if it has no option. Will not stop Vsync from Windows Compositor in W8/W10 in Windowed mode. In some games this will unlock the framerate, or will simply cause tearing and reduce Input Latency
Force On Forces Vsync On Forces application to use Vsync. Even if it has no option. Will not work on games that are Borderless Windowed/Windowed only.
1/2 Refresh Rate Forces 1/2 rate Sync.
This is very useful for playing at 30FPS or on 30FPS games that usually have some microstutter problems when syncing at 60hz. Remember to use a lower Pre-Rendered Frames setting when using it
144/72hz, 120hz/60, 60hz/30, 50hz/25
1/3 Refresh Rate Forces 1/3 rate Sync 144hz/48 , 120hz/40, 60hz/20, 50hz/Not possible with whole numbers.
1/4 Refresh Rate Forces 1/4 rate Sync 144hz/36, 120hz/30, 60hz/15, 50hz/Not possible with whole numbers
Fast Sync Enables Nvidia's Fast Sync[6] Does not work with DX12 games. Fast Sync eliminates tearing when playing games without Vsync on and reduces Input Latency significantly. However to work best, you need to be maintaining multiples of your refresh rate. Designed for Pascal GPUs, but some have had success with Maxwell GPUs. Your success will vary.


Antialiasing

Antialiasing - Behavior Flags

These mostly exist as a method of governing usage of AA from Nvidia Control Panel, though they also affect Inspector as well.

Make sure you are clearing out any flags in this field for a game profile when forcing AA as it will interfere and cause it not to work if you aren't careful.

Antialiasing - Gamma Correction

Gamma correction for MSAA

Introduced with Half-Life 2 in 2004.
Defaults to ON starting with Fermi GPUs.
Should not be set to OFF on modern hardware.[7]

Antialiasing - Mode

This has 3 settings

Application Controlled Application decides whether to use AA
Override Application Setting Driver overrides application setting. This allows you to force AA from the driver. General rule of thumb is you have to disable any in game AA/MSAA when using this to avoid conflict. There are exceptions though. And are noted in the AA Flags link generally.
Enhance Application Setting Enhances application AA settings (Ex: Enhance in game MSAA with TrSSAA).
Notes on the "Override" setting

When overriding AA in and game you will want to set "Override" and not either of the other two. Enhancing AA will become entirely dependent on the implementation of MSAA in the game you are modifying a profile for. More often than not, especially in modern DX10+ games. This is a total crap shoot, either it doesn't work, breaks something, or looks very bad.

However, there are a few exceptions here. In certain game/game engines like Monolith Productions mid 2000s LithTech based games that use some sort of MSAA/FSAA hybrid the above 3 settings will generally not matter.

Example: With F.E.A.R 2 if you have the correct AA Flag on the profile and leave it at Application Controlled but have 8xQ MSAA and 8xSGSSAA enabled below it and enable F.E.A.R 2's FSAA it will enhance the FSAA.
In this specific case, this actually looks much better than 8xSGSSAA or FSAA by themselves!

Another great example is Red Faction Guerrilla, you can't force AA in this game.

Using the "Enhance" setting

When overriding in-game MSAA fails, you can instead replace (enhance) the in-game MSAA with the NVIDIA MSAA technique of your choice.

Antialiasing - setting

Explanation of MSAA methods

This is where the specific method of forced/enhanced MSAA is set.

Antialiasing - Transparency Multisampling

Technical explanation
Interactive example
This enables and disables the use of transparency multisampling.

Antialiasing - Transparency Supersampling

Further Reference
This sets whether Supersampling and Sparse Grid Supersampling is used.
Supersampling
Inexpensive
Does not require MSAA to work
Only works on alpha-tested surfaces
Full-Screen Sparse Grid Supersamping
Extremely high-quality supersampling
Works on the entire image
Requires MSAA to work
Very expensive

Notes

The level of supersampling must match a corresponding level of MSAA. For example, if 4x MSAA is used, then 4x Sparse Grid Supersampling must be used.
In OpenGL Supersampling is equivalent to Sparse Grid Supersampling. Ergo, 4xMSAA+4xTrSSAA will give you 4xFSSGSSAA in OpenGL games that work.


Do note that, usually these require AA compatibility Flags to work properly in DX9 games!

Enable Maxwell sample interleaving (MFAA)

Introduction to MFAA

This enables Nvidia's Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced. What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.
There are some caveats to using this though.

It is not compatible with SGSSAA as far as shown in limited testing.[citation needed]
Depending on the game, MFAA causes visible flickering on geometric edges and other temporal artifacts. Part of this is nullified with downsampling.[citation needed]
It has a minimum framerate requirement of about 40FPS. Otherwise MFAA will degrade the image.
Visible sawtooth patterns with screenshots and videos captured locally.[citation needed]
Incompatible with SLI.[citation needed]

Fullscreen Approximate Anti-Aliasing (FXAA)

Nvidia Predefined FXAA usage

Simply tells the driver whether FXAA is allowed to be turned on in Nvidia Control Panel (Primarily) or Nvidia Inspector.

Toggle FXAA indicator on or off

This will display a small green icon in the left upper corner if enabled showing whether FXAA is enabled or not.

Toggle FXAA on or off

Turns FXAA on or Off.

Texture Filtering

Anisotropic Filtering (AF) mode

Tells the driver whether the Driver controls AF for the application.

It is recommended you leave this set to User Defined/Off as many games do not have Texture Filtering options and many more games also have mediocre or intentionally-mediocre Texture Filtering.[citation needed]

Anisotropic Filtering setting

If the AF mode is set to User Defined/Off, this determines what level of Texture Filtering is forced on an application. 16x is the best setting and is recommended to use.

Prevent Anisotropic Filtering

Similar to AA Behavior Flags, if this is set to On it will ignore user-defined driver overrides. Some games default to this, such as Quake 4, Rage, Doom (2016), Far Cry 3 and Far Cry 3: Blood Dragon.

Texture Filtering - Anisotropic filter optimization

"Anisotropic filter optimization improves performance by limiting trilinear filtering to the primary texture stage where the general appearance and color of objects is determined. Bilinear filtering is used for all other stages, such as those used for lighting, shadows, and other effects. This setting only affects DirectX programs."[citation needed]

Texture Filtering - Anisotropic sample optimization

"Anisotropic sample optimization limits the number of anisotropic samples used based on texel size. This setting only affects DirectX programs."[citation needed]

Texture Filtering - Driver Controlled LOD Bias

When using SGSSAA with this enabled will allow the driver to compute it's own Negative LOD Bias for textures to help improve sharpness (at cost of potential for more aliasing) for those who prefer it. It's generally less than the fixed amounts that are recommended.[citation needed]

Manual LOD bias will be ignored when this is enabled.

Texture Filtering - LOD Bias (DX)

Interactive example
Explanation of LOD bias

The Level of Detail Bias setting for textures in DirectX backends. This normally only works under 2 circumstances.
For both, "Driver Controlled LoD Bias" must be set to "Off"

  1. When Overriding or Enhancing AA.
  2. The last choice is an interesting one. If you leave the Antialiasing Mode setting to Application Controlled or Enhance Application Setting(Try Application controlled first!) but you set the AA and Transparency setting to SGSSAA (Eg 4xMSAA and 4xSGSSAA;TrSSAA in OpenGL then you can freely set the LOD bias and the changes will work without forcing AA. This has the side effect that in some games if the game has MSAA, it will act as if you were "Enhancing" the game setting. Even if you are using Application Controlled

Notes:

If you wish to use a negative LOD bias when forcing SGSSAA, these are the recommended amounts:
  1. 2xSGSSAA (2 samples): -0.5
  2. 4xSGSSAA (4 samples): -1.0
  3. 8xSGSSAA (8 samples): -1.5
Do not use a negative LOD bias when using OGSSAA and HSAA as they have their own LOD bias.[8]

Texture Filtering - LOD Bias (OGL)

Identical to Texture Filtering - LOD Bias (DX), but applies to OpenGL applications and games instead.

Texture Filtering - Negative LOD bias

Controls whether negative LODs are clamped (Not allowed) or allowed .

Has no effect on Fermi GPU's and beyond; clamps by default on those GPUs.

Texture Filtering - Quality

This controls AF optimizations meant to improve AF performance on legacy hardware at the cost of quality.

Should be set to or left at High Quality if not using a GeForce 8 series GPU or earlier.

Texture Filtering - Trilinear Optimization

Same as Texture Filtering - Quality in practice.[citation needed]

Disabled if Texture Filtering - Quality is set to High Quality.

Common

Ambient Occlusion setting

This setting determines what kind of Ambient Occlusion is used in games which do not support it.

Has three modes:
  1. Performance
  2. Quality
  3. High Quality

Quality and High Quality are nearly identical, while Performance noticeably lowers the resolution and precision of the effect in many games with less-accurate and stronger shading.[citation needed]

Certain games need Performance to support forced Ambient Occlusion, such as Oddworld: New 'n' Tasty!.[citation needed]

Ambient Occlusion usage

When using forced Ambient Occlusion, set this to On.

Extension limit

"Extension limit indicates whether the driver extension string has been trimmed for compatibility with particular applications. Some older applications cannot process long extension strings and will crash if extensions are unlimited.
If you are using an older OpenGL application, turning this option on may prevent crashing. If you are using a newer OpenGL application, you should turn this option off"[citation needed]

Multi-display/mixed-GPU acceleration

This controls GPU-based acceleration in OpenGL applications and will not have any effect on performance on DirectX platforms.[9]

Power management mode

"Setting Power management mode from "Adaptive" to "Maximum Performance" can improve performance in certain applications when the GPU is throttling the clock speeds incorrectly."[10]

Adaptive vs. Max Performance power setting

Adaptive

Automatically determines whether to use a lower clock speed for games and apps.
Works well when not playing games.
Frequently causes problems in games[citation needed]
Nvidia GPU Boost GPUs tend to incorrectly use a lower clock speed.[citation needed]

Prefer Max Performance

Prevents the GPU from switching to a lower clock speed for games and apps.
Works well when playing games.
Wasteful when not using GPU-intensive applications or games.


So, if you want to use Adaptive by default, remember to set Prefer Maximum Performance in each profile for each game you play.

First steps for tweaking driver settings
  • The first thing you will always want to do for any given is search to see whether Nvidia has already made a profile for your game.

To do this, left click in the white "Profiles" box, and start typing in the name of the game. The application will search through the list and narrow the list down the more characters you type.

  • If your game does not have a profile, you will need to create one by clicking on the Sun icon.

It will prompt you to name the profile.

  • You then will need to add the game .exe to this new profile by hitting the folder with the green + icon.

  • If you attempt to add an executable to a profile and it prompts you with a message that the Insert Here.exe already belongs to a profile.

Pay attention to the name of the profile/s it gives you. For example if the game you were looking for didn't show up in a search, it's possible you may have worded it differently than they have it set in the driver. HOWEVER, if the profiles it mentions to you aren't related to the game at all in any way we will need to set up the profile to use an .exe based on directory.
When you go to select an .exe click the drop down box in the right hand corner and set to "Absolute Application Path"

  • Now you are ready to change settings on the profile. Don't forget to hit Apply Changes to any changes you make!

Shader cache

Enables or disables the shader cache, which saves compiled shaders to your hard drive to improve game performance.[11]

Added in driver 337.88

Threaded optimization

Unknown feature which works in both DirectX and OpenGL games
Can cause issues in certain games.[citation needed]

SLI

Antialiasing - SLI AA

This determines whether a Nvidia SLI setup splits the MSAA workload between the available GPUs.[12]

Disable SLI (Explicitly set through NVAPI)

Number of GPUs to use on SLI rendering mode

NVIDIA predefined number of GPUs to use on SLI rendering mode

NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10

NVIDIA predefined SLI mode

NVIDIA predefined SLI mode on DX10

SLI indicator

SLI rendering mode

References

  1. Guru3D Forums - Nvidia Inspector introduction and guide, post # 108 - last accessed on 2016-10-12
  2. Maximum Pre-Rendered frames for SLI - last accessed on 2016-10-12
    "The 'maximum pre-rendered frames' function operates within the DirectX API and serves to explicitly define the maximum number of frames the CPU can process ahead of the frame currently being drawn by the graphics subsystem. This is a dynamic, time-dependent queue that reacts to performance: at low frame rates, a higher value can produce more consistent gameplay, whereas a low value is less likely to introduce input latencies incurred by the extended storage time (of higher values) within the cache/memory. Its influence on performance is usually barely measurable, but its primary intent is to control peripheral latency."
  3. http://www.displaylag.com/reduce-input-lag-in-pc-games-the-definitive-guide/
  4. NVIDIA - What is Smooth Vsync? - last accessed on 2016-10-12
    "Smooth Vsync is a new technology that can reduce stutter when Vsync is enabled and SLI is active.
    When SLI is active and natural frame rates of games are below the refresh rate of your monitor, traditional vsync forces frame rates to quickly oscillate between the refresh rate and half the refresh rate (for example, between 60Hz and 30Hz). This variation is often perceived as stutter. Smooth Vsync improves this by locking into the sustainable frame rate of your game and only increasing the frame rate if the game performance moves sustainably above the refresh rate of your monitor. This does lower the average framerate of your game, but the experience in many cases is far better.
    "
  5. http://www.geforce.com/hardware/technology/adaptive-vsync/technology
  6. https://www.technopat.net/sosyal/konu/video-what-is-nvidia-fast-sync.329258/
  7. NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10 - What's Gamma Correct AA? - last accessed on 2016-10-12
  8. Natural Violence - SGSSAA - last accessed on 2016-10-12
  9. https://forums.geforce.com/default/topic/449234/sli/mulit-display-mixed-gpu-acceleration-how-to-use-/
  10. https://nvidia.custhelp.com/app/answers/detail/a_id/3130/~/setting-power-management-mode-from-adaptive-to-maximum-performance
  11. GeForce 337.88 WHQL, Game Ready Watch Dogs Drivers Increase Frame Rates By Up To 75% - last accessed on 2016-10-12
    "Shaders are used in almost every game, adding numerous visual effects that can greatly improve image quality (you can see the dramatic impact of shader technology in Watch Dogs here). Generally, shaders are compiled during loading screens, or during gameplay in seamless world titles, such as Assassin's Creed IV, Batman: Arkham Origins, and the aforementioned Watch Dogs. During loads their compilation increases the time it takes to start playing, and during gameplay increases CPU usage, lowering frame rates. When the shader is no longer required, or the game is closed, it is disgarded, forcing its recompilation the next time you play.
    In today's 337.88 WHQL drivers we've introduced a new NVIDIA Control Panel feature called "Shader Cache", which saves compiled shaders to a cache on your hard drive. Following the compilation and saving of the shader, the shader can simply be recalled from the hard disk the next time it is required, potentially reducing load times and CPU usage to optimize and improve your experience.
    By default the Shader Cache is enabled for all games, and saves up to 256MB of compiled shaders in %USERPROFILE%\AppData\Local\Temp\NVIDIA Corporation\NV_Cache. This location can be changed by moving your entire Temp folder using Windows Control Panel > System > System Properties > Advanced > Environmental Variables > Temp, or by using a Junction Point to relocate the NV_Cache folder. To change the use state of Shader Cache on a per-game basis simply locate the option in the NVIDIA Control Panel, as shown below.
    "
  12. NVIDIA - SLI Antialiasing - last accessed on 2016-10-12
    "SLI AA essentially disables normal AFR rendering, and in 2-way mode will use the primary GPU for rendering+forced AA, while the secondary GPU is only used to do AA work.
    In SLI8x mode for example, each GPU would then do 4xMSAA after which the final result becomes 4xMSAA+4xMSAA=8xMSAA.
    This can be useful in games without proper SLI support, so at least the second GPU is not just idling.
    However it unfortunately only works correctly in OpenGL, and there will be no difference in temporal behavior between for example normal forced 4xMSAA+4xSGSSAA and SLI8x+8xSGSSAA in DX9.
    "