Anonymous edits have been disabled on the wiki. If you want to contribute please login or create an account.


Warning for game developers: PCGamingWiki staff members will only ever reach out to you using the official press@pcgamingwiki.com mail address.
Be aware of scammers claiming to be representatives or affiliates of PCGamingWiki who promise a PCGW page for a game key.

Nvidia Profile Inspector

From PCGamingWiki, the wiki about fixing PC games
Revision as of 03:43, 30 August 2018 by Aemony (talk | contribs) (removed duplicate content)
This page may require cleanup to meet basic quality standards. The specific problem is: Simplify and shorten everything. You can help by modifying the article. The discussion page may contain useful suggestions.
Nvidia Profile Inspector
Nvidia Profile Inspector cover
Developers
Orbmu2k

Nvidia Profile Inspector ("NPI") is a third-party tool created for pulling up and editing application profiles within the Nvidia display drivers. Much like the 3D settings page in the Nvidia Control Panel, but more in-depth and exposes settings and functionality not available through the normal control panel for the display driver.

The tool was spun off of the original Nvidia Inspector, which featured an overclocking utility and required you to create a shortcut to launch the profile editor.

Useful Links

Releases
GitHub
In-depth Guide

Compatibility Flags

Anti-Aliasing Compatibility Flags
HBAO+ Compatibility Flags
SLI Compatibility Flags (German)

Related articles

Nvidia
Nvidia Control Panel

Installation

Main window of the global/base profile.
Main window of the global/base profile.
Instructions
  1. Download the latest build.
  2. Save and extract it to an appropriate permanent location.
  3. Run nvidiaProfileInspector.exe to start the application. It may be worth making a shortcut and pinning it to your start menu or desktop.

Status indicators

  • White cog - Global default / Unchanged.
  • Grey cog - Global override through the base _GLOBAL_DRIVER_PROFILE profile.
  • Blue cog - Profile-specific user-defined override.
  • Green cog - Profile-specific Nvidia predefined override.

Tweaks

Why force Anisotropic Filtering?

This is an example of why you should set Texture Filtering globally and forget it. Some developer's own solutions are not as high quality as they should be.
If any game has a conflict with driver forced Texture Filtering, Nvidia usually sets a special flag on a profile for texture filtering that disables driver overriding for that game. (Far Cry 3 and Doom 2016 are examples of this)
Here is an example of a modified Global Profile. You don't have to copy it verbatim, but I at least recommend setting AF to override and HQ at the very minimum.

Settings

Keep changes to the base _GLOBAL_DRIVER_PROFILE profile limited, as overrides there are applied to all games and applications.

Compatibility

Compatibility parameters that allows ambient occlusion, anti-aliasing, and SLI to be used for games, if given an appropriate value.
Ambient Occlusion/HBAO+ Compatibility Flags
Anti-Aliasing Compatibility Flags
SLI Compatibility Flags (German)
Parameter Description
Ambient Occlusion compatibility Allows HBAO+ to work with any given game.
Antialiasing compatibility Allows various forms of anti-aliasing to work with any given DirectX 9 game.
Antialiasing compatibility (DX1x) Allows various forms of anti-aliasing to work with DirectX 10/11/12 games. Often restricted to MSAA and generally sees little use due to its limited capability.
Antialiasing Fix Avoid black edges in some games while using MSAA.[1] Related to Team Fortress 2 and a few other games as well, where it is enabled by default.
SLI compatibility bits Allows SLI to work in any given DirectX 9 game.
SLI compatibility bits (DX10+DX11) Allows SLI to work in any given DirectX 10/11 game.
SLI compatibility bits (DX12) Allows SLI to work in any given DirectX 12 game.

Sync and Refresh

The built-in G-Sync status indicator overlay
The built-in G-Sync status indicator overlay
The built-in frame rate limiter of the Nvidia display drivers introduces 2+ frames of delay, while RTSS (and possibly other alternatives) only introduces 1 frame of delay.[2]
AnandTech - Triple Buffering: Why We Love It - Recommended reading about triple buffering/Fast Sync.
Parameter Description
Flip Indicator Enables an on-screen display of frames presented using flip model.
Frame Rate Limiter Enables the built-in frame rate limiter of the display drivers at the specified FPS.
Frame Rate Limiter Mode Controls what mode of the frame rate limiter will be used.
GSYNC - Application Mode Controls whether the G-Sync feature will be active in fullscreen only, or in windowed mode as well.
GSYNC - Application Requested State Unknown
GSYNC - Application State Selects the technique used to control the refresh policy of an attached monitor.
  • Allow enables the use of G-Sync, and syncronizes monitor refresh rate to GPUs render target.
  • Force off and Disallow' disables the use of G-Sync.
  • Fixed Refresh Rate is the traditional fixed refresh rate monitor technology.
  • Ultra Low Motion Blur (ULMB) uses back light pulsing at a fixed refresh rate to minimize blur.

It is highly recommended to change this using the Nvidia Control Panel > Manage 3D Settings > Monitor Technology instead to properly configured this and related parameters.

GSYNC - Global Feature Toggles the global G-Sync functionality. It is recommended to change this using the Nvidia Control Panel > Set up G-SYNC instead to properly configured this and related parameters.
GSYNC - Global Feature Controls whether the G-Sync feature will be active globally in fullscreen only, or in windowed mode as well.
GSYNC - Indicator Overlay Enables a semi-transparent on-screen status indicator of when G-Sync is being used. If G-Sync is not being used, the indicator is not shown at all.
Maximum pre-rendered frames Controls how many frames the CPU can prepare ahead of the frame currently being drawn by the GPU. Increasing this can result in smoother gameplay at lower frame rates, at the cost of additional input delay.[3] This setting does not apply to SLI configurations.
  • Default is Use the 3D application setting, and it is recommended not to go above 3. Values of 1 and 2 will help reduce input latency in exchange for greater CPU usage.[4]
  • When Vsync is set to 1/2 Refresh Rate, a value of 1 is essentially required due to the introduced input latency.[citation needed]
Preferred Refresh Rate Controls the refresh rate override of the display drivers for games running in exclusive fullscreen mode. This also dictates the upper limit of the G-Sync range, as G-Sync will go inactive and the screen will start to tear at frame rates above the configured refresh rate, or V-Sync will kick in and sync the frame rate to the refresh rate if enabled.
  • Highest available - Overrides the refresh rate of the exclusive fullscreen game to whatever is the highest available on the monitor. This setting is automatically used when G-Sync is enabled. Note that this override might not work for all games, in which case an alternative such as Special K might be needed.
  • Use the 3D application setting / Application-controlled - Uses the refresh rate as requested by the application. If using G-Sync, frame rates above the requested refresh rate will result in screen tearing as G-Sync will go inactive, or V-Sync synchronizing the frame rate to the refresh rate if enabled.

This setting is only expose in Nvidia Control Panel for G-Sync monitors supporting refresh rates above 60 Hz.

Triple buffering Toggles triple buffering in OpenGL games. Enabling this improves performance when vertical sync is also turned on.
Vertical Sync Controls vertical sync, or enables Fast Sync. Fast-Sync is essentially triple buffering for DirectX games, and is only applicable to frame rates above the refresh rate when regular V-Sync is disabled.
Vertical Sync Smooth AFR behavior Toggles V-Sync smoothing behavior when V-Sync is enabled and SLI is active.

Official description:[5]

Smooth Vsync is a new technology that can reduce stutter when Vsync is enabled and SLI is active.

When SLI is active and natural frame rates of games are below the refresh rate of your monitor, traditional vsync forces frame rates to quickly oscillate between the refresh rate and half the refresh rate (for example, between 60Hz and 30Hz). This variation is often perceived as stutter. Smooth Vsync improves this by locking into the sustainable frame rate of your game and only increasing the frame rate if the game performance moves sustainably above the refresh rate of your monitor. This does lower the average framerate of your game, but the experience in many cases is far better.

Vertical Sync Tear Control Toggles Adaptive V-Sync when vertical sync is enabled. Not available when G-Sync is enabled.

Official description:[6]

Nothing is more distracting than frame rate stuttering and screen tearing. The first tends to occur when frame rates are low, the second when frame rates are high. Adaptive VSync is a smarter way to render frames using NVIDIA Control Panel software. At high framerates, VSync is enabled to eliminate tearing. At low frame rates, it's disabled to minimize stuttering.

Antialiasing

Parameter Description
Antialiasing - Behavior Flags These mostly exist as a method of governing usage of AA from Nvidia Control Panel, though they also affect Inspector as well.
Make sure you are clearing out any flags in this field for a game profile when forcing AA as it will interfere and cause it not to work if you aren't careful.
Antialiasing - Gamma Correction Gamma correction for MSAA
Introduced with Half-Life 2 in 2004.
Defaults to ON starting with Fermi GPUs.
Should not be set to OFF on modern hardware.[7]
Antialiasing - Line gamma Unknown
Antialiasing - Mode This have 3 settings:
  • Application Controlled - Application decides whether to use AA
  • Override Application Setting - Driver overrides application setting. This allows you to force AA from the driver. General rule of thumb is you have to disable any in game AA/MSAA when using this to avoid conflict. There are exceptions though. And are noted in the AA Flags link generally.
  • Enhance Application Setting - Enhances application AA settings (Ex: Enhance in game MSAA with TrSSAA).
Antialiasing - Setting
Explanation of MSAA methods

This is where the specific method of forced/enhanced MSAA is set.

Antialiasing - Transparency Multisampling
Technical explanation
Interactive example
This enables and disables the use of transparency multisampling.
Antialiasing - Transparency Supersampling
Further Reference
This sets whether Supersampling and Sparse Grid Supersampling is used.
Enable Maxwell sample interleaving (MFAA)
Introduction to MFAA

This enables Nvidia's Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced. What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.
There are some caveats to using this though.

It is not compatible with SGSSAA as far as shown in limited testing.[citation needed]
Depending on the game, MFAA causes visible flickering on geometric edges and other temporal artifacts. Part of this is nullified with downsampling.[citation needed]
It has a minimum framerate requirement of about 40FPS. Otherwise MFAA will degrade the image.
Visible sawtooth patterns with screenshots and videos captured locally.[citation needed]
Incompatible with SLI.[citation needed]
Nvidia Predefined FXAA Usage Simply tells the driver whether FXAA is allowed to be turned on in Nvidia Control Panel (Primarily) or Nvidia Inspector.
Toggle FXAA indicator on or off This will display a small green icon in the left upper corner if enabled showing whether FXAA is enabled or not.
Toggle FXAA on or off Turns FXAA on or Off.
Notes on the "Override" setting

When overriding AA in and game you will want to set "Override" and not either of the other two. Enhancing AA will become entirely dependent on the implementation of MSAA in the game you are modifying a profile for. More often than not, especially in modern DX10+ games. This is a total crap shoot, either it doesn't work, breaks something, or looks very bad.

However, there are a few exceptions here. In certain game/game engines like Monolith Productions mid 2000s LithTech based games that use some sort of MSAA/FSAA hybrid the above 3 settings will generally not matter.

Example: With F.E.A.R 2 if you have the correct AA Flag on the profile and leave it at Application Controlled but have 8xQ MSAA and 8xSGSSAA enabled below it and enable F.E.A.R 2's FSAA it will enhance the FSAA.
In this specific case, this actually looks much better than 8xSGSSAA or FSAA by themselves!

Another great example is Red Faction Guerrilla, you can't force AA in this game.

Using the "Enhance" setting

When overriding in-game MSAA fails, you can instead replace (enhance) the in-game MSAA with the NVIDIA MSAA technique of your choice.

---

Supersampling
Inexpensive
Does not require MSAA to work
Only works on alpha-tested surfaces
Full-Screen Sparse Grid Supersamping
Extremely high-quality supersampling
Works on the entire image
Requires MSAA to work
Very expensive

Notes

The level of supersampling must match a corresponding level of MSAA. For example, if 4x MSAA is used, then 4x Sparse Grid Supersampling must be used.
In OpenGL Supersampling is equivalent to Sparse Grid Supersampling. Ergo, 4xMSAA+4xTrSSAA will give you 4xFSSGSSAA in OpenGL games that work.


Do note that, usually these require AA compatibility Flags to work properly in DX9 games!

Texture Filtering

Anisotropic Filtering (AF) mode

Tells the driver whether the Driver controls AF for the application.

It is recommended you leave this set to User Defined/Off as many games do not have Texture Filtering options and many more games also have mediocre or intentionally-mediocre Texture Filtering.[citation needed]

Anisotropic Filtering setting

If the AF mode is set to User Defined/Off, this determines what level of Texture Filtering is forced on an application. 16x is the best setting and is recommended to use.

Prevent Anisotropic Filtering

Similar to AA Behavior Flags, if this is set to On it will ignore user-defined driver overrides. Some games default to this, such as Quake 4, Rage, Doom (2016), Far Cry 3 and Far Cry 3: Blood Dragon.

Texture Filtering - Anisotropic filter optimization

"Anisotropic filter optimization improves performance by limiting trilinear filtering to the primary texture stage where the general appearance and color of objects is determined. Bilinear filtering is used for all other stages, such as those used for lighting, shadows, and other effects. This setting only affects DirectX programs."[citation needed]

Texture Filtering - Anisotropic sample optimization

"Anisotropic sample optimization limits the number of anisotropic samples used based on texel size. This setting only affects DirectX programs."[citation needed]

Texture Filtering - Driver Controlled LOD Bias

When using SGSSAA with this enabled will allow the driver to compute it's own Negative LOD Bias for textures to help improve sharpness (at cost of potential for more aliasing) for those who prefer it. It's generally less than the fixed amounts that are recommended.[citation needed]

Manual LOD bias will be ignored when this is enabled.

Texture Filtering - LOD Bias (DX)

Interactive example
Explanation of LOD bias

The Level of Detail Bias setting for textures in DirectX backends. This normally only works under 2 circumstances.
For both, "Driver Controlled LoD Bias" must be set to "Off"

  1. When Overriding or Enhancing AA.
  2. The last choice is an interesting one. If you leave the Antialiasing Mode setting to Application Controlled or Enhance Application Setting(Try Application controlled first!) but you set the AA and Transparency setting to SGSSAA (Eg 4xMSAA and 4xSGSSAA;TrSSAA in OpenGL then you can freely set the LOD bias and the changes will work without forcing AA. This has the side effect that in some games if the game has MSAA, it will act as if you were "Enhancing" the game setting. Even if you are using Application Controlled

Notes:

If you wish to use a negative LOD bias when forcing SGSSAA, these are the recommended amounts:
  1. 2xSGSSAA (2 samples): -0.5
  2. 4xSGSSAA (4 samples): -1.0
  3. 8xSGSSAA (8 samples): -1.5
Do not use a negative LOD bias when using OGSSAA and HSAA as they have their own LOD bias.[8]

Texture Filtering - LOD Bias (OGL)

Identical to Texture Filtering - LOD Bias (DX), but applies to OpenGL applications and games instead.

Texture Filtering - Negative LOD bias

Controls whether negative LODs are clamped (Not allowed) or allowed .

Has no effect on Fermi GPU's and beyond; clamps by default on those GPUs.

Texture Filtering - Quality

This controls AF optimizations meant to improve AF performance on legacy hardware at the cost of quality.

Should be set to or left at High Quality if not using a GeForce 8 series GPU or earlier.

Texture Filtering - Trilinear Optimization

Same as Texture Filtering - Quality in practice.[citation needed]

Disabled if Texture Filtering - Quality is set to High Quality.

Common

Ambient Occlusion setting

This setting determines what kind of Ambient Occlusion is used in games which do not support it.

Has three modes:
  1. Performance
  2. Quality
  3. High Quality

Quality and High Quality are nearly identical, while Performance noticeably lowers the resolution and precision of the effect in many games with less-accurate and stronger shading.[citation needed]

Certain games need Performance to support forced Ambient Occlusion, such as Oddworld: New 'n' Tasty!.[citation needed]

Ambient Occlusion usage

When using forced Ambient Occlusion, set this to On.

Extension limit

"Extension limit indicates whether the driver extension string has been trimmed for compatibility with particular applications. Some older applications cannot process long extension strings and will crash if extensions are unlimited.
If you are using an older OpenGL application, turning this option on may prevent crashing. If you are using a newer OpenGL application, you should turn this option off"[citation needed]

Multi-display/mixed-GPU acceleration

This controls GPU-based acceleration in OpenGL applications and will not have any effect on performance on DirectX platforms.[9]

Power management mode

"Setting Power management mode from "Adaptive" to "Maximum Performance" can improve performance in certain applications when the GPU is throttling the clock speeds incorrectly."[10]

Adaptive vs. Max Performance power setting

Adaptive

Automatically determines whether to use a lower clock speed for games and apps.
Works well when not playing games.
Frequently causes problems in games[citation needed]
Nvidia GPU Boost GPUs tend to incorrectly use a lower clock speed.[citation needed]

Prefer Max Performance

Prevents the GPU from switching to a lower clock speed for games and apps.
Works well when playing games.
Wasteful when not using GPU-intensive applications or games.


So, if you want to use Adaptive by default, remember to set Prefer Maximum Performance in each profile for each game you play.

First steps for tweaking driver settings
  • The first thing you will always want to do for any given is search to see whether Nvidia has already made a profile for your game.

To do this, left click in the white "Profiles" box, and start typing in the name of the game. The application will search through the list and narrow the list down the more characters you type.

  • If your game does not have a profile, you will need to create one by clicking on the Sun icon.

It will prompt you to name the profile.

  • You then will need to add the game .exe to this new profile by hitting the folder with the green + icon.

  • If you attempt to add an executable to a profile and it prompts you with a message that the Insert Here.exe already belongs to a profile.

Pay attention to the name of the profile/s it gives you. For example if the game you were looking for didn't show up in a search, it's possible you may have worded it differently than they have it set in the driver. HOWEVER, if the profiles it mentions to you aren't related to the game at all in any way we will need to set up the profile to use an .exe based on directory.
When you go to select an .exe click the drop down box in the right hand corner and set to "Absolute Application Path"

  • Now you are ready to change settings on the profile. Don't forget to hit Apply Changes to any changes you make!

Shader cache

Enables or disables the shader cache, which saves compiled shaders to your hard drive to improve game performance.[11]

Added in driver 337.88

Threaded optimization

Unknown feature which works in both DirectX and OpenGL games
Can cause issues in certain games.[citation needed]

SLI

Antialiasing - SLI AA

This determines whether a Nvidia SLI setup splits the MSAA workload between the available GPUs.[12]

Disable SLI (Explicitly set through NVAPI)

Number of GPUs to use on SLI rendering mode

NVIDIA predefined number of GPUs to use on SLI rendering mode

NVIDIA predefined number of GPUs to use on SLI rendering mode on DX10

NVIDIA predefined SLI mode

NVIDIA predefined SLI mode on DX10

SLI indicator

SLI rendering mode


References

  1. Geeks3D Forums - Nvidia Inspector 1.9.7.2 Beta - last accessed on 2018-08-29
    "added FERMI_SETREDUCECOLORTHRESHOLDSENABLE as "Antialiasing fix" to avoid black edges in some games while using MSAA (R320.14+)"
  2. Blur Busters - G-SYNC 101: In-game vs. External FPS Limiters - last accessed on
    "Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC."
  3. Maximum Pre-Rendered frames for SLI - last accessed on 2016-10-12
    "The 'maximum pre-rendered frames' function operates within the DirectX API and serves to explicitly define the maximum number of frames the CPU can process ahead of the frame currently being drawn by the graphics subsystem. This is a dynamic, time-dependent queue that reacts to performance: at low frame rates, a higher value can produce more consistent gameplay, whereas a low value is less likely to introduce input latencies incurred by the extended storage time (of higher values) within the cache/memory. Its influence on performance is usually barely measurable, but its primary intent is to control peripheral latency."
  4. DisplayLag - Reduce Input Lag in PC Games: The Definitive Guide - last accessed on 2018-08-29
  5. Nvidia - What is Smooth Vsync? - last accessed on 2018-08-29
  6. Nvidia - Adaptive VSync - last accessed on 2018-08-29
  7. NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10 - What's Gamma Correct AA? - last accessed on 2016-10-12
  8. Natural Violence - SGSSAA - last accessed on 2016-10-12
  9. https://forums.geforce.com/default/topic/449234/sli/mulit-display-mixed-gpu-acceleration-how-to-use-/
  10. https://nvidia.custhelp.com/app/answers/detail/a_id/3130/~/setting-power-management-mode-from-adaptive-to-maximum-performance
  11. GeForce 337.88 WHQL, Game Ready Watch Dogs Drivers Increase Frame Rates By Up To 75% - last accessed on 2016-10-12
    "Shaders are used in almost every game, adding numerous visual effects that can greatly improve image quality (you can see the dramatic impact of shader technology in Watch Dogs here). Generally, shaders are compiled during loading screens, or during gameplay in seamless world titles, such as Assassin's Creed IV, Batman: Arkham Origins, and the aforementioned Watch Dogs. During loads their compilation increases the time it takes to start playing, and during gameplay increases CPU usage, lowering frame rates. When the shader is no longer required, or the game is closed, it is disgarded, forcing its recompilation the next time you play.
    In today's 337.88 WHQL drivers we've introduced a new NVIDIA Control Panel feature called "Shader Cache", which saves compiled shaders to a cache on your hard drive. Following the compilation and saving of the shader, the shader can simply be recalled from the hard disk the next time it is required, potentially reducing load times and CPU usage to optimize and improve your experience.
    By default the Shader Cache is enabled for all games, and saves up to 256MB of compiled shaders in %USERPROFILE%\AppData\Local\Temp\NVIDIA Corporation\NV_Cache. This location can be changed by moving your entire Temp folder using Windows Control Panel > System > System Properties > Advanced > Environmental Variables > Temp, or by using a Junction Point to relocate the NV_Cache folder. To change the use state of Shader Cache on a per-game basis simply locate the option in the NVIDIA Control Panel, as shown below.
    "
  12. NVIDIA - SLI Antialiasing - last accessed on 2016-10-12
    "SLI AA essentially disables normal AFR rendering, and in 2-way mode will use the primary GPU for rendering+forced AA, while the secondary GPU is only used to do AA work.
    In SLI8x mode for example, each GPU would then do 4xMSAA after which the final result becomes 4xMSAA+4xMSAA=8xMSAA.
    This can be useful in games without proper SLI support, so at least the second GPU is not just idling.
    However it unfortunately only works correctly in OpenGL, and there will be no difference in temporal behavior between for example normal forced 4xMSAA+4xSGSSAA and SLI8x+8xSGSSAA in DX9.
    "