![]() ![]() I was searching for a solution that would not require using the manual command after each system start, and wish to present it for those who are willing that as well. The answer above by Sylvain helps to power off the NVIDIA chip, but unfortunately that does not persist after system restart - it has to be done manually every time the laptop starts fresh. And laptop was pretty hot even if only web browser with static content was open. As observed by PowerTop, the NVIDIA PCI bus device was consuming power when selected Intel either via NVIDIA X Server settings or via 'prime-select intel' command. I have recently found I was having the very same issue on my laptop with NVIDIA 1050 Ti with either nvidia drivers: 435.21, 440.59. Is there a simple but reliable way to force Intel HD and disable the GPU? I use this for work and games, so I don't want to risk breaking the install by driver-switching frequently. I am also reluctant to attempt Bumblebee, as many have indicated it is currently shady with 18.04. Though my current setup is a GTX1070, the above problems do not make sense to me since powersave causes massive power consumption. I have a hunch the gpu is not disabled under powersave mode and it is running at unwarranted stress levels, as my fans are kicked up but CPU is still running at low 800-805 MHz as shown by grep, and managed by TLP. When set to performance mode, usage is generally 19-22w, which is somewhat marginal, but lower! For instance, as indicated by Powertop, when Xserver is set to powersave mode, usage is about 29-31w at idle-very high for my 15.4". On performance mode, consumption would run 25-40w.Ĭurrently, I have found some strange things happen under the Xserver since 18.04 and a new laptop. It would outlast my classmates' macbooks and tablets. This meant about a 5.5-7hr battery life on a gaming laptop! (I of course used TLP). I had an ASUS RoG 17.3" with two HDD and GTX 860m running at 9-11w while running office applications. *The texture filtering settings do not matter much in terms of FPS so just make sure to select High performance on the Quality setting.On a previous laptop, the performance/powersave mode under 16.04 worked exceedingly well. This setting works well with SSDs but is not recommended on HDDs. The shader cache stores these compiled shaders so that subsequent runs of the same game do not need to perform the shader compilation”. Shader compiles are normally performed each time a game runs and are a common cause of game-play stuttering. *Shader cache/Shader cache size on newer drivers – keep it on or on driver default value – “controls the maximum amount of disk space the driver may use for storing shader compiles. Since we are maximizing latency and FPS G-Sync is off and the maximum refresh rate is used. The setting will change if you use Adaptive sync. *Refresh rate – highest available to use the maximum potential of your monitor. If you are on a laptop or on a PC that has thermal issues and are willing to sacrifice performance choose a different value that will help with power saving. *Power management mode is set to Prefer maximum performance to allow the GPU to boost to the highest clock and keep it that way for best latency and performance. If your games do not use more than 85% of your GPU leave Low Latency Mode off. Having this on Ultra might affect your FPS a bit since your CPU has to work harder. * Low Latency Mode you can keep on On or Ultra as it helps with limiting queued frames, therefore, lowering input lag in games that do not use Nvidia Reflex technology. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |