How can I make my computer use NVIDIA GPU instead of Intel GPU?
I posted before due to the editor crashing whenever I try to start it: https://answers.unrealengine.com/questions/16431/ue4editor-cmdexe-has-crashed.html
Now I have found a solution to the crash, which is using a DLL from this thread: https://answers.unrealengine.com/questions/14856/ue4editor-cmdexe-crashed-after-launch-1.html
However now my problem is that the editor is horribly slow because it is using my Intel GPU instead of NVIDIA GPU. It is impossible for me to force UE4 to use my NVIDIA GPU instead of the Intel GPU.
I have set all the settings in the NVIDIA control panel for global settings as well as specific application settings for ue4editor.exe, ue4editor-cmd.exe, ue4game.exe. You can see so on these screenshots I made:
My computer specs are: Intel Core i7-2630QM CPU 8GB RAM NVIDIA GT555M Windows 7 64-bit
Unreal Engine 4 refuses to use my NVIDIA card even though I explicitely set it in the NVIDIA control panel.
I have never had this problem with any other program than Unreal Engine 4.
As proof that it is not using the NVIDIA GPU I have taken a screenshot of the NVIDIA helper application that shows GPU activity: http://images.allprog.nl/viewimage.php?img=8088_1395922601.png It should show the icon of each application that uses the NVIDIA GPU, but it only shows Visual Studio 2012
There's an answer marked as right, but the problem wasn't solved. Does Epic have a saying on this? It's really affecting the performance on a Kinect project at my company. We bought a very expensive Laptop to use at an event and Unreal performance is being crumpled for using the Intel Graphics HD!!
answered Aug 20 '15 at 08:32 PM
Which video-card is your monitor plugged into? Are you using a laptop? I've heard about some kind of setup where the nvidia GPU outputs through the intel gpu's screen buffer, it's a way of saving laptop battery power, although I'm not familiar with it.
If you're on a desktop I'd suggest plugging your monitor into your nvidia card's output.
answered Mar 30 '14 at 08:57 PM
I've actually seen this problem before but for the Source engine. The solution for that was to disable secure boot in the bios.
answered Mar 31 '14 at 02:34 AM
hi i'm using a lenovo and i had the same problem but on the lenovo support webside there is a special installer for optimus (seperate from the nvidia driver)
i have now a litte symbol in the notification area which tells me which software uses my gpu....
answered Apr 01 '14 at 08:47 AM
After running WildStar (a game) for the first time I noticed that it does not use my NVIDIA card either.
What WildStar and UE4 have in common is that they are both 64-bit programs. So I figured my NVIDIA Optimus implementation might not work properly for 64-bit programs.
To confirm this I have compiled my own 32 and 64 bit programs that use OpenGL and saw that the 32-bit program uses my NVIDIA card and the 64-bit program does not.
According to this, the problem might not be with UE4 but with my laptop...
answered May 18 '14 at 11:24 AM
Yeah I made it run. I have a laptop using Nvidia Optimus. No matter what a set for r.GraphicsAdapter Unreal always choosed Intel HD 4600. I noticed in logs that it might be because there is in fact no output on GeForce graphic. All output goes through Intel so all devices are connected to it.
So I made a fake display connected to GeForce acording to this: http://www.helping-squad.com/fake-connect-a-monitor/
Now when there is an output on both adapters UE choosing GeForce finally!
answered Dec 01 '15 at 11:55 PM
Follow this question
Once you sign in you will be able to subscribe for any updates here