Opened 11 years ago

Closed 11 years ago

Last modified 11 years ago

#1952 closed defect (fixed)

Graphics card detection fails with NVIDIA Optimus on Windows

Reported by: Evans Owned by: ben
Priority: Must Have Milestone: Alpha 14
Component: Core engine Keywords: crashlog
Cc: Jan Wassenberg Patch:

Description

I have both Intel Integrated Graphics as well as NVIDIA GT 630M on my laptop. With Integrapted Intel graphics the game works fine, but when I switch to the NVIDIA card, it crashes on startup.

Details: unhandled exception (Access violation reading 0x00000000)

Location: unknown:0 (?)

Call stack:

61007600

errno = 0 (No error reported here) OS error = 126 (The specified module could not be found.)

Attachments (3)

crashlog.txt (14.5 KB ) - added by Evans 11 years ago.
crashlog.dmp (84.7 KB ) - added by Evans 11 years ago.
graphics.jpg (112.2 KB ) - added by Evans 11 years ago.

Download all attachments as: .zip

Change History (18)

by Evans, 11 years ago

Attachment: crashlog.txt added

by Evans, 11 years ago

Attachment: crashlog.dmp added

comment:1 by historic_bruno, 11 years ago

Cc: Jan Wassenberg added

Probably it's specific to Windows 8 and/or the drivers, looks like the latest NVIDIA drivers are being used. The crash occurs when we try to enumerate graphics cards.

Call stack:

> 	nvd3d9wrap.dll!732f89b4() 	
 	[Frames below may be incorrect and/or missing, no symbols loaded for nvd3d9wrap.dll]	
	pyrogenesis.exe!wmi_GetClassInstances(const wchar_t * className=0x01443a28, std::vector<std::map<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> >,tagVARIANT,std::less<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> > >,std::allocator<std::pair<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> > const ,tagVARIANT> > >,std::allocator<std::map<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> >,tagVARIANT,std::less<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> > >,std::allocator<std::pair<std::basic_string<wchar_t,std::char_traits<wchar_t>,std::allocator<wchar_t> > const ,tagVARIANT> > > > > & instances=[1]([...](...))  Line 123	C++
 	pyrogenesis.exe!wgfx_CardName(wchar_t * cardName=0x00bdeab8, unsigned int numChars=128)  Line 143 + 0x13 bytes	C++
 	pyrogenesis.exe!gfx::CardName()  Line 45 + 0x17 bytes	C++
 	pyrogenesis.exe!RunHardwareDetection()  Line 226 + 0xa bytes	C++
 	pyrogenesis.exe!InitGraphics(const CmdLineArgs & args=, int flags=)  Line 933	C++

comment:2 by Jan Wassenberg, 11 years ago

Interesting that WMI calls the graphics driver directly, I would have thought they only do that once and cache the result. heh, we'd like to check for that particular card/driver combo, but that's exactly what's failing :P Could use the GL_RENDERER etc. to detect this case and avoid the WMI call and return less-specific information.

Does it help to completely disable your Intel card, not just enable the NVidia?

by Evans, 11 years ago

Attachment: graphics.jpg added

comment:3 by Evans, 11 years ago

I don't really have an option to disable the Intel card. I can either set one of the cards to handle 3D rendering for all applications, or I can choose individually, per application. I tried disabling the Intel card in device manager, but that screws up the display on the entire OS. I have attached a screenshot of the NVIDA control panel.

Last edited 11 years ago by Evans (previous) (diff)

comment:4 by Jan Wassenberg, 11 years ago

Ooh, sounds like an unhealthy hybrid. I was thinking of disabling at the BIOS level - that's the only way to truly hide it from the OS. Do you have that possibility? Why use integrated graphics at all when a decent graphics card is there? (I suppose the answer is power usage, and hope you aren't actually roaming.)

comment:5 by zoot, 11 years ago

The card being enabled per application is standard for NVIDIA Optimus cards. It still uses integrated graphics to output the final image, so we can't ask people to disable the latter.

Last edited 11 years ago by zoot (previous) (diff)

comment:6 by Jan Wassenberg, 11 years ago

Definitely an unhealthy hybrid.. yikes. I guess we need to skip this code on Windows 8 (ideally only after consulting the other mechanisms in wgfx.cpp and finding both NVidia and Intel driver files). Alternatively, anyone up for reporting this to NVidia and/or MS? I really don't think this is our bug.

in reply to:  5 comment:7 by historic_bruno, 11 years ago

evans, since this is a Dell laptop, are you using the latest drivers from the Dell support site? If you enter your service tag there, it looks like the latest they offer are older than what you're currently using. Maybe there is some incompatibility with NVIDIA's generic drivers, if you're using those? You could try uninstalling your current NVIDIA+Intel drivers and installing the ones from Dell instead. It's a shot in the dark, but worth a try IMO.

Replying to zoot:

The card being enabled per application is standard for NVIDIA Optimus cards. It still uses integrated graphics to output the final image, so we can't ask people to disable the latter.

I still think it's worth looking in the BIOS for a setting to disable Optimus or something, so we can narrow down the problem.

comment:8 by Evans, 11 years ago

This seems to be quite common in Dell laptops - Having both an Intel card as an NVIDIA card. Having just an NVIDIA card on a laptop might be a sever battery drain.

I checked again and there are no options in the BIOS to disable either Integrated graphics or the NVIDIA card.

I downloaded the older driver from the Dell website. It appears to be just a Dell application wrapper around an older version of the generic NVIDIA driver. The app simply extracts the driver and installs it. I will try reverting to it, but I don't think that will make a difference.

I have a question, though - When I choose "auto-select graphics processor" for the game, it is assigned Integrated graphics, and hence the game works. Shouldn't it be assigned the NVIDIA card, as the game is a 3D application?

in reply to:  8 comment:9 by zoot, 11 years ago

Replying to evans:

I have a question, though - When I choose "auto-select graphics processor" for the game, it is assigned Integrated graphics, and hence the game works. Shouldn't it be assigned the NVIDIA card, as the game is a 3D application?

The criteria are described here: http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf I'm not sure if we currently do any of that.

comment:10 by B. Guns, 11 years ago

I have the same problem on my laptop (ASUS). I have Nvidia GT 630m graphics card + Intel graphics 4000. I've had this problem with all recent nvidia drivers (can't remember if I experienced it with the previous series of drivers).

comment:11 by historic_bruno, 11 years ago

Priority: Should HaveMust Have
Summary: Game Crashes if NVIDIA Graphics Card is usedGraphics card detection fails with NVIDIA Optimus on Windows

comment:12 by historic_bruno, 11 years ago

Seems we're not the only ones having this problem: https://devtalk.nvidia.com/default/topic/549225/opengl/bug-with-nvoptimusenablement-dword-export-in-320-18-and-320-49-drivers-/

The game works fine on Michael's new laptop which also has Nvidia Optimus (GT 730M) and Windows 8, but there's another crash report with a GT 610M and Windows 7 here. Also relevant: #998

I think the safest thing to do for now, as a workaround, is to disable the WMI based graphics query and only use GL_VENDOR/GL_RENDERER. If we use GL to detect Nvidia cards first, then it seems we've lost one advantage of the WMI approach (not needing GL to be init'd - does that ever matter?) I don't know about more or less specific info, but on my system GL provides almost identical vendor and card name.

Last edited 11 years ago by historic_bruno (previous) (diff)

comment:13 by Jan Wassenberg, 11 years ago

WMI not needing OpenGL is relevant for other apps but not 0ad. Using GL info sounds fine!

comment:14 by ben, 11 years ago

Owner: set to ben
Resolution: fixed
Status: newclosed

In 13536:

Changes graphics card name detection on Windows to use OpenGL, as a workaround for certain WMI related crashes on Nvidia Optimus and netbook systems. Fixes #1952, #1575

comment:15 by historic_bruno, 11 years ago

Keywords: crashlog added; crash removed

#2036 was a duplicate report.

Note: See TracTickets for help on using tickets.