Debugging Optimized Code–New in Visual Studio 2012 (original) (raw)

For years (decades?) one of the most requested features in Visual C++ has been better support for debugging optimized code. Visual Studio’s debug information is so limited that in a program that consists just of main(argc, argv) the VS debugger can’t accurately display argc and argv in an optimized build. All I wanted was for Visual Studio to not lie to me about the value of local variables and function parameters in optimized builds.

It turns out that Microsoft shipped this feature in Visual Studio 2012, but forgot to tell anyone. This could be the most important improvement to Visual Studio in years but it’s been almost top-secret.

Update: Visual Studio 2013 Update 3 makes this feature official! It’s now under /Zo (that’s lowercase ‘o’, not zero or uppercase ‘o’, despite what the VS blog says). It works really well, as long as you don’t accidentally check the option that quietly disables this awesome feature (more details in the update at the end).

Updated update: Visual Studio 2015 makes this feature even more official. Now optimized debugging can even coexist with Edit-and-continue!

So download VS 2013 Update 3 or higher and give /Zo a try. However, be aware that the VS debugger is still missing support for one handy enhanced debugging feature. For that you need to debug with windbg, which can actually show you virtual callstack entries from inline functions when you build with /Zo. This is a big deal. Stepping through overloaded operators and STL functions will never be the same again. This screenshot is from windbg, stepping through three layers of inlining:

image

It’s beautiful.

Okay, back to the original article:

Here’s what Visual Studio 2010 looks like by default when you debug the simplest possible project with the default release-build optimizations. I don’t understand why VS has such trouble with this program given that argc and argv are quite plainly sitting on the stack. I don’t think that argc is really 0x00a31330.

image

Note: looking at the stack shows that the debugger is looking for the variables four bytes away from where they are located. So close…image

Before somebody fires up Visual Studio 2012 and tells me that it lies just as much as previous versions let me clarify my claim. Visual Studio 2012 ships with support for optimized debugging, but this support is off by default.

For reasons that I can only speculate about (perhaps the feature wasn’t considered done?) the compiler support for this feature is hidden behind a cryptic and undocumented compiler command line switch – /d2Zi+ (now exposed as /Zo). The default release builds of VS 2012 behave identically to VS 2010, but once you add the magic command-line switch and rebuild then things start to look a lot more sensible:

image

If your Internet attention span has been exceeded already then feel free to stop now. Just remember to upgrade to Visual Studio 2012 or beyond and add /d2Zi+ (/Zo) to your compiler command line. The next time you are looking at a crash dump from a customer, a release-only bug, or you just happen to be debugging optimized code, you’ll thank yourself for doing this preparatory work.

This feature is important enough to justify upgrading Visual Studio. I wish I’d realized it was there six months ago. Better late than never.

Optimized debugging of main() is not particularly compelling, but the feature works well beyond that. Here is a shot of the locals window while debugging Fractal eXtreme after doing the upgrade and /d2Zi+ dance. The displayed values are now accurate where before they were wrong, and the debugger now knows when variables have been optimized away:

clip_image002

Note that there is a difference between “optimized away” and “out of scope”. In an optimized build a variable may cease to exist prior to the end of its scope. This would be a flagrant violation of the C++ standard but the “as-if” rule permits this vital optimization.

VS 2010 together with windbg had some modest support for this capability, but it really didn’t work well. This is much better.

Details schmetails

The /d2Zi+ switch has actually been mentioned previously, by Sasha Goldstein in 2011 and by Andrew Hall in 2013. It’s not an officially documented flag, so it could go away at any moment, but hey, carpe diem, and debug optimized code while the iron is hot. I’m assuming that it will become more official looking in some future release – I don’t think they’ll actually take it away.

Sasha’s 2011 article talks about how the enhanced debug information tracks variables better but suggests that the built-in C++ debugger in Visual Studio doesn’t support the enhanced information (update: because at the time it didn’t). That’s why I missed this crucial feature initially, despite seeing Sasha’s blog post. You don’t need to use windbg – the default VC++ debugger works fine.

Andrew Hall’s post points to some other capabilities of /d2Zi+ – it records inlining information that the profiler can use to attribute time to functions. That’s pretty cool, but currently the only debugger that supports that extra information is windbg.

TANSTAAFL so there’s gotta be a catch. The optimized debugging feature works by storing additional information in the .pdb files – presumably tracking when variables move to a new memory address or register, and when they disappear. This extra information takes space so you should expect that your PDB files will get larger when compiling with /d2Zi+. I haven’t done exhaustive tests, but a quick analysis of eight PDBs showed an average size increase of… honestly it was so small that I couldn’t separate it from the noise. This may be because PDB files never shrink, so if you don’t delete them between links then your results will be invalid. Your mileage may vary. The actual code generated should be unchanged – my tests show no differences.

Upgrading

Upgrading Fractal eXtreme was incredibly trivial. I loaded the VS 2010 solution file into VS 2012 and said yes when it asked if I wanted to upgrade. The code compiled with no errors or warnings. After verifying that release-mode debugging was still a poor experience I added /d2Zi+. As if by magic I started being able to see what was going on. Optimized code is still weird – inlining, constant propagation, and code rearrangement are essential complexities of this task – but at least the accidental complexity is now mostly removed.

Upgrading large projects at work is much more complicated, but as far as upgrades go it’s really not bad. I’m quite glad that they kept the project file format consistent – the diffs are quite small. It also helped that all of our code at work is compiled (but not linked) with VC++ 2012 every day, as described at Two Years (and Thousands of Bugs) of Static Analysis.

It’s worth pointing out that while Visual Studio 2012 will upgrade the .vcxproj files, it does not modify the .sln files. This means that when you double-click the .sln file it still opens in Visual Studio 2010, which doesn’t know how to use the v110 (VS 2012) toolset. You can fix this by opening the .sln file and changing the third line from “# Visual Studio 2010” to “# Visual Studio 2012”. That’s it.

(for details on fractal math optimizations on modern CPUs see the fractals section)

Windows XP

The default Visual Studio 2012 toolset (v110) generates code that won’t run on Windows XP. You have to use the v110_xp toolset for that, after downloading the appropriate VS 2012 update. Alas, the v110_xp toolset doesn’t support /analyze, so I’ve configured our project creation system to select the right toolset for the job. Be sure to automate this boring task.

gdb and gcc

For the record, yes, I am aware that gcc/gdb already offers a comparable experience. As a Windows developer it’s good to see Visual Studio catching up. And I do like having the registers/locals/disassembly/memory/call-stack windows updating as I step through the code.

There goes the neighborhood

I’m quite capable of dropping down to assembly language to see what is really going on. There is something satisfying about tracking the flow of execution and data while you’re on the trail of a cool bug.

But my god, what a waste of time. Tracking bugs this way is inefficient, and it reduces the pool of developers who are qualified to look at crash dumps coming in from customers. I’d fear being put out of work by these newly sophisticated computers, but somehow I think I’ll find something else to work on.

VC++ improvement requests

Earlier week I asked people to vote on the two VC++ improvements that I think are most important. I would have included optimized debugging on the list except that I found out last week that it was already supported. So, no point putting it on the list. But, it’s not too late to vote for getting the most valuable /analyze warnings in the regular compile.

Reddit discussion is here – vote it up if you want others to know about this.

Update, August 2014

A lot of people, myself included, have found that the support for optimized debugging in VS 2013 is flaky. For a while I assumed that they had broken some aspects of it, but when Update 3 came out and the feature was still unreliable I filed a bug. The response from Microsoft was basically “it works on my machine”. That meant it was time to get scientific. I asked a bunch of my coworkers to try the test project, and I got back a stream of reports that the feature was indeed broken. And then one solitary voice in the wilderness said “it works on my machine”. With on-site proof that it did work we investigated and found the crucial difference. The option that quietly disables this awesome feature is… Edit and Continue.

image

If you enable native Edit and Continue in the debugger (which is separate from compiling with it) then debugging of optimized code stops working. In other words, if you check the two boxes to the right then the optimized debugging experience will be as painful as it has been for decades. If you leave one of them unchecked then… Nirvana.

I should have guessed. I’d previously found that this setting chooses between autoexp.dat and .natvis support, so I already knew that it had non-obvious side-effects.

It appears that when you enable native Edit and Continue you are actually selecting a different, older, debugging engine. This old debugging engine supports Edit and Continue and autoexp.dat. The new debugging engine supports .natvis visualizers and optimized debugging information.

Update: with VS 2015 this problem goes away. Edit and Continue and Optimized Debugging no coexist. See the comment from Ramkumar Ramesh.

It also appears (thanks to Jeremy in the comments) that if you set the Debugger Type to “Mixed” instead of “Native-Only” then you also get the old debugging engine, and therefore no support for the optimized debug information.

So, the advice is clear:

It’s clunky, but it works.