It must be about time for a 'the state of C++ tooling' rant...

I tend to develop code with JITT (Just in time testing), this is like TDD when I’m doing it but it doesn’t always get done. What does get done, generally, is the “first test” for each class. This ensures that subsequent testing is possible as the code doesn’t have hidden dependencies and it gives me a test harness that’s ready to go when I find that I need it. More complex code end up being TDD, easier bits end up as JITT where the tests are only written when I’ve wasted time banging my head on the desk trying to debug problems the “old fashioned way”. This approach tends to limit the number of really horrible bugs (except race conditions!) that I have to deal with and I can’t remember the last time that I had memory corruption issues etc.

Client code, that is code that my client’s write, tends to be different and every so often I’ll have a client who has got themselves so twisted up with ‘weirdness’ that they ask me to track down their memory issues for them. At this point I root around in my tool box for the ‘memory validation’ and ‘code analysis’ tools and dust them off. I then spend a few moments in that heavenly state that you can get into when you believe the hype about something, before coming down to earth with a bang and writing one of these “state of C++ tooling” rants.

Let’s start with the good experience…. The first tool I decided to try was static analysis with Gimpel Lint using Visual Lint to run it as it’s the only way to be productive with the tool. Unfortunately Gimpel Lint doesn’t really like modern C++. There’s a beta program for a new version which DOES play nicely with modern C++ but I hadn’t really used it that much as I didn’t need it for The Server Framework code as we have to deal with some very old compilers and so the code tends to use older C++ functionality. My client’s projects were VS2015 and running the Visual Lint on them generated PC-Lint config files that warned of incompatibilities between the version of PC-Lint I was using and the VS2015 header files… The linting failed spectacularly due to the content of the compiler’s header files, specifically the newer C++ stuff.

I was a little confused because I’d run the tools on my framework code in VS2015 with no problems. I then noticed that when run on the framework code, Visual Lint generated a config file for VS2012 rather than 2015 and used the 2012 headers. This allowed PC-Lint to work but wasn’t right.

A few tweets to Anna-Jayne and Beth at Riverblade and they’d worked out what was going on (I have some wacky Visual studio projects that have been upgraded through every possible compiler release and these contain some artifacts from VS2012 which Visual Lint is recognising and which cause it to generate the wrong config). A fix is on the way, which is brilliant and as responsive as I’ve come to expect from them.

Unfortunately, all that the fix to Visual Lint will do is mean that PC-Lint will fail on ALL of my VS2015 projects rather than just some. The real fix for this issue is to use the PC-Lint Plus beta which should understand the VS2015 C++ headers.

Well the beta is a beta. It’s not really ready for prime time, at least not on my client’s codebase and I can’t run it on the ‘clean’ code of my framework release as I’m waiting on the Visual Lint fix.

The mediocre… I then configured Visual Lint to run CppCheck which is an OpenSource C++ static analysis program. That ran and gave the framework code a pretty clean bill of health, but in a way, it was too clean. With PC-Lint I’ve had to annotate the code with annoying comment warts to tell it to let me work outside of its rules. I’ve had to tweak its config files so that it really understood stuff, etc. With CppCheck I ran it and it came up with some suggestions about some classes which had single argument constructors which were not marked as ’explicit’. It’s a good catch, but it’s the only thing it has suggested which makes me doubt it’s trying hard enough. For other files, seemingly at random, it complains about ‘missing include files’ but doesn’t tell me which ones it thinks are missing. Since these files include pretty much the same set of include files as files that work it’s hard to debug the issue… I know it’s OpenSource. I know I can go look and work out what it’s checking. I know I can add my own checks if I don’t think it’s checking enough. But I just want it to find my client’s bugs…

The bad… Since I’m looking for a memory issue, and it looks like static analysis will tell me nothing, I decided to try some memory specific tools.

First on the list was Software Verify’s Memory Validator. I always really want to like these tools. I always end up disappointed. I always think I could spend ages talking to the developer and that he might fix the issues that I find and I never do because actually I want to get my job done and, unfortunately, if the tool isn’t helping then it’s just wasting my time.

Memory Validator did pretty much the same thing it had done the last time I tried it which was get all confused as soon as the target process started to create threads. I managed to get a bit further by letting the target start up and then attaching the validator after but it didn’t seem to be seeing lots of memory allocations that it should have been seeing.

The slow… Next was a trial version of DevPartner Studio. This subsumed Bounds Checker which was my go to tool for this kind of work back in the day. As regular readers may remember, I’ve had issues with DevPartner Studio ever since it went to that place that Enterprise Software goes to die

Obtaining a trial version was easy. The trial installed OK. I ran it on my client’s server and 14hrs later when I got up the server was still crawling through the initial start up phase where it loads some data from the database. Everything seemed to be working but it was working very very slowly.

I turned off the memory checking options and ran the server up again and it started quite quickly, so it looks like the tool will work for me if I don’t want to use it for what I need it for… Until it doesn’t work for me, of course.

The enterpricy… Back in the day if Bounds Checker couldn’t find the issue I’d switch to Purify. Purify wasn’t quite as slick and didn’t quite work as well as Bounds Checker but sometimes it was preferable.

Purify still exists but appears to be owned by what seems to be another Enterprise Software Graveyard. The new owners have obviously realised that you get lots of support calls from small development shops where the guy doing the purchasing is the same as the guy doing the development and they appear to only want to sell to large corporate entities who want to buy multiple licenses.

Cynically I think that both Purify and DevPartner can be sold nicely to management level in big enterprises but rarely deliver the expected bang for the buck on real world code. The fact that the developers in large enterprises don’t do the purchasing means that they can be purchased, installed and licensed for everyone and then used by nobody…

So, where does that leave me? As usual in these situations I have ended up in a position where it’s likely quicker to do a manual, visual, code review of the code in question and then pull some of the more questionable code out into unit tests. It may be possible to run DevPartner on the isolated unit tests, if necessary, but I expect that the 7 day trial will have expired by the time I get to the point where I can use the tool again… It will be nice to get the fix for Visual Lint. It will be nice to try the Gimpel PC-Lint Plus beta on my framework code and once I have configured the tool and fixed any issues and that code is coming up with a clean report I can try it on my client’s additional code…

I expect my client wants a fix before that though, it’s lucky I like visual code reviews…