Hexus seems to think so. There's been at least one PR blunder on FutureMark's part (fomerly MadOnion) in the past year or so, not to mention raising questions on how realistic the benchmark is. The guys at Hexus take a deeper look into what's wrong with the 3DMark series, specifically the '06 incarnation. The reason why websites and print publications get hardware evaluation more right than any single benchmark does, is that they use as many real-world shipping games as is reasonable for them to do so, and benchmark in as many ways as they can in the scenarios they think work best, pairing all of that with theoretical testing work in order to get the final picture. That procedure tests a multitude of games and game content, in multiple scenarios on lots of hardware, using multiple rendering techniques with differing CPU interation. It's all about the size of the data set used to make the judgement. The dissection of the benchmark is over here. Personally, the dubious efforts of the 3DMark series are enough to warrant them being pulled from our small sets of tests in reviews anyway, but if you want an indepth explanation as to what's severly wrong with it, this is a good article to read.
Opinions of alternative benchmarks? I found Aquamark to be good, but then again I've not benchmarked a system for over a year.
But, unlike 3DMark, Aquamark is based on the Aquanox game, which you can actually get. There is no game based on the 3DMark engine. Furthermore, there's some issues in the scoring, which isn't due to the nature of the benchmark as much as companies getting it in to Futuremark before the deadline. The push was made on getting it out over getting it right from what I can see.