Numeric constants starting with 0 define octal values in many languages, hell they do so even in Java. Does that mean all those languages are buggy? I'd say you're going to find features that don't make sense to anyone who's never used them...pretty much anywhere.
I would think the issue is more the behavior of "parse as much as you can until you can't, and return what you got so far", which is what causes the unexpected result of returning 0 instead of bailing out. However that apparently wasn't fixed in ECMAScript 5, presumably for backwards compatibility.
I'd say that the entire concept of octal literals is a bug, yes. (It's almost _entirely_ useless, a leftover from the days of yore. When for Christmas, we got naked, wrapped ourselves in punch tape versions of the latest "editor", and danced around the Christmas tree while setting the line printer on fire. But I digress.. ;)
But fine, if I at least got "undefined" (well, null) as a result of parsing an invalid octal string, it'd be not quite as bad. So yes, if we talk root causes, you put your finger on that. The anachronism of octal parsing just exposes it.
It does _not_ make perfect sense to anybody who's never used octal. That is presumably the vast majority of anybody born past, say, 1980.
As such, it _is_ a bug in the sense that it violates expectations. It's just a bug in the spec, not in the code.