Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As far as I understand, Grove was at the helm when Intel started the Itanium project. Granted, he didn't see it to "completion", but poor choices were made even in his time.

He even stated the following in "Only the Paranoid Survive": One, don’t differentiate without a difference. Don’t introduce improvements whose only purpose is to give you an advantage over your competitor without giving your customer a substantial advantage. The personal computer industry is characterized by well-chronicled failures when manufacturers, ostensibly motivated by a desire to make “a better PC,” departed from the mainstream standard. But goodness in a PC was inseparable from compatibility, so “a better PC” that was different turned out to be a technological oxymoron.

One might think Itanium goes against that.



Just because Itanium last as a bet, doesn't mean it was wrong to try it. They gambled that compilers would keep up - they didn't. Today it might work! Compiler technology is very much improved. (Dotnet/Java payloads? Dynamic recompiling...) It's a software problem and as such inherently easy to underestimate the complexity of it. Itanium also had its roots in a time when Big Iron was more relevant than it is today.


Itanium was an excellent idea that needed some continued persistence. It is was a bit too early for its time.


It really wasn't. Itanium was a gamble that hard-coding static parallelism into the ISA would beat dynamic OoO. And just like architectural delay slots, it inevitably failed - using a few more transistors isn't that expensive since the alternative is "waste your time doing nothing at all".

"Compilers just need to keep up" was Intel's marketing apologia, not reality.


Compilers simply could not keep up - that was reality, not just marketing. Ideally, Intel should have coded the compilers too and removed the rough edges. But they were a bit too slow and AMD ate Itanium's target market.

You have to admit though that the EPIC (Explicitly Parallel Instruction Computing) model was quite innovative. The philosophy influenced the LLVM project and some of the principles are used in GPU's and AI accelerator chips, even if hardware-based dynamic scheduling won the game.


I think they did though. I remember several buddies from UIUC when I was in Folsom working solely on compilers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: