That works both ways, though -- I went through college from 2000-2004, and aside from some brief forays into other languages (Ruby and Perl, from what I remember) we were taught exclusively in C. I had some VB experience (self-taught) from middle school, but in my day job we exclusively use C and Ada ... and I'd never really had reason to learn another language. A few years back, I decided that I should learn some new languages for a) fun, and b) the diversity that I'd need if I ever wanted out of my industry. I taught myself Java, and then the Android framework. Then C# and VB .net. What really forced me to start learning new stuff was teaching - I started teaching as an adjunct at a local college after work (the only favor that getting my MS has ever done me), and then I had learn stuff to fit with their curriculum. Anyway, my point is that it's just as easy for older people to be oblivious to new tech as it is for young people to have a complete lack of understanding of fundamentals. I actually teach binary number formats, boolean algebra, virtual machine code, etc. to my students just to make sure that they're not completely unprepared ... and I will say that your point is valid - many of them have had no prior exposure.