> In the late 1990s and early 200xs, there were a lot of claims that there was no such thing as a "slow language", that all languages can be run as quickly as C if you just built a sufficiently smart compiler and/or runtime.
I think "slow language" never meant that way. It was more like a counterpoint to the claim that there are inherent classes of languages in terms of performance, so that some language is (say) 100x or 1000x slower than others in any circumstances. This is not true, even for Python. Most languages with enough optimization works can be made performant enough that it's no slower than 10x C. But once you've got to that point, it can take disproportionally more works to optimize further depending on specific designs.
> If I were designing a language to be slow, but not like stupidly slow just to qualify as an esolang, but where the slowness still contributed to things I could call "features" with a straight face, it would be hard to beat Python.
Ironically, Python's relative slowness came from its uniform design which is generally a good thing. This is distinct from a TCL-style "everything is a string" you've said, because the uniform design had a good intent by its own.
If you have used Python long enough you may know that Python originally had two types of classes---there was a transition period where you had to write `class Foo(object):` to get the newer version. Python wanted to remove a blur between builtin objects and user objects and eventually did so. But no one at that time knew that the blur is actually a good thing for optimization. Python tried to be a good language and is suffering today as a result.
"If you have used Python long enough you may know that Python originally had two types of classes"
Yes, because that's also the era where this claim was flying around.
I'd say that each individual person may have their own read on what the claim meant, but certainly the way it was deployed at anyone who vaguely complained that Python was kind of slow shows that plenty of people in practice read it as I've described... that if we just wait long enough and put enough work into it, there would be no performance difference between Python and C. In 2023 we can look back with the perspective that this seems to be obviously false, so they couldn't possible have meant that, but they didn't have that perspective, and so yes they can have meant that. "Sufficiently smart compiler" was just starting to be a derogatory term. I also remember and lightly participated in the c2.com discussions on that, which may also contribute to my point that, yes, there definitely were people who truly believed "sufficiently smart compilers" could exist and were just a matter of time.
As for proportions, it's impossible to tell. Internet discussions (yea verily including this very one) in general are difficult to ascertain that from because almost by definition only outliers are participating in the discussion at all. Obviously by bulk of programmers, most programmers had simply never considered the question at all.
Yeah, I agree everyone may have different anecdotes, for my case though I heard more of the "Python as glue code" arguments and never heard that Python proper can be as fast as C. I've used Python since 2.3, so maybe that argument was more prevalent before?
I think "slow language" never meant that way. It was more like a counterpoint to the claim that there are inherent classes of languages in terms of performance, so that some language is (say) 100x or 1000x slower than others in any circumstances. This is not true, even for Python. Most languages with enough optimization works can be made performant enough that it's no slower than 10x C. But once you've got to that point, it can take disproportionally more works to optimize further depending on specific designs.
> If I were designing a language to be slow, but not like stupidly slow just to qualify as an esolang, but where the slowness still contributed to things I could call "features" with a straight face, it would be hard to beat Python.
Ironically, Python's relative slowness came from its uniform design which is generally a good thing. This is distinct from a TCL-style "everything is a string" you've said, because the uniform design had a good intent by its own.
If you have used Python long enough you may know that Python originally had two types of classes---there was a transition period where you had to write `class Foo(object):` to get the newer version. Python wanted to remove a blur between builtin objects and user objects and eventually did so. But no one at that time knew that the blur is actually a good thing for optimization. Python tried to be a good language and is suffering today as a result.