> a singularity would almost surely
be preceded by a world in which machines are 0.01% intelligent (say)
I'm not sure that fractional intelligence makes sense. Otherwise, Boole's Laws of Thought, mechanisms like clockwork, even books, even counting, and many other ideas would be seen as capturing some "fraction" of intelligence.
I think it's a threshold, like Turing equivalence: either you can compute anything or you can't. Once over that threshold, percentages make a difference, such as 10,000 seconds to generate 1 second of human-level intelligence.
There's a democratizing aspect to this position, that all humans possess human-level intelligence, not just geniuses (or researchers).
I'm not sure that fractional intelligence makes sense. Otherwise, Boole's Laws of Thought, mechanisms like clockwork, even books, even counting, and many other ideas would be seen as capturing some "fraction" of intelligence.
I think it's a threshold, like Turing equivalence: either you can compute anything or you can't. Once over that threshold, percentages make a difference, such as 10,000 seconds to generate 1 second of human-level intelligence.
There's a democratizing aspect to this position, that all humans possess human-level intelligence, not just geniuses (or researchers).