Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[flagged]


It strikes me as quite arrogant to assume that those are the only possibilities. People, even experts in a field, disagree about topics and the implications of evidence all the time. Arguing that honest disagreement must reduce down to one of the three categories you list is basically saying "my point of view is so obviously correct that only bad thinkers or bad people could disagree". But that's almost certainly not the case.


I suspect you're conflating the concepts of intelligence and consciousness. It is completely unsurprising that a Turing Machine can have intelligence.


It is vacuously true that a Turing machine can simulate a human mind - this is the quantum Church-Turing thesis. Since a Turing machine can solve any arbitrary system of Schrodinger equations, it can solve the system describing every atom in the human body.[1]

The problem is that this might take more energy than the Sun for any physical computer. What is far less obvious is whether there exist any computable higher-order abstractions of the human mind that can be more feasibly implemented. Lots of layers to this - is there an easily computable model of neurons that encapsulates cognition, or do we have to model every protein and mRNA?

It may be analogous to integration: we can numerically integrate almost anything, but most functions are not symbolically integrable and most differential equations lack closed-form solutions. Maybe the only way to model human intelligence is "numerical."

In fact I suspect higher-order cognition is not Turing computable, though obviously I have no way of proving it. My issue is very general: Turing machines are symbolic, and one cannot define what a symbol actually is without using symbols - which means it cannot be defined at all. "Symbol" seems to be a primitive concept in humans, and I don't see how to transfer it to a Turing machine / ChatGPT reliably. Or, as a more minor point, our internal "common sense physics simulator" is qualitatively very powerful despite being quantitatively weak (the exact opposite of Sora/Veo/etc), which again does not seem amenable to a purely symbolic formulation: consider "if you blow the flame lightly it will flicker, if you blow hard it will go out." These symbols communicate the result without any insight into the computation.

[1] This doesn't have anything to do with Penrose's quantum consciousness stuff, it just assumes humans don't have metaphysical souls.


> It is vacuously true that a Turing machine can simulate a human mind - this is the quantum Church-Turing thesis. Since a Turing machine can solve any arbitrary system of Schrodinger equations, it can solve the system describing every atom in the human body. The problem is that this might take more energy than the Sun for any physical computer.

Feynman on "Simulating Physics with Classical Computers" [0] goes beyond that to posit that any classical simulation of quantum-mechanical properties would need exponential space in the number of particles to track the full state space; this very quickly exceeds the entire observable universe when dealing with mere hundreds of particles.

So while yes, the Turing machine model presupposes infinite tape, that is not realizable in practice.

He actually goes further:

    Can a quantum system be probabilistically simulated by
    a classical (probabilistic, I'd assume) universal computer? In other words, a
    computer which will give the same probabilities as the quantum system
    does. If you take the computer to be the classical kind I've described so far,
    (not the quantum kind described in the last section) and there're no changes
    in any laws, and there's no hocus-pocus, the answer is certainly, No! This is
    called the hidden-variable problem: it is impossible to represent the results
    of quantum mechanics with a classical universal device.
In particular, he takes issue with our ability to classically simulate negative probabilities which give rise to quantum mechanical interference.

[0] There are a number of PDFs shared as handouts for various grad classes; https://s2.smu.edu/~mitch/class/5395/papers/feynman-quantum-... was the first that I came across.


I believe Feynman was basically mistaken about the second point, though of course the field was brand new at the time - it is certainly possible to simulate the measurements of quantum mechanics to arbitrarily high accuracy on a classical computer with pseudorandom number generation; if you replace the pseudorandomness with a physical random number generator, then it might even be formally equivalent to a quantum computer (I think that's an open question, haven't tracked the developments in a while).

"Negative probabilities" is not quite right - towards the end of his life Feynman wondered about generalizing probability but that was just about intermediate calculations: he declared physical events cannot have nonnegative probabilities (in the same sense that physically I can't have negative three apples, -3 is a nonphysical abstraction used to simplify accounting). Negative probabilities are not part of modern quantum mechanics, where probabilities are always nonnegative and sum to 1. Quantum states can have negative/complex amplitudes but the probabilities are positive (and classical computers are just as good/bad at complex arithmetic as they are any other).

The "hidden variables" comment makes me think Feynman was actually a bit confused about the philosophy of computation - a classical computer cannot simulate how a quantum particle "truly" evolves over time, but that's also the case for a classical particle! Ultimately it's just a bunch of assembly pushing electrons around, that has nothing to do with a ball rolling down a hill. Computers only have Schrodinger's equation or Newton's laws, which don't care how the motion "truly" works, they just care that the measurement at the end is correct. If a computer gets the correct measurements then by definition we say it simulates the phenomenon.

Edit: clarifying this last point, Newton’s laws do have a known “hidden variables” theory in the sense that we know how an ensemble of high-temperature quantum particles can “average out” into Newton’s laws, there is an electrostatic theory of mechanical contact, etc. This does not (and seemingly cannot) exist for quantum mechanics, but merely having a quantum computer wouldn’t by itself help us figure out what’s going on: the output of a quantum computer are the “visible” variables, aka the observables. The fact that quantum computers are truly using the non-observables, whatever those might be, seemingly cannot be experimentally distinguished from a sufficiently accurate classical computer doing numerical quantum mechanics. If it turns out there is experimentally a serious difference between the results of quantum computers and classical qubit simulators, that would suggest an inadequacy in the foundations of QM.


"Negative probabilities" is being imprecise -- as you allude to, what we really mean are quantum mechanical amplitudes that are out-of-phase relative to each other, such that we get constructive and destructive interference when you convert them into concrete probabilities. (Feynman also acknowledges this lack of precision in terminology, but ultimately this text was not intended to be rigorous scientific proof but rather building intuition for this problem that he was deeply interested in.)

I believe Feynman's discussion of hidden variables is a reference to the EPR paradox (see: Einstein's infamous quote that "God does not play dice") and the various Bell tests (which at this point in time had experimentally demonstrated that hidden-variable theories were inadequate for describing QM). If you continue in the paper, he then goes on to describe one of those experiments involving entangled photons.

(I believe he described this setup: https://en.wikipedia.org/wiki/Bell_test#A_typical_CH74_(sing...)

In particular, what we definitely can't do is generate random numbers for measurements of individual particles while assuming that they're independent from each other. So now we have to consider the ensemble of particles, and in particular we need to consider the relative phases between each of them. But now we're getting back to the same exponential blowup that caused us to run into problems when we tried to simulate the evolution of the wavefunction from first principles.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: