I didn't say that internal life has nothing to do with behavior. I said you can't infer internal life from behavior.
> I'm not asserting they have an internal life, the person I responded to was saying they have no internal life and I said they cannot rigorously justify such a claim.
The only claim that I can "rigorously justify" concerning internal lives is that I have one. And the proof is subjective. Clearly this is not a discussion that admits the kind of rigor you're demanding...
We have good reason to believe that people and animals have internal lives whereas rocks and computers don't. That we can't be certain of this isn't a reason to believe the opposite. If your goal here is to insist that all statements about internal lives (other than "I have an internal life") can't be "rigorously justified," fine, I agree.
> I said you can't infer internal life from behavior.
Behaviour can't tell you there isn't internal life, but complex behaviour sure is a big red flag that there might be internal life there.
> We have good reason to believe that people and animals have internal lives whereas rocks and computers don't.
Rocks yes, computer programs no we don't, that's the point. There is no reason to suppose an information processing system that can converse intelligently doesn't have some kind of internal experience, and you have presented not a single argument suggesting that we can draw such a conclusion. I presented a simple deduction for rocks, what is the corresponding deduction for computer programs?
> Behaviour can't tell you there isn't internal life, but complex behaviour sure is a big red flag that there might be internal life there.
I agree, behavior is evidence, but there are good reasons to suppose that, despite their behavior, computers lack internal lives. To try to make my argument more clearly:
I know I have an internal life (due to subjective evidence). My internal life is related to the structure of my body in an unknown way. I'm willing to grant that other humans and similar, evolved organisms also have internal lives because they are the same kind of evolved thing as me.
If someone wants to argue that a designed, thoroughly understood object (like a computer) has an internal life, they need to explain how the structure of the object gives rise to an internal life. Behavior isn't enough.
> My internal life is related to the structure of my body in an unknown way
That's right, an unknown way. So we know the internal structures of computers and their programs, but we don't know the internal structure of humans and how that leads to sentience, but somehow this translates into having good reasons for thinking the relationships in computers are not isomorphic to or meaningfully related to the unknown structures that lead to human sentience? Don't you see the big blank there? This just does not follow and is exactly the first point I made in this thread.
All I've been pointing out this is that people making strong statements about LLMs lacking sentience are leaping to conclusions.
> That's right, an unknown way. So we know the internal structures of computers and their programs, but we don't know the internal structure of humans and how that leads to sentience, but somehow this translates into having good reasons for thinking the relationships in computers are not isomorphic to or meaningfully related to the unknown structures that lead to human sentience? Don't you see the big blank there? This just does not follow and is exactly the first point I made in this thread.
It absolutely follows. The evidence for consciousness is subjective human experience so it makes sense to assume that other humans and living things have consciousness (because living things are all related to each other).
Meanwhile it doesn't make sense to assume that a computer running some program is conscious just because it mimics human conversation. It isn't evolved life and doesn't have the same kinds of structures as evolved life.
It would be unbelievably weird if we reverse engineered consciousness without understanding how our bodies generate consciousness.
You changed the deduction I described thus moving the goalposts. I'm not going to argue a point I never made. Suffice it to say, the only people assuming anything about LLMs are the ones claiming they're not sentient.
> It would be unbelievably weird if we reverse engineered consciousness without understanding how our bodies generate consciousness
I find it weird that you think this would be weird. Most discoveries are like this. We mastered fire before we understood chemistry, we mastered flight before we understood how birds fly, we have a booming pharmaceutical industry and we still don't really know how most drugs work. Theory and formal understanding often follow discovery and practical applications, not the other way around.
Consciousness and/or sentience will be created before we understand it, and we will only retrospectively recognize that fact. We may have already created it in fact.
Great dialog. Thanks to both of you. The boundaries we are trying to define are becoming more and more blurred. In my view the brain is a tool to improve fitness and consciousness/self-consciousness is just a relatively simple higher order recursive system that helps us model the behaviors of others, and spill out long sentences like this one—-nothing too special and nothing that a tweaked LLM with a set of good supervisor algorithms to modulate attention (and perhaps move a body) could not manage.
> I'm not asserting they have an internal life, the person I responded to was saying they have no internal life and I said they cannot rigorously justify such a claim.
The only claim that I can "rigorously justify" concerning internal lives is that I have one. And the proof is subjective. Clearly this is not a discussion that admits the kind of rigor you're demanding...
We have good reason to believe that people and animals have internal lives whereas rocks and computers don't. That we can't be certain of this isn't a reason to believe the opposite. If your goal here is to insist that all statements about internal lives (other than "I have an internal life") can't be "rigorously justified," fine, I agree.