Back to News
Advertisement
Advertisement

⚡ Community Insights

Discussion Sentiment

20% Positive

Analyzed from 458 words in the discussion.

Trending Topics

#conscious#llms#memory#term#ability#change#consciousness#human#chimpanzee#state

Discussion (14 Comments)Read Original on HackerNews

xyzsparetimexyz•about 2 hours ago
One thing I haven't seen brought up much is that LLMs are basically stateless. To be conscious requires the ability for internal state to change. The weights dont change at all, but the rng seed and input/output text do. We're not seriously arguing that the text itself is the conscious part are we?
garciasn•about 2 hours ago
LLMs are stateless for recent interactions, but do have long-term memory from their training and thus act very much like someone suffering from Alzheimer’s.

So, folks who suffer from some level of brain damage that causes them not to have short term memory are then not conscious?

I’m not arguing that LLMs are conscious, mind you; I just disagree that short-term memory loss outside of their context window should be the line.

E: double negatives are bad; my 8th grade English teacher would be disappointed.

i000•about 1 hour ago
> do have long-term memory from their training and thus act very much like someone suffering from Alzheimer’s.

Your 8th grade science teacher may be disappointed too. Drawing such analogies using unequivocal language "very much like" disregards the limited understanding of LLMs, the false analogies between computer and biological systems, and the complex nature of Alzheimer's disease (no it is not just short term memory loss, not even close, for example ability to interpret images)

handoflixue•24 minutes ago
> for example ability to interpret images

I'm pretty sure blind people are conscious despite that.

Serenacula•about 1 hour ago
Why exactly should consciousness require the ability for internal state to change? That seems like a fairly arbitrary requirement to me.

Even if we allow it, from a certain perspective it does change, otherwise each token output would be identical. They are not.

michaelmrose•11 minutes ago
First you have to define consciousness. I don't see how you do that without self-reference and state transitions.
gdulli•about 2 hours ago
If software can be "conscious" then we need a new word to describe what it is that a person has that makes me care about them in a way I never would care about the output of a program.

Fighting about semantics is not as interesting as the question of whether we should care about and give rights to a program running in memory like we do the owner of a human brain.

mutant•10 minutes ago
dumbass. cmon man, atheists have a hard enough time
ganelonhb•about 1 hour ago
Is he joking to prove a point?
gnabgib•about 3 hours ago
Discussions (35 points, 4 days ago, 71 comments) https://news.ycombinator.com/item?id=47988880

(75 points, 4 days ago, 124 comments) https://news.ycombinator.com/item?id=47991340

(17 points, yesterday, 17 comments) https://news.ycombinator.com/item?id=48025969

morpheos137•about 1 hour ago
Step one make up an otological category with no unique content.

Step two declare it an imponderable mystery.

Step three argue confidently about it despite steps one and two.

NB. Humans, it doesn't matter if you are conscious.

NBB. Humans claim LLMs just manipulate words, and yet humans manipulate words to make this claim. Consciousness is a word. Not an ontology.

quantum_state•about 2 hours ago
Yet just another human fooled by LLM ...
LeCompteSftware•about 1 hour ago
If I invented a machine that makes chimpanzee noises in response to input chimpanzee noise, put it in front of a chimpanzee, and watched the chimp coo and yell and screech and purr in response to the machine, I would not conclude "wow, I emulated a chimpanzee's consciousness!" I would say "huh, I made a device that's good at tricking chimpanzees."

My belief is that the Turing test (and LLMs in particular) are not categorically different. Language is a tiny part of the human brain because it's a tiny part of human cognition, despite its outsized impact socially.

leonardo55•about 1 hour ago
What a clown