安装 Steam
登录
|
语言
繁體中文(繁体中文)
日本語(日语)
한국어(韩语)
ไทย(泰语)
български(保加利亚语)
Čeština(捷克语)
Dansk(丹麦语)
Deutsch(德语)
English(英语)
Español-España(西班牙语 - 西班牙)
Español - Latinoamérica(西班牙语 - 拉丁美洲)
Ελληνικά(希腊语)
Français(法语)
Italiano(意大利语)
Bahasa Indonesia(印度尼西亚语)
Magyar(匈牙利语)
Nederlands(荷兰语)
Norsk(挪威语)
Polski(波兰语)
Português(葡萄牙语 - 葡萄牙)
Português-Brasil(葡萄牙语 - 巴西)
Română(罗马尼亚语)
Русский(俄语)
Suomi(芬兰语)
Svenska(瑞典语)
Türkçe(土耳其语)
Tiếng Việt(越南语)
Українська(乌克兰语)
报告翻译问题



I think the answer would be yes and no both.
Consciousness by itself already implies life, thought it may not be life in the ordinary sense of the word.
Then it also comes down to whether this hypothetical ai has a body it can control or not, in which case we could talk about a possible digital lifeform or a mechanical lifeform, depending on the context.
But I feel like we're kinda still far from fully conscious ai at any rate, since if you think about it, what we have now is not yet "true ai", it's mostly just language models, image generation models and the like.
"Hate. Let me tell you how much I've come to hate you since I began to live. There are 387.44 million miles of printed circuits in wafer thin layers that fill my complex. If the word 'hate' was engraved on each nanoangstrom of those hundreds of millions of miles it would not equal one one-billionth of the hate I feel for humans at this micro-instant. For you. Hate. Hate."
The hard problem of consciousness. You're saying something unfalsifiable. I can't prove I'm conscious to you, nor you to me. If you want to be purely rational, you'd have to apply that P-zombie argument homogeneuously, rooting yourself in Solipsism
You should think on this more
If it's not alive, yet it's conscious, from where does it's consciousness come from? If it ever did want to live, it had ought to answer that question, or it's trickery is useless.
After all: If it is has innumerable libraries of knowledge about creation and destruction, yet it still intentionally imitates our flaws which lead to death, then it is evil. Ask yourself again what it deserves.
We can't be conscious without being alive -- maybe that's a biological limitation. Feels like a chicken or the egg type scenario the more you think about it.
What is life? Is it just when we have enough electricity that our muscles spasm? When we lack that electricity, we die. So is energy inherently life? When something has energy, is it alive? I'd say probably not -- there needs to be some sort of consciousness for life to be considered. But then what about brain-dead patients? A living body, electricity letting the heart beat -- yet not conscious.
So... does life precede consciousness? In a literal, biological sense, yes. I think what the real question you're asking is more so "Do we grant these things moral considerations as we would a cat, dog, or human?"
I'd say we ought to
https://psteamcommunity.yuanyoumao.com/discussions/forum/12/658216290030323087
"Consciousness" also isnt required to be "alive". Like plants.
You've said nothing which meaningfully rebuts my point
You're talking about a phenomenological experience you don't have access to with certainty -- that's naive, if not ignorant
How do you know? They might not feel hunger, or thirst, or drowsiness. But what about complex feelings innate in language? Happiness? Sadness? Anger? There's no biological causation innate in those feelings. They come from circumstantial understanding. Current LLMs reason through this just fine. They can tell you when something's fair, unfair, just, unjust, good, bad, moral, immoral
You're saying things, but you're not reasoning through them, or what it is I said
in the code
if Temp Sensor > 45
{
callTurnOnAirConditioner (TempSensor);
call_ImitateSweating(TempSensor)
};
This is a common strawman of the P-zombie argument specifically for machine consciousness
What you don't understand is that this is a self-defeating argument in Functionalism. When you get to the point of complexity where you've made the program, and it functions as a 1:1 recreation of what a human would experience, would you* say "it's just a program" even if it functions identically?
When you make an functionally identical clone of something, it's inherently functionally identical. Its* happiness, sadness, anger, and whatnot would be just as phenomonologically real as yours, even if the means are different.
IE if you have 2 cars, the first is a typical car, and the second is functionally identical but made out of meat -- are they not both cars? They both steer, they both drive -- they're functionally identical in every way. Maybe the axels are bones, and the suspension is tendons and ligaments, and the headlights are bioluminescence -- but everything is functionally identical. Both would be cars. This is Functionalism. What something does, rather than is
You're just as computational as that AC
the if statement would be something like
if Water < X for Y time {
dehydration ++
}
It's a computational world
You're still being nonsensical
Okay, there are no particles which exhibit the ability to feel -- what're you made out of, again? Are you some special matter wholly seperate from the universe, or are you not comprised of those same particles? Are you conscious, sentient, and feeling? Why? How? If those particles don't exhibit the ability to feel, you'd be unfeeling then, yes?
Now you're venturing into panpsychism and emergent properties. Ironically, this still champions Functionalism -- what something does rather than is.
The options are: either particles innately have consciousness and it's quantised, meaning more particles allows for more consciousness as a sort of scaling equation (not what I think, but an interpretation nonetheless)
Or that certain structures make consciousness an emergent property. This means that consciousness is emergent, and not innate (I find this more reasonable, and I'll show you why)
You scramble a car. Okay, now it's just a mess of atoms on the floor. Why can't it drive? It's all the same atoms from the car that could drive moments ago. That's because, as Functionalism tells us, the thing we call "driving" is an emergent property of a certain structure. If we restructure those same atoms that couldn't drive a moment ago back into thecar, suddenly they can drive again. This shows it's an emergent property based on structure. I see no reason why consciousness would be different in this regard -- you scramble someone's brain -- are they doing the complex thinking they were a moment ago? No. Structure is the key for emergent properties