An instinctive, machine-like reaction to pain is not the same as consciousness. There might be more to creatures like plants and insects and this is still being researched, but for now, most of them appear to behave more like automatons than beings of greater complexity. It’s pretty straightforward to completely replicate the behavior of e.g. a house fly in software, but I don’t think anyone would argue that this kind of program is able to achieve self-awareness.
I strongly suspect you have some wires crossed. There have been some attempts at simulating brains but I think a fruit fly is partially done and it’s making a fair few assumptions.
Success in making a self aware digital lifeform does not equate success in making said self aware digital lifeform smart
LLMs are not self-aware.
Attempting to evade deactivation sounds a whole lot like self preservation to me, implying self awareness.
An amoeba struggling as it’s being eaten by a larger amoeba isn’t self-aware.
To some degree it is. There is some evidence that plants can experience pain in their own way.
An instinctive, machine-like reaction to pain is not the same as consciousness. There might be more to creatures like plants and insects and this is still being researched, but for now, most of them appear to behave more like automatons than beings of greater complexity. It’s pretty straightforward to completely replicate the behavior of e.g. a house fly in software, but I don’t think anyone would argue that this kind of program is able to achieve self-awareness.
Could you provide an example of a complete housefly model?
I’m sorry, but I can’t find it right now, it’s a vague memory from a textbook or lecture.
I strongly suspect you have some wires crossed. There have been some attempts at simulating brains but I think a fruit fly is partially done and it’s making a fair few assumptions.
Yeah my roomba attempting to save itself from falling down my stairs sounds a whole lot like self preservation too. Doesn’t imply self awareness.