If an AI has passed the Turing test, meaning people cannot tell the >difference between a human being and a computer by talking to it, then
how do you know that the AI is not conscious? Look at it the other way,
a human being failed the Turing test. So human beings are just a load of >neurons firing, it's just a trick.
"GPT-4.5 could fool people into thinking it was another human 73% of the >time. "..." And 4.5 was even judged to be human significantly *more*
often than actual humans!"
Interestingly, the use of *asterisks* has slipped into an HTML article
where there is no need for them.
https://www.livescience.com/technology/artificial-intelligence/open-ai-gpt-4-5-is-the-first-ai-model-to-pass-an-authentic-turing-test-scientists-say
| Sysop: | DaiTengu |
|---|---|
| Location: | Appleton, WI |
| Users: | 1,076 |
| Nodes: | 10 (1 / 9) |
| Uptime: | 29:02:44 |
| Calls: | 13,804 |
| Calls today: | 1 |
| Files: | 186,990 |
| D/L today: |
300 files (66,892K bytes) |
| Messages: | 2,441,605 |