Most people think large language models such as ChatGPT can experience feelings and memories

By Media Relations

Two-thirds of people surveyed think that artificial intelligence (AI) tools like ChatGPT have some degree of consciousness and can have subjective experiences such as feelings and memories, according to a new study from the University of Waterloo.

Large language models (LLMs) like ChatGPT often display a conversational style when outputting content. These human-like abilities have spurred debates on whether AI has consciousness. 

According to the researchers, if people believe that AI has some level of consciousness, it could ultimately affect how people interact with AI tools, potentially strengthening social bonds and increasing trust. On the other hand, excessive trust can also lead to emotional dependence, reduced human interactions, and over-reliance on AI to make critical decisions.

“While most experts deny that current AI could be conscious, our research shows that for most of the general public, AI consciousness is already a reality,” said Dr. Clara Colombatto, professor of psychology at Waterloo’s Arts faculty. 

To read the full article, click here!