Does It Really Work? Perception of Reliability of ChatGPT in Daily Use

Authors

DOI:

https://doi.org/10.13136/isr.v14i10S.731

Abstract

How do individuals discriminate between what is human-made and what is produced by Artificial Intelligence (AI)? Despite OpenAI’s mission to ensure that AI benefits humanity, their cutting-edge technology, namely ChatGPT, an AI that aims to reproduce natural human language, raises several questions about its widespread use.

This contribution aims to answer the following Research Questions: RQ1 - Are users with no specific knowledge in the field of AI able to distinguish between text produced by ChatGPT or similar language models and text produced by humans? RQ2 - Is there a significant correlation between attribution of text to AI (or human) and specific opinions and attitudes?

This exploratory survey does not intend to generalise the results but to identify possible opinions and attitudes that might have influenced how the participants responded. One hundred people participated in the experiment, which consisted of a survey on their knowledge and perception of ChatGPT and a two-shot Turing Test. They were asked to read various short paragraphs and try to recognise which were written by humans and which were generated by AI.

The results showed that the group analysed experienced severe difficulties in recognising whether a sentence was written by an AI or a human being, that certain perceptual biases interfere with the attribution of a trivially false text, and that the attribution error can be reduced through experience and learning.  Although in need of further investigation, these findings can help lay the groundwork for the effects of the interaction between humans and AIs from a social science and computer science perspective.

Downloads

Published

28.07.2024

How to Cite

Beluzzi, F., Condorelli, V., & Giuffrida, G. (2024). Does It Really Work? Perception of Reliability of ChatGPT in Daily Use. Italian Sociological Review, 14(10S), 625–655. https://doi.org/10.13136/isr.v14i10S.731

Issue

Section

Articles