The Social Actor Approach to Interpersonal Deception

Björn Bengtsson

Department of Computing Science
University of Umeå
S-901 87 Umeå
bjorn@cs.umu.se

Recent studies by Nass and colleagues (e.g., Nass, Steuer & Tauber, 1994; Nass, Steuer, Henriksen & Dryer, 1994; Nass, Fogg & Moon, 1996) have demonstrated that people respond to computer artifacts and other communication technologies in a fundamentally social way. For instance, users apply politeness norms, notions of "self" and "other", and gender stereotypes when interacting with computers. Also, subjects who are told they are interdependent with a computer affiliate with the computer as a team. These responses are not the result of the mistaken belief that computers are human-like, or that they merely act as proxies for human programmers. Furthermore, a limited set of characteristics normally associated with humans provides sufficient cues to encourage users to exhibit such behavior.

The experimental approach used to establish these results can be described as follows:

  1. Pick a social science finding.
  2. Substitute 'computer' for 'human' in the statement of the theory.
  3. Replace one or more humans with computers in the method of the study.
  4. Provide the computer with characteristics associated with humans.
  5. Repeat the study and see if the results still apply.

The research done by Nass and colleagues is interesting for two reasons. First of all, the findings, in themselves, have important implications for the design of information technology in general, and for the design of communication interfaces in particular. Second, the experimental paradigm presented has the potential to generate great amounts of additional knowledge to guide this design.

The conception of computers as social actors opens up new possibilities for what has been called 'virtual communication' (Janlert, 1996). This is communication in which one or more parties are fictive, or deviates so strongly from the perceived interlocutor that the communication becomes imagined or untrue to reality in one or more respects. Virtual communication arises when, through modern information technology, we encounter information in shapes that, in reality, have no well defined, simple or clearly delimited transmitter or source, e.g., synthetic faces and voices. We continue, not surprisingly, to try and interpret these messages and expressions using the same kind of pragmatics that we use when trying to understand individual, human agents. This way, 'virtual agents' appear as backward projections of the expressions we observe.

Using modern information technology we can control interfaces to such an extent that what would normally constitute 'real' human-to-human communication is virtualized. The converse is also true: Using more and more sophisticated techniques (e.g., for facial animations and speech synthesis), interactions between people and machines may become deceptively 'real'. We are likely to face a rapid growth of virtual communication, and an increasing population of virtual agents whose messages we will be forced to take seriously. If this is so, a main concern is that of authenticity. Participants in a conversation must negotiate with one another to establish that what is being said is meaningful and true, and that the speakers are sincere. One of the most important factors in computer-interface design is the impact that the interface has on users" abilities to negotiate these claims succesfully in the course of conversation. Today"s technology and interfaces do not accurately support the dialogue with virtual agents, but it is technically possible to increase the interaction. One important question to investigate is under which circumstances this is possible.

Psychologists and communication scholars have long studied issues of trust, authenticity, and validity. One broad class of research concerns judgments about the reality of incoming messages. More specifically, extensive studies have been made on interpersonal deception. Most past research on this topic has been limited to focusing on noninteractive situations, though. Recently, however, Buller and Burgoon (1996) have presented interpersonal deception theory (IDT), that deals specifically with discourse and deception as it occurs in interpersonal situations. According to IDT, the goals, priorities, and task requirements of (possibly deceptive) senders and receivers of messages in an interaction lead each to process and understand that same interaction in distinctly different ways. IDT offers a framework for studying which schemes senders may use to perpetrate deception and the extent to which receivers may recognize such devices.

IDT focuses on face-to-face conversations. However, as the virtual communication perspective implies, state-of-the-art computer-mediated communication blurs the boundaries between interpersonal and mediated conversations. Moreover, whether a perceived interlocutor is, in fact, human or computerized (or both) becomes harder to determine as human and computerized agents converge at the concept of virtual agents. Furthermore, the experimental approach taken by Nass and colleagues lends itself beautifully to the investigation of certain aspects of virtual communication. I therefore propose to apply interpersonal deception theory to explore aspects of deception in human-computer interaction and computer-mediated communication. The purpose of such studies is to investigate the interface"s impact on users" ability to negotiate claims of external validity in conversation with virtual agents. Knowledge gained from this research will be important in guiding the design of digital communication services, search agents, information filtering devices etc.

Literature

Buller, D. B., Burgoon, J. K. (1994), Deception: Strategic and Nonstrategic Communication, in Daly, J. A., Wiemann, J. M. (eds.), Strategic Interpersonal Communication, Lawrence Erlbaum, Hillsdale, NY, 191-223

Buller, D. B., Burgoon, J. K., Buslig, A., Roiger, J. (1996), Testing Interpersonal Deception Theory: The Language of Interpersonal Deception, Communication Theory, vol. 6, 268-289

Buller, D. B., Hunsacker, F. G. (1995), Interpersonal Deception: XIII. Suspicion and the Truth Bias of Conversational Participants, in Aitken, J. (ed.), Intrapersonal Communication Processes Reader, Hayden-McNeil, Westland, MI, 239-251

Buller, D. B., Strzyzewski, K. D., Hunsacker, F. G. (1991) Interpersonal Deception: II. The inferiority of Conversational Participants as Deception Detectors, Communication Monographs, vol. 58, 25-40

Burgoon, J. K., Buller, D. B. (1994), Interpersonal Deception: III. Effects of Deceit on Perceived Communication and Nonverbal Behavior Dynamics, Journal of Nonverbal Behavior, vol. 18, 155-184

Burgoon, J. K., Buller, D. B., Afifi, W., White, C., Buslig, A. (1996), The Role of Immediacy in Deceptive Interpersonal Interactions, paper presented at the annual meeting of the International Communication Association, Chicago, May 1996

Burgoon, J. K., Buller, D. B., Dillman, L., Walther, J. (1995), Interpersonal Deception: IV. Effects of Suspicion on Perceived Communication and Nonverbal Behavior Dynamics, Human Communication Research, vol. 22, 163-196

Burgoon, J. B., Buller, D. B., Guerrero, L. K., Afifi, W., Feldman, C. (1996), Interpersonal Deception: XII. Information Management Dimensions Underlying Deceptive and Truthful Messages, Communication Monographs, vol. 63, 50-69

Janlert, L-E. (1996), Virtual Communication, research proposal, Umeå University, June 1996

Moon, Y., Nass, C. (1996), How 'Real' Are Computer Personalities?: Psychological Responses to Personality Types in Human-Computer Interaction, Communication Research, vol. 23, no. 6, 651-674

Nass, C., Fogg, B. J., Moon, Y. (1996), Can Computers Be Teammates?, International Journal of Human-Computer Studies, vol. 45, 669-678

Nass, C., Steuer, J., Henriksen, L., Dryer, C. (1994), Machines, Social Attributions, and Ethopoeia: Performance Assessments of Computers Subsequent to 'Self-' or 'Other-' Evaluations, International Journal of Human-Computer Studies, vol. 40, 543-559

Nass, C., Steuer, J., Tauber, E. R. (1994), Computers are Social Actors, in Proceedings of CHI "94, Boston, MA, April 1994


Back to the CTW Homepage.