Skip to main navigation Skip to search Skip to main content

PERCY: Personal Emotional Robotic Conversational System

  • Zhijin Meng
  • , Mohammed Althubyani
  • , Shengyuan Xie
  • , Imran Razzak
  • , Eduardo B. Sandoval
  • , Mahdi Bamdad
  • , Francisco Cruz

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Traditional rule-based conversational robots, constrained by fixed scripts and static response mappings, fundamentally lack adaptability for sustained personalized human interaction. Although large language models (LLMs) such as GPT-4 enable open-domain dialogue capabilities, most existing social robot approaches remain deficient in emotional awareness and longitudinal personalization continuity. To address this critical gap, we present PERCY (Personal Emotional Robotic Conversational sYstem) – an innovative framework that dynamically integrates: (1) real-time affective signals through facial expression recognition, (2) semantic content of user utterances, and (3) contextual profile data, synthesizing these multimodal inputs into emotion-aware prompt engineering for GPT-4. This integration drives both contextually appropriate verbal responses and synchronized non-verbal robot behaviors. PERCY utilizes GPT-4 to dynamically model the robot’s internal affective state, with non-verbal feedback primarily expressed through facial expressions. The system architecture leverages ROS-based multimodal processing: visual emotion recognition via fine-tuned MobileNetV2, textual sentiment analysis using NLTK’s VADER, decision-level sensor fusion, and GPT-4 prompt conditioning to orchestrate ARI robot behaviors. Empirical evaluation with 30 human participants demonstrated statistically significant improvements in dialogue coherence, contextual relevance, and response diversity compared to baseline systems. PERCY highlights the potential of integrating advanced multimodal perception and personalization to build a scalable foundation for next-generation emotionally intelligent human-robot interaction systems, rooted in contextually conditioned, multimodal affective computing.

Original languageEnglish
Title of host publicationAI 2025
Subtitle of host publicationAdvances in Artificial Intelligence - 38th Australasian Joint Conference on Artificial Intelligence, AI 2025, Proceedings
EditorsMiaomiao Liu, Xin Yu, Chang Xu, Yiliao Song
PublisherSpringer Science and Business Media Deutschland GmbH
Pages466-478
Number of pages13
ISBN (Print)9789819549719
DOIs
StatePublished - 2026
Event38th Australasian Joint Conference on Artificial Intelligence, AI 2025 - Canberra, Australia
Duration: 1 Dec 20255 Dec 2025

Publication series

NameLecture Notes in Computer Science
Volume16371 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference38th Australasian Joint Conference on Artificial Intelligence, AI 2025
Country/TerritoryAustralia
CityCanberra
Period1/12/255/12/25

Keywords

  • Cognitive modelling and computer-human interaction
  • Human-Robot Interaction
  • Social Robotics

Fingerprint

Dive into the research topics of 'PERCY: Personal Emotional Robotic Conversational System'. Together they form a unique fingerprint.

Cite this