The enhance personalized user interaction by incorporating contextual information such as conversational history, user preferences, emotional state, location, and behavioral patterns for adaptive and dynamic communication
Main Article Content
Abstract
This research focuses on enhancing personalized user interaction in intelligent chatbot systems by incorporating contextual information such as conversational history, user preferences, emotional state, location, and behavioral patterns for adaptive and dynamic communication. Traditional chatbot systems often suffer from poor contextual understanding, limited personalization, weak emotional adaptability, and inaccurate response generation, resulting in reduced user satisfaction and inefficient communication. To address these challenges, the proposed framework integrates Artificial Intelligence (AI), Machine Learning (ML), Deep Learning (DL), and transformer-based NLP architectures such as BERT and GPT to improve semantic understanding, contextual reasoning, and personalized conversational response generation. The proposed system utilizes contextual memory management, sentiment analysis, behavioral prediction, and location-aware adaptation to create intelligent and human-like communication. Experimental evaluation demonstrates that the proposed Adaptive Context-Aware Hybrid Model significantly outperformed traditional conversational systems. The framework achieved 97.12% accuracy, 96.54% precision, 96.20% recall, and 96.37% F1-score. In addition, the system achieved 97.90% dialogue success rate, 98.10% personalization accuracy, and 98.25% user satisfaction score. The results confirm that integrating contextual computing and transformer-based conversational intelligence significantly improves adaptive communication, emotional understanding, dialogue continuity, and personalized interaction in intelligent chatbot systems.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
References
Feil-Seifer, D.; Matari´c, M.J. Defining socially assistive robotics. In Proceedings of the 9th International Conference on Rehabilitation Robotics (ICORR 2005), Chicago, IL, USA, 28 June–1 July 2005; pp. 465–468. [CrossRef]
Scassellati, B.; Admoni, H.; Matari´c, M. Robots for use in autism research. Annu. Rev. Biomed. Eng. 2012, 14, 275–294. [CrossRef]
Alabdulkareem, A.; Alhakbani, N.; Al-Nafjan, A. A systematic review of research on robot-assisted therapy for children with autism. Sensors 2022, 22, 944. [CrossRef]
Stock-Homburg, R. Survey of emotions in human–robot interactions: Perspectives from robotic psychology on 20 years of research. Int. J. Soc. Robot. 2022, 14, 1049–1076. [CrossRef]
Li, S.; Deng, W. Deep facial expression recognition: A survey. IEEE Trans. Affect. Comput. 2022, 13, 1195–1215. [CrossRef]