Abstract
The fast-growing convergence of neuroscience, behaviour computing, and adaptive artificial intelligence (AI) offers the possibility to transform human, machine interaction. This work presents Psycho-Intelligence, a new, closed-loop system that merges electroencephalography (EEG) and inertial motion unit (IMU) signals to adaptively recognise and react to users' cognitive and affective states. Levying low-cost wearable sensors (Muse EEG and MPU-6050), the system has real-time signal acquisition , sophisticated preprocessing, spectral and statistical feature extraction, as well as multimodal fusion features. Dimensionality reduction and feature selection techniques, including Principal Component Analysis and XGBoost gain metrics, enhance learning optimally. Multiple machine learning algorithms like Random Forest, SVM and XGBoost are trained to identify engagement states with high accuracy, warranted by extensive testing through cross-validation, ROC AUC, and F1-scores. The pipeline is incorporated into an adaptive feedback system that can regulate chatbot tone, learning material, or interactive graphics based on detected user states. Statistical validation with linear mixed models confirms the robustness of EEG-derived measurements in engagement prediction. The research establishes a new paradigm for emotionally intelligent AI systems and provides a technical foundation for ethical, real-time psycho-behavioural intelligence for communication networks, education systems, and cognitive health monitoring.