Mixture of Experts (MoE) Based Top Performer Segmentation with Multilingual Chatbot Integration

by Dr. Binod Kumar, Dr. Leena S More (Deshmukh)

Published: May 8, 2026 • DOI: 10.51244/IJRSI.2026.1304000149

Abstract

This paper presents a Mixture of Experts (MoE) architecture for workforce segmentation. The proposed framework combines multiple machine learning models—Support Vector Machine (SVM), Random Forest (RF), XGBoost, and Artificial Neural Network (ANN)—using a softmax-based gating network to dynamically assign weights to expert predictions. The system is evaluated on large-scale HR datasets along with real-time chatbot-generated appraisal data. Experimental results demonstrate superior performance with 92.1%+ accuracy, high cluster separability (Silhouette Score = 0.95), and significant improvements in HR efficiency, participation, and fairness. The framework supports inclusive, data-driven talent management in industrial environments.