Abstract
Soft-Loan application programmes are increasingly leveraging machine-learning (ML) techniques to automate credit eligibility decisions. Despite high predictive performance, many models retain a " black-box " character, which limits interpretability, preventing regulatory and compliance which has raised the challenge of bias especially with the intending soft-loan customers who mostly did not have documented credit history or regular bank statements. Preceding studies on explainable AI (XAI) in credit approval and risk management majorly dues on traditional loans, with little focus to ethical and explainable decision to be concluded for commonly acceptable soft-loan systems. To bridge this gap, this work has comes up with an explainable soft-loan approval framework that combines predictive techniques with SHAP and LIME method on explainability and no limitation on age bracket. Experimenting with the Give Me Some Credit dataset, Logistic Regression, Random Forest and XGBoost models were assessed and XGBoost retained the highest output with performance as (AUC = 0.8674, Recall = 0.7696). SHAP exposed the usage, disadvantageous record and debt issue as the focus expectation while LIME gave insightful case studies for interpretation. Non bias assessment displayed average difference among the age brackets (Accuracy STD = 0.1045; Recall STD = 0.0894), emphasising the importance for justified techniques deployment. The outcome describes that integrating XGBoost with XAI produces an interpretable, auditable and non-biased approach to the soft-loan decision model.