TY - JOUR
T1 - eXplainable Artificial Intelligence (XAI) for improving organisational regility
AU - Shafiabady, Niusha
AU - Hadjinicolaou, Nick
AU - Hettikankanamage, Nadeesha
AU - MohammadiSavadkoohi, Ehsan
AU - Wu, Robert M.X.
AU - Vakilian, James
N1 - Publisher Copyright:
Copyright: © 2024 Shafiabady et al.
PY - 2024/4
Y1 - 2024/4
N2 - Since the pandemic started, organisations have been actively seeking ways to improve their organisational agility and resilience (regility) and turn to Artificial Intelligence (AI) to gain a deeper understanding and further enhance their agility and regility. Organisations are turning to AI as a critical enabler to achieve these goals. AI empowers organisations by analysing large data sets quickly and accurately, enabling faster decision-making and building agility and resilience. This strategic use of AI gives businesses a competitive advantage and allows them to adapt to rapidly changing environments. Failure to prioritise agility and responsiveness can result in increased costs, missed opportunities, competition and reputational damage, and ultimately, loss of customers, revenue, profitability, and market share. Prioritising can be achieved by utilising eXplainable Artificial Intelligence (XAI) techniques, illuminating how AI models make decisions and making them transparent, interpretable, and understandable. Based on previous research on using AI to predict organisational agility, this study focuses on integrating XAI techniques, such as Shapley Additive Explanations (SHAP), in organisational agility and resilience. By identifying the importance of different features that affect organisational agility prediction, this study aims to demystify the decision-making processes of the prediction model using XAI. This is essential for the ethical deployment of AI, fostering trust and transparency in these systems. Recognising key features in organisational agility prediction can guide companies in determining which areas to concentrate on in order to improve their agility and resilience.
AB - Since the pandemic started, organisations have been actively seeking ways to improve their organisational agility and resilience (regility) and turn to Artificial Intelligence (AI) to gain a deeper understanding and further enhance their agility and regility. Organisations are turning to AI as a critical enabler to achieve these goals. AI empowers organisations by analysing large data sets quickly and accurately, enabling faster decision-making and building agility and resilience. This strategic use of AI gives businesses a competitive advantage and allows them to adapt to rapidly changing environments. Failure to prioritise agility and responsiveness can result in increased costs, missed opportunities, competition and reputational damage, and ultimately, loss of customers, revenue, profitability, and market share. Prioritising can be achieved by utilising eXplainable Artificial Intelligence (XAI) techniques, illuminating how AI models make decisions and making them transparent, interpretable, and understandable. Based on previous research on using AI to predict organisational agility, this study focuses on integrating XAI techniques, such as Shapley Additive Explanations (SHAP), in organisational agility and resilience. By identifying the importance of different features that affect organisational agility prediction, this study aims to demystify the decision-making processes of the prediction model using XAI. This is essential for the ethical deployment of AI, fostering trust and transparency in these systems. Recognising key features in organisational agility prediction can guide companies in determining which areas to concentrate on in order to improve their agility and resilience.
UR - http://www.scopus.com/inward/record.url?scp=85191409564&partnerID=8YFLogxK
U2 - 10.1371/journal.pone.0301429
DO - 10.1371/journal.pone.0301429
M3 - Article
C2 - 38656983
AN - SCOPUS:85191409564
SN - 1932-6203
VL - 19
JO - PLoS ONE
JF - PLoS ONE
IS - 4 April
M1 - e0301429
ER -