With the expansion of remote and hybrid workforce and an ever more complex technology set, it is more important than ever to manage the digital employee experience. But how do we measure digital employee experience? One of the most common ways is to use surveys – for example post-incident surveys or annual customer satisfaction surveys. But these surveys, on their own, can be reactive, time-delayed and limited in scope. Is there a way to consistently, automatically measure digital experience that covers the broader IT functions?  

The answer is yes! In the Ivanti Neurons Workspace 2022.2 release, we are introducing the Digital Experience Score (DEX Score in short). We start with a curated set of indicators that affect digital experiences, pump them into our proprietary scoring engine and generate a DEX Score between 0-100.

DEX (Digital Experience) Score - Full Demo

Let's look at the input to our scoring engine – indicators – first. Indicators can be, for example, metrics like hard disk storage, Boolean flags like antivirus protection and events like when a scan happened. We want a scoring system that is representative of the rich digital experience across key aspects of IT functions, so we selected indicators from service management, application, device and security.  

Obviously, when it comes to our scoring engine, we can't reveal too much of our secret sauce, but what I can say is we use a hybrid model – meaning we use more than one statistical and machine learning technique to calculate the score. 

Our scoring engine does not require system admins to manually adjust the threshold of what makes an indicator good or bad. For some of the qualitative device indicators such as hard disk storage, we analyze past readings and infer what the normal range of behavior is using statistical models. Once we know what the normal range is, we can calculate how different or similar a new data point is from normal behavior. This idea of quantifying normalcy versus irregularity is at the heart of our scoring engine.  

But what about textual indicators like incident subject and description? How do we identify the latent emotion and convert them into a qualitative measurement? We decided to apply sentiment analysis to open incidents that are linked to a device. Our sentiment analysis model is a deep neural network to classify incident texts into positive, negative, and neutral sentiments. The model is pre-trained, so no additional training is required from customers.  

Lastly, we do not just analyze the indicators individually; instead, we use a data mining algorithm that elegantly takes into consideration the interactions between those indictors. That means even though individual indicators can fluctuate and spike, or not tell you much on their own, together they tell us how much a device’s behavior deviates from normal. Another advantage of this data mining algorithm is it picks out which indicators are contributing to the abnormal behavior, so we surface them as potential issues.  

If we do not use any of the fancy data science terms, what we are doing is comparing and correlating indicators across time and with each other, then condensing it into a single, holistic score. With no manual machine learning training and no threshold setting, are you ready for unprecedented insights into the digital experience across IT functions?