Analytics and evolving landscape of machine learning for intelligent applications
The goal of embedding intelligence into an application is to help users make better decisions faster, especially in times of crisis.
How do you quantify benefits when designing the “best” experience? How can you measure and drive maximum engagement and value? Interesting proposition! One measurement of success is to track the amount of time a user spends using an application. Applications are built with the hope that users log in every day – multiple times per day – and spend a significant portion of their time using it. Because we all know if users aren’t using that ‘killer app’, it’s because it provides no value with regard to time, cost or efficiency.
Our operational intelligence agile scrum development team recently discussed what “using” means: a) do we build an application with great visualization, many bells, whistles and configuration options that encourage users to remain on the app or; b) do we build a low touch, intelligent application where users are not expected to spend significant time within the application. What would drive higher customer satisfaction and NPS scores?
Machine Learning and Operational Intelligence
The goal of embedding intelligence into an application is to help users make better decisions faster, especially in times of crisis. A well implemented intelligent application helps focus the user on preventing a critical event, and if a problem occurs, enabling it to be resolved quickly – reducing both MTTD (Mean Time to Detect) and MTTR (Mean Time to Resolve). Embedded operational intelligence can also provide significant value even if users never log in. They would simply receive email notifications when problems are resolved or when operations are steering out of norm and what to do about it to restore normalcy. Even better, what if that resolution guidance was ‘automagically’ executed?
In his blog titled Analytics 3.0, Thomas Davenport does a great job explaining the evolution of Analytics over the last couple of decades. We have truly arrived in the age of machine learning and problem prediction. A good real world example of this evolution is the navigation systems we use. Prior to a world where every device is connected, learning every minute and GPS enabled; printed paper maps helped people navigate to their destination. In order to improve the user experience, maps were designed to highlight places of interest, and create dedicated maps for high interest areas. However, most of these maps were intended for expert users who knew what they were looking for, where they were going and most importantly had significant experience reading maps.
Then came the electronic GPS. GPS applications took all the geo location data that had already been collected, combined with the power of GPS to provide near real time feedback on progress and traffic. The application was now intelligent enough to provide analysis on-demand on the best route to a destination and in times of any event, the best alternate route. The focus on user experience was around ease of data entry and the speed of response to events on the road. The users still required some level of expertise in knowing what to look for and where they were headed.
In todays connected world, expecting users to know what they are looking for before an event seems a big stretch. Now every phone is a portable GPS device and combined with software on the phone that utilizes map data, real time traffic streams from others in the community, the devices even suggests routes based on the user’s calendar or daily routine. The device had become low touch and usable by anyone who can speak to it with no other skills required. Real expert systems! It is almost certain that these intelligence navigation apps will be utilized by fully automated self driven cars that can take a person from point A to point B without any input from the user. Machine learning at its best!
Bringing Machine Learning to Mainframe
As it relates to the intelligent app mentioned at the beginning of this blog, CA Technologies has delivered the next generation intelligence platform focused on enterprise systems and database performance monitoring from mobile to mainframe. We are designing solutions that address the needs of users, ranging from domain experts to recent grads, with an intelligence platform that supports users of enterprise mission critical applications to make better decisions faster. These solutions guide users through a similar evolution cycle where experts utilize easily accessible reports to understand what just happened, specialists are provided simple data discovery tools to also identify why it happened, and junior staff are guided to a resolution – the entire system is designed to facilitate easy collaboration across the entire operations team. We believe the ideal end state is to support zero touch applications where performance monitors are intelligent enough to respond to most events. Just like self-driven cars the path to such intelligent applications lies in every organization embracing analytics and evolving their plans to deploy “low touch” self sustaining solutions. We think its possible to make identifying the root cause of an active issue as easy as the notification from your phone informing you that if you don’t get on the road now your going to miss that dinner appointment!
What do you think? What’s your opinion? I’d love if you would join the conversation!
To learn more, watch this webcast on Machine Learning for the Modern Software Factory.