I remember back to when I was a resources analyst at a big company that features eyes and bees. One day my boss asked me why our direct expense went up to the tune of 20% quarter-to-quarter. I was stumped! That was a huge increase, and we didn’t have a standard report built immediately surfacing the answer.
Think back to the last time that you were asked a question where you didn’t know the answer right away. How long did it take you to answer that question?
End-to-end process is unnecessarily long
Beyond superficial quantitative, specific questions, the answers to complex questions invariably take hours to days to find. But why is this so? Depending on the complexity and severity of the question, it can take hours to weeks for an answer to come together. My boss’s question sounds innocuous, but it was a particularly complex one due to the nature of the analytics structures I knew about.
Why? Process is complex
You see, this organization that I worked for had hundreds of different types of employees across 175 countries, 5 business units (at the time) responsible for hundreds of thousands of customers and billions in revenue. How was I supposed to answer a question that spanned so many dimensions and data points? Long turnaround times don’t exist because anyone is bad at their job – they are due to the fragmented nature of the analytics process. As a resources analyst I used Essbase, TM1, Cognos and wrote some SELECTS from a MySQL database to try to piece together answers.
Purpose built tools results in analytics silos
From reporting cubes to dashboards to data science tools, disparate BI tools came of age to support specific use cases. As companies try to piece together answers to complex questions that span these disparate systems, (former) analysts like me have to traverse these systems – moving data, connecting it, cleaning it, modeling it and synthesizing an answer from all of this work. This journey is what drives the unreasonably high time to extract insights that answer your questions.
So – why did direct expense go up?
My journey went something like this:
- First, I checked Essbase – direct expense did, in fact, increase.
- I checked across every dimension available in Essbase – but all I saw was that Commissionable Expense went up. Great, so we paid more in commissions.
- I checked TM1 – within Commissionable Expense in TM1, I found the expense type associated with the increase
- I sent a coworker a request – could they write a query that returned expense by serial number for this particular expense type?
- Only after getting this report – days later – did I realize that a particular job role drove a 20% increase.
Look familiar?
Generally, finding an answer follows a flow that looks something like this:
This is the state of business intelligence today. The reason questions are asked is so that decisions can be made, but answers are either not readily available or provide an incomplete picture. The data that we’re working with is broad, but insights from that data remain accessible only after a complex process.
How can I make it easier to find why something is changing?
Artificial Intelligence
What even is AI?
Emerging technologies are proving to help automate analysis and extend access to analytics. Artificial intelligence is the systematic application of machine learning. Early machine learning required technical skill and understanding to successfully apply. Artificial intelligence puts a framework around machine learning, making it accessible to everyday users. AutoML tools, both open-source and commercial, made appearances with much fanfare and gained popularity in the mid-2010s.
Hasn’t this already been done with AutoML?
AutoML tools provide automation in the context of tradition ML techniques – they make it easier to create, run and manage regression models However, at the end of the day you are still creating models, testing hypotheses and thinking in the mindset of machine learning. Combined with classic visualization tools, they have become the flavor of choice for organizations trying to pull meaningful insights out of their data. Over time, artificial intelligence frameworks have improved to the point that they can be support the traditional analytics cycle. Instead of automating the technical process of creating machine learning models, the analytics process is now automated and supported by completely autonomous machine learning models.
Decision intelligence = the analytics process, automated
This new type of platform is called decision intelligence. It can apply automated machine learning and statistical models to identify answers to questions in a fraction of the time of the typical analytics process. Instead of an analyst combing through dimensions and waiting on a data scientist to train models such as ARIMA, random forest, decision tree and more – decision intelligence platforms now exist that expose answers instantly to end users, while taking care of the computational legwork in the background.
Think about the before and after. Before: fragmented, independent tools that were automated only within themselves. Today: platforms that can provide insights into data at scale in seconds by automating the individual, siloed processes of the past.
Natural Language Interface
NLP has come a long way
The second major enhancement of the recent past is natural language processing as an interface. I recall using voice to text on my phone circa 2012 for texting – and being disappointed with the results. Today’s natural language processing not only can pick up text-to-speech – it can contextualize what is being written and respond accordingly. Where as current and legacy tools require an arcane, hokey interface littered with nested menus and pseudo-code, NLP provides an interface that removes those barriers in place today.
Not only does the existence of NLP replace menus and code – it goes a step farther by taking advantage of concepts unique to language. The following concepts all are now possible with natural language processing interfaces:
- Autocorrection
- Suggestion Provider
- Intent Recognition
- AI-Driven personalization based upon user’s historic syntax
Finally, the barriers to entry for adopting NLP are lower than ever. Legacy applications with an NLP interface still required significant setup work, from mapping to creating complex data dictionaries. Modern NLP takes advantage of advanced computing frameworks that automate metadata mapping, indexes all items in every column and allows users to start asking questions at minute 1.
The benefits of NLP result in users that can truly take advantage of the data presented to them by allowing them to access it in a method that they are familiar with. This reduces further the time to answer a question, because the answers are all just a question away.
It’s only hard because you make it hard
The analytics landscape came of age with purpose-built tools that supported narrow use cases. Data preparation, visualization and machine learning all grew up separately from each other, supporting unique purposes. The maturation of AI and NLP allow for these processes to be automated and exposed to end users as a turnkey service – but most organizations are still living in the same time period as I was as a resource analyst.
Real value comes from acting faster. In order to act faster, you’ve got to iterate on your question and answer cycles with speed. When self-service NLP-driven analytics and automated AI live together, they replace the analytics process that disparate tools serve today and do it in a fraction of the time – compressing the lifecycle of a question by removing handoffs and exposing insights previously difficult and time-consuming to find.
If I could go back in time, I’d tell the resource analyst version of myself that it wouldn’t always be that way. Every question wouldn’t take days, and one day I could answer them on my own. That’s what I’m doing today – bringing the future to the past, one day at a time.
To learn more about how Tellius is changing the world, check out our website and try decision intelligence for yourself.