analytics

Are You Getting Real Value from Analytics? Can You Prove It?

IDC estimates that big data and advanced analytics will be a $200b market by 2020[i] .  Revenue from ‘data-driven’ products will double revenue from traditional products for 1/3rd of the Fortune 500 according to New Vantage Partners’ latest executive survey.  Fifty percent of the world’s GDP will be digitized within the next three years[ii].  The economy seems fully dependent on data-driven innovation – and companies are investing at break-neck speed.

So why is it that 85% of the big data programs being invested in fail to ever be implemented into production? [iii]

There is a disconnect here.  While analysts are providing lots of good data points on the size of the market in aggregate, it’s harder to find guidance on returns generated from specific analytics projects or programs.  This lack of quantifiability is holding back companies from making the investments they should be in the very real promise of big data and advanced analytics.  While the potential seems very real, investment decisions are ultimately made on measurable return.  And the only way you can really measure return – like an increase in revenue, decrease in fraud or improvement in customer satisfaction – is to deploy in production and actually measure it!

Quantifying Means Going All the Way

Why, then, with so much promise, are so few big data programs ever implemented?  Because going from theoretical modeling into production is a lot harder than it sounds.  Let’s use a predictive customer attrition model as an example:

Challenge Description Example
1) Integrating with Production Systems Modeling in the lab is relatively easy with a static, offline dataset.  To actually deploy a model into production requires some combination of APIs, ETL, data governance, documentation, and a lot of testing… not to mention, a lot of approvals and internal controls. You have a model that predicts profitable customers who are at risk of leaving your institution; every month your model scores the customer population and provides the 100 most at risk customers.  So now what?  The only way to determine if the model saves customers from leaving is to develop treatment plans for those at risk customers and try them.  Treatment plans must be integrated into CSR desktops, web/mobile workflow as well as other interaction channels.  Outbound offers must be generated and extended.  This cannot happen without system integration.
2) Changing Production Processes Organizations can update systems, but even with advances in machine learning and robotic process automation, changes to processes and procedures are still required.  Call Center representative training manuals, customer facing documentation, email templates and timing, IVR menus are just a few examples. In this example, in addition to any APIs and system integration, customer and agent facing processes may also need to change.  To select and deliver a retention offer, agents may need new call scripts or decision models; fulfilling those offers will need to be integrated with management or regulatory reporting processes.
3) Developing a Rigorous Test & Learn Plan Credit card issuers used to (and still do in many cases) send massive solicitations through the mail and rely on miniscule response rates to bring in new customers.  These companies are masters of test, learn and tweak: which offers, APRs, even envelope size and shape all work in what combination to lift that response rate higher.  Testing a new analytics model is similar; companies must specify success criteria, measure and have a plan to improve. In this example, your model has generated the top 100 most at-risk customers for attiring from your institution.  To save those customers, an institution should develop A/B testing models with several test paths – depending on how complex, these tests could involve multiple channels or offers.  Organizations that take this step deploy a rigorous methodology to both developing and monitoring production tests; then are able to revise production offers (and the modeling behind them) accordingly.
4) Deploying ‘Analytics Governance’ Organizations active in the big data space know all about data governance, but ‘analytics governance’ – setting up an analytics deployment capability – becomes hugely important when operationalizing analytics.  Thinking through and answering questions such as: who controls the data science budget? What do those data scientists and data analysts do?  When and how do they redeploy from model creation to maintenance?  When do they shift focus from an existing model to a new one? How can stakeholders be convinced of the validity of model output?  The appropriate organizational governance model can answer those questions as well as a host of others. Analytics governance is an absolutely critical – and often overlooked – component of getting a return on your analytics investment.  Essentially, the concept refers to a coordinated process to ensure operational control over the analytics process from beginning to end.  Analytics governance ‘councils’ provide an essential organizational overlay on resource allocation and data source selection for analytics use cases.  This ‘council’ function is essential to successful organizational adoption and integration of analytics.  More on how these councils should be formed, staffed and operated in a subsequent blog…
5) Change & Culture Management The fifth reason productionalization stalls is a failure to both recognize and address that with true transformation comes massive change to an organization’s culture – up to and including people’s roles & responsibilities and career trajectories.  Impacted resources dig in their heels and resist change that is threatening.  Change and culture management must be considered for any large big data driven analytics project to succeed. Machine learning and AI – not to mention ‘merely’ advanced statistical modeling – promises generational transformation.  Even a small component of that can bring the fear of more change to come.  Outside of analytics governance, organizations adopting even modest changes should mount a broader change management effort.  Workforce stress and resistance to change can stop even the most advanced, automated modeling dead in its tracks.

So it’s hard to believe, but it seems all the advanced analytics and data science may, at the end of the day, be relatively easy compared to deploying models in production.  Organizations have focused on gathering data and coming up to speed on new technologies to analyze it for insights.  Many companies do this very well.  It’s now time to expand that focus to include an institutional capability to deploy, evaluate, measure and learn from those models in production.  If that capability isn’t developed and matured, any investment in advanced analytics risks generating insight, but limited return.

—-

[i] IDC Futurescapes 2017

[ii] New Vantage Partners Big Data Executive Survey 2017

[iii] https://www.gartner.com/newsroom/id/3466117

Leave a Reply

Your email address will not be published. Required fields are marked *

Time limit is exhausted. Please reload CAPTCHA.