Subscribe to the Teradata Blog

Get the latest industry news, technology trends, and data science insights each week.



I consent that Teradata Corporation, as provider of this website, may occasionally send me Teradata Marketing Communications emails with information regarding products, data analytics, and event and webinar invitations. I understand that I may unsubscribe at any time by following the unsubscribe link at the bottom of any email I receive.

Your privacy is important. Your personal information will be collected, stored, and processed in accordance with the Teradata Global Privacy Policy.

Curiosity never killed the analytical cat

Curiosity never killed the analytical cat

Advanced analytics are the key to competitive advantage today. Doing it cost effectively and at scale — however — requires a new approach along three dimensions:

  1. Staffing for Advanced Analytics at Scale
  2. Learning how to “fail fast”
  3. Turning analytics into action

Let’s take a closer look at each of these dimensions to see how they can make — or break — your business prospects when using analytics.

Staffing for Advanced Analytics at Scale

Staffing is perhaps the most complicated component. Advanced analytics usually requires help from highly compensated data scientists who’ve spent considerable time learning skills such as R, Python and advanced data analytics using SAS.  Furthermore, there is not ONE overarching description of who a Data Scientist is. They come in various personas that range across the business-technical continuum. That’s why staffing will have to be nuanced enough to consider the contextual nature of their work and the right mix of needed skills.

To do advanced analytics at scale, then, there are two approaches from a staffing perspective: (A) hire lots of expensive data scientists and/or (B) find a way to leverage people who have business acumen, but not necessarily those rare data science skills.

The first way is certainly not cheap. In fact, it can’t scale either because there is not an abundant supply of data scientists to hire, even if you had unlimited resources.  This becomes a particularly hard thing to do in many geographies across the world — where having a critical mass of localized data science resources is simply not possible.

The second way takes a different approach — where we try and focus on the reasoning and logic that powers analytics, without necessarily requiring advanced data skills in each and every business user. In practical terms, we can ask data scientists to research ways to solve business problems then have them build accessible models that business users can leverage to solve those problems. With these models serving as templates and purpose built analytical apps, businesspeople (aka non data scientists) can perform analytics on their own and test new parameters as business logic dictates. 


This makes the analytics process more repeatable and stable — and one that need not be constrained by esoteric technical know how.  For example, a customer churn App that embeds a path analysis algorithm can be run each week from an app framework to show what drives customer behaviors over time. We can create similar apps for a wide variety of business use cases — from fraud detection and customer sentiment to manufacturing and supply chain optimization.

Learning how to “Fail Fast”

We often hear “failure is not an option.” But the truth is:  it is an option, provided you embrace a fail fast approach. Businesspeople can develop hypotheses about what, for example, is impacting customer churn and then try to prove or disprove those hypotheses quickly. The faster you disprove your hypothesis, the less time that is wasted on it. And of course if you prove your hypothesis, you’ve generated value.  In fact, the greater the diversity of hypotheses, however outlandish some of them may be, the better the chances of arriving at that one insight that unlocks the unlimited potential of the business to reach new heights.

Turning Analytics into Action

Affordable analytics at scale also requires having the right architecture to turn insight into action. I’m speaking here in terms of functional architecture — what it contributes to the analytic process, not necessarily how to build it. I see four parts to such a functional architecture:

  1. Gather and ingest the data
  2. Explore the data and gain insights
  3. Operationalize the insights
  4. Keep it fresh

Data lake architectures based on Hadoop enable you to take in massive amounts of data at scale, tens or even hundreds of petabytes of it.  This allows you to store data cheaply.

After gathering the data, the next move is to explore it, test it and analyze it. For example, you might do analysis to figure out which four or five events in a customer’s journey lead to churn.

After you determine events that lead to churn, you operationalize the insights you’ve gained by building them into your business logic and workflows. If Ted, a valuable customer, engages in two actions that map to the churn model,  perhaps the best course would be to intervene with an offer before he moves to event number three on the journey that would increase the likelihood of churn.

Once insights are operationalized, it is important to close the model feedback loop. The models you build usually change over time; the factors that cause churn this year may not hold a year from now, as market and other environmental factors change. Test your hypotheses to see if the model still holds or if it needs adjustment. For example, as new forms of customer interaction (e.g., channels, pricing options) are introduced, the churn driver variables may also change over time.

Many different advanced analytics techniques should become a part of your secret sauce. Typically, in my many years of doing this, I have realized that analytics is best done by using an ensemble approach. I call this multi-genre analytics.  To do these kinds of impactful analytics and more importantly, to operationalize insights at scale, requires the smart application of scarce analytic resources. The point is to use all these resources effectively: data scientists and the many people with deep business domain knowledge.  Furthermore, creating a culture of failing fast and encouraging experimentation within reason is a failsafe method to make rapid progress.

Finally, developing the in house vernacular and tools that make everyone invested in the analytics process is a sure winner.  After all, demystifying analytics and helping the business execute on insights can only be good for everyone in the organization, for the customers, and brightens the competitive sheen of the marketplace.  To mercilessly flog a cliché here: this is a win-win for all.


Portrait of Sri Raghavan

(Author):
Sri Raghavan

Sri Raghavan is a Senior Global Product Marketing Manager at Teradata and is in the big data area with responsibility for the AsterAnalytics solution and all ecosystem partner integrations with Aster. Sri has more than 20 years of experience in advanced analytics and has had various senior data science and analytics roles in Investment Banking, Finance, Healthcare and Pharmaceutical, Government, and Application Performance Management (APM) practices. He has two Master’s degrees in Quantitative Economics and International Relations respectively from Temple University, PA and completed his Doctoral coursework in Business from the University of Wisconsin-Madison. Sri is passionate about reading fiction and playing music and often likes to infuse his professional work with references to classic rock lyrics, AC/DC excluded. View all posts by Sri Raghavan

Turn your complex data and analytics into answers with Teradata Vantage.

Contact us