Subscribe to the Teradata Blog

Get the latest industry news, technology trends, and data science insights each week.

I consent that Teradata Corporation, as provider of this website, may occasionally send me Teradata Marketing Communications emails with information regarding products, data analytics, and event and webinar invitations. I understand that I may unsubscribe at any time by following the unsubscribe link at the bottom of any email I receive.

Your privacy is important. Your personal information will be collected, stored, and processed in accordance with the Teradata Global Privacy Policy.

A Few Things to Know When You’re Moving to the Cloud

A Few Things to Know When You’re Moving to the Cloud
Every few years something comes along that shakes up the analytic industry. The current iteration of this cycle is “the cloud” and all the surrounding fervor that it will be the savior to us all. Unfortunately, time will move forward, and we will once again come to realize there is no silver bullet.  

One of my favorite definitions of the cloud is “it’s just someone else’s computer.” I know that over-simplifies all the cloud offers but as in all humor, it is funny as there is a grain of truth to the saying. At the essence of it, the cloud is basically someone else’s data center with a whole host of compute and storage. They manage the system integration and provide a good deal of the “care and feeding” so you can focus on your own business. Going to the cloud also allows a shift of corporate resources from the data center into more valuable efforts like customer directed applications and deeper analytic offerings.  

While the cloud offers some very good characteristics, there is a lot that is missing. The smart play is to make the move to the cloud with eyes wide open and leverage the good while remembering to tend to the gaps.

Good things the cloud provides

Probably the most notable benefit of the cloud offerings is the fast provisioning and elasticity of compute and storage needs.

In the past, when all platforms were stashed away in the corporate data center, it was a chore to get more compute and storage resources. Systems had to be ordered, and delivered, space had to be found on the floor, equipment had to be installed and tested and so on. In short, it was a constant struggle between having enough compute to meet peak demand and trying to minimize cost by not overbuying capacity before it was needed. The outcome was usually too much capacity, or more often not enough, which meant workload had to be delayed, prioritized, or discouraged.

Today, with cloud offerings, the pain of the past if gone. You can provision new compute or storage as you need, and even scale down when the peak is over. There are many billing options to pay only for what you use (though, you need to exercise caution as it may lead to unexpected costs), and all the data center management issues have been off-loaded to the cloud providers.

The cloud also brings a slew of positives in the application and services space that are in the cloud architecture. Just a few examples:

•    More object storage (S3, Blobs), 
•    Streaming and ETL tools (Glue, Data Factory, Pub/Sub),
•    BI or reporting (Quicksight, Power BI, Looker),
•    AI or advanced data science (SageMaker, Databricks), and
•    Other system management, connectivity, security options.

All of these will come with their own pros and cons, which are beyond the scope of this blog post, but remember cloud providers are a business, not a charity. The goal behind getting companies onto their cloud is to not only provide a platform service, but to then “extend the spend” with these other offerings. Bottom line is that if companies intend to leverage these applications and services, they will need to either migrate over existing processes or create them from the start. This brings us to our next point… 

Good things the cloud does not provide (and you will still need)

But the cloud is still a platform, it is not a solution. You still need to worry about all the boring things like data management, data integration, optimized application development, and overall system and workload governance.

These are not solely technical issue but more process and people issues. The cloud does nothing to these address items.

Many are moving to the cloud as a part of the “modern analytic architecture,” and it covers much of the modern part, but what about the part that matters most, namely – analytic capability? To drive enterprise analytics, you need to have a strong analytic engine that allows for robust data models, complex and integrated operations within single queries, advanced functions such as pathing, sessionizing, time series, and predictive modeling. You also need to handle data reusability for high concurrency with diverse workloads. Not all users need the same answers and there are great differences in service expectations from sub-second operational requests to hour long queries to determine and drive strategy.  

On its own the cloud does none of that -- again it is the platform. If you already have an on-premises system providing your analytics are you going to be able to make the move to the cloud with that engine, or will you be starting from scratch with a new analytic engine? That is a decision with lots of negative consequences and possibly years of migration effort, especially if you need to maintain existing business processes and analytics.

Even with a good analytic engine, there are many other considerations and actions that lead to success in business, and that is what having an analytic ecosystem is all about, driving your business better. A solid analytic ecosystem demands not only the platform but also:

•    Data Leadership (Why it is important)
•    Data Governance (What is important)
•    Data Architecture (How to achieve it)

Check out other blogs at for each of these topics, though just a point to illustrate the three items, let’s look at Data Reusability. Reuse of data across the enterprise provides a full view of the business (the why) and minimizes cost and complexity of the environment (what’s important). The “how” you achieve that is critical; will you create silos and constantly propagate the same data in many places, leading to data drift and confusion, or will you have an architecture that allows for separate compute and storage to optimize costs but also allow cross-platform analytics to occur transparently? While the cloud can help in the execution, you still need to make or manage all these decisions.

Vantage – Your low-risk fast path to the cloud

Utilization of the cloud as part of an overall modernization strategy needs to be well thought out and driven by the value it brings to the business. Whether that value is increased capability provided by cloud tool sets, increased business agility provided by the elasticity of cloud compute and storage, or costs savings provided by taking advantage of high-volume, low-cost object storage resources, having a clear understanding of what benefits you expect to see and what the true cost of that benefit will be is critical.

The cloud is just another tool/capability in your technology toolbox and if used properly can bring great value to your business, but if deployed without a strategy or clear, measurable goals it can very easily spiral out of control from a cost and complexity perspective.

Current Teradata customers can easily delve into the cloud and start testing out some of the new capabilities via Vantage with very little risk. Since Vantage on-premises is the same product as Vantage in the cloud, there is very little to no migration effort, and if the expected value is not realized it is easy to move back to your on-prem environment. Also, data integration capabilities via QueryGrid and Native Object Store (NOS) will even allow you to start taking advantage of cloud-based object stores with without any need to migrate.  

Teradata Vantage is a low risk, fast path to the cloud, and Teradata has the business and ecosystem consulting to help make that move with the best value for your business.
Portrait of Rob Armstrong

Rob Armstrong

Starting with Teradata in 1987, Rob Armstrong has contributed in virtually every aspect of the data warehouse and analytical processing arenas. Rob’s work in the computer industry has been dedicated to data-driven business improvement and more effective business decisions and execution.  Roles have encompassed the design, justification, implementation and evolution of enterprise data warehouses.

In his current role, Rob continues the Teradata tradition of integrating data and enabling end-user access for true self-driven analysis and data-driven actions. Increasingly, he incorporates the world of non-traditional “big data” into the analytical process.  He also has expanded the technology environment beyond the on-premises data center to include the world of public and private clouds to create a total analytic ecosystem.

Rob earned a B.A. degree in Management Science with an emphasis in mathematics and relational theory at the University of California, San Diego. He resides and works from San Diego.

View all posts by Rob Armstrong

Turn your complex data and analytics into answers with Teradata Vantage.

Contact us