BUSINESS INSIGHTS

Aug 22, 2017

Why Companies are Failing to Utilize the Cloud for Analytics

Scott Faculak Posted by Scott Faculak

Big organizations have multiple departments responsible for providing analytics, including marketing, sales, finance, supply chain, and logistics. Due to the nature of large companies, any type of analytical technology must come through IT. Many organizations are trying to take advantage of the cost savings and speed of analysis that Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) can provide, however, the common IT approach hinders or completely blocks the potential of cloud services for analytical business units outside of IT.

Why Companies are Failing to Utilize Architecture as a Service for Analytics.png

Having witnessed this situation while working in a corporate environment and again as a consultant, this common problem is becoming ubiquitous in the Fortune 500 world. Analytics teams will come in with business cases and instead of changing process and adjusting governance, IT tends to continue to charge down the old path of creating a requisition and trying to be agile within a non-agile process.

The next issue comes with scaling: As analytics projects can vary greatly in size and scope, simply provisioning a one-size-fits-all compute engine is not possible. Having the ability to spin up a project-specific cluster and then tear it down when done is where the cloud shines, but not when IT must create another requisition to dismantle the cluster.

One example comes from working with a manufacturing company attempting to build a sensor analysis across its network of plants. The process monitored the sensor output of multiple machines on multiple lines to look for outages or excessive cycle time. To spin up the appropriate infrastructure, the U.S.-based team had to requisition the Poland-based team to provision the hardware. The process could take a week or more before any work could be done. Similarly, any change required within the design of the infrastructure meant starting the process all over again from the beginning.

From a personal experience, as then-VP of Data Warehouse and CRM, I was confronted with having to transition from a data warehouse appliance to an on-premise Hadoop solution, due to the company’s comfort level with cloud-based solutions. With the current appliance, I was told that in order to double my performance, the vendor would have to remove my half-rack appliance and bring in a full rack, requiring down time for infrastructure changes and data transition. The Hadoop solution did provide flexibility for future growth by simply adding additional nodes; however, there was still lead time required to expand and very little opportunity to scale back if needed. Another sticking point was the necessity of building a disaster recovery cluster in a co-location, due to bandwidth and cost of duplicating the cluster. In hindsight, having a cloud-based solution, that can scale up or down based on need, and that includes geographic redundancy, would have been a much better choice. Plus, by having multiple technology options from a traditional relational database management system, data warehouse appliance and HD Insight (Hadoop), analysts can easily transition their skill set and choose the best option for the workload.

Luckily, IaaS/PaaS providers have taken note and started offering solutions where provisioning will become a thing of the past. Microsoft’s latest cloud-based solution, Azure Data Lake Analytics, solves this by offering a scalable and always available compute engine.

“Process big data jobs in seconds with Azure Data Lake Analytics. There is no infrastructure to worry about because there are no servers, virtual machines, or clusters to wait for, manage, or tune. Instantly scale the processing power, measured in Azure Data Lake Analytics Units (AU), from one to thousands for each job. You only pay for the processing that you use per job.” Data Lake Analytics. (n.d.). Retrieved August 8, 2017, from https://azure.microsoft.com/en-us/services/data-lake-analytics

Recently data warehouse appliance vendors have changed direction, turning away from selling more units, and declaring on-premise appliances end-of-life, making the transition to the cloud a requirement. So, if you find yourself in the position to rethink your infrastructure and want to take advantage of redundancy, scalability, and technical flexibility, I highly suggest looking to the Azure cloud.

The cloud brings tremendous flexibility and speed to analytics, but a change of process and governance will be required to make it effective in large organizations. However, change is good when it positively affects the company’s bottom line and provides solutions that are just not possible with evolving workloads. With the combination of both service offerings, the result is literally Architecture as a Service and, with the right controls in place, a limitless canvas for current workloads and those yet to be defined. Don’t fear the cloud. Prepare, adapt and reap the rewards.

If you are looking for help with transitioning to the cloud and building a successful governance strategy, contact us!

Preventing Failures with Predictive Maintenance
Scott Faculak

About The Author

Scott Faculak

Scott Faculak is a recognized technology leader engaging in next generation, big data Hadoop solutions to ensure best in class business intelligence, analytics and operational reporting. He is a strategic visionary, leading data architecture and solution development efforts. As an analytic solution provider with over 15 years of business intelligence practice, he is capable of maximizing financial, operational and marketing competencies in multiple industries. He effectively leads teams composed of business intelligence developers, analysts, project managers, data engineers and support staff, consistently exceeding corporate goals, initiatives and expectations.