Survey Reveals One Third of Businesses are Exceeding Their Cloud Budgets by as Much as 40 Percent

 Pepperdata, the leader in Analytics Stack Performance, announced the results of a new survey of 750 senior enterprise IT professionals in industries ranging from finance to healthcare, automotive, advertising and other data-intensive businesses. Key findings include that more than one-third of businesses have cloud budget overruns of up to 40 percent, and one in 12 companies exceed this number.

The survey was conducted to better understand how organizations run their big data applications and workloads in the cloud. Companies polled ranged in size from 500 to more than 5000 employees, and spent from $500,000 to more than $10M on big data analytics.

The shift to cloud computing is solidly underway. While the cloud offers both the benefits of a pay-as-you go model and the ability to be elastic on demand, enterprises almost universally see a rise in costs. This is because IT often lacks sufficient visibility into cloud performance and does not have the tools to optimize applications.

Key findings from the survey include:

  • For 64% of respondents, “cost management and containment” is their biggest concern with running cloud big data technologies and applications.
  • A majority of respondents said the desire to “better optimize current cloud resources” was their highest priority big data cloud initiative.
  • In 2020, for one in three respondents, cloud spend was projected to be over budget by between 20 percent and 40 percent.
  • One in 12 respondents said their cloud spend was expected to be over budget by more than 40 percent.

“This research shows us the importance of visibility into big data workloads. It also highlights the need for automated optimization as a means to control runaway costs,” said Ash Munshi, CEO, Pepperdata. “Significantly more than half the companies surveyed indicate lack of control on cloud spend, making optimization and visibility the keys to getting costs under control.”

Other findings included:

  • Types of cloud: 47 percent of respondents are using a private cloud; 21 percent are in the public cloud; 28 percent are using a combination of both.
  • Biggest concerns when running big data applications in the cloud: 39 percent of respondents cited cost management and containment; 33 percent said increased complexity; 14 percent said CapEx to OpEx; 13 percent said a lack of control.
  • When asked how budgets were impacted by the move to the cloud, 40 percent stated that budgets stayed the same and resided with IT; 31 percent said budget became shared with IT administering; 30 percent said budgets moved into one or multiple business units.
  • When asked how their enterprise measures application/workload performance based on their cloud instance, 30 percent said they were using an application performance monitoring solution; 28 percent said they used cloud provider tools; almost 20 percent said a homegrown solution; 17 percent said an observability solution for insights, alerts and automation; five percent said they don’t monitor at all.
  • Moving to the cloud often means confusion as to which part of the organization owns support and troubleshooting for big data applications: 44 percent said they had a shared support model between ITOps and business units/LOB developers; 35 percent said support stays with ITOps; 22 percent said the dev organization within the business units owns this.
  • Spend: Asked to estimate what they will spend this year on big data analytics in the cloud, 34 percent said between $500,000 and $1M; 26 percent said between $1M and $2M; 15 percent said between $2M and $10M; seven percent said more than $10M.
  • Budgets: For 2020, 44 percent said they’d be on budget; 35 percent expected to exceed budget by 20 to 40 percent; eight percent of respondents believed they’d exceed budget by more than 40 percent.

Cloud optimization delivers big savings. According to Google, even minimal cloud optimization efforts can net a business as much as 10% savings per service within two weeks. Cloud services that are fully optimized and running on extended periods (over six weeks) can save more than 20%. Key findings surrounding cloud optimization were:

  • Top big data cloud initiatives included better optimizing current cloud resources and continuing to migrate workloads into the cloud. Lower-priority initiatives included increasing visibility into app and workload performance; expanding to other public cloud vendors; more use of containers; and finding a reliable replacement for Hadoop.
  • The types of applications and workloads that consumed the most resources, according to respondents were: Hive, with 29 percent of the vote; Spark, at 27 percent; MapReduce, at 16 percent; Tez, at 11 percent; other applications, 18 percent.

To cut the waste out of IT operations processes and achieve true cloud optimization, enterprises need observability and automated tuning. This requires machine learning and a unified analytics stack performance platform. Such a setup equips IT operations teams with the cloud tools they need to keep their infrastructure running optimally, while minimizing spend.

View the full Pepperdata Big Data Survey Report

Image licensed by

Related News:

SAP and Microsoft Expand Partnership and Integrate Microsoft Teams Across Solutions

Palo Alto Networks Achieves New FedRAMP Authorization including Prisma Cloud, Cortex XDR and Cortex Data Lake


About Author

Leigh Porter's first love is to love people. Beginning her career as a neonatal RN was an obvious choice until life threw the curve ball to embark on a new IT endeavor. Pursuing this fresh career was a piece of cake with her resilient and steadfast character. Outside of the office, Leigh also diligently gives much of her time faithfully as a nationally awarded volunteer leader to a very dear to her heart organization.