top of page
Search
  • Writer's picturezOpt

Simplify AWS cost and usage report: A Beginner's Guide

Part 1 of Series "Why zOpt ?"



zOpt Cost Visualization
zOpt Cost Visualization

From the get-go, AWS used the analogy of power utility service (read as "electricity") to describe the pay-as-you-go model. This is a very apt analogy (pun intended) since you get one number as your electricity consumption charge every month with no further insights. This prohibits the electricity bill payer from knowing which appliances are contributing to the cost and any opportunity to cut down the electricity costs.


Technology companies (SaaS, ISV, Startups) have cloud cost as second highest line item in their balance sheets. Cloud costs are more significant contributors to their gross margins and cost of revenue

Historic perspective:

AWS launched its first service S3 in 2006, and launched "Cost Explorer" in 2014 with bare minimum features. It took almost 8 years before AWS customers got any insights into their cloud cost or monthly bills. Cost Explorer added features at a slower pace than AWS innovation and customer demands.


AWS started providing their enterprise support customers a "Detailed Billing Report" ("DBR"), it was so complex that needed a double PhD to decipher it. A few more years down the road, AWS offered "Cost & Usage Report" ( commonly called as "CUR") that provided a better version of DBR to non-enterprise support customers.


Cost & Usage Report v1.0

CUR v1.0 was offered as DBR variant. If DBR needed a double PhD, CUR v1.0 needed single PhD before you could decipher it and use it to your advantage.


CUR v1.0 is designed as flexi-scheme model. Moreover, it was offered as key-value pair formatted data where the report could have 200+ columns without standardization. For example, if you use 100 tags in your AWS resource tagging policy, you could be given 100 additional columns one for each unique tag, leaving the complexity of deciphering it to its users.


Let's imagine you are spending $5K/month on AWS, CUR v1.0 could easily be a million rows. Flex-schema and bulkiness of CUR v1.0 made it harder to used by a typical customer and gave rise to dozens of AWS cost observability companies including Apptio (acquired by IBM), CloudHealth (acquired by VMWare), RightScale (acquired by Flexera), Turbonomic (acquired by IBM), Minjar (acquired by Nutanix) and many more.


During 2014-2021, AWS customers were paying these cloud observability platform companies to get insights into their own AWS spend.

Cost & Usage Report v2.0

AWS started enriching Cost Explorer at a faster pace in 2020 and announced CUR v2.0 in re:invent 2023. The CUR v2.0 major improvements includes a fixed schema (finally, phew...) and SQL query support. Read the details in the blog here.


If you have finished reading the CUR v2.0 blog, you understand that while this v2.0 has significant improvements, the complexity has only reduced marginally. Its is still significantly difficult for average users to get the much needed value from CUR v2.0. Let's not forget users who invested significant efforts in CUR v1.0, they are required to port their internal systems to CUR v2.0 as CUR v1.0 is going away in mid-2024.


Why is CUR complex?

Let's understand why CUR is so bulky and complex. AWS services are priced for

  1. Committed capacity such as reserved instances and savings plan

  2. Provisioned resources such as ec2, EBS, containers and RDS

  3. Consumed capacity such as API gateway, lambda, interAZ inter-region egress data transfer, disk IO and read/write capacity units of DynamoDB

  4. Committed capacity is charged on monthly or annual basis, providing capacity is charged per second, per minute, per hour, per month basis where as the consumed capacity is charged at the actuals

  5. Unit price changes per region, per SKU


CUR captures every line item of your cloud cost for every SKU, for every region, for each hour of billing. Hence, your CUR report has hundreds of thousands of rows.


For example, you have one ec2 instance in one AZ running for 730 hours in a month. CUR report has 730 entries for one ec2 for each hour of usage. Now imagine you have ten ec2, with 10 attached EBS volumes, with 10-20-30 snapshots in two regions. You get the point.


Lets ask ChatGPT


chatGPT "how much effort is it to use AWS CUR"
Setting up and using AWS Cost and Usage Reports (CUR) involves several steps, but the effort required can vary based on your specific requirements and familiarity with AWS services. Here's a breakdown of the key steps and the effort involved:
1. Enabling CUR: Effort Level: Low
2. Configuring S3 Bucket: Effort Level: Low to Medium.
3. Understanding CUR Data Format: Effort Level: Medium.
4. Analyzing Data Using AWS Cost Explorer: Effort Level: Low
5. Custom Analysis with Third-Party Tools or Programming: Effort Level: Variable
6. Cost Optimization and Action Planning: Effort Level: Ongoing
The real effort in using CUR comes from ongoing cost management and optimization. This involves regularly reviewing cost reports, identifying trends, and taking actions to optimize resource usage and costs. This ongoing effort is critical for ensuring cost efficiency in your AWS environment.

Let's do a small assignment

You are using Lambda in your production environment and the cost is variable as Lambda is priced per consumed unit. Your realized Lambda costs have increased by 30%+ from the previous month. You would like to analyzse this increased cost, and to determine find if this increased cost is if it is inline with your business growth, and if there is find a way to reduce the Lambda spend without impacting your application performance.

Think about how much effort this is this going to be and what is your confidence level for saving the cost without impacting your application performance.


Do send us your responses at save.with [at] zopt.ai


Or use the readymade solution

Try zOpt.ai - You could solve the assignment above in 30 seconds.

You get detailed insights into your committed capacity, provisioned infrastructure and consumed capacity at granular level. Moreover, we identify optimization opportunities without impacting the application performance and offer automated remediation. All this with a few clicks of buttons.



93 views0 comments

Recent Posts

See All
bottom of page