Register for Executive Edge
Knowledge centreInsights

The importance of FINOPS when adopting AI on your data platform

It seems like every popular show on commercial terrestrial TV has a competition to enter. Phone or text a number to enter to win the latest car, holiday or cash prize. Of course this isn’t new, I remember Saturday morning TV when I was growing up, the excitement of picking up the phone to play the latest competition or speak to someone in the studio.

But there was always one familiar line chorused to all the trigger-happy kids desperate to dial their number… “Always ask the bill payer’s permission”.

I’d love to say I always did this, but the truth is I didn’t. Those phone number jingles were all too enticing and still occupy space in my head today.

In the late 80s, the idea of accessing something exciting by dialling a few numbers on a device that was readily accessible – with speed dial– was irresistible. But of course, it wasn’t free, someone was paying for my usage (sorry Dad!).

The bill would arrive revealing the cost of accessing this service. Each month those costs would have been inconsistent, depending on my appetite to play. Without explicit permission or agreed boundaries set by the bill payer to work within, the bills were impossible to forecast.

AI and Data Platforms

Fast forward to today. Many organisations have embraced the cloud for their data needs, building Enterprise Data Platforms that process massive datasets at scale that have the potential to deliver valuable insights.

With cloud service providers making machine learning and AI services highly accessible, adoption is spreading fast. Running AI scenarios against large datasets isn’t for technologists anymore, business colleagues are now more aware of the benefits of tapping into this wealth of insight. Business areas now want to experiment directly to uncover insights in the hope that they gain the ultimate prize.

But just like those childhood phone calls, the question remains:

Who is paying the bill? and how is permission granted?

With just a few presses of a button, business users can now spin up powerful models, query petabytes of data, and generate insights that once took weeks to uncover. But behind every AI experiment lies a meter ticking in the background: compute time, storage, data egress, and API calls…all silently accumulating cost.

In this new world, the bill payer isn’t a parent, it’s the enterprise. And without a clear framework for permission, forecasting, and accountability, the monthly cloud invoice can be as surprising as those itemised phone bills of the 80s.

The Cost of Unchecked AI

According to a 2024 report by the FinOps Foundation, organisations that lack cost governance around AI workloads see an average of 23% overspend on cloud services.

This isn’t due to malice or misuse;it’s the natural byproduct of innovation without boundaries. When every team has access to powerful tools, experimentation flourishes, but so does financial risk.

According to cloud zero, AI budgets are expected to rise by 36% in 2025, with average monthly spend increasing from $62,964 to $85,521.

As a result, Gartner predicts that by 2026, 75% of organisations will adopt a formal FinOps practice to manage cloud and AI costs, up from just 30% in 2022.

The message is clear: AI without FinOps is like handing your phone to your child and letting them call premium numbers.

FinOps: The Framework for Responsible Innovation

The FinOps framework provides the guardrails needed to balance innovation with fiscal responsibility. Its core pillars, inform, optimise, and operate, are especially critical in the context of AI:

  • Forecasting: Teams must anticipate the cost of training and running models.
  • Budgeting: Product teams need clear budgets for AI experimentation. This empowers them to innovate within constraints, rather than being blindsided by end-of-month invoices.
  • Governance: Tagging, access controls, and usage policies ensure that only authorised users can run high-cost workloads, and that every pound spent can be traced back to a business outcome.

As J.R. Storment, Executive Director of the FinOps Foundation, puts it: “FinOps is not about stopping innovation, it’s about enabling it responsibly.”

The Role of Product Owners in the Platform Operating Model

In many modern organisations, the shift to a product and platform operating model means that product owners now hold the recurring budget.

This makes them the new bill payers—and the gatekeepers of responsible ad-hoc AI usage. Therefore:

  • Product owners must be financially literate, understanding not just what their AI features do, but what they cost.
  • Platform teams must provide visibility, offering dashboards, alerts, and recommendations to help product teams stay within budget.
  • Finance and engineering must collaborate, ensuring that forecasts are realistic, and that cost anomalies are caught early.

This isn’tjust about saving money; it’s about aligning spend with value. When product owners can see the ROI of their AI investments, they’re empowered to make better decisions, faster.This is an important factor in any operating model design.

Ultimately, the goal of FinOps isn’t to restrict access to AI, it’s to unlock its full potential. By embedding cost awareness into the culture, organisations can move from reactive cost-cutting to proactive value creation. 

Because in the age of AI, responsible innovation starts with asking one simple question:Have you asked the bill payer’s permission?

Related news & insights

See all articles
Culture

Driving Social and Environmental Impact Through Purposeful Tech: Axiologik’s ESG Commitment

Culture

Axiologik Champions Sustainability During Big Green Week 2025

Insights

From Struggle to Strength: Why Digital Organisations Fail at Resilience—and How Product Thinking Can Fix It

Want to know more about how we can help you deliver digital change?