(c) Chartered Accountants Australia and New Zealand. Contact Chartered Accountants Australia and New Zealand for permission to reproduce this article., Financial Reporting

Is end-user computing the next big thing for financial modelling?

Finance functions of tomorrow will be powered by an array of tools and technologies, held together by cloud-based platforms.

There are two emerging themes that are suggesting how finance folk are going to embrace and use new technologies to manipulate, model and analyse in the future: an explosion of end-user computing (EUC) and the power of platforms.

At the time that most EUC/model risk policies were written, it was imagined that all EUC would eventually be systemised (the much-heralded death of Excel!), but there has always been the need for a “last mile” (or more) of spreadsheets to get to the final output.

An explosion of end-user computing

Tool options for analysts have never been greater, but they also will never be as few as they are today. Over the past few years, analysts – especially those just out of university – who are frustrated with their organisation’s main systems, increasingly are looking outside Excel and the software provided on their laptop.

In our financial modelling team it goes like this:

Me: “Please develop this client solution in [new technology we’ve just started using].”

[a few hours later]

Smart new graduate: “I tried it in [new technology we’ve just started using] but it was too slow, so I built it in [free new software I’ve never heard of] which was much better.”

The main finance function infiltrators are R, DataRobot, KNIME, BigQuery, Python, Alteryx, Power Query, Power Pivot, Excel (obviously), Modano, Power BI, Quantrix, Qlik and Tableau, most of which are either free or relatively cheap. However, more traditional-looking cloud-based financial planning and analysis platforms (Anaplan, Adaptive Insights and Jedox) are also being used for discrete tactical purposes. That’s a lot of choices (much of it personal) and it’s a lot of finance function fragmentation.

With increased competition, all of these tools are rapidly evolving, and users float from tool to tool to chase the latest features.

This combination of technology choice and tech-enabled smart young people creates an amazing opportunity. Being free of the IT department, analysts can do analysis faster than ever before.

But it also creates risk. Most of these tools are too new, or evolving too quickly, to have established best practices for design and risk. That means passing tools between people and teams is getting harder. The list of tech skills required by new finance recruits is getting longer, too, to match the increased number of technologies being relied on in the finance function. The possibility of vast tool graveyards is high.

How end-user computing impacts risk

The unexpected power of cloud platforms

I must admit to having been slightly caught out by the radical shift in thinking and structure that comes when you utilise platforms for your analysis and financial modelling. I believe CFOs and financial planning and analysis teams are still stumbling across the opportunity created by using a suite of tools all based on the same platform.

The key enabler of this shift was the introduction of the cloud, but what revolutionised it was the plethora of APIs (application programming interfaces) and connectors within the cloud.

I am going to avoid naming the platforms themselves, not only because the situation is evolving by the month, but also because it’s not about the software; it’s about how this new way of thinking changes how we think and solve problems.

It can be a little complicated to get your head around, so to simplify the concept I will talk about what all this means for the humble financial model.

The component parts of a financial model: A financial model is self-contained and it takes its inputs, calculations and outputs with it everywhere it goes. A financial model is easy to build and email.

But when I put my model into a platform strange things happen, and basic model design fundamentally changes.

Inputs: Inputs don’t live in my model. Instead of risky manual import processes, I dynamically query well-structured data sources and suck in the latest data. All data is held centrally, in a standalone data warehouse, as a single source of truth, and is available to all models.

Calculations: Calculations are built in a modular fashion in discrete pieces. Each module is freely available to everyone, and can call on any other module. Free of data, they become more elegant and bespoke.

Outputs: The raw outputs are held where we store the inputs – sitting side by side, ready for the next calculation model that needs them.

Presentation outputs (dashboards, etc): The presentation outputs are published. They are simple, streamlined and tailored to the individual user. Consumers can drill down into the calculations and underlying data but, in reality, if the dashboards and visualisations are telling the right story and answering the right questions, most people won’t bother.

How this sets up big opportunities

Because everything is shareable, there’s a single version, and because we no longer package up the model into a single file and email it, there’s no more model fragmentation.

Another advantage is that having the inputs and outputs for our forecast model alongside actuals in one place, and not spread across hundreds of different models, sets the stage for some very powerful analysis.

How we will introduce analytics (including machine learning and artificial intelligence) alongside financial modelling will be the subject of another article, but one use will be to refine the inputs/assumptions in the model. In the old world we never had enough data for this, but now that all the inputs and outputs, and actuals, are in the same place, we do.

So it’s not just the availability and power of new tools, used by smart, tech-enabled young people that will bring new innovative solutions. It’s the combined benefit of plugging these tools into connected platforms. At that point, the value explodes. But so does the risk.


Managing the risk

As traditional model audits have become too narrow and unnecessary for non-transaction models, a new suite of model assurance services has been developed. Like PwC’s Model Analyser review, they rely heavily on risk analysis of the components of the model to target and tailor review procedures that give just the right level of comfort. For some models and departments, these are fully automated risk scans across every model; for others, a deeper dive is required. But understanding the risk of each individual model is key.


How to approach the new world of financial modelling

How should finance departments approach this new world of finance modelling? Here are my top tips for the future.

  • Embrace new technology, train yourself and empower your analysts.
  • Encourage your team to think differently. Our implicit understanding of how models work is constraining our imagination.
  • Introduce best practice standards to all of your modelling in whatever technology. Critically, they need to be religiously consistent across the department.
  • Don’t write more policies. Consider practical steps to test, analyse and review your tools and models.
  • Take internal audit, risk and IT on the journey and leverage their support.

This article was originally published in the December issue of Acuity.