Coping with the Demands of Changing Regulation

Published: December 07, 2016

Coping with the Demands of Changing Regulation
Jayson Dunne
Director, Rubicon Risk Advisory Services

Coping with the Demands of Changing Regulation
by Jayson Dunne, Director, Rubicon Risk Advisory Services

 

As a corporate treasurer or risk manager, are you ready for the increased information demand that changing regulations will likely place on your time? 

The present dynamic regulatory framework is changing the way companies are required to measure and report risks in their annual integrated reports. Statements attesting to corporate viability are required to be more detailed than previous ‘going concern’ opinions expressed by a company’s auditors. The risk management portion of the report has also moved from a general statement about risk awareness and management and expanded to comprise a large portion of an integrated report. Ironically, the reporting on risk relating to financial market variables, despite being well understood mathematically, has been less detailed in integrated reports, than say, the environmental report.

King III and the soon-to-be-released King IV Report on Corporate Governance evinces a transition from a tick-box approach to a more principle-based approach. The draft King IV report specifically mentions adopting an ‘apply and explain’ framework in implementing these principles as opposed to mindless adoption as if they were rules. Substance over form is the desired approach.

In addition, recent changes to the Corporate Governance Code in the UK require all UK listed companies to make a viability statement which is separate from and in addition to the going concern opinion. This statement requires that a company declare its viability over a specific time horizon, explain what variables were considered in the assessment, what time frame forward was looked at, and why the range of variables and the timeframes considered are believed to be appropriate. It’s foreseeable that a similar requirement will be implemented in South Africa.

These shifts in approach require companies to have deeper internal conversations about risk, and be more transparent about the risk impacts in their integrated report

As the custodians of this information the corporate treasurer and risk manager will be most affected by these changes. 

 

Does this increased regulatory burden simply add costs to your company or does it provide an opportunity to enhance your company value?

Many companies view risk management functions as non-core and an imposed requirement. Added to this, a smaller company may have difficulty building the necessary redundancy into its risk unit, and could struggle to retain qualified staff. This can lead to risk being seen as a high-maintenance business unit which struggles to embed itself into the corporate culture.

It’s well understood that it’s beneficial to a company and its market value to have the best risk management processes possible. 

EY research [1] reports that “companies with more mature risk management practices generate three times the level of EBITDA as those with the least mature risk management practices”. This research also found that financial performance is highly correlated with the level of integration and coordination across risk, control and compliance functions.

The recent release of the SARB report into the failure of African Bank points to an almost complete absence of risk management as one of the main causes of the failure. This doesn’t necessarily prove the premise that increased risk management increases company value but it demonstrates that at the extreme, the lack of risk management can certainly lead to the complete destruction of company value.

“Risk Management leads to a lower cost of capital and higher expected cash flows and, hence, to a higher firm value”: Siva Moodley, The South African Treasurer 2012

The incremental value accrued to companies with ‘more mature’ risk management is almost certainly relative, implying that to maintain the value differential, a company needs to constantly monitor its risk processes to stay abreast of latest trends and developments. 

Companies that manage risk better, do better.

 

Note
1.’The viability statement: Finding opportunities in the new regulatory challenge’

 

[[[PAGE]]]

How can companies thoroughly explore the possibility of all outcomes without drowning in the data?


Companies routinely explore scenarios to understand their risk to market rates or prices; it’s a natural response to the perennial ‘what if’ question. However, scenario analysis quickly expands into too many dimensions for most people to apply usefully. 

It’s common for many companies operating in Africa to have multiple foreign exchange, interest rate and inflation related risks, and possibly commodity exposure as well. Consider a company with four risk factors. If three scenarios (up, down, flat) are considered for each of these risk factors, the number of permutations is 81. If the risk factor set is expanded to six, the number of permutations balloons to 729 and sensible discussion about the inputs and results becomes near impossible.

Imagine discussing scenario results, and trying to remember that you are discussing the up, down, flat, down scenario for example and comparing it to the down, down up, flat scenario. This requires some impressive mental gymnastics and can understandably lead to a simplification of the process: “Let’s assume that this variable will stay flat”.

The availability of data in this situation is not the hurdle; the challenge is to translate the wealth of data into information that can be quickly understood and acted upon. 

Even the simplest scenario analysis becomes unwieldy very quickly as parameters are added.

 

How do you assimilate the information needed to make good decisions? 

The process of data mining can move a company’s board from simple paramaterised risk management metrics such as VAR, into a decision-making framework where risks are managed according to the company’s unique risk profile and appetite. 

The transition becomes apparent as the company moves from statements such as: “The VAR of the price of our widgets is x, and therefore we have a risk to our revenue of y over the period in question”, to one that is directly relevant to the company: “The widget price volatility and likelihood of moves over the next three years means that there is only an x% chance of failing to meet stated target y. The board is happy with this risk profile”.

The core input required for analysis of this nature is a detailed cash flow model of the company onto which a scenario generating module can be added. The enhanced financial model is then run through a large number of scenarios. To get a statistically significant set of results, in excess of 3,000 scenarios is recommended. Market practitioners will routinely use 10,000 scenarios to evaluate financial market variables to ensure the integrity of the results. 

Many tools that undertake these sorts of analyses rely on stochastic formulae, distribution or copula fitting, or correlation matrix calibrations. The sensitivity of these is such that a marginal difference in calibration can lead to large differences in the output, and there is potential that the calibration exercise lands up becoming an end in itself. Worse still is the possibility that the matrix is calibrated intuitively: “Everyone knows that the naira and the oil price are highly correlated, make that correlation co-efficient higher”, effectively limiting the power of the process and nullifying the results.

George Box, the renowned 20th century statistician commented “Essentially all models are wrong, but some of them are useful”. He went on to say that “the scientist cannot obtain a ‘correct’ one by excessive elaboration. On the contrary following William of Occam he should seek an economical description of natural phenomena. Just as the ability to devise simple but evocative models is the signature of the great scientist so overelaboration and overparameterisation is often the mark of mediocrity.”

Sometimes the simplest, most intuitive solutions are the best.

[[[PAGE]]] 

What do you do to make this information work for your company?


An ‘economical’ and intuitive solution to the complexity is to use a Monte Carlo simulator that uses historical data sets to create future price paths. This approach captures most innate correlations by taking contemporaneous data samples from the data sets. It requires no calibration beyond ensuring that there are matching data sets for all input variables and that the data sets are sufficiently large to record a market through rising and falling cycles. For most market variables, paucity of data is not a restriction.

Well-built Monte Carlo models can undertake a 10,000 path analysis in under 30 seconds, allowing decision-making and analysis to become a real-time discussion, not one spanning weeks and multiple meetings.

Thousands of paths with embedded, historically observed inter-relationships created by the Monte Carlo model allows the company to make decisions across a universe of future outcomes that will cover virtually all market outcomes, allowing the board to make decisions that protect the company, while exploiting opportunities and allowing management to focus on their core competency. This removes debate around what the future price of widgets will or could be, where often the loudest voice prevails, and transforms the discussion into one where decisions flow from the results of the analysis and the company’s stated risk appetite. 

Decision-making becomes more robust and does not rely on the evolving widget price to validate the decision made, as the decision was not made with a particular price trajectory in mind, but was rather concerned with the viability of the company over all price paths.

If a company had hedged a percentage of the input cost of widgets for the next two years, and the price subsequently fell, shareholders, the financial press and often the board will characterise that decision as incorrect because the price moved in a way that suggests they should not have hedged. Making the decision based on outcomes and risk appetite makes the decision robust regardless of price moves, since it was not predicated on the widget price, but rather on the unacceptable likelihood of retrenchments, for example, should the widget price rise. Furthermore, the hedge percentage was not arbitrarily picked, but it was that hedge percentage which reduced the risk of some event to below the chosen risk threshold for the company 

In summary, the changes in reporting standards and disclosures expected is creating a greater demand on risk management. This demand can be met using modelling that moves the discussion away from the market price of widgets and towards a process where objective decisions can be made that factor in the complexity of the market in which a company operates and its unique risk reward appetite into the decisions being made.

Well executed scenario analysis can give you the best approximation of hindsight at the time of execution.  

 

 

Jayson Dunne

Jayson Dunne
Director, Rubicon Risk Advisory Services

Jayson has a deep understanding of global financial markets with over 25 years’ experience spanning virtually all products and aspects of treasury at local and international banks. 

He has worked in project financing, trading, structuring, risk management and sales, and also led the RMB Commodities in treasury business for a number of years, so he is well equipped to help boards transform their decision-making processes by changing the metrics used when making complex choices. The process becomes more robust and inclusive, allowing boards a level of hindsight in the present.

 

 

Sign up for free to read the full article

Article Last Updated: May 03, 2024

Related Content