The challenges of risk management big data implementation

18 December 2015
Big data is not a solution in itself; it’s a process accelerator – as is the case with most technologies.
One interesting correlation with the financial crisis of 2008 has been the rise of big data technologies. As banks scrambled to react to their data and risk management shortcomings, big data seemed like a technology saviour to some fundamental problems.
However, big data is not a solution in itself; it’s a process accelerator – as is the case with most technologies. Riskcare takes a look at some of the factors worth consideration while planning a big data risk management solution.
TECHNOLOGISTS TO IMPLEMENT THE SOLUTION
There are generations of technologists across the globe that understand and can implement (to a greater or lesser effectiveness) a classic stack. Data processing with Java, SQL and Perl. RDBMS schemas, stars and snowflakes for storage. Excel as a reporting platform.
Decades of this stack have resulted in mature off- and near-shore centres to augment expensive hubs in London and New York. Institutional understanding exists for it and there is a fungibility of resources to roll on and off similar stack projects.
But none of this exists yet for big data platforms. Engaging consultants can accelerate implementation, but banks can’t institutionalise that wider operational acceptance of what will be a new core technology for their stack without permanent members of staff.
HIGH-QUALITY SOURCE DATA
I know this might seem like a really obvious thing to say, but technology doesn't fix data problems. It can be used to illuminate those problems more effectively, but ultimately banks will have to go back to source (hopefully there’s just one source) and fix it there. Any fixes made downstream of source are an immediate data quality divergence and duplication of effort. Unless fixed at source, banks are creating debt that they will at some point have to pay off.
But what if the upstream system isn't sufficiently responsive to fix the issue and get reports out? Idealistically, the pressure on this non-functioning system should force investment and process change from the business, but pragmatically that doesn't always happen. Workarounds are probably inevitable, but as professional technologists we must identify and call out this creation of debt – as it is this that will cause us to fail to maximise value from a new stack. I’m even a fan of allowing the workaround to be slightly painful for users at times, lest the source of the issue be forgotten.
VISUALISATION AND INVESTIGATION TOOLS
The sheer volume of data makes investigating issues and analysing results sufficiently difficult. On top of this, the aforementioned increasingly obfuscated risk measures can result in Excel simply not being fit for purpose. Big data tools within this space are immature, so the remaining options are for banks to build their own interface (which has been out of fashion for quite some time within the middle/back office) or fit one of the previous generation’s tools onto the new stack.
The utopia of direct and dynamic user-interaction with data is extraordinarily hard to achieve (although it’s glorious if it can be). The inefficiency of business talking to IT to make a system change, which goes through change control, has to be one of the main reasons for the on-going use of Excel. The more end-user computing solutions that exist, the less efficient the overall process – and this itself is one of the biggest areas preventing the realisation of value from IT programmes. Whatever approach organisations take, build or buy, it is key to make changing the system part of the daily expectation of using it – and build the appropriate (lightweight) process and oversight to enable that.
A HOLISTIC PROCESS EFFICIENTLY UTILISING THE PLATFORM
It all comes down to process change, that’s where efficiencies can really be achieved. While IT implementations are often there to automate repeatable tasks or provide incremental functionality, this generational change in technology requires a generational shift in process approach to really maximise benefits. The opportunity to cut down data duplication, to consolidate and accelerate business functions that can be linked to such a technology change doesn’t often arise – and it is a legitimate chance for organisations to create exponential efficiencies.
Obviously this is not easy. We must continue to push as far up the hierarchy as possible in determining underlying problems, not just accepting fait accompli solutions. True partnership where both business and IT sponsors have aligned goals in terms of minimising cost to maximise margin is our best hope of climbing out of the quagmire of legacy systems surrounding us. While the pressure of regulation (and previous less-than-perfect IT implementations) will limit appetite for such changes, they are the only way to turn what is currently a damage limitation exercise into a genuine competitive advantage. The damage limitation exercise will only ever bring diminishing returns.
How can we help?
Our deep knowledge, project-based engagement model and technology accelerators means we deliver your objectives to our own high-quality engineering standards. The results are rapid adoption with a long and reliable service life, good value and satisfied customers. Find out how we can help you...
Read more