Banks and financial services are experiencing a real need for solutions that provide performance-at-scale. New regulations like FRTB (also called Basel IV FRTB) have greatly expanded the reporting requirements for the financial industry.
This increase in requirements, coupled with the complexity of the new rules and regulations means that the law now requires much finer granularity and disciplined reporting. In order to comply, banks’ analytics systems must be modernized, to ensure that the data sent to regulatory authorities is correct and relevant.
In an effort to find solutions to these new requirements, Citihub Consulting (a global financial market IT advisory firm) has penned a comparative analysis of GPU accelerated database solutions, to determine the maturity and performance potential of these solutions.
Dealing with big data challenges in risk and reporting
Citihub has found that in many firms, this type of unified risk reporting is currently constrained, as datasets are distributed across collections of asset-class aligned systems that typically refresh only once per day. This distribution reduces the datasets’ interactivity, making it difficult to process changes in near real-time, and limiting responsiveness.
The challenge is not just in the response time for these queries and reports, but also in the lack of access to the huge quantities of data being collected. Many banks and financial services perform time-consuming re-aggregations and cube-generation to make the data accessible in a timely fashion, but this also severely limits the data’s fidelity and granularity. Users in these cases are able to wait minutes (or longer) for responses to these types of large-scale queries, according to Citihub.
How do GPU databases measure up for Big Data?
Citihub worked closely with a tier 1 investment bank to generate a financial services industry risk result dataset suited specifically to FRTB reporting and analysis. They set about putting GPU databases to the test in an intent to reveal query performance, scalability, and overall product enterprise readiness.
SQream is hosting the report’s writer, Tim Jennings to discuss “GPU Accelerated Database Use Cases in Capital Markets”.
Join us for a webinar, on Wednesday, July 18, 2018 at 3:30 PM – 4:30 PM BST – London Time (or 10:30 AM – 11:30 AM Eastern Time), where we will discuss:
- are GPU databases candidates to support live intra-day type blotters?
- can we envisage pre-calculated OLAP cubes being replaced by GPU databases fulfilling live queries on in-bound streaming data?
- are GPU databases viable and performant alternatives to Big Data solutions?
- are GPU databases mature enough for mission-critical solutions?