Equities VaR and Stress Test Implementation
In this case study, the customer wanted to add an accreditable market risk system into their existing market risk system topology to calculate and report the risk from their Equities books. The challenges of this implementation were:
Complex product and modelling
The business ranges from simple equity positions through to structures with complex hedges (including average rate/average strike options). Volatility surfaces need to be modelled in two ways, firstly traditionally as two-dimensional volatility matrices for history purposes and secondly as a series of polynomials that represent the volatility skew at specific terms.
Seamlessly integrating with the extant systems
The equities business is supported by a bespoke in-house system that supplies trade and equities market data information while other market data is supplied by the customer's bespoke main market data distribution system.
In-house technology requirements
The in-house technology department requires source code-level control of the trade and market data loaders as well as the data extractors so that they may maintain them independently of the vendor. In-house technology needs to be able to coordinate VaR history periods with the legacy system and work within the existing market risk job control. Once the calculations are done, the risk vectors must be exported in a form compliant with the customer's own XML schema for risk vector aggregation.
Required system features and performance
The customer requires that users (non technology staff) can repair trades, reload market data and history, re-run reports and re-publish output. Monte Carlo pricing is required for pricing the average rate/average strike options even during historical (550 day) VaR calculation which is an obvious performance challenge. A number of stress tests are required. In some cases, these are equity extensions to existing tests, thus requiring aggregation with other asset classes, while in other cases the stress tests were specific to the equities trading.
Kakapo worked closely with the client to ensure that the requirements they had gathered and formalised could and would be met. From this solid ground, Kakapo worked with the client to propose designs for how the loaders and extractors should work and agreed this design with the client.
A collaborative approach was undertaken with numerous drops of test files delivered by the client. As history was added, it became possible to generate indicative early figures to allow the likely contribution to the firm-wide risk profile to be socialised. The client was able to delay hardware deployment by viewing their data and progress on Kakapo's in-house servers via the internet.
To keep both teams in close contact, a collaboration and wiki tool was used, again hosted on Kakapo servers.
Once the import and export process plug-ins were complete, they were handed over with the system installers. Kakapo staff helped with the first install and then progressively handed over. At this point, the customer was left to their own devices as they moved to production and undertook the necessary discussions with the local regulator.
Kakapo Risk in concert with BoundaryRider (from Vector Risk) was easily able to meet the challenges presented by this implementation. The diagram shows the basic layout of the systems involved. It was a simple matter to integrate Kakapo Risk and BoundaryRider into the existing customer system topology.
Modelling the complex options
Kakapo Risk was extended using its extensible architecture to model the complex options. This included market data extensions for manipulating volatility surfaces manifested as polynomials and more traditional two-dimensional matrices. In fact, prior to this project, Kakapo Risk had no specific support for equities, but equities modelling was easily and quickly added.
Supporting secure data and results re-publishing
Secure data repair, re-run and re-publish are all delivered as standard Kakapo Risk features. The Trade Browser supports editing of trades on both the "trunk" and branches the user has created with the Operations Manager. Report re-run is triggered by a simple, security-controlled Powershell script run from an end user's machine. Re-publishing results is supported within the GUI, allowing the user to choose any version of the results to re-publish. Naturally, each of these steps is individually controlled by the RBAC scheme.
Ensuring risk vectors are commensurate
Coordinating Kakapo Risk to generate a commensurate set of vectors to the legacy system was easily done using calendar support built into Kakapo Risk. Kakapo Risk's region parameters can be set when creating EOD events and these were used to ensure each risk analysis used the correct dates. The whole process was simply subsumed into the customer's existing EOD jobstream by adding simple Powershell script calls to the export processes.
Monte Carlo with a VaR run
Despite the pricing of the average rate/average price options being complex and requiring Monte Carlo simulation, BoundaryRider is able to make these calculations for a fully recalculated and attributed VaR run in just a few minutes. While BoundaryRider does scale well across large clusters, in this case a minimal Microsoft HPC implementation of just one head socket and three working sockets was required.
Stress Test support
Features of Kakapo Risk's market data assignment engine were used to support the required stress tests. A specific stress test engine was written for the client that allows the client to configure and expand the tests as required.
Customer maintained data import and export
Kakapo Risk Import and Export process plug-ins were written for the client and handed over as part of the implementation. The client can change these and redeploy as they see fit.