25 Nov 2016

By Marc Henrard


Over the last few years, the pricing of vanilla derivatives has become more complex than ever before. This is the result of the basis increases, in particular OIS vs. LIBOR, and the generalisation of Variation and Initial Margins. At the same time, the market has moved to increased standardisation.

In this new framework it is more important than ever to have a flexible and generic approach to curve calibration, derivatives pricing and sensitivity computations. The first production release of our open source analytics library, Strata 1.0, was made available to the public on 15 July 2016. The second release, Strata 1.1, was released on 7 October 2016. One of the key strengths of Strata is its support for multi-curve pricing and risk calculations.

Over the coming months, I will publish a number of posts that detail some examples (and corresponding code) of how Strata can be used for OTC derivatives valuation and risk. This blog series is for quantitative analysts, risk managers, model validators and regulators who want to access tools that provide the required flexibility and precision in an easy-to-access environment.

Each of the blog posts will describe, in financial terms, the analysis that I have performed and the results obtained using Strata. I will make the code available for each example in our Strata examples project so that the readers can easily reproduce the analysis. If you have any questions or comments, please feel free to get in touch below or contact me directly at marc@opengamma.com.

In the first instalment, I will describe how to calibrate curves and compute PV and bucketed PV01 for swaps. The curve calibration will start from curve descriptions and data in simple CSV files and will cover the computation of ‘Jacobian’ matrices used later for sensitivity computations. The calibrated curves will be used to compute present values and bucketed PV01, also called bucketed delta or key rate duration. I will also comment on the efficiency obtained from the Algorithmic Differentiation used throughout the library. This instalment will be a little more technical than future ones as we will go through the code used to run the analysis. However, the step-by-step method should help non-technical users understand the process. Subsequent posts will focus more on the finance and risk management side.

In the forthcoming instalments, we will discuss:

  • Interpolation and risk
  • Foreign currency collateral valuation
  • Historical analysis: swap PV and hedges

Instalment 1: Curve calibration and bucketed PV01

Curve calibration

Curve calibration is a mechanism that requires a lot of fine-tuning and precision, and at the same time is normally based on standard instruments and conventions. One would like to have a lot of flexibility and at the same time be able to create standard curves very easily. Luckily, Strata has been built with those two, almost contradictory, goals in mind.

Let’s start with the standard part. Many indices, conventions, and calendars in the main markets are built in. These standard objects can be accessed easily by name, which is just a simple string. For example, if you want to access the calendar for good business days in London, knowing its code GBLO is enough; to access the GBP LIBOR 6M index and its convention, knowing its name GBP-LIBOR-6M is enough; to access the standard market conventions for the GBP IRS with LIBOR 6M floating index, knowing its name GBP-FIXED-6M-LIBOR-6M is enough.

If you have read the paragraph above, you know almost enough to be able to calibrate curves in Strata.

Before going to the calibration, let’s discuss the flexibility side. The lists of built-in objects described above can be extended easily by the user. If you don’t like the list of holidays predefined in GBLO – perhaps a royal wedding needs to be added, or you have your own source of data – you can create your own calendar and use it in the same way as the built-in ones. If you want to create a new GBP index – maybe a potential replacement for LIBOR – you can add this to the library without writing a single line of code, simply by adding a single CSV file.

But enough of the flexibility for now. Let’s look at the calibration of a standard multi-curve framework with standard instruments in GBP. The description of a multi-curve starts by specifying which curves you have and what each of them are used for. In our simple case, we would like to have two curves: one for discounting in GBP and for forwards related to SONIA, and the other one for forwards related to GBP LIBOR 6M. Suppose that we give our multi-curve framework the name GBP-DSCONOIS-L6IRS, and our curve names are GBP-DSCON-OIS and GBP-LIBOR6M-IRS respectively. How do we pass this information to Strata? We simply create a CSV file with exactly this information. The file is:


Group Name,Curve Type,Reference,Curve Name

Next, we want to describe how the curve should be interpolated. Should it be on the zero-rates or on the discounting factors, and which interpolator should be used? We simply create a file with this information:

Curve Name,Value Type,Day Count,Interpolator,Left Extrapolator,Right Extrapolator

Finally, you want to describe each node of the curve in an easy way. This is where the built-in conventions will make your life very easy. A description of the curves could look like:

Curve Name,Label,Symbology,Ticker,Field Name,Type,Convention,Time,Spread

For each curve, there is a corresponding list of nodes. Each node has a name (for result display purposes) and a link to the data to be used. The data to be used is composed of a ‘Symbology’, a ‘Ticker’ and a ‘Field Name’. Here I have used ‘OG-Ticker’ (for OpenGamma, not ‘Olympic Games’), but these can be replaced with ‘Bloomberg-Ticker’ or ‘MyBankProprietaryName’ dependent on the user’s preference. The ‘Ticker’ itself could be ‘BPSW10’ or again your own internal naming convention. For the ‘Field Name’ I have used ‘MarketValue’, but you could choose something like ‘Bid’, ‘Ask’ or ‘LastQuote’. Next, you specify the type of instrument. Here we restrict ourselves to DEP (term deposit), OIS (fixed vs. overnight swap), FIX (Ibor fixing), FRA (Forward Rate Agreement) and IRS (fixed vs. Ibor swaps). Other possibilities include IFU (Futures), BAS (Basis swaps), ONI (Overnight vs. Ibor swaps), and these will be discussed in future instalments. The last two columns are the ‘convention’ and the ‘time’. The convention is one of the built-in or user-defined conventions; the type of convention depends on the specified instrument type. The ‘time’ is the tenor for the instruments used above. More complex times can be created to describe FRAs and futures.

The last ingredient for curve calibration is the actual data. In this example, we calibrate the curve as of 1 August 2016.

Valuation Date,Symbology,Ticker,Field Name,Value

I believe that the file names speak for themselves.

We have done the hard work by collecting the data and deciding which nodes we want to use in our curves. We are now four lines of code away from having the curves.

First, load the data from the configuration files:

Map<CurveGroupName, CurveGroupDefinition> configs = RatesCalibrationCsvLoader.load(GROUP_RESOURCE, SETTINGS_RESOURCE, NODES_RESOURCE);

Second, load the data:

Map<QuoteId, Double> MAP_MQ = QuotesCsvLoader.load(VALUATION_DATE, ResourceLocator.of(QUOTES_FILE));

Third, collect the data in the right format:

ImmutableMarketData MARKET_QUOTES = ImmutableMarketData.builder(VALUATION_DATE).values(MAP_MQ).build();

And finally, calibrate the curves:

RatesProvider multicurve = CALIBRATOR.calibrate(configs.get(CONFIG_NAME), MARKET_QUOTES, REF_DATA);

Of the four lines of code, only the last one, where the actual calibration take place, requires some explanation. I will not go through the theory of multi-curve and calibration; this is presented in other places. If I may, I would recommend Chapter 5 of my book [Henrard (2014)].

The only ingredient useful in this discussion is that in the calibration procedure, we solve a root-finding exercise to obtain parameters of the interpolated curves such that for all the instruments describing the nodes,

PV(Instrument) = 0.

This root-finding algorithm is run for all the curves simultaneously (footnote 1). For each PV to be 0, the parameters we have selected must be such that all the benchmark instruments reproduce exactly the market price using the calibrated curves. In the curve calibration process, not only are the curves themselves calibrated and stored but so are the Jacobians or transition matrices. I will describe what those matrices are and why we care about them below.

Present value and PV01

The first application of the calibrated curve that we can think of is to compute the present value of a swap that we have traded. From the rate of the benchmark swaps (and the interpolator) we want to infer the value of a non-benchmark instrument.

For that, we need first to create a swap. Here again, our OpenGamma conventions come in handy. We take a standard convention, specify the trade date, the period to start, tenor, side, notional and coupon, and we are done:

ResolvedSwapTrade swap = GBP_FIXED_6M_LIBOR_6M.createTrade(

We want to compute the present value of this swap using the multi-curve we have calibrated above:

MultiCurrencyAmount pv = PRICER_SWAP.presentValue(swap, multicurve);

The output is a MultiCurrencyAmount object to deal with the cross-currency swaps that we will discuss in a later instalment.

Immediately after the present value, the next thing that a trader or a risk manager will want is the bucketed PV01, also called bucketed delta, rate sensitivities, or key rate duration. Before talking about the code needed to obtain this, we have to define what we mean by ‘bucketed PV01’ as there are three versions of it available in Strata.

The three different sensitivities available are: point sensitivity, parameter sensitivity, and market quote sensitivities. The point sensitivity is the sensitivity, in the sense of the mathematical partial derivative, of the present value with respect to each zero rate of discounting curves and each forward rate of forward curves. By each zero-rate or forward rate, we mean that each single date on which there is a cash flow or a fixing take place is represented in the sensitivity; the sensitivities are not only to the curve nodes but to each single date in the instrument. The main use of this type of information is to view the daily fixing sensitivities and see if there are dates with large fixing impacts.

An example of point sensitivities for a one year GBP swap is displayed below. There are four cash flows (two on each leg) producing four zero rate sensitivities. There are also two forward Ibor rate sensitivities.

ZeroRateSensitivity{curveCurrency=GBP, yearFraction=0.5808219178082191, currency=GBP, sensitivity=71940.4415752365}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=1.084931506849315, currency=GBP, sensitivity=136526.2295146616}
IborRateSensitivity{observation=IborIndexObservation{index=GBP-LIBOR-6M, fixingDate=2016-09-01, effectiveDate=2016-09-01, maturityDate=2017-03-01, yearFraction=0.4958904109589041}, currency=GBP, sensitivity=4954388.900936099}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=0.5808219178082191, currency=GBP, sensitivity=-16317.31422092133}
IborRateSensitivity{observation=IborIndexObservation{index=GBP-LIBOR-6M, fixingDate=2017-03-01, effectiveDate=2017-03-01, maturityDate=2017-09-01, yearFraction=0.5041095890410959}, currency=GBP, sensitivity=5033542.805338534}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=1.084931506849315, currency=GBP, sensitivity=-23843.440872413255}

The parameter sensitivity is the sensitivity of the present value to the parameters used internally to represent the curves. The parameters are often the zero-coupon rates or discount factors at the nodes of an interpolated curve.

The market quote sensitivity, also called par rate sensitivity, is the sensitivity of the present value to the raw market quotes used in the curve calibration. Of the three sensitivities, this is the risk figure the most used by traders and risk managers.

An example of market quote sensitivities for a swap 3 months forward and with a 7-year tenors is provided below.

Each of those sensitivities can be obtained easily in one line of code. The code that was used to produce the results shown above is:

PointSensitivities pts = PRICER_SWAP.presentValueSensitivity(swap, multicurve);
CurrencyParameterSensitivities ps = multicurve.parameterSensitivity(pts);
CurrencyParameterSensitivities mqs = MQC.sensitivity(ps, multicurve);

It is now a good time to discuss the computation of Jacobian matrices in the curve calibration.  These are the matrices used to transform a parameter sensitivity to a market quote sensitivity. They depend only on the curves and not on the instrument for which the sensitivity is computed. It thus makes sense to compute them only once and to do it when the curves are calibrated. This is what is done in Strata: when the curves are calibrated, it is possible to request the computation of the Jacobian matrices. If requested, these matrices will be computed and stored in the ‘metadata’ associated to the curve. They are then available for later use whenever they are required.


As we are discussing the computation of sensitivities, we should add that all sensitivities in Strata are computed using Algorithmic Differentiation (AD). The theory of this computer science technique is described in Naumann (2012). For this blog it is sufficient to know that the technique produces derivatives by analytical computation, and it is very efficient. Analytical computation means that it avoids numerical methods like finite difference, and produces sensitivities that are more precise while avoiding numerical instability. The best way to discuss its efficiency is through a couple of examples.

We first calibrate a set of curves and compute the present value of 100 swaps. The elapsed wall-clock time for this is recorded. In the example above, there are 29 data inputs for the curve calibration. If we were to use the standard bump-and-recompute technique we would redo this process 29 extra times with small changes in inputs, which would require 30 times more CPU time.

In my example, on standard desktop hardware and using only a single thread, the time taken for the multi-curve calibration and the computation of 100 PVs is 9.98 ms (footnote 2). Computing the market quote bucketed PV01 by finite difference would take 30×9.98ms = 299.4 ms (footnote 3). If we compute the (three versions of) sensitivities using AD, we have a computation time of 15.37 ms. This is roughly 20 times faster than a simple bump-and-recompute.

You might argue that the example with only 100 swaps is not realistic. Running the example instead with 2,000 swaps (tenors between 1Y and 20Y and 100 different coupons for each tenor) took 32.46 ms to calibrate the curves and calculate PV. Extending this to include sensitivities computed by AD took 162.88 ms, whereas obtaining the same values by finite difference would have taken 973.8 ms. In this example algorithmic differentiation is roughly 6 times faster than finite difference. We will show in forthcoming blogs that this improved performance is even better when there are more nodes on the curves.

This means any trader, portfolio manager or risk manager could compute the risk of this portfolio in near real-time using only his or her desktop computer.

We will increase the complexity of the portfolio in a later instalment of the blog.


Strata offers a very easy API to calibrate interest rate curves, create trades and compute risk measures. As Algorithmic Differentiation is implemented in the code, the calibration and computation of sensitivities is extremely efficient.

That’s all for today.

The code used for this blog is available in the Strata repository: strata-examples in the package ‘com.opengamma.strata.examples.blog.stratamulticurve1’

The numbers can also be obtained in Excel using the StrataExcel add-in. Don’t hesitate to contact us for a demo.

In the next instalment of this blog series, we will discuss the impact of the curve interpolator on the curve smoothness and on the bucketed PV01.

How we can help

We have long experience in interest rate derivatives valuation, both from a financial engineering perspective, and from a technology perspective. We help banks, hedge funds, asset managers and clearing houses to quickly start or improve their risk management practice for swap trading.

Examples of engagements:

Client: Hedge fund
Issue: Specialized in futures trading, would like to extend the trading universe to IRS.
Solution: Create tools for their quant and risk managers to analyse pricing and hedging. Historical analysis of strategy P/L and hedging behavior.

Client: Clearing house
Issue: Clear single currency swap and would like to expend clearing offering to cross-currency swaps.
Solution: Build Excel-based tools to price and risk manage cross-currency swaps. Analyse the relevant regulation for bilateral margin which is a competitor to clearing. Compare with standard bilateral margin methodologies.

Client: Asset manager – pension fund
Issue: Trade assets in foreign currency and swap them to their base currency; need tools to value multi-currency swap books with collateral in base currency cash.
Solution: Create multi-currency swap valuation based on cross-currency swap between base currency and other currencies.

Client: Bank ALM
Issue: swap book. Multi-curve, collateral, VaR
Solution: Comparison of methodology, benchmarking of their VaR methodologies with alternative approach. Historical analysis of their full swap book. Based on existing trade booking system, with a thin layer of OpenGamma software; historical analysis running on quarterly basis.


  1. It is possible to split the large root-finding process in sub-processes to improve speed, but this is beyond the scope of this blog.
  2. The figure is obtained by running the process 100 times in a loop and dividing the total time by 100. If not, the numbers are too small to be estimated in a meaningful way. The figures are obtained on a 3Y old laptop ‘Mac Book pro 2.6 GHz Intel Core i7’
  3. This figure is underestimated as the calibration process itself is already accelerated by using AD. Without any AD, the computation time would be even larger. Our estimation without AD in the calibration would be a time of 1308.6 ms for calibration, PV and sensitivity; roughly 8 times slower than with AD.


Henrard, M. (2014). Interest rate modelling in the Multi-curve Framework: Foundation, Evolution and Implementation. Palgrave.

Naumann, U. (2012). The Art of Differentiating Computer Programs. An introduction to Algorithmic Differentiation. SIAM.


The Author

Marc Henrard

Marc Henrard

Marc is Head of Quantitative Research at OpenGamma.


Browse our forums for technical posts from our open source community.


Subscribe to RSS feed to keep up to date on the latest OpenGamma news.