The post What happens to my exchange traded derivatives (ETD) margin when there’s a big market move? appeared first on OpenGamma.
]]>However, the impact of market moves on margin depends upon the type of margin; Initial Margin or Variation Margin. Each are impacted by large market moves differently.
Almost all ETD CCPs use the SPAN (or a SPAN like) methodology for margin calculations. The one exception to this is Eurex that has switched to use a VaR based methodology.
Variation Margin is all about getting profit and loss up to date. For some products it is paid in cash, whereas for others (in particular premium paid up front options) it is a contingent amount. If Contingent Variation Margin is a debit (i.e. the market has moved against you) then collateral must be provided to cover this theoretical loss, but if the margin is a credit then it can be used to offset any Initial Margin that may be required.
Initial Margin is a confusing name. For some types of trades it is just an “initial” amount of collateral that needs to be provided before you can trade, which is then held over the lifetime of the trade to cover the market risk. This is the description you tend to be provided if you Google “Initial Margin”.
However, this isn’t normally how Initial Margin is calculated. The purpose of Initial Margin is to ensure that the CCP has sufficient funds to close out any positions it may inherit following a default. On the assumption that the Variation Margin has been collected, the Initial Margin would need to cover any adverse market moves in the expected close out period. For Exchange Traded Derivatives (ETD), this is normally 2 days.
The Initial Margin algorithms employed by all the CCPs are portfolio based, meaning that hey look at all the positions and calculate a margin requirement for the total portfolio, taking into account offsets between different contracts and products. These algorithms are either scenario based (like SPAN) or VaR based (like Eurex Prisma).
Assume you just have a long position in futures for a single product. The CCP will calculate the potential Variation and Initial Margin at the end of day and make sure that there is sufficient collateral. If not, then additional cash will be requested.
The Variation Margin part of the calculation is easy. It will be the implied profit or loss based on the change in price from the previous close – if you are on the right side of the market it will be a credit, and if not it will be a debit.
But what about the Initial Margin? Intuition would say that it is bound to increase because of the increase in volatility. However, if you haven’t changed your position, it is most likely to remain the same.
SPAN uses scenarios that are based on fixed price moves in the futures. So the calculation will be the same as the night before. Any potential increase in the scenario levels will need to be reviewed before it is applied and this process takes time. Usually the SPAN scenarios are only reviewed on a monthly basis, although an emergency review is possible.
If you happened to have a more complex portfolio which included offsets between different products, the margin still wouldn’t change because the number of offsets given within the SPAN algorithm, and hence the savings, are based on the number of lots of each and not their relative values. Changing these ratios is something that is also dependent on a review of the SPAN parameters.
Sufficient collateral will already have been posted to cover the Initial Margin as it hasn’t changed. So any intra-day margin call will be based on the change in Variation Margin.
The same isn’t quite true for options as these are non-linear instruments, but as the parameters aren’t varying, any change in Initial Margin will depend on how much the delta of the option changes. If your portfolio contains options close to maturity then this could be a large swing.
The result will be slightly different if the Initial Margin algorithm is VaR based, but not much.
The intra-day price move could be included as a new scenario in the VaR calculation. As Filtered Historical VaR tends to be the chosen methodology, the increased volatility will be used to scale the scenarios to current volatility. In addition, if the scenarios used are relative rather than absolute, and if the market move is down, you may actually find that your Initial Margin decreases because it is now looking at various percentage moves in a lower base price.
On the assumption that the volatility in the market hasn’t encouraged you to trade and therefore change your portfolio, then the margin call you are receiving following a large price move is almost entirely down to Variation Margin.
The level of your Initial Margin is likely to be very similar, if not the same, as it was the day before.
You may even find that if you have been asked for additional funds intra-day, at the end of the day some of this may be returned. This is because markets often bounce back, so parts of a large market move will be recovered by the end of day. If the intra-day margin was determined at the peak of the market then the Variation Margin will show a greater loss than that calculated at the end of day – and hence funds being returned.
5th February 2018 was a particularly volatile day on the US stock markets. The Dow ended the day more than 1000 points lower in its biggest points drop ever and the S&P 500 finished more than 4% down, amid fears over interest rate rises.
After big moves overnight in the Far East, on 6th February the volatility continued on the European stock markets, with the FTSE 100 ending 2.6% lower and the Dax shedding around 2.4%.
The examples below show the margin impact for the E-mini S&P contracts cleared by CME and FTSE 100 Index Futures cleared by ICE Clear Europe.
Assume a position of long 100 E-mini S&P March Futures. The SPAN Initial Margin and value has been calculated for end of day 2nd February, intra-day 5th February and end of day 6th February, using the SPAN parameter files published by CME.
If 100 long 2700 March Options are added to the portfolio then the results are as follows:It can be seen that the Initial Margin stays consistent. The intra-day parameter file shows the start of the downward move in the S&P.
Between end of day 2nd and end of day 5th the delta of the option went from 0.6583 to 0.33. This move can be clearly seen in the change in Initial Margin and Equity Value.
Now consider 100 Long March FTSE Index Futures. The SPAN Scanning Range for these contracts was £3,230 per lot for both 5th and 6th February. This is an implied move in the index of 323 points.
As the Scanning Range did not change, the Initial Margin stayed constant at £323,000.
The settlement prices were as follows:
This is a change in price of 181.5 which would result in an end of day Variation Margin of -£181,500. ICE would have requested around this amount of additional margin during the day when the FTSE hit these levels. This could then be used to cover the Variation Margin call at end of day, but no additional funds would be requested to cover Initial Margin.
Contrary to common expectation, under the SPAN methodology, Initial Margin is insensitive to market volatility on a day to day basis.
The post What happens to my exchange traded derivatives (ETD) margin when there’s a big market move? appeared first on OpenGamma.
]]>The post The critical steps to better clearing broker services at lower cost appeared first on OpenGamma.
]]>
Interested in finding out more ways to lower your clearing broker costs? Contact OpenGamma
The post The critical steps to better clearing broker services at lower cost appeared first on OpenGamma.
]]>The post Elephant in the (Swap Trading) Room: Exposing dealer trading costs for cleared and uncleared swaps appeared first on OpenGamma.
]]>Are you a Cat 3, Cat 4 or exempted entity as defined under EMIR? If so, have you evaluated the commercial merits of voluntary clearing vs bilateral OTC?
Assessing the impact of these costs against today’s backdrop of ongoing regulatory, political and market infrastructure fragmentation is not straightforward. In this blog, we shed light on the complexities of these costs and the hidden benefits that arise from certain trading scenarios.
Our analysis highlights that when the costs of bilateral OTC are fully accounted for, cleared swaps offer significant capital and MVA savings for dealers, which should translate into tighter bid/offer spreads. More precisely, the bid/offer spread for an uncleared 10yr USD vanilla IR Swap held to maturity that is used in this analysis, is approximately 2 bps greater than when cleared through a CCP with an established liquidity pool.
Our study looks at the trading costs for cleared and uncleared swaps from the counterparty perspective in various trading scenarios, highlighting to the buy-side what the sell side drivers impacting all-in risk price are. We quantify the direct cost to the dealer for both margin and capital, resulting in the true theoretical differential (that dealers could/should be charging) between cleared and uncleared swap execution spreads.
For OTC uncleared, two cost scenarios are examined. One where Variation Margin (VM) is paid (mandatory since March 2017) and one where VM as well as Initial Margin (IM) are paid (mandatory for largest players and to be extended to more participants in the coming years). For full info on BCBS IOSCO’s framework on margin requirements for non-centrally cleared derivatives click here.
As illustrated in the chart below, from the buy-side perspective, the choice in terms of cost between cleared and uncleared swaps trading appears to be obvious – bilateral is cheaper than cleared – as VM has no associated cost at all, bilateral-VM-IM has an MVA cost and cleared swaps attract both MVA and fee costs.
This chart shows the costs associated with a 10Y vanilla IRS trade from a buy-side perspective. The costs are expressed in terms of the swap PV01.
However, this view is a simplistic assessment that ignores the array of dealer expenses that should theoretically be being passed on to clients in the form of adjusted bid/offer spreads.
If we carry out the analysis from the dealer perspective, we see a more complete picture. In the most extreme of examples, where the highest possible costs that a dealer would charge is the case (e.g. a dealer isn’t able to achieve internal portfolio netting when looking to price a new client trade), bid/offer spreads would naturally widen, as a direct consequence of the various component costs. And bilateral trading is still cheaper than when cleared. The chart below illustrates this scenario:
This chart depicts the sizes and sources of costs associated with a 10Y vanilla IRS trade, from a broker perspective.
However, though insightful, this standalone basis does not reflect the norm for most trading scenarios today. In a case where effective netting between different clients is taken into account by the broker, a very different story unfolds – bilateral trading costs become vastly higher than cleared. The cleared case results in almost no cost on the broker side and costs on the clearing broker side are covered by the clearing fees. This scenario is illustrated in the chart below:
This chart shows from the broker perspective the cost of trading, when effective netting is taken into consideration.
When looked at from the buy-side perspective, the following is exposed:
This chart shows from the buy-side perspective, the hidden costs of trading cleared and uncleared swaps.
This analysis shows that clearing in comparison with uncleared swaps offers the potential of significantly more balance sheet and cost-efficient trading of OTC swaps for dealers and, in turn, for the buy-side.
To conclude, comparison of the costs of trading cleared and uncleared swaps is complex, requiring analysis of both direct and indirect costs. Our analysis shows that the indirect impact on the dealer margin and balance sheet costs can be significant for bilateral trades, creating a strong incentive for dealers to move more business into clearing.
Buy-side swap users that are able to quantify these impacts on an ongoing basis will ensure that they make optimal decisions as to when it becomes commercially appropriate to begin voluntary CCP clearing.
If you would like further insights into your costs of trading, contact us at healthcheck@opengamma.com
OpenGamma is the financial technology company at the heart of the derivatives market, delivering unique analytics that marry buy-side needs with sell-side constraints.
The power of the technology is based upon unrivalled insight into the supply and cost of balance sheet, allowing the buy-side not only to execute on their derivatives strategy, but to enhance their own performance and benchmark their costs against peers.
The post Elephant in the (Swap Trading) Room: Exposing dealer trading costs for cleared and uncleared swaps appeared first on OpenGamma.
]]>The post Strata and multi-curve: Interpolation and risk appeared first on OpenGamma.
]]>One important point in the discussion on interpolation in interest rate curve, before the question on how to interpolate should be the question of What to interpolate? I do not plan to discuss that issue in this blog, but you can find my rant on the subject in Section 4.1 of my book on the multi-curve framework (Henrard (2014)). In this blog, I have decided to use interpolation on the zero-rate (continuously compounded), which is probably the most used approach
We start with the graphical representation of impact of interpolation. For this I use that same data as used in the previous blog, calibrating two curves in GBP to realistic market data from 1-Aug-2016. One curve is the OIS (SONIA) curve and the other one is the forward LIBOR 6M curve. The nodes of the curves are the node prescribed by the FRTB rules (see BCBS (2016)). Similar effects would be visible with roughly any selection of node; they were selected for convenience.
We will use through this blog three types of interpolation schemes: linear, double quadratic and natural cubic spline. The reason for the choice is to have interpolation schemes with different properties, I do not claim that those are the best for all (or even any) purposes. The graph of the forward Sonia rates, daily between the data date and 31 years later is displayed in Figure 1. Each date is represented by one point.
Figure 1: Overnight forward rates with three different interpolators.
The forward rates for the Libor curves are provided in Figure 2.
Figure 2: Libor 6M forward rates with three different interpolators.
The graphs are like the familiar one on any note related to curve interpolation. The linear interpolation leads, even with very clean data like in this example, to forward rates profiles with saw tooth effect. Some impacts are very pronounced; in the middle of the graph there is a drop of 40 bps in forward rate from one day to the next. An almost similar drop if visible in the Libor graph, but this time on a period of 6 months (one tenor of the underlying Libor). There is also a large difference between the different interpolators. In the case of the overnight rates, the largest difference in forward rate is 39.29 bps and is observed for the rate starting on 31-Jul-2046 (30 years forward) between the linear and the double quadratic.
The code used to produce the above graphs are available in the blog example Strata code [link]. The code to produce the above curves is roughly 10 lines. The code to export the data to be graphed by Matlab is longer than the calibration code itself.
The interpolation mechanism does not only have an impact on the forward rates, and thus on the present value, but also on the implied risk. We now look at this risk, using the bucketed PV01 or key rate duration as the main risk indicator. For this we take one forward swap with starting date in 6 months and a tenor of 9 years. The fixed rate is 2.5% (above market rate). For that swaps, the par rate bucketed PV01 are provided in Figures 3 to 5.
Figure 3: PV01 with linear interpolation scheme
Figure 4: PV01 with double quadratic interpolation scheme
Figure 5: PV01 with natural cubic spline interpolation scheme
The first point to notice is that the sum of the bucketed PV01 for both the OIS curve and the Libor curve are almost the same for all interpolation schemes. A parallel curve move results in similar change of PV and there is only a very small effect from the level and the curve shape.
The interesting part is obviously the differences. The double quadratic and natural cubic spline are non-local interpolators. The sensitivity to the cash flows in the 5Y to 10Y period extend beyond the 5Y and 10Y nodes. There is a sensitivity to the 3Y and 15Y points, and in the case of the natural cubic spline (NCS), even to the 20Y and 30Y points. The non-locality can lead to non-intuitive hedging results. In the NCS case, the hedging of the swap with a maturity of 9Y and 6M is done with swap up to 10Y but also with swaps of maturity 15Y, 20Y and 30Y. Note also the “wave” effect where the hedging is done alternatively with swap of the opposite direction as the original one and the same direction.
The code to calibrate the three sets of curves and computing the related par market quote sensitivities – the variable mqs in the code – takes only 10 lines:
/* Calibrate curves */ ImmutableRatesProvider[] multicurve = new ImmutableRatesProvider[3]; for (int loopconfig = 0; loopconfig < NB_SETTINGS; loopconfig++) { multicurve[loopconfig] = CALIBRATOR.calibrate(configs.get(loopconfig).get(CONFIG_NAME), MARKET_QUOTES, REF_DATA); } /* Computes PV and bucketed PV01 */ MultiCurrencyAmount[] pv = new MultiCurrencyAmount[NB_SETTINGS]; /* Computes PV and bucketed PV01 */ MultiCurrencyAmount[] pv = new MultiCurrencyAmount[NB_SETTINGS]; CurrencyParameterSensitivities[] mqs = new CurrencyParameterSensitivities[NB_SETTINGS]; for (int loopconfig = 0; loopconfig < NB_SETTINGS; loopconfig++) { pv[loopconfig] = PRICER_SWAP.presentValue(swap, multicurve[loopconfig]); PointSensitivities pts = PRICER_SWAP.presentValueSensitivity(swap, multicurve[loopconfig]); CurrencyParameterSensitivities ps = multicurve[loopconfig].parameterSensitivity(pts); mqs[loopconfig] = MQC.sensitivity(ps, multicurve[loopconfig]); }
It is also interesting to look at a “dynamic” version of the above PV01 report. For that I will produce the same report for swaps with starting dates between the data date and two years later and all of them with a tenor of 8 years. The swap maturities will be between 8 years and 10 years. Let’s start with the linear interpolation scheme as it displays the profile that most people would probably consider as intuitive (at least I do). That profile is displayed in Figure 6. For a 8Y spot starting swap (the left side of the graph), the sensitivity is mainly on the 5Y and 10Y IRS with more on the 10Y. The main sensitivities are represented by the colour curves; the other sensitivities (Libor and OIS) are represented by the dark grey thin lines. When we move to the 2Yx8Y swap along the X-axis, we see the sensitivity to the 10Y increase and the sensitivity to the 5Y decrease down to zero. The sensitivity to the 2Y increase in absolute value but with the opposite sign to the 10Y. The total sensitivities of the swaps are fairly constant.
Figure 6: PV01 profile for linear interpolation scheme.
Figure 7: PV01 profile for double quadratic interpolation scheme.
Let’s jump immediately to the natural cubic spline profile displayed in Figure 8; the Figure 7 for double quadratic is in between the other two. For that profile, I will start the explanation from the right side, with the 2Yx8Y forward swap. The risk is very similar to the one depicted by the linear interpolation. The main sensitivities (2Y and 10Y) are on nodes and the sensitivity in the two cases are similar. When we move away from the nodes, in our case with the starting date moving down from 2Y to valuation date and the maturity from 10Y down to 2Y, a different profile appear. Sensitivities are not anymore only to the nodes surrounding the swap dates. There are now significant sensitivities to the adjacent nodes (3Y and 15Y for example) with the opposite sign to the main sensitivities with a non intuitive behavior. For example the sensitivity to the 15Y node increases when the maturity of the swap moves away from that node. A significant sensitivity appear on the 2Y when the start date of the swap moves away from the 2Y node and is the larger in this profile for the spot starting swap.
Figure 8: PV01 profile for natural cubic spline interpolation scheme.
Obviously the sensitivity to different nodes will lead to recommended hedging using swaps with those tenors. The 8Y swap will be hedged with 2Y, 3Y, 5Y, 10Y and 15Y swaps. Maybe not the most intuitive approach. It is also possible to easily compute the require notional to hedge each sensitivity, and that will be described in a forthcoming instalment of this blog.
The goal of this blog was not to provide a definitive answer on which interpolation scheme to use for curve calibration – I don’t with there is such a definitive answer – but to show that a personal analysis is important. The analysis can be done easily with the right tools. The code used to produce figures 3 to 5 is available in the examples associated to this blog. The main part of the code consists in roughly 10 lines of code, for calibration and sensitivity computation. The other figures required a loop around that code to produce the sensitivities for roughly 500 swaps (2Y of business days).
In this instalment of the multi-curve blogs, we have provided some example of interpolation impact on forward rate and risk computations. We also showed how easy it is to obtain those results in the Strata code.
Henrard, M. (2014). Interest rate modelling in the Multi-curve Framework: Foundation, Evolution and Implementation. Palgrave.
BCBS (2016). Minimum capital requirements for market risk. Basel Committee on Banking Supervision, January 2016.
The post Strata and multi-curve: Interpolation and risk appeared first on OpenGamma.
]]>The post Strata and multi-curve: Curve calibration and bucketed PV01 appeared first on OpenGamma.
]]>Over the last few years, the pricing of vanilla derivatives has become more complex than ever before. This is the result of the basis increases, in particular OIS vs. LIBOR, and the generalisation of Variation and Initial Margins. At the same time, the market has moved to increased standardisation.
In this new framework it is more important than ever to have a flexible and generic approach to curve calibration, derivatives pricing and sensitivity computations. The first production release of our open source analytics library, Strata 1.0, was made available to the public on 15 July 2016. The second release, Strata 1.1, was released on 7 October 2016. One of the key strengths of Strata is its support for multi-curve pricing and risk calculations.
Over the coming months, I will publish a number of posts that detail some examples (and corresponding code) of how Strata can be used for OTC derivatives valuation and risk. This blog series is for quantitative analysts, risk managers, model validators and regulators who want to access tools that provide the required flexibility and precision in an easy-to-access environment.
Each of the blog posts will describe, in financial terms, the analysis that I have performed and the results obtained using Strata. I will make the code available for each example in our Strata examples project so that the readers can easily reproduce the analysis. If you have any questions or comments, please feel free to get in touch below or contact me directly at marc@opengamma.com.
In the first instalment, I will describe how to calibrate curves and compute PV and bucketed PV01 for swaps. The curve calibration will start from curve descriptions and data in simple CSV files and will cover the computation of ‘Jacobian’ matrices used later for sensitivity computations. The calibrated curves will be used to compute present values and bucketed PV01, also called bucketed delta or key rate duration. I will also comment on the efficiency obtained from the Algorithmic Differentiation used throughout the library. This instalment will be a little more technical than future ones as we will go through the code used to run the analysis. However, the step-by-step method should help non-technical users understand the process. Subsequent posts will focus more on the finance and risk management side.
Curve calibration is a mechanism that requires a lot of fine-tuning and precision, and at the same time is normally based on standard instruments and conventions. One would like to have a lot of flexibility and at the same time be able to create standard curves very easily. Luckily, Strata has been built with those two, almost contradictory, goals in mind.
Let’s start with the standard part. Many indices, conventions, and calendars in the main markets are built in. These standard objects can be accessed easily by name, which is just a simple string. For example, if you want to access the calendar for good business days in London, knowing its code GBLO is enough; to access the GBP LIBOR 6M index and its convention, knowing its name GBP-LIBOR-6M is enough; to access the standard market conventions for the GBP IRS with LIBOR 6M floating index, knowing its name GBP-FIXED-6M-LIBOR-6M is enough.
If you have read the paragraph above, you know almost enough to be able to calibrate curves in Strata.
Before going to the calibration, let’s discuss the flexibility side. The lists of built-in objects described above can be extended easily by the user. If you don’t like the list of holidays predefined in GBLO – perhaps a royal wedding needs to be added, or you have your own source of data – you can create your own calendar and use it in the same way as the built-in ones. If you want to create a new GBP index – maybe a potential replacement for LIBOR – you can add this to the library without writing a single line of code, simply by adding a single CSV file.
But enough of the flexibility for now. Let’s look at the calibration of a standard multi-curve framework with standard instruments in GBP. The description of a multi-curve starts by specifying which curves you have and what each of them are used for. In our simple case, we would like to have two curves: one for discounting in GBP and for forwards related to SONIA, and the other one for forwards related to GBP LIBOR 6M. Suppose that we give our multi-curve framework the name GBP-DSCONOIS-L6IRS, and our curve names are GBP-DSCON-OIS and GBP-LIBOR6M-IRS respectively. How do we pass this information to Strata? We simply create a CSV file with exactly this information. The file is:
Group Name,Curve Type,Reference,Curve Name
GBP-DSCONOIS-L6IRS,Discount,GBP,GBP-DSCON-OIS
GBP-DSCONOIS-L6IRS,Forward,GBP-SONIA,GBP-DSCON-OIS
GBP-DSCONOIS-L6IRS,Forward,GBP-LIBOR-6M,GBP-LIBOR6M-IRS
Next, we want to describe how the curve should be interpolated. Should it be on the zero-rates or on the discounting factors, and which interpolator should be used? We simply create a file with this information:
Curve Name,Value Type,Day Count,Interpolator,Left Extrapolator,Right Extrapolator
GBP-DSCON-OIS,Zero,Act/365F,Linear,Flat,Flat
GBP-LIBOR6M-IRS,Zero,Act/365F,Linear,Flat,Flat
Finally, you want to describe each node of the curve in an easy way. This is where the built-in conventions will make your life very easy. A description of the curves could look like:
Curve Name,Label,Symbology,Ticker,Field Name,Type,Convention,Time,Spread
,,,,,,,,
GBP-DSCON-OIS,GBP-ON,OG-Ticker,GBP-ON,MarketValue,DEP,GBP-ShortDeposit-T0,1D,
GBP-DSCON-OIS,GBP-TN,OG-Ticker,GBP-TN,MarketValue,DEP,GBP-ShortDeposit-T1,1D,
GBP-DSCON-OIS,GBP-OIS-1M,OG-Ticker,GBP-OIS-1M,MarketValue,OIS,GBP-FIXED-1Y-SONIA-OIS,1M,
GBP-DSCON-OIS,GBP-OIS-2M,OG-Ticker,GBP-OIS-2M,MarketValue,OIS,GBP-FIXED-1Y-SONIA-OIS,2M,
GBP-DSCON-OIS,GBP-OIS-3M,OG-Ticker,GBP-OIS-3M,MarketValue,OIS,GBP-FIXED-1Y-SONIA-OIS,3M,
etc...
GBP-LIBOR6M-IRS,GBP-FIX-L6M,OG-Ticker,GBP-FIX-L6M,MarketValue,FIX,GBP-LIBOR-6M,,
GBP-LIBOR6M-IRS,GBP-FRA-3Mx9M,OG-Ticker,GBP-FRA-3Mx9M,MarketValue,FRA,GBP-LIBOR-6M,3Mx9M,
GBP-LIBOR6M-IRS,GBP-IRS6M-1Y,OG-Ticker,GBP-IRS6M-1Y,MarketValue,IRS,GBP-FIXED-6M-LIBOR-6M,1Y,
GBP-LIBOR6M-IRS,GBP-IRS6M-2Y,OG-Ticker,GBP-IRS6M-2Y,MarketValue,IRS,GBP-FIXED-6M-LIBOR-6M,2Y,
GBP-LIBOR6M-IRS,GBP-IRS6M-3Y,OG-Ticker,GBP-IRS6M-3Y,MarketValue,IRS,GBP-FIXED-6M-LIBOR-6M,3Y,
etc...
For each curve, there is a corresponding list of nodes. Each node has a name (for result display purposes) and a link to the data to be used. The data to be used is composed of a ‘Symbology’, a ‘Ticker’ and a ‘Field Name’. Here I have used ‘OG-Ticker’ (for OpenGamma, not ‘Olympic Games’), but these can be replaced with ‘Bloomberg-Ticker’ or ‘MyBankProprietaryName’ dependent on the user’s preference. The ‘Ticker’ itself could be ‘BPSW10’ or again your own internal naming convention. For the ‘Field Name’ I have used ‘MarketValue’, but you could choose something like ‘Bid’, ‘Ask’ or ‘LastQuote’. Next, you specify the type of instrument. Here we restrict ourselves to DEP (term deposit), OIS (fixed vs. overnight swap), FIX (Ibor fixing), FRA (Forward Rate Agreement) and IRS (fixed vs. Ibor swaps). Other possibilities include IFU (Futures), BAS (Basis swaps), ONI (Overnight vs. Ibor swaps), and these will be discussed in future instalments. The last two columns are the ‘convention’ and the ‘time’. The convention is one of the built-in or user-defined conventions; the type of convention depends on the specified instrument type. The ‘time’ is the tenor for the instruments used above. More complex times can be created to describe FRAs and futures.
The last ingredient for curve calibration is the actual data. In this example, we calibrate the curve as of 1 August 2016.
Valuation Date,Symbology,Ticker,Field Name,Value
,,,,
2016-08-01,OG-Ticker,GBP-ON,MarketValue,0.0042
2016-08-01,OG-Ticker,GBP-TN,MarketValue,0.005
2016-08-01,OG-Ticker,GBP-OIS-1M,MarketValue,0.0023
2016-08-01,OG-Ticker,GBP-OIS-2M,MarketValue,0.0021
2016-08-01,OG-Ticker,GBP-OIS-3M,MarketValue,0.002
etc...
2016-08-01,OG-Ticker,GBP-FIX-L6M,MarketValue,0.0057969
2016-08-01,OG-Ticker,GBP-FRA-3Mx9M,MarketValue,0.0045
2016-08-01,OG-Ticker,GBP-IRS6M-1Y,MarketValue,0.0051
2016-08-01,OG-Ticker,GBP-IRS6M-2Y,MarketValue,0.0048
2016-08-01,OG-Ticker,GBP-IRS6M-3Y,MarketValue,0.0049
etc...
I believe that the file names speak for themselves.
We have done the hard work by collecting the data and deciding which nodes we want to use in our curves. We are now four lines of code away from having the curves.
First, load the data from the configuration files:
Map<CurveGroupName, CurveGroupDefinition> configs = RatesCalibrationCsvLoader.load(GROUP_RESOURCE, SETTINGS_RESOURCE, NODES_RESOURCE);
Second, load the data:
Map<QuoteId, Double> MAP_MQ = QuotesCsvLoader.load(VALUATION_DATE, ResourceLocator.of(QUOTES_FILE));
Third, collect the data in the right format:
ImmutableMarketData MARKET_QUOTES = ImmutableMarketData.builder(VALUATION_DATE).values(MAP_MQ).build();
And finally, calibrate the curves:
RatesProvider multicurve = CALIBRATOR.calibrate(configs.get(CONFIG_NAME), MARKET_QUOTES, REF_DATA);
Of the four lines of code, only the last one, where the actual calibration take place, requires some explanation. I will not go through the theory of multi-curve and calibration; this is presented in other places. If I may, I would recommend Chapter 5 of my book [Henrard (2014)].
The only ingredient useful in this discussion is that in the calibration procedure, we solve a root-finding exercise to obtain parameters of the interpolated curves such that for all the instruments describing the nodes,
PV(Instrument) = 0.
This root-finding algorithm is run for all the curves simultaneously (footnote 1). For each PV to be 0, the parameters we have selected must be such that all the benchmark instruments reproduce exactly the market price using the calibrated curves. In the curve calibration process, not only are the curves themselves calibrated and stored but so are the Jacobians or transition matrices. I will describe what those matrices are and why we care about them below.
The first application of the calibrated curve that we can think of is to compute the present value of a swap that we have traded. From the rate of the benchmark swaps (and the interpolator) we want to infer the value of a non-benchmark instrument.
For that, we need first to create a swap. Here again, our OpenGamma conventions come in handy. We take a standard convention, specify the trade date, the period to start, tenor, side, notional and coupon, and we are done:
ResolvedSwapTrade swap = GBP_FIXED_6M_LIBOR_6M.createTrade(
VALUATION_DATE, SWAP_PERIOD_TO_START, Tenor.of(SWAP_TENOR), BuySell.BUY, SWAP_NOTIONAL, SWAP_COUPON, REF_DATA)
.resolve(REF_DATA);
We want to compute the present value of this swap using the multi-curve we have calibrated above:
MultiCurrencyAmount pv = PRICER_SWAP.presentValue(swap, multicurve);
The output is a MultiCurrencyAmount object to deal with the cross-currency swaps that we will discuss in a later instalment.
Immediately after the present value, the next thing that a trader or a risk manager will want is the bucketed PV01, also called bucketed delta, rate sensitivities, or key rate duration. Before talking about the code needed to obtain this, we have to define what we mean by ‘bucketed PV01’ as there are three versions of it available in Strata.
The three different sensitivities available are: point sensitivity, parameter sensitivity, and market quote sensitivities. The point sensitivity is the sensitivity, in the sense of the mathematical partial derivative, of the present value with respect to each zero rate of discounting curves and each forward rate of forward curves. By each zero-rate or forward rate, we mean that each single date on which there is a cash flow or a fixing take place is represented in the sensitivity; the sensitivities are not only to the curve nodes but to each single date in the instrument. The main use of this type of information is to view the daily fixing sensitivities and see if there are dates with large fixing impacts.
An example of point sensitivities for a one year GBP swap is displayed below. There are four cash flows (two on each leg) producing four zero rate sensitivities. There are also two forward Ibor rate sensitivities.
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=0.5808219178082191, currency=GBP, sensitivity=71940.4415752365}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=1.084931506849315, currency=GBP, sensitivity=136526.2295146616}
IborRateSensitivity{observation=IborIndexObservation{index=GBP-LIBOR-6M, fixingDate=2016-09-01, effectiveDate=2016-09-01, maturityDate=2017-03-01, yearFraction=0.4958904109589041}, currency=GBP, sensitivity=4954388.900936099}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=0.5808219178082191, currency=GBP, sensitivity=-16317.31422092133}
IborRateSensitivity{observation=IborIndexObservation{index=GBP-LIBOR-6M, fixingDate=2017-03-01, effectiveDate=2017-03-01, maturityDate=2017-09-01, yearFraction=0.5041095890410959}, currency=GBP, sensitivity=5033542.805338534}
ZeroRateSensitivity{curveCurrency=GBP, yearFraction=1.084931506849315, currency=GBP, sensitivity=-23843.440872413255}
The parameter sensitivity is the sensitivity of the present value to the parameters used internally to represent the curves. The parameters are often the zero-coupon rates or discount factors at the nodes of an interpolated curve.
The market quote sensitivity, also called par rate sensitivity, is the sensitivity of the present value to the raw market quotes used in the curve calibration. Of the three sensitivities, this is the risk figure the most used by traders and risk managers.
An example of market quote sensitivities for a swap 3 months forward and with a 7-year tenors is provided below.
Each of those sensitivities can be obtained easily in one line of code. The code that was used to produce the results shown above is:
PointSensitivities pts = PRICER_SWAP.presentValueSensitivity(swap, multicurve);
CurrencyParameterSensitivities ps = multicurve.parameterSensitivity(pts);
CurrencyParameterSensitivities mqs = MQC.sensitivity(ps, multicurve);
It is now a good time to discuss the computation of Jacobian matrices in the curve calibration. These are the matrices used to transform a parameter sensitivity to a market quote sensitivity. They depend only on the curves and not on the instrument for which the sensitivity is computed. It thus makes sense to compute them only once and to do it when the curves are calibrated. This is what is done in Strata: when the curves are calibrated, it is possible to request the computation of the Jacobian matrices. If requested, these matrices will be computed and stored in the ‘metadata’ associated to the curve. They are then available for later use whenever they are required.
As we are discussing the computation of sensitivities, we should add that all sensitivities in Strata are computed using Algorithmic Differentiation (AD). The theory of this computer science technique is described in Naumann (2012). For this blog it is sufficient to know that the technique produces derivatives by analytical computation, and it is very efficient. Analytical computation means that it avoids numerical methods like finite difference, and produces sensitivities that are more precise while avoiding numerical instability. The best way to discuss its efficiency is through a couple of examples.
We first calibrate a set of curves and compute the present value of 100 swaps. The elapsed wall-clock time for this is recorded. In the example above, there are 29 data inputs for the curve calibration. If we were to use the standard bump-and-recompute technique we would redo this process 29 extra times with small changes in inputs, which would require 30 times more CPU time.
In my example, on standard desktop hardware and using only a single thread, the time taken for the multi-curve calibration and the computation of 100 PVs is 9.98 ms (footnote 2). Computing the market quote bucketed PV01 by finite difference would take 30×9.98ms = 299.4 ms (footnote 3). If we compute the (three versions of) sensitivities using AD, we have a computation time of 15.37 ms. This is roughly 20 times faster than a simple bump-and-recompute.
You might argue that the example with only 100 swaps is not realistic. Running the example instead with 2,000 swaps (tenors between 1Y and 20Y and 100 different coupons for each tenor) took 32.46 ms to calibrate the curves and calculate PV. Extending this to include sensitivities computed by AD took 162.88 ms, whereas obtaining the same values by finite difference would have taken 973.8 ms. In this example algorithmic differentiation is roughly 6 times faster than finite difference. We will show in forthcoming blogs that this improved performance is even better when there are more nodes on the curves.
This means any trader, portfolio manager or risk manager could compute the risk of this portfolio in near real-time using only his or her desktop computer.
We will increase the complexity of the portfolio in a later instalment of the blog.
Strata offers a very easy API to calibrate interest rate curves, create trades and compute risk measures. As Algorithmic Differentiation is implemented in the code, the calibration and computation of sensitivities is extremely efficient.
That’s all for today.
The code used for this blog is available in the Strata repository: strata-examples in the package ‘com.opengamma.strata.examples.blog.stratamulticurve1’
The numbers can also be obtained in Excel using the StrataExcel add-in. Don’t hesitate to contact us for a demo.
In the next instalment of this blog series, we will discuss the impact of the curve interpolator on the curve smoothness and on the bucketed PV01.
We have long experience in interest rate derivatives valuation, both from a financial engineering perspective, and from a technology perspective. We help banks, hedge funds, asset managers and clearing houses to quickly start or improve their risk management practice for swap trading.
Examples of engagements:
Client: Hedge fund
Issue: Specialized in futures trading, would like to extend the trading universe to IRS.
Solution: Create tools for their quant and risk managers to analyse pricing and hedging. Historical analysis of strategy P/L and hedging behavior.
Client: Clearing house
Issue: Clear single currency swap and would like to expend clearing offering to cross-currency swaps.
Solution: Build Excel-based tools to price and risk manage cross-currency swaps. Analyse the relevant regulation for bilateral margin which is a competitor to clearing. Compare with standard bilateral margin methodologies.
Client: Asset manager – pension fund
Issue: Trade assets in foreign currency and swap them to their base currency; need tools to value multi-currency swap books with collateral in base currency cash.
Solution: Create multi-currency swap valuation based on cross-currency swap between base currency and other currencies.
Client: Bank ALM
Issue: swap book. Multi-curve, collateral, VaR
Solution: Comparison of methodology, benchmarking of their VaR methodologies with alternative approach. Historical analysis of their full swap book. Based on existing trade booking system, with a thin layer of OpenGamma software; historical analysis running on quarterly basis.
Henrard, M. (2014). Interest rate modelling in the Multi-curve Framework: Foundation, Evolution and Implementation. Palgrave.
Naumann, U. (2012). The Art of Differentiating Computer Programs. An introduction to Algorithmic Differentiation. SIAM.
The post Strata and multi-curve: Curve calibration and bucketed PV01 appeared first on OpenGamma.
]]>The post CSA Changes: Do You Really Understand the Impact? appeared first on OpenGamma.
]]>Negotiating each agreement can be lengthy and, with timing a real and harsh constraint, dealers are keen to have clients sign new standardised CSAs so trading can carry on as usual. But making the right decision is no easy task for the buy-side; it is vital they do so as it can impact the value of portfolios by millions of pounds sterling.
There are three options:
The first step in the process is to compare the value of portfolios with the economic impact that amending collateral terms in a CSA will have. You may be shocked.
In our analysis, we’ve assumed that a typical book is made up of the following types of trades:
We valued them under various CSA terms (detailed later), using trades across the three main currencies: GBP, USD, EUR. So what were the key takeaways?
Now let’s get into the meat of the problem.
To quantify the impact that CSA terms have on a typical portfolio, we have made some additional assumptions: that exposures are directional, with trades having significant notionals.
We quantified the impact of collateral eligibility on PV, with 2 common CSA types:
Firstly, let’s take a look at a Vanilla Interest Rate Swap for example, a standard 100m 40Y USD Libor 3M IRS maturing in 30 years. The PVs for this swap under the various CSA terms described are given in Table 1. As you can see the terms of the CSA dictate the valuation of the swap:
CSA Type |
CSA1 Single Ccy |
CSA2 Multi Ccy |
Trade PV |
£85m |
£75m |
Table 1: Valuation of a 40Y USD LIBOR 3M IRS with 30Y remaining maturity under different CSA terms
If we assume that the current discounting method to date has been Libor discounting, the valuations differences to Libor discounting are:
CSA / Valuation Type |
Libor disc. |
CSA1 Single Ccy |
CSA2 Multi Ccy |
Trade PV |
£80m |
£5m |
£-5m |
Table 2: Valuation of a 40Y USD LIBOR 3M IRS with 30Y remaining maturity using Libor discounting and under different CSA terms
The difference in values between CSA1 and CSA2 is very interesting, as it quantifies the valuation impact of switching from a multi-currency cash CSA to a single-currency cash CSA (or vice-versa) – in our example, this difference is £10m, or 10% of the trade notional.
Why such a difference? It’s down to the fact that a multi-currency CSA will typically assume a ‘Cheapest-to-Deliver’ approach to collateral selection and therefore discounting. At current market levels, the trade cash flows are discounted using EUR EONIA, translated to the trade currency through cross-currency swaps.
Can the same be said for other OTC products? To answer this question, we turn to cross-currency swaps. Changing the CSA terms for a £100m 30Y GBP / USD cross-currency swap with 20 years remaining maturity yielded the following results:
CSA / Valuation Type |
CSA1 |
CSA2 CTD disc. |
Valuation |
Trade PV |
£-49m |
£-42m |
£-56m |
Table 3: Valuation of a 30Y GBP/USD CCS with 20Y remaining maturity under different CSA terms and valuation approaches
CSA1 and CSA2 valuations factor in the cross-currency basis – the impact of switching from a multi-currency cash CSA to a single-currency cash CSA (or vice-versa) is £7m, or 7% of the trade notional.
Under the valuation approach without the cross-currency basis, the OIS discounting for each of the legs has to be done independently, resulting in a significant PV difference (7% of notional). This is equivalent to a significant spread on the trade: 35 bps in this example.
So to answer the previous question, representative products found in a typical LDI portfolio are affected by changing the terms in a CSA. In particular, they are all significantly affected by switching from a multi-currency CSA to a single-currency CSA.
Correctly valuing cross-currency instruments is also critical: the reflection of the cross-currency basis can add up to a difference in mark-to-market in the tens of millions.
You may be sitting there thinking, “Well my CSAs are clean: mostly cash and some with gilts, so none of this applies to me.” Unfortunately, you would be wrong. As we showed earlier, differences in discounting between a single- and multi-curve framework have a significant impact on the valuation of the portfolio. This applies for future trades as well as current trades. Correctly valuing these instruments by factoring in the impact of collateral terms has the following benefits for asset managers:
OpenGamma’s independent valuation service is based on our award-winning analytics and can help value your portfolio based on client mandates and CSA terms, by assigning valuation methodologies to each fund and counterparty.
For more information about our solutions, please contact Maxime Jeanniard du Dot at max@opengamma.com.
The post CSA Changes: Do You Really Understand the Impact? appeared first on OpenGamma.
]]>The post Gain visibility to verify your Initial Margin requirement appeared first on OpenGamma.
]]>In the context of corporate governance and fiduciary responsibility, treasury managers feel obligated to have robust control around independent verification of the Initial Margin (IM) call on their cleared OTC derivatives portfolio, as they bear potential credit risk to their clearing brokers.
We have been working with a UK retail bank to provide this increasingly critical function. Its team is now in production using OpenGamma’s margin calculation capabilities within our partner CloudMargin’s collateral management workflow solution. Best of all, the client was able to access this new functionality through a simple configuration option in CloudMargin, rather than the often onerous task of new software deployment or updates.
With full visibility into the Initial Margin requirements on their OTC derivatives portfolio, the bank is now able to validate every Clearing Broker IM call against the actual Clearing House IM requirement – ensuring that costs resulting from clearing mandates are precisely understood.
Using the cloud-based solution offered by OpenGamma and CloudMargin, we have helped this bank increase control and transparency, meaning they never have to blindly agree on a margin call again.
To find out more about how we can help you analyse and assess your margin requirements, please contact us for a demo.
The post Gain visibility to verify your Initial Margin requirement appeared first on OpenGamma.
]]>The post The upcoming threat to the buy-side of CSA repapering appeared first on OpenGamma.
]]>Some dealers switched to discounting using the interest rate paid on the assets collateralising the trades. By using their more advanced multi-curve valuation models, they could more accurately price trades enabling them to identify opportunities to rebalance their exposures.
Why is this history relevant right now? With the VM regulation deadline of March 2017 looming, the OTC derivative market is about to go through a widespread CSA repapering process. Some estimates suggest that around 200,000 CSAs will need to be amended in order to comply with the bilateral margin rules set by BCBS/IOSCO, with many individual buy-side firms having many 100’s of CSAs in place.
Amending CSA terms on an existing agreement can have a significant impact on the value of a portfolio (just as voluntary clearing at a clearing house would do). Consequently, this triggers the need to agree on a potential payment between the dealer and client or vice versa for every individual CSA.
This all comes at a time when dealers are facing pressure on their ROEs due to regulatory changes such as leverage ratio and NSFR. They are increasingly keen for the buy-side to exchange cash collateral in the currency of the underlying exposures, on a daily basis. A key driver for the dealers is that only daily-exchanged cash-settled VM can provide them with capital relief.
On the other hand, many funds would like to keep their ability to post other assets. Pension funds, for instance, tend to be fully allocated and do not hold cash reserves within their funds. Contraction of repo liquidity (also impacted by regulation) has closed down the ability for buy-side firms to raise cash using the securities held in their funds. Many buy-side firms are worried that moving to single-currency cash CSAs might not be such a good idea. They are now faced with having to make some key strategic decisions with regards to the potential asset-allocation implications of needing to hold more cash in their funds to meet margin requirements in periods of market stress.
Many buy-side firms will be entering these discussions with their dealers without the ability to independently price the impact of the CSA changes on their book. It is therefore impossible to know the value of the CSA optionality that they may be giving up.
OpenGamma’s award-winning derivatives analytics are currently being used by our buy-side customers to allow them to independently price their derivative positions, for example using different CSA assumptions. We are seeing an increased level of interest in our solutions as firms prepare for their upcoming negotiations.
For more information about our solutions, please contact Maxime Jeanniard du Dot at max@opengamma.com.
The post The upcoming threat to the buy-side of CSA repapering appeared first on OpenGamma.
]]>The post Strata Wins Big at JavaOne 2016 appeared first on OpenGamma.
]]>
Award presented by Georges Saab, Vice President, Software Development, Java Platform Group at Oracle (Centre) and Sharat Chander, Director, Java Product Management & Developer Relations (Left)
It was great to be back at JavaOne after a three year gap, with the award as a real bonus. I also gave a talk on code generating mutable and immutable beans, with slides available. There was lots of talk about Java 9, modularity and future changes to the language for local variable type inference and data classes (potentially removing the need for code generators).
Now I’m back in the UK, it is time to finish up the loose ends and release v1.1 of Strata, with lots of enhancements. That way we can keep the award-winning momentum going for everyone who needs a transparent, high quality, easy-to-use library for valuation and market risk.
The post Strata Wins Big at JavaOne 2016 appeared first on OpenGamma.
]]>The post EMIR category 2 – OpenGamma provides analysis for optimal margin outcome appeared first on OpenGamma.
]]>Some of your clients may be firms unprepared for the EMIR Category 2 clearing deadline (coming into full effect on 21st December 2016) and will presumably have an urgent short-term need.
These firms will need to understand their margin requirement by 21st December, and ideally they will want to ‘cherry-pick’ trades for voluntary clearing from their pre-existing bilateral back-book for an optimal margin outcome.
The challenge will be in performing the necessary analysis in a timely and cost-efficient manner, given the fast approaching deadline and differing portfolio data formats. While you will have access to the cleared portion of the client’s portfolio in the clearing house format, it is expected that the bilateral back-book will only be available in the client’s own internal formats.
This implies significant manual effort to first map,run and then present the analysis in the format the client expects.
OpenGamma has built an analytical service specifically to help buyside firms assess the margin impact of EMIR Category 2. We can turn around margin analysis on a client portfolio typically in less than 24 hours, including mapping their custom portfolio formats. This will ensure that your client gets the value-add analysis that will differentiate the service you offer in comparison to other clearing brokers.
For more information or a demo please contact:
Alp Oergen
Email: alp@opengamma.com
Direct: +44 20 3725 3384 | Mobile: +44 7532 142 410
The post EMIR category 2 – OpenGamma provides analysis for optimal margin outcome appeared first on OpenGamma.
]]>