Regarding my background, I have worked as a High Frequency Market Maker on the NSE (National Stock Exchange of India), MCX (multi-commodity exchange) and the CME (Chicago Mercentile Exchange) mainly in the Currency/Commodities F&O and thereby built a lot of strategies and Algorithms around it. Feel free to connect with me on Linkedin: My Linkedin Profile . For further queries you can reach me at email@example.com .
Bonds in general are more complicated and have more moving parts than Stocks/FX/Commodities as they behave like derivatives themselves with rates being the underlying (since they derive their value from the underlying benchmark curve in addition to their credit/riskiness and of course their terms of issuance). Therefore, it’s more important to understand the basic risk measures before moving over to the more complicated ones. Here I wouldn’t describe the simple measures like the duration, convexity, interest rate / prepayment risk, instead I would strongly recommend that somebody with no Fixed Income background familiarize themselves with the basic concepts before moving on with the more complicated details that I’m going to talk about in this post.
Before even proceeding further with the idea of algorithmic trading in Fixed Income and their derivatives, as we could easily do with time-series analysis of the mean-reverting behavior of their relative spreads in FX/Stocks/Commodities derivatives, its more important to understand the not so simple spread measures of the Fixed Income instruments relative to their benchmark curve before we can even start doing the time-series analysis on them. The usual spread measure as we all know is the Z-spread or the Zero Volatility Spread which in general is a measure of the risk over the benchmark given that it lends itself to be a measure of the excess return over its risk-free counterpart. There are certain flaws about it, but we can ignore that as of now. To me, its simply a static parallel spread over the benchmark security with similar characteristics.
So what’s wrong with the Z-spread ? Like modified duration, it doesn’t account for the variability of the cash flows of the derivatives ! So how to measure the effective OAS (option adjusted spread) ? Interest Rate Models.
A lot of Fixed Income fund managers do not understand this very important measure of credit spread because of several reasons. Some of which are:
1. Its not a straightforward calculation.
2. Its opinionated and can vary by at most a few hundred basis points between different models and the method of calculation. The major concern is that people rely too much on the accuracy of the OAS to 6 places of decimal. However, all that matters is to have a fair and reasonable estimate and the order of magnitude of this risk metric that the particular instrument is exposed to.
So let’s talk about these interest rate models and their applications in assessing Fixed Income and their Derivatives. In general I would describe them as probabilistic (need I say realistic) measures of the evolution of the interest rate upon which the future unknown cash flows of the instrument can be assigned a probability space. Once we have a probability measure of various occurrences, we can calculate the Expected Excess Spread over the benchmark which is nothing but the model implied OAS . It’s not like saying whether the short term rates will absolutely go up/down and the rest of the yield curve would follow suit, but it’s more like they will go up by a certain probability say 0.7 and down by 0.3 (so 0.7 + 0.3 = 1.0). So, its not only important but necessary to have an unbiased view of the underlying benchmark to even think of constructing a model. Again, I would like to re-emphasize that a model doesn’t tell you the future, it only gives you the expected value. So needless to say that people who blame models in times of crisis are the same people who don’t quite understand it too well enough to use it, but unfortunately depend too much on it.
The model implied OAS can be calculated either using a lattice method or Monte Carlo simulation. First I will walk through the lattice approach. In further posts, I will talk more about simulations and its pros/cons. The two popular non-arbitrage stochastic short rate models that I’m going to start (and end) with are very popular industry standard models namely: BDT(Black Derman Toy) and BK (Black-Karasinsky) both of which are log normal. Once the lattice is constructed, it can be used to recursively value any kind of interest rate contingent claim such as embedded options, interest rate options, interest rate futures, vanilla / bermudan swaps, swaptions etc , some of which are similar to european/american/bermudan options but the others are more complicated than their equity/FX/commodity counterparts and cannot be evaluated using a closed form solution.
A very significant advantage of OAS is that this measure applies to a wide array of asset classes and so an entire portfolio’s aggregated OAS can be compared to any benchmark index’s OAS.
While I will do the model validation for both, I will also highlight that what in general are the reasons to choose one over the other in my case.
Black Derman Toy Model:
= the instantaneous short rate at time t
= Expected drift in period t,t+
= Short rate volatility or local volatility
= a Standard Brownian motion under a risk-neutral probability measure;,.
I won’t go into the details of explaining it since there are more deserving literature(s) for it.
Since I’m doing a simple one factor model with constant volatility, the = 0. Therefore, the discrete time representation of the above can be approximated as:
where 1 time step
Some important points to note here:
1. Time step can be chosen based on the frequency of the cash flows. More granularity in the discretization wouldn’t help much unless the rates interpolation is vital to valuation process.
2. With no prior information, I assume the probability of rates going up and down to be same as 0.5.
3. Solve for the drift at every step under the no-arbitrage and the re-combination condition and generate the rate lattice with the given term structure.
4. Use linear interpolation mechanism to get the instantaneous short rate (or forward rate for future t1 and t2)
So we are all set to calculate the OAS.
Term Structure: arrays of rates and maturities starting from 3 months upto 30 years.
Maturity —— Rates
1/12 —— 0.25/100.0
3/12 —— 0.29/100.0
6/12 —— 0.38/100.0
1.0 —— 0.55/100.0
2.0 —— 0.76/100.0
3.0 —— 0.91/100.0
5.0 —— 1.22/100.0
7.0 —— 1.51/100.0
10.0 —— 1.71/100.0
20.0 —— 2.14/100
30.0 —— 2.55/100.0
Volatility = 15% (flat volatility term structure)
Start with a simple bond: A 10 year bond with zero-coupon, semi-annual compounding with initial price of 61.027 (5% yield)
Balanced Tree: (the tree is meant to be interpreted as a triangle with say in the below example, 0.38 being the root, 0.644995 being the left node and 0.797415 being the right node. In the third row, 0.851222 is the re-combination node.
0.688518 0.851222 1.05238
0.760528 0.940249 1.16244 1.43714
0.711589 0.879746 1.08764 1.34466 1.66242
0.709424 0.877069 1.08433 1.34057 1.65736 2.04901
0.705541 0.872268 1.0784 1.33323 1.64829 2.0378 2.51935
Unfortunately, I can’t put the entire tree here, since it will be hard to understand. click the image below to enlarge for the highlighted full tree view.
And since the price at maturity is 100.0, we recursively discount the cash flows through backward induction and try to converge to the given price with the spread ‘x’ over the benchmark. When we converge successfully, the spread ‘x’ is the OAS.
Here it converges to the price : 61.027
Again, click on the image below and enlarge to see the price tree. Sorry again for the inconvenience but the limitations of WordPress.
Through backward induction from the redemption price, we calculate the OAS as 355.189 bps .
Its easily noticeable that the rates on the left side are continuously falling while on the right-most are continuously rising. This is not very realistic.
For Model validation purpose, let’s move forward and calculate the effective duration.
To calculate the effective duration, we need to bump the rate curve by say 10 basis points up and down and calculate the prices p1 and p2.
Effective Duration as we know is defined as
Using Python / Matplotlib library , its easy to generate how the original/parallel bumped up / down yield curve looks like.
Note the red curve is the original curve and it’s bumped up (blue) and down (green) to get the price sensitivity w.r.t yield
Price with the yield curve bumped up by 30 bps: 60.5175
Price with the yield curve bumped down by 30 bps: 61.5297
Calculated Effective Duration: 8.29316 years.
Calculated Effective Convexity: 74.1095
Since the expected effective duration should be close to 10.0 years, it’s not quite right, which leads us to the point of trying out the Black Karasinsky Model which also has a mean reversion component to simulate more realistic movement of the short rate.
Black Karasinsky Model:
= the instantaneous short rate at time t
= Expected drift in period t,t+
= Short rate volatility or local volatility
= a Standard Brownian motion under a risk-neutral probability measure;
= cap/floor on the rate evolution
= speed of mean reversion
Again, the time discretized version for approximation would be:
C++ code generated balanced Tree looks something like this.
0.506482 0.626169 0.77414
0.584735 0.722914 0.893746 1.10495
0.675082 0.834612 1.03184 1.27568 1.57713
0.709661 0.877362 1.08469 1.34102 1.65791 2.0497
0.706319 0.87323 1.07958 1.3347 1.65011 2.04004 2.52213
0.684033 0.845678 1.04552 1.29259 1.59804 1.97568 2.44255 3.01975
0.654734 0.809455 1.00074 1.23722 1.52959 1.89105 2.33793 2.89041 3.57344
0.620599 0.767254 0.948564 1.17272 1.44985 1.79246 2.21604 2.73972 3.38714 4.18756
Click on the image below to view the entire tree:
And the corresponding Price Tree and fitted OAS:
The OAS calculated here is: 359.135 bps as opposed to 355.189 bps (calculated through BDT model) previously. Since this OAS causes more sensitivity, the effective duration would be slightly higher. As expected, it comes out to be: 10.2225 (~ 10.0) years which is reasonably acceptable for a zero coupon bond maturing in 10 years.
So far, so good. Lets make this bond say a 5% coupon bond, semi-annual frequency, with a price of par (yield = 5%) and do the calculation similarly. You would expect the OAS to be fairly close but the effective duration falling. Let’s check it out.
The OAS as expected comes out to be: 366.011 bps which is somewhat close to 359.135 bps (for zero coupon) because they have the same yields (5.0%) but periodic cash flows which can be re-invested. The effective duration comes out to be: 8.47687 years (< 10.0 years), since again, due to its periodic cash flows, it seems to be relatively less riskier than its zero counterpart.
Finally, let’s assess the OAS values for bullet VS callable bond.
Let’s say we have a 5% coupon, 10 year bond, priced at 115.0,
and we get the following measures:
Effective Duration: 8.77167
which is reasonably fair.
What if it’s continuously callable at par after 5 years ? Because of this, it will be slightly cheaper. Let’s say it is: 113.5. What happens to the OAS and the effective duration now ?
Effective Duration: 4.56597
Look at the highlighted region in the top right. It captures the low rates region of the model implied interest rate evolution. From American option point of view, its the efficient exercise region and for a Mortgage Backed Security (we will get into this later), its the prepayment region. The price on these nodes were greater than the call price (par or 100.0), so I set them explicitly to 100.0. This will reduce the expected excess spread due to the optionality in comparison to the Bullet Bond.
As can be seen, though the callable bond appears to be a good deal as its cheaper (than its bullet counterpart), the expected excess spread is far lesser by almost 75bps . It does get compensated to some extent due to less (almost half) duration risk but again, the reinvestment rate would be much lesser to mark a huge loss for a total duration of 8.77167 years (bullet bond’s duration).
The significance of effective duration is that it probabilistically captures the optionality of the embedded option as the price goes into the efficient exercise region and slowly shifts the effective duration, which can reduce hedging costs in the long run.
Another point to not miss is that as we move the price up, more and more nodes in the lattice will come in the efficient exercise region of the option. The OAS will further reduce and so will the effective duration.
While these calculations make some intuitive sense, there are various shortcomings to these short-rate models. We assumed a flat volatility structure which is not always the case. The local volatility might as well be stochastic and to capture that effect, we need a 2-factor model. Also, this model only captures the short (term) rate evolution but ignores the rest of the yield curve. There can certainly be more factors which can affect the term structure.
So, its clearly evident that a good grasp of these measures is necessary to apply algorithmic tradingstrategies on them. In further posts, we will focus on a few more similar measures and the strategies around them.
Hope you enjoyed the post ! Appreciations are welcome and criticisms are even more welcome !!!
1. Opinions expressed are solely my own and do not express the views or opinions any of my employers.
2. The information from the Site is based on financial models, and trading signals are generated mathematically. All of the calculations, signals, timing systems, and forecasts are the result of backtesting, and are therefore merely hypothetical. Trading signals or forecasts used to produce our results were derived from equations which were developed through hypothetical reasoning based on a variety of factors. Theoretical buy and sell methods were tested against the past to prove the profitability of those methods in the past. Performance generated through back testing has many and possibly serious limitations. We do not claim that the historical performance, signals or forecasts will be indicative of future results. There will be substantial and possibly extreme differences between historical performance and future performance. Past performance is no guarantee of future performance. There is no guarantee that out-of-sample performance will match that of prior in-sample performance. The website does not claim or warrant that its timing systems, signals, forecasts, opinions or analyses are consistent, logical or free from hindsight or other bias or that the data used to generate signals in the backtests was available to investors on the dates for which theoretical signals were generated.