Mar
5

## VaR Historical Simulation

Following Value at Risk xls and var backtesting, a third post about using historical simulation for Value at Risk calculation. We know one shortcoming of historical simulation is: the result highly depends on the choice of sample data length, VaR result does not vary often or changes suddenly. Despite this weakness, HS is still popular due to its obvious advantage: easy to implement, and no distribution assumption required, which is especially appealing if the estimate of distribution assumption is difficult. Several ways have been proposed to improve HS's performance, here are two selected methods with good results I personally use.

1, The Best of Both Worlds: A Hybrid Approach to Calculating Value at Risk by Jacob Boudoukh1, Matthew Richardson and Robert F. Whitelaw. By hybrid it means this approach is a combination of RiskMetrics's parametric method and Historical Simulation. The basic idea is: since we can allocate larger weight to recent data and smaller weight to remote data for exponential weighted moving average (EWMA) volatility calculation, hence improves the backtesting performance of parametric method, why can't we then apply a similar principle to historical simulation? make sense? so it estimates the VaR of a portfolio by applying exponentially declining weights to past returns and then finding the appropriate percentile of this time weighted empirical distribution. The following results are from the paper

2, INCORPORATING VOLATILITY UPDATING INTO THE HISTORICAL SIMULATION METHOD FOR VALUE AT RISK by John Hull and Alan White. The idea is to "adjust" return based on the ratio of current volatility to the past volatility, and use historical simulation on the adjusted returns. Their argument is supposing today's volatility is 20%, while volatility was say, 30%, then past returns obviously exaggerate the current market situation if used directly. They even compare their performance with the first one above and the results are:

source from

Results are promising, aren't they? few lines of codes are enough for the adjustment.

Hot posts:

15 Incredibly Stupid Ways People Made Their Millions

Online stock practice

Ino.com: Don't Join Marketclub until You Read This MarketClub Reviews

World Changing Mathematical Discoveries

Value at Risk xls

Random posts:

VIX calculation

Kalman Filter Finance Revisited

MFE toolbox

Mathematics is everywhere

Days in Sydney

1, The Best of Both Worlds: A Hybrid Approach to Calculating Value at Risk by Jacob Boudoukh1, Matthew Richardson and Robert F. Whitelaw. By hybrid it means this approach is a combination of RiskMetrics's parametric method and Historical Simulation. The basic idea is: since we can allocate larger weight to recent data and smaller weight to remote data for exponential weighted moving average (EWMA) volatility calculation, hence improves the backtesting performance of parametric method, why can't we then apply a similar principle to historical simulation? make sense? so it estimates the VaR of a portfolio by applying exponentially declining weights to past returns and then finding the appropriate percentile of this time weighted empirical distribution. The following results are from the paper

*The Best of Both Worlds: A Hybrid Approach to Calculating Value at Risk*, page 11. It does improve compared with the vanilla historical simulation and EWMA parametric method, nice.2, INCORPORATING VOLATILITY UPDATING INTO THE HISTORICAL SIMULATION METHOD FOR VALUE AT RISK by John Hull and Alan White. The idea is to "adjust" return based on the ratio of current volatility to the past volatility, and use historical simulation on the adjusted returns. Their argument is supposing today's volatility is 20%, while volatility was say, 30%, then past returns obviously exaggerate the current market situation if used directly. They even compare their performance with the first one above and the results are:

source from

*INCORPORATING VOLATILITY UPDATING INTO THE HISTORICAL SIMULATION METHOD FOR VALUE AT RISK*page 17.Results are promising, aren't they? few lines of codes are enough for the adjustment.

**People viewing this post also viewed:**

Hot posts:

Random posts:

Simon

2011/04/07 01:07 [Add/Edit reply] [Clear reply] [Del comment] [Block]

I have read about those methods in Carol Alexander book. Simple improvements with good results. Anyone would like to share a code of it? I am able to do it only manually ;).

J

2011/10/20 04:14 [Add/Edit reply] [Clear reply] [Del comment] [Block]

is there an update to the XLS for this?

Pages: 1/1 1