# Quantitative Finance Collector is a blog on Quantitative finance analysis, financial engineering methods in mathematical finance focusing on derivative pricing, quantitative trading and quantitative risk management. Random thoughts on financial markets and personal staff are posted at the sub personal blog.

Nov
13

I have written a working paper on CDS (credit default swap) implied stock volatility and found some interesting results. Post it here just in case someone is interested.

Both CDS and out-of-money put option can protect investors against downside risk, so they are related while not being mutually replaceable. This study provides a straightforward linkage between corporate CDS and equity option by inferring stock volatility from CDS spread and, thus, enables a direct analogy with the implied volatility from option price. I find CDS inferred volatility (CIV) and option implied volatility (OIV) are complementary, both containing some information that is not captured by the other. CIV dominates OIV in forecasting stock future realized volatility. Moreover, a trading strategy based on the CIV-OIV mean reverting spreads generates significant risk-adjusted return. These findings complement existing empirical evidence on cross-market analysis.

Jan
31

A paper published in the Journal of Portfolio Management, 2013, Vol. 39, No. 2: pp. 28-40, by Alexandre Hocquard, Sunny Ng, and Nicolas Papageorgiou.

The idea is simple, easy to implement, has a good performance based on the authors' results.

Journal paper, Working paper.

Since Lehman Brothers collapsed in 2008, tail-risk hedging has become an increasingly important concern for investors. Traditional approaches, such as purchasing options or variance swaps as insurance, are often expensive, illiquid, and result in a substantial drag on performance. A more prudent, cost-effective way to maintain a constant risk exposure is to actively manage portfolio exposure according to the prevailing volatility level within underlying assets. The authors implement a robust methodology based on Dybvig’s payoff distribution model to target a constant level of volatility and normalize monthly returns. This approach to portfolio and risk management can help investors obtain their desired risk exposures over both short and longer time frames, reduce exposure to tail risk, and in general increase portfolios’ risk-adjusted performance.

The idea is simple, easy to implement, has a good performance based on the authors' results.

Journal paper, Working paper.

Oct
11

A paper forthcoming in The Econometrics Journal by Qu and Perron, worth to read carefully.

Journal paper, Working paper in PDF

This paper proposes a framework for the modeling, inference and forecasting of volatility in the presence of level shifts of unknown timing, magnitude and frequency. First, we consider a stochastic volatility model comprising both a level shift and a short-memory component, with the former modeled as a compound binomial process and the latter as an AR(1). Next, we adopt a Bayesian approach for inference and develop algorithms to obtain posterior distributions of the parameters and the two latent components. Then, we apply the model to daily S&P 500 and NASDAQ returns over the period 1980.1–2010.12. The results show that although the occurrence of a level shift is rare, about once every two years, this component clearly contributes most to the variation in the volatility. The half-life of a typical shock from the AR(1) component is short, on average between 9 and 15 days. Interestingly, isolating the level shift component from the overall volatility reveals a stronger relationship between volatility and business cycle movements. Although the paper focuses on daily index returns, the methods developed can potentially be used to study the low frequency variation in realized volatility or the volatility of other financial or macroeconomic variables.

Journal paper, Working paper in PDF

Sep
29

A great paper by Cartea, Álvaro and Karyampas, Dimitrios, published in Applied Mathematical Finance, Volume 19, Number 6, 1 December 2012 , pp. 535-552(18).

Journal paper, Working paper

We test the performance of different volatility estimators that have recently been proposed in the literature and have been designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F), a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary, Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different assumptions about the path properties of stock dynamics are used.

Journal paper, Working paper

Jul
31

A nice paper written by Han and Zhang (2012) in The Econometrics Journal.

article, or working paper.

We investigate a new non-stationary non-parametric volatility model, in which the conditional variance of time series is modelled as a non-parametric function of an integrated or near-integrated covariate. Importantly, the model can generate the long memory property in volatility and allow the unconditional variance of time series to be time-varying. These properties cannot be derived from most existing non-parametric or semi-parametric volatility models. We show that the kernel estimate of the model is consistent and its asymptotic distribution is mixed normal. For an empirical application of the model, we study the daily S&P 500 index return volatility using the VIX index as the covariate. It is shown that our model performs reasonably well both in within-sample and out-of-sample forecasts.

article, or working paper.