MFA annual conference provides a forum for the interaction of finance academics and practitioners to share scholarly activity and current practice so as to encourage and facilitate the betterment of the profession. Below I select several papers with download links that are of interest to me, it is by no means a list of top quality of the conference though.

paper

paper

paper

paper

paper

paper

paper

Tags - conference

An adequate risk-adjusted return performance measure to select investment funds is crucial for financial analysts and investors. Sharpe ratio has become a standard measure by adjusting the return of a fund by its standard deviation (Sharpe, 1966), nevertheless, practitioners often question this measure mainly for its invalidity if the distribution of fund returns is beyond normal (Kao, 2002; Amin and Kat, 2003; Gregoriou and Gueyie, 2003, Cavenaile, et al, 2011, Di Cesare, et al, 2014). Several new measures have been proposed and investigated to overcome this limitation of the Sharpe ratio, however, Eling (2008)

finds choosing a performance measure is not critical to mutual fund evaluation, Eling and Schuhmacher (2007) compare the Sharpe ratio with 12 other measures for hedge funds and conclude that the Sharpe ratio and other measures generate virtually identical rank ordering, despite the significant deviations from normal distribution. Similar evaluation includes Eling and Faust (2010) on funds in emerging markets, Auer and Schuhmacher (2013) on hedge funds, and Auer (2015) on commodity investments.

This paper proves that several widely used performance measures are monotonic if the distribution of asset returns is a LS family, a family of univariate probability distributions parametrized by a location and a non-negative scale parameters that is commonly applied in finance (Levy and Duchin, 2004). Our proof certifies the empirical findings in other studies on the indifference of choosing a performance measure when valuing a fund. We show that those measures generate virtually the same rank ordering using monthly mutual fund return data from 1997 to 2005 and Monte-Carlo simulations. Therefore this paper contributes to both the academia and industry by clarifying the phenomenon.

For example, the below figure plots the correlation and confidence intervals based on 2000 simulations for each sample size. For simplicity, we show the results for the Sharpe (ρ1), the Sharpe-Omega (ρ2) and the Sortino ratio (ρ3) only. Consistent with the previous finding, the rank correlation among these performance measures is roughly equal, and is approaching one with the increase of sample size.

Tags - sharpe-ratio , mutual-fund , performance

Quotation

Both CDS and out-of-money put option can protect investors against downside risk, so they are related while not being mutually replaceable. This study provides a straightforward linkage between corporate CDS and equity option by inferring stock volatility from CDS spread and, thus, enables a direct analogy with the implied volatility from option price. I find CDS inferred volatility (CIV) and option implied volatility (OIV) are complementary, both containing some information that is not captured by the other. CIV dominates OIV in forecasting stock future realized volatility. Moreover, a trading strategy based on the CIV-OIV mean reverting spreads generates significant risk-adjusted return. These findings complement existing empirical evidence on cross-market analysis.

Click to download

Tags - cds , volatility

Quotation

In the current literature, the analytical tractability of discrete time option pricing models is guaranteed only for rather specific types of models and pricing kernels. We propose a very general and fully analytical option pricing framework, encompassing a wide class of discrete time models featuring multiple-component structure in both volatility and leverage, and a flexible pricing kernel with multiple risk premia. Although the proposed framework is general enough to include either GARCH-type volatility, Realized Volatility or a combination of the two, in this paper we focus on realized volatility option pricing models by extending the Heterogeneous Autoregressive Gamma (HARG) model of Corsi et al. (2012) to incorporate heterogeneous leverage structures with multiple components, while preserving closed-form solutions for option prices. Applying our analytically tractable asymmetric HARG model to a large sample of S&P 500 index options, we demonstrate its superior ability to price out-of-the-money options compared to existing benchmarks.

http://www.sciencedirect.com/science/article/pii/S0304407615000615

Quotation

Volatility clustering, long-range dependence, and non-Gaussian scaling are stylized facts of financial assets dynamics. They are ignored in the Black & Scholes framework, but have a relevant impact on the pricing of options written on financial assets. Using a recent model for market dynamics which adequately captures the above stylized facts, we derive closed form equations for option pricing, obtaining the Black & Scholes as a special case. By applying our pricing equations to a major equity index option dataset, we show that inclusion of stylized features in financial modeling moves derivative prices about 30% closer to the market values without the need of calibrating models parameters on available derivative prices.

http://www.sciencedirect.com/science/article/pii/S0304407615000585

Quotation

We analyze the high-frequency dynamics of S&P 500 equity-index option prices by constructing an assortment of implied volatility measures. This allows us to infer the underlying fine structure behind the innovations in the latent state variables driving the evolution of the volatility surface. In particular, we focus attention on implied volatilities covering a wide range of moneyness (strike/underlying stock price), which load differentially on the different latent state variables. We conduct a similar analysis for high-frequency observations on the VIX volatility index as well as on futures written on it. We find that the innovations over small time scales in the risk-neutral intensity of the negative jumps in the S&P 500 index, which is the dominant component of the short-maturity out-of-the-money put implied volatility dynamics, are best described via non-Gaussian shocks, i.e., jumps. On the other hand, the innovations over small time scales of the diffusive volatility, which is the dominant component in the short-maturity at-the-money option implied volatility dynamics, are best modeled as Gaussian with occasional jumps.

http://www.sciencedirect.com/science/article/pii/S0304407615000627

Quotation

The paper proposes a general asymmetric multifactor Wishart stochastic volatility (AMWSV) diffusion process which accommodates leverage, feedback effects and multifactor for the covariance process. The paper gives the closed-form solution for the conditional and unconditional Laplace transform of the AMWSV models. The paper also suggests estimating the AMWSV model by the generalized method of moments using information not only of stock prices but also of realized volatilities and co-volatilities. The empirical results for the bivariate data of the NASDAQ 100 and S&P 500 indices show that the general AMWSV model is preferred among several nested models.

http://www.sciencedirect.com/science/article/pii/S0304407615000548

Quotation

We introduce a tractable class of multi-factor price processes with regime-switching stochastic volatility and jumps, which flexibly adapt to changing market conditions and permit fast option pricing. A small set of structural parameters, whose dimension is invariant to the number of factors, fully specifies the joint dynamics of the underlying asset and options implied volatility surface. We develop a novel particle filter for efficiently extracting the latent state from joint S&P 500 returns and options data. The model outperforms standard benchmarks in- and out-of-sample, and remains robust even in the wake of seemingly large discontinuities such as the recent financial crisis.

http://www.sciencedirect.com/science/article/pii/S0304407615000597

Quotation

Assume that St is a stock price process and Bt is a bond price process with a constant continuously compounded risk-free interest rate, where both are defined on an appropriate probability space P. Let yt=log(St/St−1). yt can be generally decomposed into a conditional mean plus a noise with volatility components, but the discounted St is not a martingale under P. Under a general framework, we obtain a risk-neutralized measure Q under which the discounted St is a martingale in this paper. Using this measure, we show how to derive the risk neutralized price for the derivatives. Special examples, such as NGARCH, EGARCH and GJR pricing models, are given. Simulation study reveals that these pricing models can capture the “volatility skew” of implied volatilities in the European option. A small application highlights the importance of our model-based pricing procedure.

http://www.sciencedirect.com/science/article/pii/S030440761500055X

Tags - option

Quotation

Using the Chinese stock market data from 1997 to 2013, this paper examines the “Sell in May and Go Away” puzzle first identified by Bouman and Jacobsen (2002). We find strong existence of the Sell in May effect, robust to different regression assumptions, industries, and after controlling for the January or February effect. However, part of the puzzle is subsumed by the seasonal affective disorder effect. We then construct a trading strategy based on this puzzle, and find that it outperforms the buy-and-hold strategy and could resist the market downside risk during large recession periods.

As the abstract suggests, basically we aim to examine whether the sell-in-may phenomenon existed in developed country also happens in China, and if Yes, if there is any special reason to explain it, which has implications for those international investors as MSCI plans to add Chinese A shares to its emerging index from May 2015, and as the recent China's stock market plan that permits Hong Kong investors to trade designated stocks in Shanghai Exchange market directly. People would expect investing in China provides a diversified strategy.

“Sell in May and Go Away” puzzle means that stocks have higher returns in the November-April period than the May-October period, in this paper we first run a dummy regression that assign dummy=0 when the date t is in the May-October period, and dummy=1 when otherwise. We find the dummy variable is highly significant, not driven by a specific industry, and cannot be explained by well-known January or February effect, nor by time-varying risk, nevertheless, time-varying risk aversion approximated by the SAD (seasonal affective disorder) effect by Kamstra, et al. (2003) subsumes part of the Sell in May effect.

Then we test whether such a phenomenon could generate any economic benefit, We construct a trading strategy that buys the Chinese stock market at the beginning of November and sells it at the end of April of the next year. We save the capital in a bank earning a risk-free floating deposit rate from the beginning of May to the end of October . Our benchmark is a buy-and-hold strategy. This simple trading strategy is shown to outperform the buy-and-hold strategy and can protect investors from dramatic losses during large recession, as shown in belowing Table and Figure.

Sell in May strategy Buy-and-hold strategy

Return 13.03% 7.50%

Sharpe ratio 0.6002 0.2199

Maximum drawdown 27.00% 69.30%

Downside deviation 2.98% 5.34%

Historical VaR (95%) 6.86% 11.20%

Leland’s alpha 8.69%

The short paper is at http://www.sciencedirect.com/science/article/pii/S1544612314000579

Quotation

We propose a new definition of skill as a general cognitive ability to either pick stocks or time the market at different times. We find evidence for stock picking in booms and for market timing in recessions. Moreover, the same fund managers that pick stocks well in expansions also time the market well in recessions. These fund managers significantly outperform other funds and passive benchmarks. Our results suggest a new measure of managerial ability that gives more weight to a fund’s market timing in recessions and to a fund’s stock picking in booms. The measure displays far more persistence than either market timing or stock picking alone and can predict fund performance.

Paper.

Tags - mutual-fund , skill

Quotation

We propose a model of portfolio selection that adjusts an investors’ portfolio allocation in accordance with changing market liquidity environments and market conditions. We found that market liquidity provides a useful “leading indicator” in dynamic asset allocation. Specifically, market liquidity risk premium cycles anticipate economic and market cycles. Investors can therefore act to avoid markets with low liquidity premiums, waiting to extract liquidity risk premiums when the likelihood of extracting a liquidity premium improves. The result, meaningfully enhanced portfolio performance through economic and market cycles, and is robust to transactions costs and alternate specifications.

Basically this article examines a portfolio strategy that buys stocks and sells bonds when the market is less liquid, thus enjoying a higher liquidity premium, this strategy outperforms a benchmark with equal weights on stocks and bonds by generating a higher sharpe ratio and positive alpha.

Journal paper Working paper

Tags - liquidity , portfolio , allocation

Quotation

We propose that fund performance can be predicted by its R2, obtained from a regression of its returns on a multifactor benchmark model. Lower R2 indicates greater selectivity, and it significantly predicts better performance. Stock funds sorted into lowest-quintile lagged R2 and highest-quintile lagged alpha produce significant annual alpha of 3.8%. Across funds, R2 is positively associated with fund size and negatively associated with its expenses and manager's tenure.

Journal paper, Working paper.

Tags - mutual-fund , prediction

Quotation

Since Lehman Brothers collapsed in 2008, tail-risk hedging has become an increasingly important concern for investors. Traditional approaches, such as purchasing options or variance swaps as insurance, are often expensive, illiquid, and result in a substantial drag on performance. A more prudent, cost-effective way to maintain a constant risk exposure is to actively manage portfolio exposure according to the prevailing volatility level within underlying assets. The authors implement a robust methodology based on Dybvig’s payoff distribution model to target a constant level of volatility and normalize monthly returns. This approach to portfolio and risk management can help investors obtain their desired risk exposures over both short and longer time frames, reduce exposure to tail risk, and in general increase portfolios’ risk-adjusted performance.

The idea is simple, easy to implement, has a good performance based on the authors' results.

Journal paper, Working paper.

Tags - volatility , tail , risk , portfolio

Quotation

Portfolio optimization problems involving value at risk (VaR) are often computationally intractable and require complete information about the return distribution of the portfolio constituents, which is rarely available in practice. These difficulties are compounded when the portfolio contains derivatives. We develop two tractable conservative approximations for the VaR of a derivative portfolio by evaluating the worst-case VaR over all return distributions of the derivative underliers with given first- and second-order moments. The derivative returns are modelled as convex piecewise linear or—by using a delta–gamma approximation—as (possibly nonconvex) quadratic functions of the returns of the derivative underliers. These models lead to new worst-case polyhedral VaR (WPVaR) and worst-case quadratic VaR (WQVaR) approximations, respectively. WPVaR serves as a VaR approximation for portfolios containing long positions in European options expiring at the end of the investment horizon, whereas WQVaR is suitable for portfolios containing long and/or short positions in European and/or exotic options expiring beyond the investment horizon. We prove that—unlike VaR that may discourage diversification—WPVaR and WQVaR are in fact coherent risk measures. We also reveal connections to robust portfolio optimization.

Journal, Working paper in PDF.

Tags - var , nonlinear , risk

"How to Combine Long and Short Return Histories Efficiently" is a good paper forthcoming in Financial Analysts Journal by Sébastien Page, as introduced

Quotation

A common challenge in portfolio risk analysis is that certain assets have shorter return histories than others. Unfortunately, many standard portfolio risk analysis techniques—including historical tail risk measurement, regime-dependent risk analysis, and bootstrapping simulations—require full return histories for all assets or risk factors. The author presents easy instructions on how to efficiently combine data for investments whose histories differ in length and offers a new model to better account for non-normal distributions.

An important feature of this paper is instead of assuming that the uncertainty around the backfilled returns is normally distributed, the model samples empirical residuals from the short sample. Evidence shows this method is efficient. The author also provides Matlab code in the Appendix for us to play around.

Paper

Tags - missing , imputation , mle , em , distribution

Below are three sets of frequently asked questions (FAQs) that relate to counterparty credit risk, including the default counterparty credit risk charge, the credit valuation adjustment (CVA) capital charge and asset value correlations. More sets may be forthcoming, stay tuned.

First set

Second set

Third set

Tags - risk , counterparty , basel , credit , cva

Quotation

Going beyond the simple bid–ask spread overlay for a particular Value at Risk, the author introduces an innovative framework that integrates liquidity risk, funding risk, and market risk. He overlaid a whole distribution of liquidity uncertainty on future market risk scenarios and allowed the liquidity uncertainty to vary from one scenario to another, depending on the liquidation or funding policy implemented. The result is one easy-to-interpret, easy-to-implement formula for the total liquidity-plus-market-risk profit and loss distribution.

Journal paper, Working paper

Tags - liquidity , var , risk

Quotation

This paper proposes a framework for the modeling, inference and forecasting of volatility in the presence of level shifts of unknown timing, magnitude and frequency. First, we consider a stochastic volatility model comprising both a level shift and a short-memory component, with the former modeled as a compound binomial process and the latter as an AR(1). Next, we adopt a Bayesian approach for inference and develop algorithms to obtain posterior distributions of the parameters and the two latent components. Then, we apply the model to daily S&P 500 and NASDAQ returns over the period 1980.1–2010.12. The results show that although the occurrence of a level shift is rare, about once every two years, this component clearly contributes most to the variation in the volatility. The half-life of a typical shock from the AR(1) component is short, on average between 9 and 15 days. Interestingly, isolating the level shift component from the overall volatility reveals a stronger relationship between volatility and business cycle movements. Although the paper focuses on daily index returns, the methods developed can potentially be used to study the low frequency variation in realized volatility or the volatility of other financial or macroeconomic variables.

Journal paper, Working paper in PDF

Tags - stochastic , volatility

Quotation

We test the performance of different volatility estimators that have recently been proposed in the literature and have been designed to deal with problems arising when ultra high-frequency data are employed: microstructure noise and price discontinuities. Our goal is to provide an extensive simulation analysis for different levels of noise and frequency of jumps to compare the performance of the proposed volatility estimators. We conclude that the maximum likelihood estimator filter (MLE-F), a two-step parametric volatility estimator proposed by Cartea and Karyampas (2011a; The relationship between the volatility returns and the number of jumps in financial markets, SSRN eLibrary, Working Paper Series, SSRN), outperforms most of the well-known high-frequency volatility estimators when different assumptions about the path properties of stock dynamics are used.

Journal paper, Working paper

Tags - volatility

A paper "

Quotation

This paper proposes a simple model for incorporating wrong-way and right-way risk into CVA (credit value adjustment) calculations. These are the calculations, involving Monte Carlo simulation, made by a dealer to determine the reduction in the value of its derivatives portfolio because of the possibility of a counterparty default. The model assumes a relationship between the hazard rate of the counterparty and variables whose values can be generated as part of the Monte Carlo simulation. Numerical results for portfolios of 25 instruments dependent on five underlying market variables are presented. The paper finds that wrong-way and right-way risk have a significant effect on the Greek letters of CVA as well as on CVA itself. It also finds that the percentage effect depends on the collateral arrangements.

Article, Working paper.

Tags - cva , default , credit , crisis , portfolio

Quotation

We propose a new method for measuring the quality of banks' credit portfolios. This method makes use of information embedded in bank share prices by exploiting differences in their sensitivity to credit default swap spreads of borrowers of varying quality. The method allows us to derive a credit risk indicator (CRI). This indicator represents the perceived share of high-risk exposures in a bank's portfolio and can be used as a risk weight for computing regulatory capital requirements. We estimate CRIs for the 150 largest U.S. bank holding companies. We find that their CRIs are able to forecast bank failures and share price performances during the crisis of 2007–2009, even after controlling for a variety of traditional asset quality and general risk proxies.

Article, Working paper

Tags - crisis , cds , credit , risk , bank

Quotation

We investigate a new non-stationary non-parametric volatility model, in which the conditional variance of time series is modelled as a non-parametric function of an integrated or near-integrated covariate. Importantly, the model can generate the long memory property in volatility and allow the unconditional variance of time series to be time-varying. These properties cannot be derived from most existing non-parametric or semi-parametric volatility models. We show that the kernel estimate of the model is consistent and its asymptotic distribution is mixed normal. For an empirical application of the model, we study the daily S&P 500 index return volatility using the VIX index as the covariate. It is shown that our model performs reasonably well both in within-sample and out-of-sample forecasts.

article, or working paper.

Tags - non-parametric , volatility , vix

Tags - forecast , portfolio , strategy , r , correlation , volatility , skew

Tags - liquidity , risk , var , option , return , jpmorgan

Tags - return , forecast , volatility , big-data

An accurate estimation of VIX is obviously important given its special role as the fear gauge, there is extensive literature trying to do so, among them, mean-reverting models are especially popular. The authors compare eight different mean-reverting models, with each having different mean reversion speed or diffusion term, specifically, they can be summarized as follows in table 2.1:

Using VIX index values between 1990 and 2009, the authors estimate parameters of the eight models by generalized method of moments (GMM) approach, and calculate the root mean squared error (RMSE),

where equation (1) and (2) are two measures of error term.

Another big contribution of this study is the authors derive a closed form solution for a European call option under Model 7. The option pricing performance of model 7 outperforms other candidates as well.

What's nice of the

Tags - volatility , stochastic , vix , option

Tags - trading , strategy , var , correlation , machine , cds , optimization

Quotation

Numerous issues have arisen over the past few decades relating to the implied volatility smile in the options market; however, the extant literature reveals that relatively little effort has thus far been placed into comparing the various implied volatility models, essentially as a result of the lack of any theoretical foundation on which to base such comparative analysis. In this study, we use a comprehensive options database and employ methods of combining the various hypothesis tests to compare the different implied volatility models. To the best of our knowledge, this is the first study of its kind to address this issue using combination tests. **Our empirical results reveal that the linear piecewise model is the most appropriate model for capturing the implied volatility smile**, with additional robustness checks confirming the validity of this finding.

Read the paper at http://onlinelibrary.wiley.com/doi/10.1002/fut.20549/abstract.

Tags - volatility , parametric

Quotation

We develop a new approach to approximating asset prices in the context of continuous-time models. For any pricing model that lacks a closed-form solution, we provide a solution, which relies on the approximation of the intractable model through a known, "auxiliary" one. We derive an expression for the difference between the true (but unknown) price and the auxiliary one, which we approximate in closed-form, and use to create increasingly improved refinements to the initial mispricing induced by the auxiliary model. The approach is intuitive, simple to implement and leads to fast and extremely accurate approximations. We illustrate this method in a variety of contexts, including option pricing with stochastic volatility, volatility contracts and the term-structure of interest rates.

A working paper is available at http://w4.stern.nyu.edu/volatility/docs/Kristensen.pdf

Tags - option , black scholes , no-arbitrage

In the paper

1. If stocks appear undervalued relative to corporate bonds, go long stocks.

2. If stocks appear overvalued relative to corporate bonds, exit stock positions and buy short-term Treasuries.

the back-test of the strategy captures 65% of upside equity moves on a monthly basis while only taking 21% of the downside.

A comparison of this strategy with buy-and-hold is summarized

For detail please refer to the paper

Tags - allocation , strategy

Contents include:

Financial markets, prices and risk

Univariate volatility modeling

Multivariate volatility models

Risk measures

Implementing risk forecasts

Analytical value-at-risk for options and bonds

Simulation methods for VaR for options and bonds

Backtesting and stress testing

Extreme value theory

Endogenous risk

You can download the Matlab and R codes at http://www.financialriskforecasting.com/book-code, I would recommend the book “

Tags - risk , forecast

Generally, range-based estimators assume that the price process follows a geometric Brownian motion, the authors start from two upward biased volatility estimates with zero-drift assumption. (O, C, H, and L denote the log of the opening, closing, highest, and lowest price, respectively)

All three estimators above are calculated assuming that stock trading is continuous, however, it is not in practice and discrete trading is therefore expected to cause a downward bias. To get rid of the bias, correction procedures are developed

So far the above mentioned estimators use daily data only, with the availability of intraday data and hence more information captured, Martens and van Dijk (2007) and Christensen and Podolskij (2007) combine the concepts of range based and realized volatility. Specifically, define a typical realized range as

Instead of the scaling factor 0.3607,

with lambda being the second moment of a standard Brownian motion over a unit interval and can be simulated.

Finally the authors compare all of those estimators using 25 German stocks and the two scales realized volatility of Zhang, Mykland, and Ait-Sahalia (2005) as a benchmark, they show that all estimators based on daily ranges are by far superior to the classical estimator, the realized range obtained from intraday ranges performs better in terms of both bias and efficiency, in addition, the bias correcting procedure developed by Christensen and Podolskij (2007) consistently outperform all other alternatives.

PS: all of the equations are from the paper

Tags - volatility

The book is a compilation of 25 essays written by very distinguished individuals that have had successful careers in the quantitative finance industry. They work in various areas including; market microstructure, derivatives pricing, risk management, and equity portfolio management.

For the most part, you will need more than a basic knowledge of finance to truly be able to grasp the book. However, if you are a mathematician looking to make a career change, it could provide you some motivation.

The essays let the readers inside of the lives of the authors, who go into great detail explaining how they became involved in the quantitative finance industry. Most people will find it quite surprising that many of the people who contributed to this book, previously worked in physics or math.

But, due to the end of the cold war, and a subsequent reduction in funding in those areas, they were forced to find employment elsewhere. This is just fantastic for anybody that has ever lost a job, and thinks their world is coming to an end. It just goes to show you, that losing a job might be the best thing that can ever happen to you.

The writers lead you down many different and interesting paths, while letting you know how they got their start in the industry. Believe it or not, most of the time it was because of luck, knowing somebody, or being in the right place at the right time. Each writer also discusses their individual area of expertise, and their main achievements within those areas.

Many of the writers have PhD’s, and write like they have PhD’s. In other words, they are trying to impress the readers by letting them know how smart they are, by writing over their heads and constantly name dropping.

Look, the readers already know how smart you are, or you would not be in the book. If they were really as smart as they attempted to appear, they would have known to write in a style that most people could understand, instead of writing like an intellectual, for other intellectuals.

The editors of the book Richard R. Lindsey and Barry Schachter could have, and should have done a much better job reviewing and fixing the problems with the book. First, there are numerous typos, grammar errors, and misspelling in the book.

Second, it would have not been that hard to rewrite the original author’s material so that it would have had a much wider appeal. More than likely, they did not understand what most the writers of the essays were talking about either, which made adjusting it almost impossible.

For the reasons mentioned above, we can only rate “How I Became a Quant: Insights from 25 of Wall Street's Elite”, three stars out of five. You should consider acquiring the book only if you are presently a mathematician looking to make a career change, a university student studying in this area, or a person who is already employed within the quantitative finance industry looking for some inspiration, or a means to advance within your profession.

Tags - quant , wall-street

So, you will be soon graduating, and looking for your first position in quantitative finance. At this time, you are a little nervous since you have never interviewed for such good paying jobs previously, and you are wondering if your interviewing skills are up to par?

First, all new graduates feel exactly the same way as you do presently, regardless of the field they are seeking employment in. Second, you should be more than just a little scared, because more than likely your interviewing skills are not just bad, they are terrible.

If for no other reason than the above two statements, you should strongly considering obtaining “

What we really like about this book, and is SO important that it can NOT be over-stated enough. It contains over 200 real world interview questions with ANSWERS, that you can and will be asked in quantitative finance interviews.

The following is an example of a question that is not related to quantitative finance, but is asked in most interviews for high level positions. This question does not appear in the book, but it will show you the importance of being prepared, and how to turn a negative into a positive.

Interviewer: What do you consider is your WORST working quality?

Interviewee: I tend to be a perfectionist, and I want to do everything to the best of my abilities at all times. Because of this, many nights I will bring home extra work with me just so I can be certain I have not missed anything, which upsets my family, since I am not spending time with them.

What have you accomplished by being ready for this almost always asked question? Instead of saying something bad about yourself, which you never want to do. You have turned the tables on the interviewer, and reinforced your commitment to the job, and your strong and dedicated working habits.

When you get done reading the book, you should take time to study both the questions and the answers. Then practice the answers while having your friends ask you the interview questions, and then let them critique you.

If you do that, and when the big day finally arrives, your principal problem will NOT be answering the questions in an interview for a quantitative finance job, but it will be NOT smiling when you are repeating the same statements you have made time and again.

We rate “A Practical Guide To Quantitative Finance Interviews” five stars out of five stars. It is perfect for anybody that is just graduating, has not obtained the position they desire, or feel that their interviewing skills could use a little improvement.

Tags - interview , quant , job

Any portfolio return r can be decomposed into

define the downside and upside partial moments as follows

our objective is to minimize the below Omega ratio, a known performance measure, subject to additional constraints such as long short weights.

The authors apply this method to their data and conclude: the Omega function selected well-performing portfolios in terms of final wealth. These portfolios, however, exhibited a higher volatility when compared with naive mean variance method. Also the Omega-portfolios exhibited a favorable asymmetry in returns, and generally thinner tails than mean-variance-portfolios.

For detail please refer to the original paper downloadable at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1464798.

Tags - portfolio , omega , optimization

This book was written specifically for the “disadvantaged candidate” who is seeking a job in quantitative finance. If you are planning on graduating from a top university such as Harvard, Princeton, or Stanford, you should not have too hard of a time finding an excellent position in this field; and the book will be of little benefit.

However, if you meet any or all of the criteria mention below, you should certainly acquire this book and put it to good use.

1)

2)

4) Very few if any recruiters visit your university looking to hire students in this industry.

5) You need a special visa, or a work permit to be legally employed in the country you are seeking work.

6) You have less than two years of work experience in this industry.

As you can readily see from the above list, most people other than the lucky few that were able to be selected to attend a highly thought of university, could and do benefit from this book.

This book is kind of a diary of what the author, James Lin had to go through to get a job in quantitative finance. If he could do it, then there is no reason you cannot successfully get the job you desire also.

At the present time this book is available only through Kindle, or though a special application that you can download from Amazon that allows you to read it on a PC.

The book itself covers most basic job search techniques, which are available in a ton of other locations. But, what makes it so useful, it that it teaches you to “Think out of the box”, and use other methods that more than likely you would of never considered yourself.

The book is very easy to read, understand, and most importantly of all, implement what is taught inside of it. There is not too much wasted space, filler, or knowledge that you will not find useful contained within its covers. The final chapter of the book teaches you how to get certified in C++ for very little money, which of course will later assist you in your job search.

Our rating for “DEMYSTIFYING THE JOB SEARCH PROCESS IN QUANTITATIVE FINANCE: a practical guide for entry-level quants” is five stars out of five stars. This is an extremely competitive industry, where just getting your foot in the door is often the difference between a lifetime of success or failure. If you meet any of the criteria mentioned above for the people that this publication would help, then it is a MUST have.

Tags - quant , job

This book is certainly not for the novice, who is new to the quantitative finance arena. It is for the professional that possesses exceptional mathematic skills, who really needs to understand everything there is about this industry at the highest possible level.

The people that will find it most useful are individuals whose work is concentrated on fixed income or derivatives. Other people who certainly should read this book are the ones looking for the first job in this discipline, or professionals that are already in it, who want to refresh and enhance their knowledge.

It is written in an unusual format, because it first asks a question, and then answers the question, and this configuration is repeated throughout the book. The book provides both a long and short answer to each question. Following each answer to a question, the book also provides references for you to review further if you require more detail information about that particular topic.

The following are a few of the mathematical areas discussed in the book Ito's lemma, the Black-Scholes model, maximum likelihood estimation, and what are the Greeks?

If you are looking for information on prevalent probability distributions and how they are utilized in finance, you might just find the following sections of the book appealing, common contracts, ten different ways to derive Black-Scholes, and brainteasers.

The book is centered on 60 FAQs, which are exceptionally well thought out, and provide a great deal of insight that most specialists in this matter will find useful. It is very practical and relevant for what is presently taking place in the derivatives industry.

For those of you that are first starting out in this industry, it should not be the first book you read. Instead, you might want to initially look into "Stochastic Calculus for Finance II: Continuous-Time Models" or "Options, Futures, and Other Derivatives and DerivaGem CD Package" and come back to this book after you have completed them.

The book “Frequently Asked Questions in Quantitative Finance” is very highly regarded by virtually everybody that has had an opportunity to read it. Our review also rates it five stars out of five stars. We consider it a must read, and keep on the shelf for all professionals in this industry that want to be able to perform their jobs at the highest levels.

Tags - quant

Quotation

The usage pattern is based on an offline phase to calibrate and generate model libraries. Valuation and simulation algorithms are planned offline with portfolio specific optimizations. The interactive user-driven phase includes a coherent global market simulation taking a few minutes and a real time data exploration phase with response time below 10 seconds.

Data exploration includes 3-dimensional risk visualization of portfolio loss distributions and sensitivities. It also includes risk resolution capability for outliers from the global portfolio level down to the single instrument level and hedge ratio optimization. The network bottleneck is bypassed by using heterogeneous boards with acceleration. The memory bottleneck is avoided at the algorithmic level by adapting the mathematical framework to revolve around a handful of compute-bound algorithms.

Data exploration includes 3-dimensional risk visualization of portfolio loss distributions and sensitivities. It also includes risk resolution capability for outliers from the global portfolio level down to the single instrument level and hedge ratio optimization. The network bottleneck is bypassed by using heterogeneous boards with acceleration. The memory bottleneck is avoided at the algorithmic level by adapting the mathematical framework to revolve around a handful of compute-bound algorithms.

A working paper is available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1844711.

Tags - counterparty , risk

This book is more for the person who is just beginning their career in quantitative trading, as opposed to the old-time experienced professionals. It is not too technical, therefore most people should find it is quite easy to read and understand.

In it you will learn some of the following, MACD under price oscillators, channel breakouts, dual moving average crossover, relative strength index stochastics, volatility breakout, and momentum trading.

Many of these approach’s where heavily used in the industry twenty years ago, and may not be viable options today. That being said, understanding the fundamentals of any discipline is extremely important, and learning a little bit of history never hurt anybody.

If you are using a software package like TradeStation, the book will teach you methods that you can utilize to develop your own computer code and trading system with. However, the book itself does not supply any code. In the book you will learn how to start with a simple straightforward concept, which later you can use to create a tool based on your own individual philosophies and personality.

There is very little, if any mathematical examples discussed in the book. Instead, the author attempts to supply you techniques or theories that you can utilize to cultivate your own models with. Probably the most important strategy you will learn in the book are “Money Management” skills, which are instructed based on the writers pass experience in this industry.

The final review of “Quantitative Trading Strategies: Harnessing the Power of Quantitative Techniques to Create a Winning Trading Program”, is neither positive or negative, since the value you receive from it will largely depend on what stage in your career you are at presently. If you are a hedge fund manager that has been doing quantitative trading for many years now, you will probably not get too much from it. If you are just starting to take an interest in this subject, you should probably acquire the book, since it will teach you a great deal of background information you will need to advance yourself in this industry.

Tags - trading , strategy

This is an exceptional book for new comers to the quantitative trading, because it is very easy to read and understand. It allows you to pick up the information required to start trading, and much more importantly, making money doing it using the methods instructed in this book.

There is one word of warning though, which is you need to think like a mathematician to utilize its techniques to the fullest. While the math skills needed to implement what is taught in the book, are not at too high of a level, evaluating the data requires a more systematic approach than most beginners possess when they are first starting out in this field.

It is therefore recommended, that if you do decide to uses the systems you learn in this book, you take your time, and do not invest actual funds, until you practice extensively your back end assessing abilities.

A few of the concepts you will learn in the book are Alpha (Which is an active trading strategy), and Beta (Which is a buy and hold approach). You will also be taught high level risk management skills that you can use with either styles of investing mentioned above.

You will learn how to calculate

What this book is not going to provide you are the meat and potato’s of quantitative trading that so many are looking for. Instead, you will get an overview of the entire field with a great deal of discussion on managing your portfolio.

If you are a long time serious quantitative trader, you might want to pass on this book. However, as with almost all books, there will tidbits of information that you do not know presently, that you could have learned from the book.

There is another person whose skills sets this book matches perfectly, which is the mathematician that has never invested before. Since the book was written by a mathematician, it takes the skills he already possessed, and teaches people that have the same skills, how to make money investing with them.

However, that being said, this is not the only book that you should read and study, if you are truly interested in mastering the art of quantitative trading. The book itself takes many of the most recognizable trading and investing concepts in this industry, and breaks them down into their most basic components.

In conclusion, “Inside the Black Box: The Simple Truth About Quantitative Trading”, does have its detractors, and the book is controversial. This is who should defiantly obtain a copy of the book, mathematician who have never invested previously, novices to quantitative trading, and experts in the field who would find the book useful, if they only where able to learn one helpful piece of information. Who should not buy the book, are specialist in this area, that think they know more about the subject than anybody else, including the author.

Tags - trading

This book is for the highly sophisticated and enduring investor. It is extremely detailed, as well as complicated, making it difficult reading for anybody that is not truly interested in learning more about intermarket analysis.

In it you will be taught why understanding the relationship between various countries economies is so important, and the rolls these associations play in the financial markets. You will come to appreciate why these connections are the solution to decoding both the intermediate and the long term trends that play out overtime.

You will learn about the four major market subdivisions based on the theory of a business cycle, and how the economy has gone through boom and bust periods over the past centuries. He will then teach you how to incorporate that knowledge with other economic factors to determine what stage the business cycle is presently in, and how this affects the overall economy.

Once you understand that, you will have a much better idea where to place both your cash and equity investments for the short term, as well as the long term. You will come to appreciate that although the stock market is always evolving and changing, there are none the less trends that have occurred time and again throughout its history, that are not only reliable, but also trustworthy.

Even though this book was written in 1991, the information revealed in it has become even more useful due to the ever increasing interdependencies of the markets since it was published. In fact, many of the theories and concepts that were first introduced in this book when it was initially released have been proven to be extremely useful for any earnest stock market investor.

If you are interested in being able to determine which way the market will be heading in the future, this book will certainly point you along the way. It will help equity investors, commodities traders, Forex traders, and well as those that participate in the futures markets. You will learn how to identify signs that signal turning points in the markets, which can and will help you recognize buying opportunities.

Intermarket Technical Analysis: Trading Strategies for the Global Stock, Bond, Commodity, and Currency Markets is considered one of, if not the best book ever written on this subject, and it is a must read for all individuals that sincerely want to learn more about the subject. In addition, you might want to consider also obtaining one of John Murphy other books, titled Technical Analysis of the Financial Markets: A Comprehensive Guide to Trading Methods and Applications to complement what you will learn in this book.

Tags - strategy , trading

The authors first review a few popular models for option pricing such as Black Scholes model, GARCH option pricing, and Stochastic volatility models, then they argue two possible sources of model misspecification, one is the omitted state variables, or factors, for instance, should we consider volatility smile? should we include jump into our pricing equation, etc. The other source of model misspecification is the functional form of the process for the state variables, including the specification of risk premiums associated with the state variables, this misspecification may be especially prone to error, or in another term, easily leads to model risk. Square root process or simple mean-reversion? or a combination of these two as some literature suggest.

In order to identify the necessary number of factors, the authors then use a nonparametric approach with state variables approximated by

By applying this methodology to S&P 500 index option, the authors find two factors are fairly enough, their results suggest that for S&P 500 options, adding jumps to the one-factor model with jump intensity and jump sizes is not enough, extending the volatility process to higher dimensions than two is of little use either. Therefore a promising direction to model options is to improve the specification of the two factor model.

Below is a residual analysis graph captured from the paper,

where M0 means without additional state variable, M1, M2 and M3 means one, two and three state variables, respectively, obviously for this example, two state variables perform very well, better than one state variable, and the extra gain of three variables is very small.

Should you are interested the nonlinear principal components analysis, I shared a Matlab toolbox at the post Nonlinear PCA toolbox, enjoy.

Tags - pca , option

This book covers subjects that would be considered basic to intermediate by most people in this industry. The book itself is more of an instruction manual of how to get started and become profitable in this field. While it does discuss some mathematical formulas, they are not too difficult to follow or implement.

The book goes over many of Dr. Chan’s past experiences as a trader, and lets you know what he has learned from them. He will tell what to do, as well as what not to do, based on his successes and failures.

The book itself does not provide a strict guideline for you to follow with your investments. So if you are looking for an approach that says do (A, B, C, and D) and you will start making money, this is not the book for you. It provides you more of a philosophical approach to investing, that you must think about deeply, to fully appreciate.

It does however discuss different investing strategies that you can investigate further on your own that are centered on Dr. Chan’s expertise, which is, long and short equity strategies. You will learn how to research and accumulate the proper data, how to select the appropriate approach to investing that matches your personality and goals, as well as back-testing, and choosing a good trading platform.

It does not go into as much detail as many of the more experienced investors would like, but it does supply you with some excellent resources that you can investigate on your own if you are seeking this kind of highly advanced knowledge.

If you are interested in building, or improving your home automated trading system, Dr. Chan will point you in the right direction without creating too many unnecessary distractions for you. Maybe the best part of the entire book is his letting you know some of the major mistakes that he made in his investing career, and how you can avoid these costly blunders, without actually losing any of your own money if you follow his advice.

If you are an extremely high level profitable investor, you might not get too much out of the “Quantitative Trading: How to Build Your Own Algorithmic Trading Business”, but you should read it anyway for its intrinsic value. For everybody else that is interested in this industry, it is a must read.

Tags - trading , strategy

You can download the ISDA CDS Standard Model source code at ISDA. Should you like to dig further, here is a list of CDS paper I personally feel useful to understand the pricing methodology:

Longstaff, Mithal et al. (2005) assume premium is paid continuously, set the values of the premium leg and protection leg equal to each other.

Pan and Singleton (2008) apply a reduced form model to Mexico, Turkey, and Korea sovereign CDS, show that a single-factor model for default spread following a lognormal process captures most of the variation in the term structures of spreads.

Nashikkar, Subrahmanyam et al. (2011) assume default process be constant and calculate CDS par yield in reduced-form framework.

Ren-Raw Chen (2008) assume risk-free rates and default rates are correlated and solve the CDS pricing model explicitly used reduced-form.

Hai Lin (2011) value corporate bonds and CDS simultaneously using reduced form model, for CDS part, the authors assume there are both default and non-default part, and solve the model by assuming the two parts are independent.

Jankowitsch, Pullirsch et al. (2008) attribute the difference between corporate bond yields and CDS premium to one covenant of CDS: cheapest-to-delivery option, and solve the covenant by relating it to recovery rate. Their empirical analysis doesn’t support liquidity premium.

Carr and Wu (2010) propose a dynamically consistent framework that allows joint valuation and estimation of stock options and credit default swaps written on the same reference company. By assuming the stock price follows a jump-diffusion process with stochastic volatility, the instantaneous default rate and variance rate follow a bivariate continuous process, the authors solve the reduced form model analytically.

Brigo and Alfonsi (2005) introduce two-dimensional correlated square-root diffusion (SSRD) model for interest-rate and default process, then price CDS with Monte Carlo simulation.

Zhang (2008) use a three-factor model, namely interest rates, firm-specific distress variable, and hazard rate. The author is able to link hazard rate with interest rates by assuming the former is a function of the latter, then he solves the model analytically and applies to Argentina sovereign CDS.

Merton (1974), Black and Cox (1976), RiskMetrics (2002)

Zhong, Cao et al. (2010) argue CDS is similar to out-of-the-money put options in that both offer a low cost and effective protection against downside risk. They then investigates that put option-implied volatility is an important determinant of CDS spreads.

Bedendo, Cathcart et al. (2009) use an extended version of RiskMetrics (2002) to find the gap between the model CDS premium and market premium is time varying and widens substantially in times of financial turbulence. The author notice that CDS liquidity shows a significant impact on the gap, and should therefore be included when pricing CDS contracts.

Bongaerts, de Jong et al. (2011) imply that the equilibrium expected returns on the hedge assets can be decomposed in several components: priced exposure to the non-hedge asset returns, hedging demand effects, an expected illiquidity component, liquidity risk premium and hedge transaction costs.

Bedendo, M., L. Cathcart, et al. (2009). "Market and Model Credit Default Swap Spreads: Mind the Gap!" European Financial Management: no-no.

Black, F. and J. C. Cox (1976). "Valuing Corporate Securities - Some Effects of Bond Indenture Provisions." Journal of Finance 31(2): 351-367.

Bongaerts, D., F. de Jong, et al. (2011). "Derivative Pricing with Liquidity Risk: Theory and Evidence from the Credit Default Swap Market." Journal of Finance 66(1): 203-240.

Brigo, D. and A. Alfonsi (2005). "Credit default swap calibration and derivatives pricing with the SSRD stochastic intensity model." Finance and Stochastics 9(1): 29-42.

Carr, P. and L. Wu (2010). "Stock Options and Credit Default Swaps: A Joint Framework for Valuation and Estimation." Journal of Financial Econometrics 8(4): 409-449.

Hai Lin, S. L., and Chunchi Wu (2011). "Dissecting Corporate Bond and CDS Spreads." The Journal of Fixed Income 20(3).

Jankowitsch, R., R. Pullirsch, et al. (2008). "The delivery option in credit default swaps." Journal of Banking & Finance 32(7): 1269-1285.

Longstaff, F. A., S. Mithal, et al. (2005). "Corporate yield spreads: Default risk or liquidity ? New evidence from the credit default swap market." Journal of Finance 60(5): 2213-2253.

Merton, R. C. (1974). "Pricing of Corporate Debt - Risk Structure of Interest Rates." Journal of Finance 29(2): 449-470.

Nashikkar, A., M. G. Subrahmanyam, et al. (2011). "Liquidity and Arbitrage in the Market for Credit Risk." Journal of Financial and Quantitative Analysis FirstView: 1-58.

Pan, J. U. N. and K. J. Singleton (2008). "Default and Recovery Implicit in the Term Structure of Sovereign CDS Spreads." The Journal of Finance 63(5): 2345-2384.

Ren-Raw Chen, X. C., Frank J. Fabozzi and Bo Liu (2008). "An Explicit, Multi-Factor Credit Default Swap Pricing Model with Correlated Factors." Journal of Financial and Quantitative Analysis 43.

RiskMetrics (2002). "CreditGrades™ Technical Document."

Zhang, F. X. (2008). "Market Expectations and Default Risk Premium in Credit Default Swap Prices: A Study of Argentine Default." Journal of Fixed Income 18(1).

Zhong, Z. D., C. Cao, et al. (2010). "The information content of option-implied volatility for credit default swap valuation." Journal of Financial Markets 13(3): 321-343.

Tags - cds

Quotation

The main contribution of the paper is the back-testing and comparison of market-neutral PCA- and ETF- based strategies over the broad universe of U.S. equities. Back-testing shows that, after accounting for transaction costs, PCA-based strategies have an average annual Sharpe ratio of 1.44 over the period 1997 to 2007, with a much stronger performances prior to 2003: during 2003-2007, the average Sharpe ratio of PCA-based strategies was only 0.9. On the other hand, strategies based on ETFs achieved a Sharpe ratio of 1.1 from 1997 to 2007, but experience a similar degradation of performance after 2002. We introduce a method to take into account daily trading volume information in the signals (using "trading time'' as opposed to calendar time), and observe significant improvements in performance in the case of ETF-based signals. ETF strategies which use volume information achieve a Sharpe ratio of 1.51 from 2003 to 2007.

Tags - arbitrage , strategy

The authors also find mid-quote and micro price data are between 40 to 60 times less noisy than trade data (as measured by the microstructure noise variance) leading to an efficiency gain for realized variance estimation of around 50%. Between the mid-quote and micro price, the former is weakly preferred.

The new micro-prices sampling method used for realized variance estimation is straightforward as a linear equation of bid, ask prices & volume, it is definitely a worth trial given the huge improvement.

Tags - variance , volatility

where the combination parameter delta lies between 0 and 1. Our purpose is to calculate the parameter delta by minimizing the loss function given by

Luckily we could get a closed-form solution, please refer to the original paper for detail.

Tags - portfolio , strategy , markowitz

here ASD(s)t is the standard deviation of returns from t+1 to t+s, r is return. Compared with a simple GARCH model, ARLS allows different coefficient beta for different forecasting horizon s (keep in mind in GARCH, beta is the same), thus is more flexible and overcomes the shortcomings of above mentioned GARCH type models.

Tags - volatility , forecast