Quantitative finance collector
C++ Matlab VBA/Excel Java Mathematica R/Splus Net Code Site Other

Quantitative Finance Collector is a blog on Quantitative finance analysis, financial engineering methods in mathematical finance focusing on derivative pricing, quantitative trading and quantitative risk management. Random thoughts on financial markets and personal staff are posted at the sub personal blog.

Sep 17
Quantivity introduced an interesting article a few days ago Combined Regression and Ranking that may be interesting to some of you.

The basic idea of Combined Regression and Ranking is to optimize regression and ranking objectives simultaneously. Generally we are trying to keep the predicted values as close as possible to the true target value for regression based method, such as minimizing the mean square error (MSE), however, besides accurately producing values, in some circumstances we are more interested in a model able to predict the ranking as well. A good example in the paper is: considering nearly all observations have target value y=0 and only a small fraction have value y=1, therefore a model predicting 0 for all cases is good enough using regression, however, it failes to return a useful & meaningful ranking.

The objective funtion for combined regression and ranking is given by
objective function
where the first part is for regression loss, the second part is for pairwise ranking loss, and the parameter alpha intuitively trades off between optimizing regression loss and optimizing pairwise loss.
Sep 22
I recently read a paper comparing the performances of different models to predict stock returns, at the end the authors rank the models by their out-of-sample symmetric mean absolute percentage error (SMAPE), surprisingly for me, The winning four models turned out to be:
1: Gaussian process regression;
2: Neural network;
3: Multiple regression model;
4: A very simple model based on a simple moving average.

What interests me is Gaussian process regression is the best model by the authors, as stated: "A List of different monitored learning techniques have been attempted to predict future stock returns, both for potential monetary make and because it is an interesting research problem. We use regression to capture changes in stock price prediction as a function of a covariance
(kernel or Gram) matrix. For our aims it is natural to think of the price of a stock as being some function over time. Generally, a Gaussian processes can be cosidered to be defining a distrubution over functions with the inference step occurring directly in the space of functions. Thus, by using Gaussian process regression to extend a function beyond known price data, we can predict whether stocks will rise or fall the next day, and by how much."

Should you are interested, here is a book Gaussian Processes for Machine Learning to be freely downloaded, accompanying Matlab package is also available at the website.
http://www.gaussianprocess.org/gpml/
Tags:
Feb 17
glmlab is a free MATLAB toolbox for analysing generalized linear models.  glmlab can fit all types of generalized linear models, including (among others):
multiple regression;
log-linear models;
logistic regression; and
weighted regression
.

glmlab includes the following error distributions:
normal (Gaussian);
gamma;
inverse Gaussian;
Poisson; and
binomial.
You can also specify your own error distributions with just a little bit of MATLAB programming.
Tags:
Oct 7
Excel provides a handy function called LINEST that allows the user to make OLS regressions in an very quick and simple fashion. Unfortunately, the function fails if some values are missing in the data.

Here is a small program that addresss this shortcoming. After installing this add-in, you can simply say LINESTNA(...) instead of LINEST(...) and the problem with the missing values is gone.

The program first extracts the rows that do not contain any missing values, and then calls Excel's LINEST to perform the estimation with the cleaned data. The data have to be organized column-wise.

http://www.wwz.unibas.ch/ds/abt/wirtschaftstheorie/personen/yvan/software/#c6714
Tags:
Jul 29
Quantile regression is a statistical technique intended to estimate, and conduct inference about, conditional quantile functions. Just as classical linear regression methods based on minimizing sums of squared residuals enable one to estimate models for conditional mean functions, quantile regression methods offer a mechanism for estimating models for the conditional median function, and the full range of other conditional quantile functions. By supplementing the estimation of conditional mean functions with techniques for estimating an entire family of conditional quantile functions, quantile regression is capable of providing a more complete statistical analysis of the stochastic relationships among random variables.

Tags:
Pages: 1/1 First page 1 Final page [ View by Articles | List ]