Inside Volatility Filtering

Secrets of the Skew
 
 
Wiley (Verlag)
  • 2. Auflage
  • |
  • erschienen am 27. Juli 2015
  • |
  • 320 Seiten
 
E-Book | ePUB mit Adobe DRM | Systemvoraussetzungen
978-1-118-94398-4 (ISBN)
 
A new, more accurate take on the classical approach to volatility evaluation
Inside Volatility Filtering presents a new approach to volatility estimation, using financial econometrics based on a more accurate estimation of the hidden state. Based on the idea of "filtering", this book lays out a two-step framework involving a Chapman-Kolmogorov prior distribution followed by Bayesian posterior distribution to develop a robust estimation based on all available information. This new second edition includes guidance toward basing estimations on historic option prices instead of stocks, as well as Wiener Chaos Expansions and other spectral approaches. The author's statistical trading strategy has been expanded with more in-depth discussion, and the companion website offers new topical insight, additional models, and extra charts that delve into the profitability of applied model calibration. You'll find a more precise approach to the classical time series and financial econometrics evaluation, with expert advice on turning data into profit.
Financial markets do not always behave according to a normal bell curve. Skewness creates uncertainty and surprises, and tarnishes trading performance, but it's not going away. This book shows traders how to work with skewness: how to predict it, estimate its impact, and determine whether the data is presenting a warning to stay away or an opportunity for profit.
* Base volatility estimations on more accurate data
* Integrate past observation with Bayesian probability
* Exploit posterior distribution of the hidden state for optimal estimation
* Boost trade profitability by utilizing "skewness" opportunities
Wall Street is constantly searching for volatility assessment methods that will make their models more accurate, but precise handling of skewness is the key to true accuracy. Inside Volatility Filtering shows you a better way to approach non-normal distributions for more accurate volatility estimation.
2. Auflage
  • Englisch
  • New York
  • |
  • USA
John Wiley & Sons
  • 22,60 MB
978-1-118-94398-4 (9781118943984)
1118943988 (1118943988)
weitere Ausgaben werden ermittelt
ALIREZA JAVAHERI is the head of Equities Quantitative Research Americas at JP Morgan and an adjunct professor of Mathematical Finance at the Courant Institute of New York University, as well as Baruch College. He has worked in the field of derivatives quantitative research since 1994 in a variety of investment banks, including Goldman Sachs and Citigroup.
Foreword ix
Acknowledgments (Second Edition) xi
Acknowledgments (First Edition) xiii
Introduction (Second Edition) xv
Introduction (First Edition) xvii
Summary xvii
Contributions and Further Research xxiii
Data and Programs xxiv
CHAPTER 1 The Volatility Problem 1
Introduction 1
The Stock Market 2
The Stock Price Process 2
Historic Volatility 3
The Derivatives Market 5
The Black-Scholes Approach 5
The Cox Ross Rubinstein Approach 7
Jump Diffusion and Level-Dependent Volatility 8
Jump Diffusion 8
Level-Dependent Volatility 11
Local Volatility 14
The Dupire Approach 14
The Derman Kani Approach 17
Stability Issues 18
Calibration Frequency 19
Stochastic Volatility 21
Stochastic Volatility Processes 21
GARCH and Diffusion Limits 22
The Pricing PDE under Stochastic Volatility 26
The Market Price of Volatility Risk 26
The Two-Factor PDE 27
The Generalized Fourier Transform 28
The Transform Technique 28
Special Cases 30
The Mixing Solution 32
The Romano Touzi Approach 32
A One-Factor Monte-Carlo Technique 34
The Long-Term Asymptotic Case 35
The Deterministic Case 35
The Stochastic Case 37
A Series Expansion on Volatility-of-Volatility 39
Local Volatility Stochastic Volatility Models 42
Stochastic Implied Volatility 43
Joint SPX and VIX Dynamics 45
Pure-Jump Models 47
Variance Gamma 47
Variance Gamma with Stochastic Arrival 51
Variance Gamma with Gamma Arrival Rate 53
CHAPTER 2 The Inference Problem 55
Introduction 55
Using Option Prices 58
Conjugate Gradient (Fletcher-Reeves-Polak-Ribiere) Method 59
Levenberg-Marquardt (LM) Method 59
Direction Set (Powell) Method 61
Numeric Tests 62
The Distribution of the Errors 65
Using Stock Prices 65
The Likelihood Function 65
Filtering 69
The Simple and Extended Kalman Filters 72
The Unscented Kalman Filter 74
Kushner's Nonlinear Filter 77
Parameter Learning 80
Parameter Estimation via MLE 95
Diagnostics 108
Particle Filtering 111
Comparing Heston with Other Models 133
The Performance of the Inference Tools 141
The Bayesian Approach 158
Using the Characteristic Function 172
Introducing Jumps 174
Pure-Jump Models 184
Recapitulation 201
Model Identification 201
Convergence Issues and Solutions 202
CHAPTER 3 The Consistency Problem 203
Introduction 203
The Consistency Test 206
The Setting 206
The Cross-Sectional Results 206
Time-Series Results 209
Financial Interpretation 210
The "Peso" Theory 214
Background 214
Numeric Results 215
Trading Strategies 216
Skewness Trades 216
Kurtosis Trades 217
Directional Risks 217
An Exact Replication 219
The Mirror Trades 220
An Example of the Skewness Trade 220
Multiple Trades 225
High Volatility-of-Volatility and High Correlation 225
Non-Gaussian Case 230
VGSA 232
A Word of Caution 236
Foreign Exchange, Fixed Income, and Other Markets 237
Foreign Exchange 237
Fixed Income 238
CHAPTER 4 The Quality Problem 241
Introduction 241
An Exact Solution? 241
Nonlinear Filtering 242
Stochastic PDE 243
Wiener Chaos Expansion 244
First-Order WCE 247
Simulations 248
Second-Order WCE 251
Quality of Observations 251
Historic Spot Prices 252
Historic Option Prices 252
Conclusion 262
Bibliography 263
Index 279

Introduction (First Edition)


Summary


This book focuses on developing Methodologies for Estimating Stochastic Volatility (SV) parameters from the Stock-Price Time-Series under a Classical framework. The text contains three chapters and is structured as follows:

In the first chapter, we shall introduce and discuss the concept of various parametric SV models. This chapter represents a brief survey of the existing literature on the subject of nondeterministic volatility.

We start with the concept of log-normal distribution and historic volatility. We then will introduce the Black-Scholes [40] framework. We shall also mention alternative interpretations as suggested by Cox and Rubinstein [71]. We shall state how these models are unable to explain the negative-skewness and the leptokurticity commonly observed in the stock markets. Also, the famous implied-volatility smile would not exist under these assumptions.

At this point we consider the notion of level-dependent volatility as advanced by researchers such as Cox and Ross [69, 70] as well as Bensoussan, Crouhy, and Galai [34]. Either an artificial expression of the instantaneous variance will be used, as is the case for Constant Elasticity Variance (CEV) models, or an implicit expression will be deduced from a Firm model similar to Merton's [199], for instance.

We also will bring up the subject of Poisson Jumps [200] in the distributions providing a negative-skewness and larger kurtosis. These jump-diffusion models offer a link between the volatility smile and credit phenomena.

We then discuss the idea of Local Volatility [38] and its link to the instantaneous unobservable volatility. Work by researchers such as Dupire [94], Derman, and Kani [79] will be cited. We shall also describe the limitations of this idea due to an ill-poised inversion phenomenon, as revealed by Avellaneda [17] and others.

Unlike Non-Parametric Local Volatility models, Parametric Stochastic Volatility (SV) models [147] define a specific stochastic differential equation for the unobservable instantaneous variance. We therefore will introduce the notion of two-factor Stochastic Volatility and its link to one-factor Generalized Auto-Regressive Conditionally Heteroskedastic (GARCH) processes [42]. The SV model class is the one we shall focus on. Studies by scholars such as Engle [99], Nelson [204], and Heston [141] will be discussed at this juncture. We will briefly mention related works on Stochastic Implied Volatility by Schonbucher [224], as well as Uncertain Volatility by Avellaneda [18].

Having introduced SV, we then discuss the two-factor Partial Differential Equations (PDE) and the incompleteness of the markets when only cash and the underlying asset are used for hedging.

We then will examine Option Pricing techniques such as Inversion of the Fourier transform, Mixing Monte-Carlo, as well as a few asymptotic pricing techniques, as explained for instance by Lewis [185].

At this point we shall tackle the subject of pure-jump models such as Madan's Variance Gamma [192] or its variants VG with Stochastic Arrivals (VGSA) [52]. The latter adds to the traditional VG a way to introduce the volatility clustering (persistence) phenomenon. We will mention the distribution of the stock market as well as various option pricing techniques under these models. The inversion of the characteristic function is clearly the method of choice for option pricing in this context.

In the second chapter we will tackle the notion of Inference (or Parameter-Estimation) for Parametric SV models. We shall first briefly analyze the Cross-Sectional Inference and will then focus on the Time-Series Inference.

We start with a concise description of cross-sectional estimation of SV parameters in a risk-neutral framework. A Least Squares Estimation (LSE) algorithm will be discussed. The Direction-Set optimization algorithm [214] will be also introduced at this point. The fact that this optimization algorithm does not use the gradient of the input-function is important, since we shall later deal with functions that contain jumps and are not necessarily differentiable everywhere.

We then discuss the parameter inference from a Time-Series of the underlying asset in the real world. We shall do this in a Classical (Non-Bayesian) [252] framework and in particular we will estimate the parameters via a Maximization of Likelihood Estimation (MLE) [134] methodology. We shall explain the idea of MLE, its link to the Kullback-Leibler [105] distance, as well as the calculation of the Likelihood function for a two-factor SV model.

We will see that unlike GARCH models, SV models do not admit an analytic (integrated) likelihood function. This is why we will need to introduce the concept of Filtering [136].

The idea behind Filtering is to obtain the best possible estimation of a hidden state given all the available information up to that point. This estimation is done in an iterative manner in two stages: The first step is a Time Update where the prior distribution of the hidden state, at a given point in time, is determined from all the past information via a Chapman-Kolmogorov equation. The second step would then involve a Measurement Update where this prior distribution is used together with the conditional likelihood of the newest observation in order to compute the posterior distribution of the hidden state. The Bayes rule is used for this purpose. Once the posterior distribution is determined, it could be exploited for the optimal estimation of the hidden state.

We shall start with the Gaussian case where the first two moments characterize the entire distribution. For the Gaussian-Linear case, the optimal Kalman Filter (KF) [136] is introduced. Its nonlinear extension, the Extended KF (EKF), is described next. A more suitable version of KF for strongly nonlinear cases, the Unscented KF (UKF) [174], is also analyzed. In particular we will see how this filter is related to Kushner's Nonlinear Filter (NLF) [181, 182].

EKF uses a first-order Taylor approximation upon the nonlinear transition and observation functions, in order to bring us back into a simple KF framework. On the other hand, UKF uses the true nonlinear functions without any approximation. It, however, supposes that the Gaussianity of the distribution is preserved through these functions. UKF determines the first two moments via integrals that are computed on a few appropriately chosen "sigma points." NLF does the same exact thing via a Gauss-Hermite quadrature. However NLF often introduces an extra centering step, which will avoid poor performance due to an insufficient intersection between the prior distribution and the conditional likelihood.

As we shall observe, in addition to their use in the MLE approach, the Filters above could be applied to a direct estimation of the parameters via a Joint Filter (JF) [140]. The JF would simply involve the estimation of the parameters together with the hidden state via a dimension augmentation. In other words, one would treat the parameters as hidden states. After choosing initial conditions and applying the filter to an observation data set, one would then disregard a number of initial points and take the average upon the remaining estimations. This initial rejected period is known as the "burn in" period.

We will test various representations or State Space Models of the Stochastic Volatility models such as Heston's [141]. The concept of Observability [215] will be introduced in this context. We will see that the parameter estimation is not always accurate given a limited amount of daily data.

Before a closer analysis of the performance of these estimation methods, we shall introduce simulation-based Particle Filters (PF) [84, 128], which can be applied to non-Gaussian distributions. In a PF algorithm, the Importance Sampling technique is applied to the distribution. Points are simulated via a chosen proposal distribution and the resulting weights proportional to the conditional likelihood are computed. Since the variance of these weights tends to increase over time and cause the algorithm to diverge, the simulated points go through a variance reduction technique commonly referred to as Resampling [15]. During this stage, points with too small a weight are disregarded, and points with large weights are reiterated. This technique could cause a Sample Impoverishment, which can be corrected via a Metropolis-Hastings Accept/Reject test. Work by researchers such as Doucet [84], Smith, and Gordon [128] are cited and used in this context.

Needless to say, the choice of the proposal distribution could be fundamental in the success of the PF algorithm. The most natural choice would be to take a proposal distribution equal to the prior distribution of the hidden state. Even if this makes the computations simpler, the danger would be a non-alignment between the prior and the conditional likelihood as we previously mentioned. To avoid this, other proposal distributions taking into account the observation should be considered. The Extended PF (EPF) and the Unscented PF (UPF) [240] precisely do this by adding an extra Gaussian Filtering step to the process. Other techniques such as Auxiliary PF (APF) have been developed by Pitt and Shephard [213].

Interestingly, we will see that PF brings only marginal improvement to the traditional KF's when applied to daily data. However, for a larger time-step where the nonlinearity is stronger, the PF does help more.

At this point we also compare the Heston model to other SV models such as the "3/2" model [185] using real market data, and we will see that the...

Dateiformat: EPUB
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat EPUB ist sehr gut für Romane und Sachbücher geeignet - also für "fließenden" Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

71,99 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
ePUB mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung dieser Web-Seiten erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok