Statistical Signal Processing in Engineering

 
 
Standards Information Network (Verlag)
  • erschienen am 15. November 2017
  • |
  • 608 Seiten
 
E-Book | PDF mit Adobe DRM | Systemvoraussetzungen
978-1-119-29395-8 (ISBN)
 
A problem-solving approach to statistical signal processing for practicing engineers, technicians, and graduate students
This book takes a pragmatic approach in solving a set of common problems engineers and technicians encounter when processing signals. In writing it, the author drew on his vast theoretical and practical experience in the field to provide a quick-solution manual for technicians and engineers, offering field-tested solutions to most problems engineers can encounter. At the same time, the book delineates the basic concepts and applied mathematics underlying each solution so that readers can go deeper into the theory to gain a better idea of the solution's limitations and potential pitfalls, and thus tailor the best solution for the specific engineering application.
Uniquely, Statistical Signal Processing in Engineering can also function as a textbook for engineering graduates and post-graduates. Dr. Spagnolini, who has had a quarter of a century of experience teaching graduate-level courses in digital and statistical signal processing methods, provides a detailed axiomatic presentation of the conceptual and mathematical foundations of statistical signal processing that will challenge students' analytical skills and motivate them to develop new applications on their own, or better understand the motivation underlining the existing solutions.
Throughout the book, some real-world examples demonstrate how powerful a tool statistical signal processing is in practice across a wide range of applications.
* Takes an interdisciplinary approach, integrating basic concepts and tools for statistical signal processing
* Informed by its author's vast experience as both a practitioner and teacher
* Offers a hands-on approach to solving problems in statistical signal processing
* Covers a broad range of applications, including communication systems, machine learning, wavefield and array processing, remote sensing, image filtering and distributed computations
* Features numerous real-world examples from a wide range of applications showing the mathematical concepts involved in practice
* Includes MATLAB code of many of the experiments in the book
Statistical Signal Processing in Engineering is an indispensable working resource for electrical engineers, especially those working in the information and communication technology (ICT) industry. It is also an ideal text for engineering students at large, applied mathematics post-graduates and advanced undergraduates in electrical engineering, applied statistics, and pure mathematics, studying statistical signal processing.
1. Auflage
  • Englisch
  • Newark
  • |
  • Großbritannien
John Wiley & Sons Inc
  • Für Beruf und Forschung
  • 17,66 MB
978-1-119-29395-8 (9781119293958)
1119293952 (1119293952)
weitere Ausgaben werden ermittelt
UMBERTO SPAGNOLINI is Professor in Signal Processing and Telecommunications at Politecnico di Milano, Italy. Prof. Spagnolini's research focuses on statistical signal processing, communication systems, and advanced topics in signal processing for remote sensing and wireless communication systems. He is a Senior Member of the IEEE, engages in editorial activity for IEEE journals and conferences, and has authored 300 patents and papers in peer reviewed journals and conferences.
  • Intro
  • Title Page
  • Copyright Page
  • Contents
  • List of Figures
  • List of Tables
  • Preface
  • List of Abbreviations
  • How to Use the Book
  • About the Companion Website
  • Prerequisites
  • Why are there so many matrixes in this book?
  • Chapter 1 Manipulations on Matrixes
  • 1.1 Matrix Properties
  • 1.1.1 Elementary Operations
  • 1.2 Eigen-Decompositions
  • 1.3 Eigenvectors in Everyday Life
  • 1.3.1 Conversations in a Noisy Restaurant
  • 1.3.2 Power Control in a Cellular Communication System
  • 1.3.3 Price Equilibrium in the Economy
  • 1.4 Derivative Rules
  • 1.4.1 Derivative with respect to x? Rn
  • 1.4.2 Derivative with respect to x? Cn
  • 1.4.3 Derivative with respect to the Matrix X?Rm×n
  • 1.5 Quadratic Forms
  • 1.6 Diagonalization of a Quadratic Form
  • 1.7 Rayleigh Quotient
  • 1.8 Basics of Optimization
  • 1.8.1 Quadratic Function with Simple Linear Constraint (M=1)
  • 1.8.2 Quadratic Function with Multiple Linear Constraints
  • Appendix A: Arithmetic vs. Geometric Mean
  • Chapter 2 Linear Algebraic Systems
  • 2.1 Problem Definition and Vector Spaces
  • 2.1.1 Vector Spaces in Tomographic Radiometric Inversion
  • 2.2 Rotations
  • 2.3 Projection Matrixes and Data-Filtering
  • 2.3.1 Projections and Commercial FM Radio
  • 2.4 Singular Value Decomposition (SVD) and Subspaces
  • 2.4.1 How to Choose the Rank of A?
  • 2.5 QR and Cholesky Factorization
  • 2.6 Power Method for Leading Eigenvectors
  • 2.7 Least Squares Solution of Overdetermined Linear Equations
  • 2.8 Efficient Implementation of the LS Solution
  • 2.9 Iterative Methods
  • Chapter 3 Random Variables in Brief
  • 3.1 Probability Density Function (pdf), Moments, and Other Useful Properties
  • 3.2 Convexity and Jensen Inequality
  • 3.3 Uncorrelatedness and Statistical Independence
  • 3.4 Real-Valued Gaussian Random Variables
  • 3.5 Conditional pdf for Real-Valued Gaussian Random Variables
  • 3.6 Conditional pdf in Additive Noise Model
  • 3.7 Complex Gaussian Random Variables
  • 3.7.1 Single Complex Gaussian Random Variable
  • 3.7.2 Circular Complex Gaussian Random Variable
  • 3.7.3 Multivariate Complex Gaussian Random Variables
  • 3.8 Sum of Square of Gaussians: Chi-Square
  • 3.9 Order Statistics for N rvs
  • Chapter 4 Random Processes and Linear Systems
  • 4.1 Moment Characterizations and Stationarity
  • 4.2 Random Processes and Linear Systems
  • 4.3 Complex-Valued Random Processes
  • 4.4 Pole-Zero and Rational Spectra (Discrete-Time)
  • 4.4.1 Stability of LTI Systems
  • 4.4.2 Rational PSD
  • 4.4.3 Paley-Wiener Theorem
  • 4.5 Gaussian Random Process (Discrete-Time)
  • 4.6 Measuring Moments in Stochastic Processes
  • Appendix A: Transforms for Continuous-Time Signals
  • Appendix B: Transforms for Discrete-Time Signals
  • Chapter 5 Models and Applications
  • 5.1 Linear Regression Model
  • 5.2 Linear Filtering Model
  • 5.2.1 Block-Wise Circular Convolution
  • 5.2.2 Discrete Fourier Transform and Circular Convolution Matrixes
  • 5.2.3 Identification and Deconvolution
  • 5.3 MIMO systems and Interference Models
  • 5.3.1 DSL System
  • 5.3.2 MIMO in Wireless Communication
  • 5.4 Sinusoidal Signal
  • 5.5 Irregular Sampling and Interpolation
  • 5.5.1 Sampling With Jitter
  • 5.6 Wavefield Sensing System
  • Chapter 6 Estimation Theory
  • 6.1 Historical Notes
  • 6.2 Non-Bayesian vs. Bayesian
  • 6.3 Performance Metrics and Bounds
  • 6.3.1 Bias
  • 6.3.2 Mean Square Error (MSE)
  • 6.3.3 Performance Bounds
  • 6.4 Statistics and Sufficient Statistics
  • 6.5 MVU and BLU Estimators
  • 6.6 BLUE for Linear Models
  • 6.7 Example: BLUE of the Mean Value of Gaussian rvs
  • Chapter 7 Parameter Estimation
  • 7.1 Maximum Likelihood Estimation (MLE)
  • 7.2 MLE for Gaussian Model xN(µ(?),C(?))
  • 7.2.1 Additive Noise Model x=s(?)+w with wN(0,Cw)
  • 7.2.2 Additive Noise Model x=H(?)·a+w with wN(0,Cw)
  • 7.2.3 Additive Noise Model with Multiple Observations x=s(?)+w with wN(0,Cw), Cw Known
  • 7.2.3.1 Linear Model = · +
  • 7.2.3.2 Model = · +
  • 7.2.3.3 Model = ( ) · +
  • 7.2.4 Model xN(0,C(?))
  • 7.2.5 Additive Noise Model with Multiple Observations x=s(?)+w with wN(0,Cw), Cw Unknown
  • 7.3 Other Noise Models
  • 7.4 MLE and Nuisance Parameters
  • 7.5 MLE for Continuous-Time Signals
  • 7.5.1 Example: Amplitude Estimation
  • 7.5.2 MLE for Correlated Noise Sw(f)
  • 7.6 MLE for Circular Complex Gaussian
  • 7.7 Estimation in Phase/Frequency Modulations
  • 7.7.1 MLE Phase Estimation
  • 7.7.2 Phase Locked Loops
  • 7.8 Least Squares (LS) Estimation
  • 7.8.1 Weighted LS with W = diag{c1,c2,...,cN}
  • 7.8.2 LS Estimation and Linear Models
  • 7.8.3 Under or Over-Parameterizing?
  • 7.8.4 Constrained LS Estimation
  • 7.9 Robust Estimation
  • Chapter 8 Cramér-Rao Bound
  • 8.1 Cramér-Rao Bound and Fisher Information Matrix
  • 8.1.1 CRB for Scalar Problem (P=1)
  • 8.1.2 CRB and Local Curvature of Log-Likelihood
  • 8.1.3 CRB for Multiple Parameters (p=1)
  • 8.2 Interpretation of CRB and Remarks
  • 8.2.1 Variance of Each Parameter
  • 8.2.2 Compactness of the Estimates
  • 8.2.3 FIM for Known Parameters
  • 8.2.4 Approximation of the Inverse of FIM
  • 8.2.5 Estimation Decoupled From FIM
  • 8.2.6 CRB and Nuisance Parameters
  • 8.2.7 CRB for Non-Gaussian rv and Gaussian Bound
  • 8.3 CRB and Variable Transformations
  • 8.4 FIM for Gaussian Parametric Model xN(µ(?),C(?))
  • 8.4.1 FIM for x=s(?)+w with wN(0,Cw)
  • 8.4.2 FIM for Continuous-Time Signals in Additive White Gaussian Noise
  • 8.4.3 FIM for Circular Complex Model
  • Appendix A: Proof of CRB
  • Appendix B: FIM for GaussianModel
  • Appendix C: Some Derivatives for MLE and CRB Computations
  • Chapter 9 MLE and CRB for Some Selected Cases
  • 9.1 Linear Regressions
  • 9.2 Frequency Estimation x[n]=aocos(?0n+?o)+w[n]
  • 9.3 Estimation of Complex Sinusoid
  • 9.3.1 Proper, Improper, and Non-Circular Signals
  • 9.4 Time of Delay Estimation
  • 9.5 Estimation of Max for Uniform pdf
  • 9.6 Estimation of Occurrence Probability for Binary pdf
  • 9.7 How to Optimize Histograms?
  • 9.8 Logistic Regression
  • Chapter 10 Numerical Analysis and Montecarlo Simulations
  • 10.1 System Identification and Channel Estimation
  • 10.1.1 Matlab Code and Results
  • 10.2 Frequency Estimation
  • 10.2.1 Variable (Coarse/Fine) Sampling
  • 10.2.2 Local Parabolic Regression
  • 10.2.3 Matlab Code and Results
  • 10.3 Time of Delay Estimation
  • 10.3.1 Granularity of Sampling in ToD Estimation
  • 10.3.2 Matlab Code and Results
  • 10.4 Doppler-Radar System by Frequency Estimation
  • 10.4.1 EM Method
  • 10.4.2 Matlab Code and Results
  • Chapter 11 Bayesian Estimation
  • 11.1 Additive Linear Model with Gaussian Noise
  • 11.1.1 Gaussian A-priori: ?N(,s?2)
  • 11.1.2 Non-Gaussian A-Priori
  • 11.1.3 Binary Signals: MMSE vs. MAP Estimators
  • 11.1.4 Example: Impulse Noise Mitigation
  • 11.2 Bayesian Estimation in Gaussian Settings
  • 11.2.1 MMSE Estimator
  • 11.2.2 MMSE Estimator for Linear Models
  • 11.3 LMMSE Estimation and Orthogonality
  • 11.4 Bayesian CRB
  • 11.5 Mixing Bayesian and Non-Bayesian
  • 11.5.1 Linear Model with Mixed Random/Deterministic Parameters
  • 11.5.2 Hybrid CRB
  • 11.6 Expectation-Maximization (EM)
  • 11.6.1 EM of the Sum of Signals in Gaussian Noise
  • 11.6.2 EM Method for the Time of Delay Estimation of Multiple Waveforms
  • 11.6.3 Remarks
  • Chapter 12 Optimal Filtering
  • 12.1 Wiener Filter
  • 12.2 MMSE Deconvolution (or Equalization)
  • 12.3 Linear Prediction
  • 12.3.1 Yule-Walker Equations
  • 12.4 LS Linear Prediction
  • 12.5 Linear Prediction and AR Processes
  • 12.6 Levinson Recursion and Lattice Predictors
  • Chapter 13 Bayesian Tracking and Kalman Filter
  • 13.1 Bayesian Tracking of State in Dynamic Systems
  • 13.1.1 Evolution of the A-Posteriori pdf
  • 13.2 Kalman Filter (KF)
  • 13.2.1 KF Equations
  • 13.2.2 Remarks
  • 13.3 Identification of Time-Varying Filters in Wireless Communication
  • 13.4 Extended Kalman Filter (EKF) for Non-Linear Dynamic Systems
  • 13.5 Position Tracking by Multi-Lateration
  • 13.5.1 Positioning and Noise
  • 13.5.2 Example of Position Tracking
  • 13.6 Non-Gaussian Pdf and Particle Filters
  • Chapter 14 Spectral Analysis
  • 14.1 Periodogram
  • 14.1.1 Bias of the Periodogram
  • 14.1.2 Variance of the Periodogram
  • 14.1.3 Filterbank Interpretation
  • 14.1.4 Pdf of the Periodogram (White Gaussian Process)
  • 14.1.5 Bias and Resolution
  • 14.1.6 Variance Reduction and WOSA
  • 14.1.7 Numerical Example: Bandlimited Process and (Small) Sinusoid
  • 14.2 Parametric Spectral Analysis
  • 14.2.1 MLE and CRB
  • 14.2.2 General Model for AR, MA, ARMA Spectral Analysis
  • 14.3 AR Spectral Analysis
  • 14.3.1 MLE and CRB
  • 14.3.2 A Good Reason to Avoid Over-Parametrization in AR
  • 14.3.3 Cramér-Rao Bound of Poles in AR Spectral Analysis
  • 14.3.4 Example: Frequency Estimation by AR Spectral Analysis
  • 14.4 MA Spectral Analysis
  • 14.5 ARMA Spectral Analysis
  • 14.5.1 Cramér-Rao Bound for ARMA Spectral Analysis
  • Appendix A:Which Sample Estimate of the Autocorrelation to Use?
  • Appendix B: Eigenvectors and Eigenvalues of Correlation Matrix
  • Appendix C: Property of Monic Polynomial
  • Appendix D: Variance of Pole in AR(1)
  • Chapter 15 Adaptive Filtering
  • 15.1 Adaptive Interference Cancellation
  • 15.2 Adaptive Equalization in Communication Systems
  • 15.2.1 Wireless Communication Systems in Brief
  • 15.2.2 Adaptive Equalization
  • 15.3 Steepest Descent MSE Minimization
  • 15.3.1 Convergence Analysis and Step-Size µ
  • 15.3.2 An Intuitive View of Convergence Conditions
  • 15.4 From Iterative to Adaptive Filters
  • 15.5 LMS Algorithm and Stochastic Gradient
  • 15.6 Convergence Analysis of LMS Algorithm
  • 15.6.1 Convergence in the Mean
  • 15.6.2 Convergence in the Mean Square
  • 15.6.3 Excess MSE
  • 15.7 Learning Curve of LMS
  • 15.7.1 Optimization of the Step-Size µ
  • 15.8 NLMS Updating and Non-Stationarity
  • 15.9 Numerical Example: Adaptive Identification
  • 15.10 RLS Algorithm
  • 15.10.1 Convergence Analysis
  • 15.10.2 Learning Curve of RLS
  • 15.11 Exponentially-Weighted RLS
  • 15.12 LMS vs. RLS
  • Appendix A: Convergence inMean Square
  • Chapter 16 Line Spectrum Analysis
  • 16.1 Model Definition
  • 16.1.1 Deterministic Signals s (t)
  • 16.1.2 Random Signals s (t)
  • 16.1.3 Properties of Structured Covariance
  • 16.2 Maximum Likelihood and Cramér-Rao Bounds
  • 16.2.1 Conditional ML
  • 16.2.2 Cramér-Rao Bound for Conditional Model
  • 16.2.3 Unconditional ML
  • 16.2.4 Cramér-Rao Bound for Unconditional Model
  • 16.2.5 Conditional vs. Unconditional Model & Bounds
  • 16.3 High-Resolution Methods
  • 16.3.1 Iterative Quadratic ML (IQML)
  • 16.3.2 Prony Method
  • 16.3.3 MUSIC
  • 16.3.4 ESPRIT
  • 16.3.5 Model Order
  • Chapter 17 Equalization in Communication Engineering
  • 17.1 Linear Equalization
  • 17.1.1 Zero Forcing (ZF) Equalizer
  • 17.1.2 Minimum Mean Square Error (MMSE) Equalizer
  • 17.1.3 Finite-Length/Finite-Block Equalizer
  • 17.2 Non-Linear Equalization
  • 17.2.1 ZF-DFE
  • 17.2.2 MMSE-DFE
  • 17.2.3 Finite-Length MMSE-DFE
  • 17.2.4 Asymptotic Performance for Infinite-Length Equalizers
  • 17.3 MIMO Linear Equalization
  • 17.3.1 ZF MIMO Equalization
  • 17.3.2 MMSE MIMO Equalization
  • 17.4 MIMO-DFE Equalization
  • 17.4.1 Cholesky Factorization and Min/Max Phase Decomposition
  • 17.4.2 MIMO-DFE
  • Chapter 18 2D Signals and Physical Filters
  • 18.1 2D Sinusoids
  • 18.1.1 Moiré Pattern
  • 18.2 2D Filtering
  • 18.2.1 2D Random Fields
  • 18.2.2 Wiener Filtering
  • 18.2.3 Image Acquisition and Restoration
  • 18.3 Diffusion Filtering
  • 18.3.1 Evolution vs. Time: Fourier Method
  • 18.3.2 Extrapolation of the Density
  • 18.3.3 Effect of p/4 Phase-Shift
  • 18.4 Laplace Equation and Exponential Filtering
  • 18.5 Wavefield Propagation
  • 18.5.1 Propagation/Backpropagation
  • 18.5.2 Wavefield Extrapolation and Focusing
  • 18.5.3 Exploding Reflector Model
  • 18.5.4 Wavefield Extrapolation
  • 18.5.5 Wavefield Focusing (or Migration)
  • Appendix A: Properties of 2D Signals
  • Appendix B: Properties of 2D Fourier Transform
  • Appendix C: Finite Difference Method for PDE-Diffusion
  • Chapter 19 Array Processing
  • 19.1 Narrowband Model
  • 19.1.1 Multiple DoAs and Multiple Sources
  • 19.1.2 Sensor Spacing Design
  • 19.1.3 Spatial Resolution and Array Aperture
  • 19.2 Beamforming and Signal Estimation
  • 19.2.1 Conventional Beamforming
  • 19.2.2 Capon Beamforming (MVDR)
  • 19.2.3 Multiple-Constraint Beamforming
  • 19.2.4 Max-SNR Beamforming
  • 19.3 DoA Estimation
  • 19.3.1 ML Estimation and CRB
  • 19.3.2 Beamforming and Root-MVDR
  • Chapter 20 Multichannel Time of Delay Estimation
  • 20.1 Model Definition for ToD
  • 20.2 High Resolution Method for ToD (L=1)
  • 20.2.1 ToD in the Fourier Transformed Domain
  • 20.2.2 CRB and Resolution
  • 20.3 Difference of ToD (DToD) Estimation
  • 20.3.1 Correlation Method for DToD
  • 20.3.2 Generalized Correlation Method
  • 20.4 Numerical Performance Analysis of DToD
  • 20.5 Wavefront Estimation: Non-Parametric Method (L=1)
  • 20.5.1 Wavefront Estimation in Remote Sensing and Geophysics
  • 20.5.2 Narrowband Waveforms and 2D Phase Unwrapping
  • 20.5.3 2D Phase Unwrapping in Regular Grid Spacing
  • 20.6 Parametric ToD Estimation and Wideband Beamforming
  • 20.6.1 Delay and Sum Beamforming
  • 20.6.2 Wideband Beamforming After Fourier Transform
  • Appendix A: Properties of the Sample Correlations
  • Appendix B: How to Delay a Discrete-Time Signal?
  • Appendix C:Wavefront Estimation for 2D Arrays
  • Chapter 21 Tomography
  • 21.1 X-ray Tomography
  • 21.1.1 Discrete Model
  • 21.1.2 Maximum Likelihood
  • 21.1.3 Emission Tomography
  • 21.2 Algebraic Reconstruction Tomography (ART)
  • 21.3 Reconstruction From Projections: Fourier Method
  • 21.3.1 Backprojection Algorithm
  • 21.3.2 How Many Projections to Use?
  • 21.4 Traveltime Tomography
  • 21.5 Internet (Network) Tomography
  • 21.5.1 Latency Tomography
  • 21.5.2 Packet-Loss Tomography
  • Chapter 22 Cooperative Estimation
  • 22.1 Consensus and Cooperation
  • 22.1.1 Vox Populi: The Wisdom of Crowds
  • 22.1.2 Cooperative Estimation as Simple Information Consensus
  • 22.1.3 Weighted Cooperative Estimation (p=1)
  • 22.1.4 Distributed MLE (p=1)
  • 22.2 Distributed Estimation for Arbitrary Linear Models (p>1)
  • 22.2.1 Centralized MLE
  • 22.2.2 Distributed Weighted LS
  • 22.2.3 Distributed MLE
  • 22.2.4 Distributed Estimation for Under-Determined Systems
  • 22.2.5 Stochastic Regressor Model
  • 22.2.6 Cooperative Estimation in the Internet of Things (IoT)
  • 22.2.7 Example: Iterative Distributed Estimation
  • 22.3 Distributed Synchronization
  • 22.3.1 Synchrony-States for Analog and Discrete-Time Clocks
  • 22.3.2 Coupled Clocks
  • 22.3.3 Internet Synchronization and the Network Time Protocol (NTP)
  • Appendix A: Basics of Undirected Graphs
  • Chapter 23 Classification and Clustering
  • 23.1 Historical Notes
  • 23.2 Classification
  • 23.2.1 Binary Detection Theory
  • 23.2.2 Binary Classification of Gaussian Distributions
  • 23.3 Classification of Signals in Additive Gaussian Noise
  • 23.3.1 Detection of Known Signal
  • 23.3.2 Classification of Multiple Signals
  • 23.3.3 Generalized Likelihood Ratio Test (GLRT)
  • 23.3.4 Detection of Random Signals
  • 23.4 Bayesian Classification
  • 23.4.1 To Classify or Not to Classify?
  • 23.4.2 Bayes Risk
  • 23.5 Pattern Recognition and Machine Learning
  • 23.5.1 Linear Discriminant
  • 23.5.2 Least Squares Classification
  • 23.5.3 Support Vectors Principle
  • 23.6 Clustering
  • 23.6.1 K-Means Clustering
  • 23.6.2 EM Clustering
  • References
  • Index
  • EULA

Dateiformat: PDF
Kopierschutz: Adobe-DRM (Digital Rights Management)

Systemvoraussetzungen:

Computer (Windows; MacOS X; Linux): Installieren Sie bereits vor dem Download die kostenlose Software Adobe Digital Editions (siehe E-Book Hilfe).

Tablet/Smartphone (Android; iOS): Installieren Sie bereits vor dem Download die kostenlose App Adobe Digital Editions (siehe E-Book Hilfe).

E-Book-Reader: Bookeen, Kobo, Pocketbook, Sony, Tolino u.v.a.m. (nicht Kindle)

Das Dateiformat PDF zeigt auf jeder Hardware eine Buchseite stets identisch an. Daher ist eine PDF auch für ein komplexes Layout geeignet, wie es bei Lehr- und Fachbüchern verwendet wird (Bilder, Tabellen, Spalten, Fußnoten). Bei kleinen Displays von E-Readern oder Smartphones sind PDF leider eher nervig, weil zu viel Scrollen notwendig ist. Mit Adobe-DRM wird hier ein "harter" Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.

Weitere Informationen finden Sie in unserer E-Book Hilfe.


Download (sofort verfügbar)

95,99 €
inkl. 19% MwSt.
Download / Einzel-Lizenz
PDF mit Adobe DRM
siehe Systemvoraussetzungen
E-Book bestellen

Unsere Web-Seiten verwenden Cookies. Mit der Nutzung dieser Web-Seiten erklären Sie sich damit einverstanden. Mehr Informationen finden Sie in unserem Datenschutzhinweis. Ok