Part 1 Approximation theory: there exists a neural network that does not make avoidable mistakes, A.R. Gallant and H. White; multilayer feedforward networks are universal approximators, K. Hornik, et al; universal approximation using feedforward networks with non-sigmoid hidden layer activation functions, M. Stinchcombe and H. White; approximating and learning unknown mappings using multilayer feedwork networks with bounded weights, M. Stinchcombe and H. White; universal approximation of an unknown mapping and its derivatives, K. Hornik, et al. Part 2 Learning and statistics: neural network learning and statistics, H. White; learning in artificial neural networks, H. White; some asymptotic results for learning in single hidden layer feedforward networks, H. White; connectionist nonparametric regression, H. White; nonparametric estimation of conditional quantiles using neural networks; on learning the derivatives of an unknown mapping with multilayer feedforward netowrks, A.R. Gallant and H. White; consequences and detection of misspecified nonlinear regression models, H. White; maximum likelihood estimation of misspecified models, H. White; some results for sieve estimation with dependent observations, H. White and J. Wooldridge.