Weightless neural tools - toward cognitive macrostructures, L. Aleksander; an estimation theoretic basis for the design of sorting and classification network, R.W. Brockett; a self organizing ARTMAP neural architecture for supervized learning and pattern recognition, G.A. Carpenter et al; hybrid neural network architectures - equilibrium systems that pay attention, L.N. Cooper; neural networks for internal representation of movements in primates and robots, R. Eckmiller et al; recognition and segmentation of characters in handwriting with selective attention, K. Fukushima et al; adaptive acquisition of language, A.L. Gorin et al; what connectionist models learn - learning and representation in connectionist networks, S.J. Hanson and D.J. Burr; early vision, focal attention and neural nets, B. Julesz; toward hierarchical matched filtering, R. Hecht-Nielsen; some variations on training of recurrent networks, G.M. Kuhn and N.P. Herzberg; generalized perception networks with nonlinear discriminant functions, S.Y. Kung et al; neural tree networks, A. Sankar and R. Mammone; capabilities and training of feedforward nets, E.D. Sontag; a fast learning algorithm for multilayer neural network based on projection methods, S.J. Yeh and H. Stark.