This book introduces the reader to the transformative techniques involved in deep learning. A range of methodologies are addressed including: * Basic classification and regression with perceptrons * Training algorithms, such as back propagation and stochastic gradient descent and the parallelization of training * Multi-Layer Perceptrons for learning from descriptors, and de-noising data * Recurrent neural networks for learning from sequences * Convolutional neural networks for learning from images * Bayesian optimization for tuning deep learning architectures Each of these areas has direct application to physical science research, and by the end of the book, the reader should feel comfortable enough to select the methodology which is best for their situation, and be able to implement and interpret outcome of the deep learning model. The book is designed to teach researchers to think in new ways, providing them with new avenues to attack problems, and avoid roadblocks within their research. This is achieved through the inclusion of case-study like problems at the end of each chapter, which will give the reader a chance to practice what they have just learnt in a close-to-real-world setting, with example 'solutions' provided through an online resource. This book introduces the reader to the transformative techniques involved in deep learning. A range of methodologies are addressed including: *Basic classification and regression with perceptrons *Training algorithms, such as back propagation and stochastic gradient descent and the parallelization of training *Multi-Layer Perceptrons for learning from descriptors, and de-noising data *Recurrent neural networks for learning from sequences *Convolutional neural networks for learning from images *Bayesian optimization for tuning deep learning architectures Each of these areas has direct application to physical science research, and by the end of the book, the reader should feel comfortable enough to select the methodology which is best for their situation, and be able to implement and interpret outcome of the deep learning model. The book is designed to teach researchers to think in new ways, providing them with new avenues to attack problems, and avoid roadblocks within their research. This is achieved through the inclusion of case-study like problems at the end of each chapter, which will give the reader a chance to practice what they have just learnt in a close-to-real-world setting, with example 'solutions' provided through an online resource. Market Description This book introduces the reader to the transformative techniques involved in deep learning. A range of methodologies are addressed including: * Basic classification and regression with perceptrons * Training algorithms, such as back propagation and stochastic gradient descent and the parallelization of training * Multi-Layer Perceptrons for learning from descriptors, and de-noising data * Recurrent neural networks for learning from sequences * Convolutional neural networks for learning from images * Bayesian optimization for tuning deep learning architectures Each of these areas has direct application to physical science research, and by the end of the book, the reader should feel comfortable enough to select the methodology which is best for their situation, and be able to implement and interpret outcome of the deep learning model. The book is designed to teach researchers to think in new ways, providing them with new avenues to attack problems, and avoid roadblocks within their research. This is achieved through the inclusion of case-study like problems at the end of each chapter, which will give the reader a chance to practice what they have just learnt in a close-to-real-world setting, with example 'solutions' provided through an online resource.
Sprache
Verlagsort
Verlagsgruppe
Zielgruppe
Maße
Höhe: 244 mm
Breite: 170 mm
Dicke: 18 mm
Gewicht
ISBN-13
978-1-119-40833-8 (9781119408338)
Schweitzer Klassifikation
Dr Edward O. Pyzer-Knapp is the worldwide lead for AI Enriched Modelling and Simulation at IBM Research. ?Previously, he obtained his PhD from the University of Cambridge using state of the art computational techniques to accelerate materials design then moving to Harvard where he was in charge of the day-to-day running of the Harvard Clean Energy Project - a collaboration with IBM which combined massive distributed computing, quantum-mechanical simulations, and machine-learning to accelerate discovery of the next generation of organic photovoltaic materials. He is also the Visiting Professor of Industrially Applied AI at the University of Liverpool, and the Editor in Chief for Applied AI Letters, a journal with a focus on real-world application and validation of AI.
Dr Matt Benatan received his PhD in Audio-Visual Speech Processing from the University of Leeds, after which he went on to pursue a career in AI research within industry. His work to date has involved the research and development of AI techniques for a broad variety of domains, from applications in audio processing through to materials discovery. His research interests include Computer Vision, Signal Processing, Bayesian Optimization, and Scalable Bayesian Inference.
Autor*in
University of Cambridge
University of Leeds