Scaling up Machine Learning

Parallel and Distributed Approaches

Nonfiction, Computers, Advanced Computing, Engineering, Computer Vision, Artificial Intelligence, General Computing
Cover of the book Scaling up Machine Learning by , Cambridge University Press
View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart
Author: ISBN: 9781139635578
Publisher: Cambridge University Press Publication: December 30, 2011
Imprint: Cambridge University Press Language: English
Author:
ISBN: 9781139635578
Publisher: Cambridge University Press
Publication: December 30, 2011
Imprint: Cambridge University Press
Language: English

This book presents an integrated collection of representative approaches for scaling up machine learning and data mining methods on parallel and distributed computing platforms. Demand for parallelizing learning algorithms is highly task-specific: in some settings it is driven by the enormous dataset sizes, in others by model complexity or by real-time performance requirements. Making task-appropriate algorithm and platform choices for large-scale machine learning requires understanding the benefits, trade-offs and constraints of the available options. Solutions presented in the book cover a range of parallelization platforms from FPGAs and GPUs to multi-core systems and commodity clusters, concurrent programming frameworks including CUDA, MPI, MapReduce and DryadLINQ, and learning settings (supervised, unsupervised, semi-supervised and online learning). Extensive coverage of parallelization of boosted trees, SVMs, spectral clustering, belief propagation and other popular learning algorithms, and deep dives into several applications, make the book equally useful for researchers, students and practitioners.

View on Amazon View on AbeBooks View on Kobo View on B.Depository View on eBay View on Walmart

This book presents an integrated collection of representative approaches for scaling up machine learning and data mining methods on parallel and distributed computing platforms. Demand for parallelizing learning algorithms is highly task-specific: in some settings it is driven by the enormous dataset sizes, in others by model complexity or by real-time performance requirements. Making task-appropriate algorithm and platform choices for large-scale machine learning requires understanding the benefits, trade-offs and constraints of the available options. Solutions presented in the book cover a range of parallelization platforms from FPGAs and GPUs to multi-core systems and commodity clusters, concurrent programming frameworks including CUDA, MPI, MapReduce and DryadLINQ, and learning settings (supervised, unsupervised, semi-supervised and online learning). Extensive coverage of parallelization of boosted trees, SVMs, spectral clustering, belief propagation and other popular learning algorithms, and deep dives into several applications, make the book equally useful for researchers, students and practitioners.

More books from Cambridge University Press

Cover of the book Brands, Competition Law and IP by
Cover of the book European Economic and Social Constitutionalism after the Treaty of Lisbon by
Cover of the book The Cambridge Companion to Postcolonial Travel Writing by
Cover of the book Venice by
Cover of the book Medieval Affect, Feeling, and Emotion by
Cover of the book Plotinus' Legacy by
Cover of the book Genocide and the Europeans by
Cover of the book Technology and the Diva by
Cover of the book Citizenship and Antisemitism in French Colonial Algeria, 1870–1962 by
Cover of the book Poetry and Paternity in Renaissance England by
Cover of the book Shakespeare, Popularity and the Public Sphere by
Cover of the book British Plant Communities: Volume 4, Aquatic Communities, Swamps and Tall-Herb Fens by
Cover of the book Motives in Children's Development by
Cover of the book Making Prussians, Raising Germans by
Cover of the book The Social Life of Hagiography in the Merovingian Kingdom by
We use our own "cookies" and third party cookies to improve services and to see statistical information. By using this website, you agree to our Privacy Policy