Tree-based Machine Learning Algorithms: Decision Trees, Random Forests, and Boosting

Tree-based Machine Learning Algorithms: Decision Trees, Random Forests, and Boosting - Clinton Sheppard

Tree-based Machine Learning Algorithms: Decision Trees, Random Forests, and Boosting


Get a hands-on introduction to building and using decision trees and random forests. Tree-based machine learning algorithms are used to categorize data based on known outcomes in order to facilitate predicting outcomes in new situations. You will learn not only how to use decision trees and random forests for classification and regression, and some of their respective limitations, but also how the algorithms that build them work. Each chapter introduces a new data concern and then walks you through modifying the code, thus building the engine just-in-time. Along the way you will gain experience making decision trees and random forests work for you. This book uses Python, an easy to read programming language, as a medium for teaching you how these algorithms work, but it isn't about teaching you Python, or about using pre-built machine learning libraries specific to Python. It is about teaching you how some of the algorithms inside those kinds of libraries work and why we might use them, and gives you hands-on experience that you can take back to your favorite programming environment.

Table of Contents:

  1. A brief introduction to decision trees
  2. Chapter 1: Branching - uses a greedy algorithm to build a decision tree from data that can be partitioned on a single attribute.
  3. Chapter 2: Multiple Branches - examines several ways to partition data in order to generate multi-level decision trees.
  4. Chapter 3: Continuous Attributes - adds the ability to partition numeric attributes using greater-than.
  5. Chapter 4: Pruning - explore ways of reducing the amount of error encoded in the tree.
  6. Chapter 5: Random Forests - introduces ensemble learning and feature engineering.
  7. Chapter 6: Regression Trees - investigates numeric predictions, like age, price, and miles per gallon.
  8. Chapter 7: Boosting - adjusts the voting power of the randomly selected decision trees in the random forest in order to improve its ability to predict outcomes.

Citeste mai mult

-10%

PRP: 54.17 Lei

!

Acesta este Pretul Recomandat de Producator. Pretul de vanzare al produsului este afisat mai jos.

48.75Lei

48.75Lei

54.17 Lei

Primesti 48 puncte

Important icon msg

Primesti puncte de fidelitate dupa fiecare comanda! 100 puncte de fidelitate reprezinta 1 leu. Foloseste-le la viitoarele achizitii!

Livrare in 2-4 saptamani

Plaseaza rapid comanda

Important icon msg

Completeaza mai jos numarul tau de telefon

Poti comanda acest produs introducand numarul tau de telefon. Vei fi apelat de un operator Libris.ro in cele mai scurt timp pentru prealuarea datelor necesare.

Descrierea produsului


Get a hands-on introduction to building and using decision trees and random forests. Tree-based machine learning algorithms are used to categorize data based on known outcomes in order to facilitate predicting outcomes in new situations. You will learn not only how to use decision trees and random forests for classification and regression, and some of their respective limitations, but also how the algorithms that build them work. Each chapter introduces a new data concern and then walks you through modifying the code, thus building the engine just-in-time. Along the way you will gain experience making decision trees and random forests work for you. This book uses Python, an easy to read programming language, as a medium for teaching you how these algorithms work, but it isn't about teaching you Python, or about using pre-built machine learning libraries specific to Python. It is about teaching you how some of the algorithms inside those kinds of libraries work and why we might use them, and gives you hands-on experience that you can take back to your favorite programming environment.

Table of Contents:

  1. A brief introduction to decision trees
  2. Chapter 1: Branching - uses a greedy algorithm to build a decision tree from data that can be partitioned on a single attribute.
  3. Chapter 2: Multiple Branches - examines several ways to partition data in order to generate multi-level decision trees.
  4. Chapter 3: Continuous Attributes - adds the ability to partition numeric attributes using greater-than.
  5. Chapter 4: Pruning - explore ways of reducing the amount of error encoded in the tree.
  6. Chapter 5: Random Forests - introduces ensemble learning and feature engineering.
  7. Chapter 6: Regression Trees - investigates numeric predictions, like age, price, and miles per gallon.
  8. Chapter 7: Boosting - adjusts the voting power of the randomly selected decision trees in the random forest in order to improve its ability to predict outcomes.

Citeste mai mult

De pe acelasi raft

Parerea ta e inspiratie pentru comunitatea Libris!

Noi suntem despre carti, si la fel este si

Newsletter-ul nostru.

Aboneaza-te la vestile literare si primesti un cupon de -10% pentru viitoarea ta comanda!

*Reducerea aplicata prin cupon nu se cumuleaza, ci se aplica reducerea cea mai mare.

Ma abonez image one
Ma abonez image one