ID3 Decision Tree Learning Algorithm
ID3(Examples, Target, Attributes)
- Create a root node
- If all Examples have the same Target value, give the root that label
- Else if Attributes is empty label, the root according to the most common value
- Else begin
- Calculate the information gain for each attribute, according to the average entropy formula
- Select the attribute, A, with the lowest average entropy (highest information gain) and make this the attribute tested at the root
- For each possible value, v, of this attribute
- Add a new branch below the root, corresponding to A = v
- Let Examples(v) be those examples with A = v
- If Examples(v) is empty, make the new branch a leaf node labelled with the most common value among Examples
- Else let the new branch be the tree created by ID3(Examples(v), Target, Attributes - {A})
end
- Return root
This page borrowed from here and
modified slightly to correspond more precisely to Table 3.1, p. 56 of
the text Machine
Learning by Tom Mitchell.