2. Benefits of Feature Subset Selection
• Too many dimensions; Elimination of the
curse of dimensionality problem
• Improved model and classifier performance
• Simple Models and elimination of over-
fitting
• Faster training times.
4. Feature Selection Methods
Wrapper
use a search algorithm to search through the
space of possible features and evaluate each subset
by running a model on the subset
Risk of over fitting to the model
Computationally expensive
Embedded
Embedded in and specific to a model
Filter
Similar to Wrappers in the search approach
Simpler filter is evaluated
15. • Forward selection methods: these methods
start with one or a few features selected
according to a method specific selection
criteria. More features are iteratively added
until a stopping criterion is met.
• Backward elimination methods: methods of
this type start with all features and iteratively
remove one feature or bunches of features.
16. Relief
• Evaluates the worth of an attribute by
repeatedly sampling an instance and
considering the value of the given attribute for
the nearest instance of the same and different
class. Can operate on both discrete and
continuous class data.
17. Relief
• Relief does not help with redundant features.
If most of the given features are relevant to
the concept, it would select most of them
even though only a fraction are necessary for
concept description