\section{Methods}

The experiments of this paper generated numerous {\em solo-methods}, found the ones that perform the best,
the combined those into a set of {\em multi-methods}.
\subsection{Solo-Methods}
Each {\em solo-method} is a pair of {\em pre-precessor} and {\em learner}.
This section  describes those pairs.

\subsubsection{Learners}

Our selection of learners comes from Shepperd 2005\footnote{does it?}.

{\em Instance-based learners:} 1NN\footnote{Ekrem- need better terminology. need to group abe0 with 1nn. need to distinquish these from neural net}
ABE0

{\em Iterative dicotomization:} 
CART-Off
CART-On

{\em One-layered neural net:}

NNet

{\em Regression methods}
PCR
PLSR
SLReg
SWReg

\subsection{Pre-Processors}

\footnote{Why these pre-processors? Where in the lit are they recommended?}

{\em Simple numeric techniques:} 
log, norm

{\em Other:}
none

{\em Feature synthesis:} PCA

{\em Feature selection:} SFS
SWReg

{\em Discretization:}
freq3bin
freq5bin
width3bin
width5bin

\subsection{Pruning}
Not all of the above combinations are valid. For example, ABE0 normalizes all numerics to the range min..max, 0..1. Hence:
\bi
\item
The combination
{\em norm-ABE0} is redundant ;
\item
The combgination {\em none-ABE0} is misleading since ABE0 always applies normalization/
\ei
The following methods were therefore prunes:
\bi
\item XXXX
\ei
In all, this left YYY solo methods\footnote{Ekrem- please regenerate the charts with only these methods}.
