# Changeset 1200 for trunk/yat/classifier

Ignore:
Timestamp:
Mar 5, 2008, 3:30:58 AM (13 years ago)
Message:

working on #75

Location:
trunk/yat/classifier
Files:
3 edited

Unmodified
Added
Removed

 r1184 /// /// \brief Train the %classifier using training data and targets. /// \brief Train the NBC using training data and targets. /// /// For each class mean and variance are estimated for each /// /// \brief Train the %classifier using weighted training data and /// \brief Train the NBC using weighted training data and /// targets. /// \f$P_j = \frac{1}{Z} \exp\left(-N\frac{\sum {w_i(x_i-\mu_i)^2}/(2\sigma_i^2)}{\sum w_i}\right)\f$, where \f$\mu_i \f$ and \f$\sigma_i^2 \f$ are the estimated mean and variance, respectively. Z is chosen such that total probability equals unity, \f$\sum P_j = 1 \f$. {w_i(x_i-\mu_i)^2}/(2\sigma_i^2)}{\sum w_i}\right) \prod_i\frac{1}{\sqrt{2\pi\sigma_i^2}}\f$, where \f$ \mu_i \f$and \f$ \sigma_i^2 \f$are the estimated mean and variance, respectively. Z is chosen such that total probability equals unity, \f$ \sum P_j = 1 \f$. \note If parameters could not be estimated during training, due • ## trunk/yat/classifier/SVM.cc  r1177 } /* double SVM::predict(const DataLookup1D& x) const { return margin_*(y+bias_); } */ int SVM::target(size_t i) const • ## trunk/yat/classifier/SVM.h  r1175 class KernelLookup; /// /// @brief Support Vector Machine /// /// /// /// Class for SVM using Keerthi's second modification of Platt's /// Sequential Minimal Optimization. The SVM uses all data given for /// training. If validation or testing is wanted this should be /// taken care of outside (in the kernel). /// /** \brief Support Vector Machine */ class SVM { /** Copy constructor. \brief Copy constructor. */ SVM(const SVM&); /// /// Destructor /// \brief Destructor /// virtual ~SVM(); /// /// Same as copy constructor. /// /** \brief Create an untrained copy of SVM. \returns A dynamically allocated SVM, which has to be deleted by the caller to avoid memory leaks. */ SVM* make_classifier(void) const; /// /// @return \f$ \alpha \f$/// @return alpha parameters /// const utility::Vector& alpha(void) const; /// large C means the training will be focused on getting samples /// correctly classified, with risk for overfitting and poor /// generalisation. A too small C will result in a training in which /// misclassifications are not penalized. C is weighted with /// respect to the size, so \f$ n_+C_+ = n_-C_- \f$, meaning a /// misclassificaion of the smaller group is penalized /// generalisation. A too small C will result in a training, in /// which misclassifications are not penalized. C is weighted with /// respect to the size such that \f$ n_+C_+ = n_-C_- \f$, meaning /// a misclassificaion of the smaller group is penalized /// harder. This balance is equivalent to the one occuring for /// regression with regularisation, or ANN-training with a /// %regression with regularisation, or ANN-training with a /// weight-decay term. Default is C set to infinity. /// + bias \f$, where \f$t \f$ is the target. @return output @return output of training samples */ const theplu::yat::utility::Vector& output(void) const; is calculated as the output times the margin, i.e., geometric distance from decision hyperplane: \f$\frac{ \sum \alpha_j t_j K_{ij} + bias}{w} \f$ The output has 2 rows. The first row t_j K_{ij} + bias}{|w|} \f\$ The output has 2 rows. The first row is for binary target true, and the second is for binary target false. The second row is superfluous as it is the first row void predict(const KernelLookup& input, utility::Matrix& predict) const; /* /// /// @return output times margin (i.e. geometric distance from /// double predict(const DataLookupWeighted1D& input) const; */ /// decreased. \throw if maximal number of epoch is reach. Class for SVM using Keerthi's second modification of Platt's Sequential Minimal Optimization. The SVM uses all data given for training. \throw std::runtime_error if maximal number of epoch is reach. */ void train(const KernelLookup& kernel, const Target& target);
Note: See TracChangeset for help on using the changeset viewer.