Changeset 1188 for trunk/yat/classifier/KNN.h
 Timestamp:
 Feb 29, 2008, 11:14:04 AM (14 years ago)
 File:

 1 edited
Legend:
 Unmodified
 Added
 Removed

trunk/yat/classifier/KNN.h
r1164 r1188 43 43 namespace classifier { 44 44 45 /// 46 /// @brief Class for Nearest Neigbor Classification. 47 /// 48 /// The template argument Distance should be a class modelling 49 /// the concept \ref concept_distance. 50 /// The template argument NeigborWeighting should be a class modelling 51 /// the concept \ref concept_neighbor_weighting. 52 45 /** 46 @brief Nearest Neighbor Classifier 47 48 A sample is predicted based on the classes of its k nearest 49 neighbors among the training data samples. KNN supports using 50 different measures, for example, Euclidean distance, to define 51 distance between samples. KNN also supports using different ways to 52 weight the votes of the k nearest neighbors. For example, using a 53 uniform vote a test sample gets a vote for each class which is the 54 number of nearest neighbors belonging to the class. 55 56 The template argument Distance should be a class modelling the 57 concept \ref concept_distance. The template argument 58 NeighborWeighting should be a class modelling the concept \ref 59 concept_neighbor_weighting. 60 */ 53 61 template <typename Distance, typename NeighborWeighting=KNN_Uniform> 54 62 class KNN : public SupervisedClassifier … … 56 64 57 65 public: 58 /// 59 /// @brief Constructor 60 /// 66 /** 67 @brief Default constructor. 68 69 The number of nearest neighbors (k) is set to 3. Distance and 70 NeighborWeighting are initialized using their default 71 constructuors. 72 */ 61 73 KNN(void); 62 74 63 75 64 /// 65 /// @brief Constructor 66 /// 76 /** 77 @brief Constructor using an intialized distance measure. 78 79 The number of nearest neighbors (k) is set to 3. This constructor 80 should be used if Distance has parameters and the user wants 81 to specify the parameters by initializing Distance prior to 82 constructing the KNN. 83 */ 67 84 KNN(const Distance&); 68 85 69 86 70 / //71 /// @briefDestructor72 ///87 /** 88 Destructor 89 */ 73 90 virtual ~KNN(); 74 91 75 76 /// 77 /// Default number of neighbors (k) is set to 3. 78 /// 79 /// @return the number of neighbors 80 /// 92 93 /** 94 \brief Get the number of nearest neighbors. 95 \return The number of neighbors. 96 */ 81 97 u_int k() const; 82 98 83 /// 84 /// @brief sets the number of neighbors, k. 85 /// 99 /** 100 \brief Set the number of nearest neighbors. 101 102 Sets the number of neighbors to \a k_in. 103 */ 86 104 void k(u_int k_in); 87 105 … … 89 107 KNN<Distance,NeighborWeighting>* make_classifier(void) const; 90 108 91 /// 92 /// Train the classifier using training data and target. 93 /// 94 /// If the number of training samples set is smaller than \a k_in, 95 /// k is set to the number of training samples. 96 /// 97 void train(const MatrixLookup&, const Target&); 98 99 /// 100 /// Train the classifier using weighted training data and target. 101 /// 102 void train(const MatrixLookupWeighted&, const Target&); 103 104 105 /// 106 /// For each sample, calculate the number of neighbors for each 107 /// class. 108 /// 109 void predict(const MatrixLookup&, utility::Matrix&) const; 110 111 /// 112 /// For each sample, calculate the number of neighbors for each 113 /// class. 114 /// 115 void predict(const MatrixLookupWeighted&, utility::Matrix&) const; 116 117 109 /** 110 @brief Make predictions for unweighted test data. 111 112 Predictions are calculated and returned in \a results. For 113 each sample in \a data, \a results contains the weighted number 114 of nearest neighbors which belong to each class. Numbers of 115 nearest neighbors are weighted according to 116 NeighborWeighting. If a class has no training samples NaN's are 117 returned for this class in \a results. 118 */ 119 void predict(const MatrixLookup& data , utility::Matrix& results) const; 120 121 /** 122 @brief Make predictions for weighted test data. 123 124 Predictions are calculated and returned in \a results. For 125 each sample in \a data, \a results contains the weighted 126 number of nearest neighbors which belong to each class as in 127 predict(const MatrixLookup& data, utility::Matrix& results). 128 If a test and training sample pair has no variables with 129 nonzero weights in common, there are no variables which can 130 be used to calculate the distance between the two samples. In 131 this case the distance between the two is set to infinity. 132 */ 133 void predict(const MatrixLookupWeighted& data, utility::Matrix& results) const; 134 135 136 /** 137 @brief Train the KNN using unweighted training data with known 138 targets. 139 140 For KNN there is no actual training; the entire training data 141 set is stored with targets. KNN only stores references to \a data 142 and \a targets as copying these would make the %classifier 143 slow. If the number of training samples set is smaller than k, 144 k is set to the number of training samples. 145 146 \note If \a data or \a targets go out of scope ore are 147 deleted, the KNN becomes invalid and further use is undefined 148 unless it is trained again. 149 */ 150 void train(const MatrixLookup& data, const Target& targets); 151 152 /** 153 \brief Train the KNN using weighted training data with known targets. 154 155 See train(const MatrixLookup& data, const Target& targets) for 156 additional information. 157 */ 158 void train(const MatrixLookupWeighted& data, const Target& targets); 159 118 160 private: 119 161 120 162 const MatrixLookup* data_ml_; 121 163 const MatrixLookupWeighted* data_mlw_;
Note: See TracChangeset
for help on using the changeset viewer.