Group : nz.ac.waikato.cms.weka

ordinalClassClassifier

nz.ac.waikato.cms.weka » ordinalClassClassifier

Meta classifier that allows standard classification algorithms to be applied to ordinal class problems. For more information see: Eibe Frank, Mark Hall: A Simple Approach to Ordinal Classification. In: 12th European Conference on Machine Learning, 145-156, 2001. Robert E. Schapire, Peter Stone, David A. McAllester, Michael L. Littman, Janos A. Csirik: Modeling Auction Price Uncertainty Using Boosting-based Conditional Density Estimation. In: Machine Learning, Proceedings of the Nineteenth International Conference (ICML 2002), 546-553, 2002.

更新时间: 2017-12-06 09:22

classificationViaClustering

nz.ac.waikato.cms.weka » classificationViaClustering

A simple meta-classifier that uses a clusterer for classification. For cluster algorithms that use a fixed number of clusterers, like SimpleKMeans, the user has to make sure that the number of clusters to generate are the same as the number of class labels in the dataset in order to obtain a useful model. Note: at prediction time, a missing value is returned if no cluster is found for the instance. The code is based on the 'clusters to classes' functionality of the weka.clusterers.ClusterEvaluation class by Mark Hall.

更新时间: 2017-11-27 07:24

probabilityCalibrationTrees

nz.ac.waikato.cms.weka » probabilityCalibrationTrees

Provides probability calibration trees (PCTs) for local calibration of class probability estimates. To achieve calibration of a base learner, the PCT class must be used as the meta learner in the CascadeGeneralization class, which is also included in this package. The classifier to be calibrated must be used as the base learner in the CascadeGeneralization class. The CascadeGeneralization class can also be used independently to perform CascadeGeneralization for ensemble learning. The code for PCTs is largely the same as the LMT code for growing logistic model trees. For more details, see the ACML paper on probability calibration trees.

更新时间: 2017-11-27 07:03

functionalTrees

nz.ac.waikato.cms.weka » functionalTrees

Functional trees (decision trees with oblique splits and functions at the leaves)

更新时间: 2017-07-18 05:52

gridSearch

nz.ac.waikato.cms.weka » gridSearch

Performs a grid search of parameter pairs for the a classifier (Y-axis, default is LinearRegression with the "Ridge" parameter) and the PLSFilter (X-axis, "# of Components") and chooses the best pair found for the actual predicting. The initial grid is worked on with 2-fold CV to determine the values of the parameter pairs for the selected type of evaluation (e.g., accuracy). The best point in the grid is then taken and a 10-fold CV is performed with the adjacent parameter pairs. If a better pair is found, then this will act as new center and another 10-fold CV will be performed (kind of hill-climbing). This process is repeated until no better pair is found or the best pair is on the border of the grid. In case the best pair is on the border, one can let GridSearch automatically extend

更新时间: 2017-03-22 16:20

latentSemanticAnalysis

nz.ac.waikato.cms.weka » latentSemanticAnalysis

Performs latent semantic analysis and transformation of the data. Use in conjunction with a Ranker search. A low-rank approximation of the full data is found by specifying the number of singular values to use. The dataset may be transformed to give the relation of either the attributes or the instances (default) to the concept space created by the transformation.

更新时间: 2017-03-21 13:43

ensemblesOfNestedDichotomies

nz.ac.waikato.cms.weka » ensemblesOfNestedDichotomies

A meta classifier for handling multi-class datasets with 2-class classifiers by building an ensemble of nested dichotomies. For more info, check Lin Dong, Eibe Frank, Stefan Kramer: Ensembles of Balanced Nested Dichotomies for Multi-class Problems. In: PKDD, 84-95, 2005. Eibe Frank, Stefan Kramer: Ensembles of nested dichotomies for multi-class problems. In: Twenty-first International Conference on Machine Learning, 2004.

更新时间: 2017-02-21 14:58

multiInstanceLearning

nz.ac.waikato.cms.weka » multiInstanceLearning

A collection of multi-instance learning classifiers. Includes the Citation KNN method, several variants of the diverse density method, support vector machines for multi-instance learning, simple wrappers for applying standard propositional learners to multi-instance data, decision tree and rule learners, and some other methods.

更新时间: 2017-02-21 14:57

stackingC

nz.ac.waikato.cms.weka » stackingC

Implements StackingC (more efficient version of stacking). For more information, see A.K. Seewald: How to Make Stacking Better and Faster While Also Taking Care of an Unknown Weakness. In: Nineteenth International Conference on Machine Learning, 554-561, 2002. Note: requires meta classifier to be a numeric prediction scheme

更新时间: 2016-12-02 08:45

percentageErrorMetrics

nz.ac.waikato.cms.weka » percentageErrorMetrics

Provides root mean square percentage error and mean absolute percentage error for evaluating regression schemes.

更新时间: 2016-11-29 10:03

logarithmicErrorMetrics

nz.ac.waikato.cms.weka » logarithmicErrorMetrics

Provides root mean square logarithmic error and mean absolute logarithmic error for evaluating regression schemes.

更新时间: 2016-11-29 09:56

LibSVM

nz.ac.waikato.cms.weka » LibSVM

A wrapper class for the libsvm tools (the libsvm classes, typically the jar file, need to be in the classpath to use this classifier). LibSVM runs faster than SMO since it uses LibSVM to build the SVM classifier. LibSVM allows users to experiment with One-class SVM, Regressing SVM, and nu-SVM supported by LibSVM tool. LibSVM reports many useful statistics about LibSVM classifier (e.g., confusion matrix,precision, recall, ROC score, etc.)

更新时间: 2016-11-21 06:26

kfKettle

nz.ac.waikato.cms.weka » kfKettle

Knowledge Flow step that provides an entry point for data coming from the Kettle ETL tool.

更新时间: 2016-11-02 09:40

multiLayerPerceptrons

nz.ac.waikato.cms.weka » multiLayerPerceptrons

This package currently contains classes for training multilayer perceptrons with one hidden layer, where the number of hidden units is user specified. MLPClassifier can be used for classification problems and MLPRegressor is the corresponding class for numeric prediction tasks. The former has as many output units as there are classes, the latter only one output unit. Both minimise a penalised squared error with a quadratic penalty on the (non-bias) weights, i.e., they implement "weight decay", where this penalised error is averaged over all training instances. The size of the penalty can be determined by the user by modifying the "ridge" parameter to control overfitting. The sum of squared weights is multiplied by this parameter before added to the squared error. Both classes use BFGS opti

更新时间: 2016-10-31 16:23

streamingUnivariateStats

nz.ac.waikato.cms.weka » streamingUnivariateStats

This package provides A Knowledge Flow step to compute summary statistics incrementally

更新时间: 2016-09-26 17:00
共9页 , 总 123
索引仓库
仓库 个数
Central 592045