Group : nz.ac.waikato.cms.weka

citationKNN

nz.ac.waikato.cms.weka » citationKNN

Modified version of the Citation kNN multi instance classifier. For more information see: Jun Wang, Zucker, Jean-Daniel: Solving Multiple-Instance Problem: A Lazy Learning Approach. In: 17th International Conference on Machine Learning, 1119-1125, 2000.

更新时间: 1970-01-01 08:00

CLOPE

nz.ac.waikato.cms.weka » CLOPE

Yiling Yang, Xudong Guan, Jinyuan You: CLOPE: a fast and effective clustering algorithm for transactional data. In: Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining, 682-687, 2002.

更新时间: 1970-01-01 08:00

decorate

nz.ac.waikato.cms.weka » decorate

DECORATE is a meta-learner for building diverse ensembles of classifiers by using specially constructed artificial training examples. Comprehensive experiments have demonstrated that this technique is consistently more accurate than the base classifier, Bagging and Random Forests. Decorate also obtains higher accuracy than Boosting on small training sets, and achieves comparable performance on larger training sets. For more details see: P. Melville, R. J. Mooney: Constructing Diverse Classifier Ensembles Using Artificial Training Examples. In: Eighteenth International Joint Conference on Artificial Intelligence, 505-510, 2003; P. Melville, R. J. Mooney (2004). Creating Diversity in Ensembles Using Artificial Data. Information Fusion: Special Issue on Diversity in Multiclassifier Systems.

更新时间: 1970-01-01 08:00

DMNBtext

nz.ac.waikato.cms.weka » DMNBtext

Class for building and using a Discriminative Multinomial Naive Bayes classifier. For more information see: Jiang Su,Harry Zhang,Charles X. Ling,Stan Matwin: Discriminative Parameter Learning for Bayesian Networks. In: ICML 2008', 2008.

更新时间: 1970-01-01 08:00

dualPerturbAndCombine

nz.ac.waikato.cms.weka » dualPerturbAndCombine

Class for building and using classification and regression trees based on the closed-form dual perturb and combine algorithm described in Pierre Geurts, Lous Wehenkel: Closed-form dual perturb and combine for tree-based models. In: Proceedings of the 22nd International Conference on Machine Learning, 233-240, 2005.

更新时间: 1970-01-01 08:00

EMImputation

nz.ac.waikato.cms.weka » EMImputation

Replaces missing numeric values using Expectation Maximization with a multivariate normal model. Described in " Schafer, J.L. Analysis of Incomplete Multivariate Data, New York: Chapman and Hall, 1997."

更新时间: 1970-01-01 08:00

ensembleLibrary

nz.ac.waikato.cms.weka » ensembleLibrary

Manages a libary of ensemble classifiers

更新时间: 1970-01-01 08:00

fastCorrBasedFS

nz.ac.waikato.cms.weka » fastCorrBasedFS

Feature selection method based on correlation measureand relevance and redundancy analysis. Use in conjunction with an attribute set evaluator (SymmetricalUncertAttributeEval). For more information see: Lei Yu, Huan Liu: Feature Selection for High-Dimensional Data: A Fast Correlation-Based Filter Solution. In: Proceedings of the Twentieth International Conference on Machine Learning, 856-863, 2003.

更新时间: 1970-01-01 08:00

filteredAttributeSelection

nz.ac.waikato.cms.weka » filteredAttributeSelection

This package provides two meta attribute selection evaluators that can apply an arbitrary filter to the input data before executing the actual attribute selection scheme. One filters data and then passes it to an attribute evaluator (FilteredAttributeEval), and the other filters data and then passes it to a subset evaluator (FilteredSubsetEval).

更新时间: 1970-01-01 08:00

fuzzyLaticeReasoning

nz.ac.waikato.cms.weka » fuzzyLaticeReasoning

The Fuzzy Lattice Reasoning Classifier uses the notion of Fuzzy Lattices for creating a Reasoning Environment. The current version can be used for classification using numeric predictors. For more information see: I. N. Athanasiadis, V. G. Kaburlasos, P. A. Mitkas, V. Petridis: Applying Machine Learning Techniques on Air Quality Data for Real-Time Decision Support. In: 1st Intl. NAISO Symposium on Information Technologies in Environmental Engineering (ITEE-2003), Gdansk, Poland, 2003; V. G. Kaburlasos, I. N. Athanasiadis, P. A. Mitkas, V. Petridis (2003). Fuzzy Lattice Reasoning (FLR) Classifier and its Application on Improved Estimation of Ambient Ozone Concentration.

更新时间: 1970-01-01 08:00

generalizedSequentialPatterns

nz.ac.waikato.cms.weka » generalizedSequentialPatterns

Class implementing a GSP algorithm for discovering sequential patterns in a sequential data set. The attribute identifying the distinct data sequences contained in the set can be determined by the respective option. Furthermore, the set of output results can be restricted by specifying one or more attributes that have to be contained in each element/itemset of a sequence. For further information see: Ramakrishnan Srikant, Rakesh Agrawal (1996). Mining Sequential Patterns: Generalizations and Performance Improvements.

更新时间: 1970-01-01 08:00

grading

nz.ac.waikato.cms.weka » grading

Implements Grading. The base classifiers are "graded". For more information, see A.K. Seewald, J. Fuernkranz: An Evaluation of Grading Classifiers. In: Advances in Intelligent Data Analysis: 4th International Conference, Berlin/Heidelberg/New York/Tokyo, 115-124, 2001.

更新时间: 1970-01-01 08:00

hiddenNaiveBayes

nz.ac.waikato.cms.weka » hiddenNaiveBayes

Contructs Hidden Naive Bayes classification model with high classification accuracy and AUC. For more information refer to: H. Zhang, L. Jiang, J. Su: Hidden Naive Bayes. In: Twentieth National Conference on Artificial Intelligence, 919-924, 2005.

更新时间: 1970-01-01 08:00

hyperPipes

nz.ac.waikato.cms.weka » hyperPipes

Class implementing a HyperPipe classifier. For each category a HyperPipe is constructed that contains all points of that category (essentially records the attribute bounds observed for each category). Test instances are classified according to the category that "most contains the instance". Does not handle numeric class, or missing values in test cases. Extremely simple algorithm, but has the advantage of being extremely fast, and works quite well when you have "smegloads" of attributes.

更新时间: 1970-01-01 08:00

isotonicRegression

nz.ac.waikato.cms.weka » isotonicRegression

Learns an isotonic regression model. Picks the attribute that results in the lowest squared error. Missing values are not allowed. Can only deal with numeric attributes. Considers the monotonically increasing case as well as the monotonically decreasing case.

更新时间: 1970-01-01 08:00
共9页 , 总 123
索引仓库
仓库 个数
Central 592045