搜索资源列表
-
0下载:
Boosting is a meta-learning approach that aims at combining an ensemble of weak classifiers to form a strong classifier. Adaptive Boosting (Adaboost) implements this idea as a greedy search for a linear combination of classifiers by overweighting the
-
-
0下载:
AdaBoost, Adaptive Boosting, is a well-known meta machine learning algorithm that was proposed by Yoav Freund and Robert Schapire. In this project there two main files
1. ADABOOST_tr.m
2. ADABOOST_te.m
to traing and test a user-coded learnin
-
-
1下载:
最经典AdaBoost实现,适合初学,有大量详细的注释,容易理解-This a classic AdaBoost implementation, in one single file with easy understandable code.
The function consist of two parts a simple weak classifier and a boosting part:
The weak classifier tries to find the b
-
-
0下载:
Boosting中的AdaBoost.M1算法在文本分类中的应用实现。使用ICTCLAS用于中文分词,弱分类器使用Naive Bayes。程序参数使用配置文件的格式。-Application of text classification using AdaBoost.M1. Use ICTCLAS tool in Chinese segment, and use Naive Bayes as the weak classifier. use the config file as the para
-
-
0下载:
这是一个经典的形变模型实施,在一个单一的文件用简单的可以理解的代码。
功能包括两部分一个简单的弱分类器和一个促进部分:
弱分类器试图找到最佳阈值的数据维数对数据进行分离成两个阶级1和1
要求的进一步提高分类器部分迭代,每一步是变化分类权重miss-classified例子。这造成了一连串的“弱分类器”,表现得像一个“强大分类器”
-This a classic AdaBoost implementation, in one single file with easy unders
-
-
1下载:
AdaBoost元算法属于boosting系统融合方法中最流行的一种,说白了就是一种串行训练并且最后加权累加的系统融合方法。
具体的流程是:每一个训练样例都赋予相同的权重,并且权重满足归一化,经过第一个分类器分类之后,
计算第一个分类器的权重alpha值,并且更新每一个训练样例的权重,然后再进行第二个分类器的训练,相同的方法.......
直到错误率为0或者达到指定的训练轮数,其中最后预测的标签计算是各系统*alpha的加权和,然后sign(预测值)。
可以看出,训练流程是串行的
-
-
0下载:
目前支持这种分类器的boosting技术有四种: Discrete Adaboost, Real Adaboost, Gentle Adaboost and Logitboost。-Currently support this classifier boosting technology, there are four: Discrete Adaboost, Real Adaboost, Gentle Adaboost and Logitboost.
-
-
1下载:
使用matlab编程的,boosting分类器的源程序,用于图像处理的行人的识别-Recognition using matlab programming, boosting classifier source for image processing pedestrians
-
-
0下载:
The project is to determine how much a particular factor influences on the helpfulness of a review. We extracted
features like polarity, rating, average word length, helpfulness ratio the collected amazon data. We used
gradient boosting classifie
-