资源列表
F2
- 这是全国研究生第十一届数学建模竞赛E题第二问的程序,它可以用于解决优化问题-This is the eleventh National Graduate Mathematical Contest in Modeling E second question asked of the program, which can be used to solve optimization problems
Cluster_K-means
- k中心算法的基本过程是:首先为每个簇随意选择一个代表对象,剩余的对象根据其与每个代表对象的距离(此处距离不一定是欧氏距离,也可能是曼哈顿距离)分配给最近的代表对象所代表的簇;然后反复用非代表对象来代替代表对象,以优化聚类质量。聚类质量用一个代价函数来表示。当一个中心点被某个非中心点替代时,除了未被替换的中心点外,其余各点被重新分配。-The basic process k center algorithm is: First free to choose a delegate object fo
Cluster_DBSCAN
- DBSCAN(Density-Based Spatial Clustering of Applications with Noise,具有噪声的基于密度的聚类方法)是一种基于密度的空间聚类算法。该算法将具有足够密度的区域划分为簇,并在具有噪声的空间数据库中发现任意形状的簇,它将簇定义为密度相连的点的最大集合。 该算法利用基于密度的聚类的概念,即要求聚类空间中的一定区域内所包含对象(点或其他空间对象)的数目不小于某一给定阈值。DBSCAN算法的显著优点是聚类速度快且能够有效处理噪声点和发
Text1
- 遗传算法的实现代码,是最原始的遗传算法,不加任何改进,适合初学者,代码不长,容易理解-Genetic algorithm implementation code, is the most primitive genetic algorithm, without any improvement, suitable for beginners, the code is not long, easy to understand
GA_code
- 遗传算法的代码,面向对象的c++编程,编译通过,由一个头文件和一个算法引擎文件和一个主函数文件组成,比较适合有面向对象编程基础的初学者-Genetic algorithm code, object-oriented programming c++, compile, it consists of a header and an algorithm engine files and files a main function, more suitable for beginners basis o
a2
- 蚁群算法C++实现,是比较原始的蚁群算法,比较适合初学者,含注释-C++ ant colony algorithm to achieve, it is more primitive colony algorithm, more suitable for beginners, including the Notes
DBN_C
- 深度信念网络工具箱,用C语言写的,包括了受限玻尔兹曼机的工具箱-deep belief network toolbox, write by C programming language
DBN-cpp
- 深度信念网络工具箱,用C++写的,包含了受限波尔兹曼机的程序-deep belief network toolbox, including Restricted Boltzmann Machines, c++ edition
DBN-python
- 深度信念网络工具箱,用python写的,包含了受限波尔兹曼机的程序-deep belief network toolbox, including Restricted Boltzmann Machines, python edition
DBN-java
- 深度信念网络工具箱,用JAVA写的,包含了受限波尔兹曼机的程序-deep belief network toolbox, including Restricted Boltzmann Machines, java edition
rbf_pid_code
- 在MATLAB环境中实现RBF神经网络自适应控制-Implement RBF Neural Network Adaptive Control in the MATLAB environment
plot_isotonic_regression
- 保序回归是寻找使训练集均方差最小的近似函数,它的优点是目标函数不要线性的。-The isotonic regression finds a non-decreasing approximation of a function while minimizing the mean squared error on the training data. The benefit of such a model is that it does not assume any form for the tar