搜索资源列表
entropy
- shannon Tsallis escort Tsallis renyi entropy and their relative entropy
entropy
- The functions include extensive Shannon and nonextensive Tsallis,escort Tsallis,and renyi entropy. the funcition names start with K_q_ indicate relative entropys Usage for all the seven functions:
waveletentropy
- 该程序是基于小波变换的熵谱,用来提取时间序列的周期成分。熵谱相对于小波谱来说在提取周期成分上有一定优势-The program is based on spectral entropy of wavelet transform to extract time series of the periodic component. Entropy is relative to the small spectral component in the extraction cycle, there are
slm.tar
- 利用剑桥的SLM工具包实现的一个计算ngram的代码,可以完成1-3元模型的训练以及压缩。压缩算法是利用相对熵策略压缩。-Using the Cambridge SLM toolkit to achieve a calculated ngram code, you can complete the training as well as compression of the 1-3 model. The compression algorithm is the use of compressio
image_fusion_proforma_evalu_quality
- 这是从网上整理出来的图像融合评价标准,总共有13项性能指标。包括平均梯度,边缘强度,信息熵,灰度均值,标准差(均方差MSE),均方根误差,峰值信噪比(psnr),空间频率(sf),图像清晰度,互信息(mi),结构相似性(ssim),交叉熵(cross entropy),相对标准差。大家一起交流吧~-This is sorted out from the online image fusion evaluation criteria, there are a total of 13 perform
yingxiangpingjia
- 影响的均值,标准差,信息熵,相关系数,相对偏差的计算-The impact of the mean, standard deviation, information entropy, correlation coefficient, the calculation of the relative deviation
xiangduishang
- 本文论述了相对熵阈值分割法在图像处理中的应用,列举了其在储粮害虫图像中的重点应用。-This article discusses the relative entropy threshold segmentation in image processing applications, cited in stored grain pests in the image key application.
Lecture3
- what is entropy, mutual information, relative entropy
Fergus-Perona
- We present a method to learn and recognize object class models from unlabeled and unsegmented cluttered scenes in a scale invariant manner. Objects are modeled as flexible constellations of parts. A probabilistic representation is used for al
Matlab-20120611
- End-Point Detection Introduction Frequency Domain Basic Spectral Entropy Spectral Entropy with Normalized The Relative Spectral Entropy The Mean Delta Features-End-Point Detection Introduction Frequency Domain Basi
APEN
- 样本熵是时间序列复杂度的一种度量, 在上个世纪末期由几位非线 性动力学研究者提出。样本熵比近似熵更具有相对一致性-Sample entropy is a measure of the complexity of the time series, at the end of the last century by several non-linear dynamics researchers. Sample entropy more than approximate entropy relati
Interval-Type-2-Relative-Entropy
- 区间二型模糊聚类算法,给出了利用区间二型模糊熵改进FCM算法的思路和方法-Interval type-2 fuzzy clustering algorithm, gives the ideas and methods of interval type-2 fuzzy entropy Improved FCM Algorithm
chapter01
- 完成对最大熵基本原理的解释,适合初学者掌握相对熵原理(Complete the interpretation of the basic principle of maximum entropy, suitable for beginners to master the relative entropy principle)
相对熵
- 计算相对熵,三角不等式,验证,,接口和毁容啊看就看定期维护(Calculating relative entropy)
kl
- 计算kl散度,信息熵,是随机变量或整个系统的不确定性。熵越大,随机变量或系统的不确定性就越大。 相对熵,用来衡量两个取值为正的函数或概率分布之间的差异。 交叉熵,用来衡量在给定的真实分布下,使用非真实分布所指定的策略消除系统的不确定性所需要付出的努力的大小。 相对熵=交叉熵-信息熵(Relative entropy, used to measure the difference between two positive functions or probability distributi
kl
- 信息熵,是随机变量或整个系统的不确定性。熵越大,随机变量或系统的不确定性就越大。 相对熵,用来衡量两个取值为正的函数或概率分布之间的差异。 交叉熵,用来衡量在给定的真实分布下,使用非真实分布所指定的策略消除系统的不确定性所需要付出的努力的大小。 相对熵=交叉熵-信息熵:(Relative entropy, used to measure the difference between two positive functions or probability distributions.
Homework_Dataset
- 信息熵,是随机变量或整个系统的不确定性。熵越大,随机变量或系统的不确定性就越大。相对熵,用来衡量两个取值为正的函数或概率分布之间的差异。交叉熵,用来衡量在给定的真实分布下,使用非真实分布所指定的策略消除系统的不确定性所需要付出的努力的大小。(Information entropy is the uncertainty of random variables or the whole system. The greater the entropy, the greater the uncertai
Class_4_Code
- 信息熵,是随机变量或整个系统的不确定性。熵越大,随机变量或系统的不确定性就越大。 相对熵,用来衡量两个取值为正的函数或概率分布之间的差异。 交叉熵,用来衡量在给定的真实分布下,使用非真实分布所指定的策略消除系统的不确定性所需要付出的努力的大小。 相对熵=交叉熵-信息熵:(Information entropy is the uncertainty of random variables or the whole system. The greater the entropy, the gr
Class_5_Code
- 信息熵,是随机变量或整个系统的不确定性。熵越大,随机变量或系统的不确定性就越大。相对熵,用来衡量两个取值为正的函数或概率分布之间的差异。交叉熵,用来衡量在给定的真实分布下,使用非真实分布所指定的策略消除系统的不确定性所需要付出的努力的大小。相对熵=交叉熵-信息熵:(Information entropy is the uncertainty of random variables or the whole system. The greater the entropy, the greater
chapter01
- conditional entropy, entropy, joint entropy, mutual information, relative entropy basic features are implemented in matlab approach