搜索资源列表
jieba
- jieba分词软件,是python下的开源分词软件,里面有使用例子,简单易用-jieba segmentation software, is under the open source python segmentation software, there are examples of the use, easy to use
jieba-0.37
- 这个相当于一个模块工具,很好地进行数据的处理和分析,相当准确。-This is equivalent to a module tool, a good process and analyze the data, quite accurately。
jieba for Python
- jieba分词功能在python中的实现方法(The Method of jieba for word-split in python)
jieba
- 将句子分成很小的独立词,来提取信息,对照数据字典得到有用的关键信息,进行智能筛选题目或回答问题。(The sentence is divided into very small independent words to extract information, and the data dictionary is used to obtain useful key information.)
jieba-jieba3k
- MATLAB 结巴分词的工具包,用于很多中文分词的模式识别代码程序,利用已有函数工具包提高工作效率,内有安装说明(MATLAB jieba toolkit, used for many Chinese word segmentation pattern recognition code programs, using existing function toolkits to improve work efficiency, with installation instructions)
kmeans
- jieba分词将中文文本进行分词处理,将分词后的结果使用word2vec转化成词向量,使用kmeans将中文文本进行聚类(Jieba participle segmenting Chinese text, transforming the result of word segmentation into word vector using word2vec, and clustering Chinese text using kmeans.)
jieba分词.net源码
- 该项目是jieba分词组件的.net版本源码实现,生成的库可以使用,分词也较好
chatbot
- 聊天机器人 原理: 严谨的说叫 ”基于深度学习的开放域生成对话模型“,框架为Keras(Tensorflow的高层包装),方案为主流的RNN(循环神经网络)的变种LSTM(长短期记忆网络)+seq2seq(序列到序列模型),外加算法Attention Mechanism(注意力机制),分词工具为jieba,UI为Tkinter,基于”青云“语料(10万+闲聊对话)训练。 运行环境:python3.6以上,Tensorflow,pandas,numpy,jieba。(Chat Robot