搜索资源列表
小叮咚分词模块
- 小叮呼的分词模块 小叮呼的分词模块-small bite called the Word module called the small bite-term m odule
百度分词词库
- 据说是百度以前用的中文分词词典,希望对大家有一点帮助哈,快下快下-allegedly Baidu before the Chinese word dictionaries, we hope to have a bit of help to Kazakhstan, where fast under fast!
分词模块
- 一个非常有用的分词模块,对研究搜索引擎的人有参考价值-a very useful segmentation module, the study of search engines reference value
中文分词函数库CipSegSDKV1.03
- 东大做中文分词的源代码,主要是用于搜索引擎的中文文本预处理-Tung Chinese-made version of the source code is mainly for the Chinese search engine Hypertext
Delphi实现的简单中文分词v1.1
- Delphi实现的简单中文分词,Delphi实现的简单中文分词-Delphi simple Chinese word Delphi simple Chinese word
庖丁分词工具
- 一个流行的java分词程序。
本程序可以实现对已有网页的信息提取和分词
- 本程序可以实现对已有网页的信息提取和分词,结果会导入叫做res.txt的文件中。本程序是开发搜索引擎的前期工作。-This procedure can be achieved on existing Web information extraction and segmentation, the results into a file called res.txt. This program is the development of the preliminary work the searc
Chinesewordsegmentationalgorithm
- 中文分词算法,跟金山词霸一样,当鼠标移动到语句上时,能自动分割词语-Chinese word segmentation algorithm with the same PowerWord, when the mouse moved to sentence when the words automatically partition
Codes_and_Application
- 中科院的的分词工具,应该是分中文用的,效率不错-Chinese Academy of Sciences of the sub-word tools, should be used at the Chinese, the efficiency of a good
css
- 用VISUAL C++编写的中文分词系统C-Using VISUAL C++ Prepared Chinese word segmentation system C
include
- 用VISUAL C++编写的中文分词系统中的INCULDE算法-Using VISUAL C++ Prepared Chinese word segmentation system INCULDE algorithm
utils
- 用VISUAL C++编写的中文分词系统 UTILS算法-Using VISUAL C++ Prepared Chinese word segmentation system Utils algorithm
NLuke0.12
- 这是一个基于网络的,扩展了lunce的一个搜索分词工具-This is a web-based, expanded lunce participle of a search tool
SentenceSplit
- 中文分词组件。小型搜索引擎用中文分词组件, 内带词库-Sentence Split
src
- 利用lucene编写的一个简单搜索引擎,能够中文分词。-a simple search engine built with lucene.
SearchEngine
- C#+Lucene.Net开发完成的一个自定义WEB搜索引擎,本项目实现了分词、模糊索引,加以Lucene.Net内部核心功能共同实现了搜索机制引擎-C#+ Lucene.Net developed a custom WEB search engine, the project achieved a sub-word, fuzzy indexing, Lucene.Net be the core function of the internal search mechanism to achie
UseHLSSplit(Fix)
- 中文分词处理,delphi调用海量智能分词库,修改了网上另一个版本的错误。-Chinese word processing, delphi call the massive intelligence points thesaurus, revised the online version of the error to another.
111
- 中文分词 词库 分次字典中文分词 词库 分次字典- IHTMLDocument3* pHTMLDoc3 HRESULT hr = m_pHTMLDocument2->QueryInterface(IID_IHTMLDocument3, (LPVOID*)&pHTMLDoc3) ASSERT(SUCCEEDED(hr))
fenci
- 一个简单的基于词典分词的程序,lucene的分词程序不少,但有时候并不需要复杂的功能,只是需要简单的根据指定的词典分词。代码简单,可以作为学习参考-A simple dictionary-based word process, lucene procedures for sub-word a lot, but sometimes does not require complex functions, but only require a simple dictionary word accord
33753115ktdictseg_v1.0.01
- 中文分词算法,效率很高,使用词典树装搜索进行单词切割,并提供扩充词库的函数-It s an arithmetc of word segment,it has a very high efficiency!