Path path = Paths.get(new File(MyEnTokenizeStopStemUtil.class.getClassLoader().getResource("dicts/jieba.dict").getPath()).getAbsolutePath());
WordDictionary.getInstance().loadUserDict(path);
为什么通过这种方式window环境正常,Linux环境下报错load user dict failure!
jieba分词详解引言 “结巴”分词是一个Python 中文分词组件,参见https://github.com/fxsjy/jieba可以对中文文本进行分词、词性标注、关键词抽取等功能,并且支...