XJIPC OpenIR  > 多语种信息技术研究室
uyghur named entity identification basing on maximum entropy model
Yong Yang; Chun Xu; Awudan Gulimire
2011
会议名称3rd International Conference on Information Technology and Computer Science
页码327-330
会议日期JUL 23-24, 2011
会议地点Huangshi, PEOPLES R CHINA
出版地THREE PARK AVENUE, NEW YORK, NY 10016-5990 USA
出版者3RD INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND COMPUTER SCIENCE (ITCS 2011), PROCEEDINGS
摘要

Named entity identification is a fundamental task during natural language processing. This paper puts forward identification method of Uighur which is based on maximum entropy models. Maximum entropy model can make full use of various and arbitrary language features. The language features of Chinese and English generally only integrate part of speech, word forms and other information. This paper combines with the characteristics of Uighur and makes segmentation of Uighur words; it takes word stern and word affix information as characteristics and adds to the maximum entropy model. The results of experiment show that named entity identification accuracy, recall ratio and F value have been significantly enhanced in the proposed method.

主办者Huazhong Univ Sci & Technol, Natl Tech Univ Ukraine, Harbin Inst Technol, Wuhan Univ, Huangshi Inst Technol, Res Assoc Modern Educ & Comp Sci
文献类型会议论文
条目标识符http://ir.xjipc.cas.cn/handle/365002/2307
专题多语种信息技术研究室
作者单位Chinese Acad Sci, Xinjiang Tech Inst Phys & Chem, Beijing 100864, Peoples R China
推荐引用方式
GB/T 7714
Yong Yang,Chun Xu,Awudan Gulimire. uyghur named entity identification basing on maximum entropy model[C]. THREE PARK AVENUE, NEW YORK, NY 10016-5990 USA:3RD INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND COMPUTER SCIENCE (ITCS 2011), PROCEEDINGS,2011:327-330.
条目包含的文件
条目无相关文件。
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Yong Yang]的文章
[Chun Xu]的文章
[Awudan Gulimire]的文章
百度学术
百度学术中相似的文章
[Yong Yang]的文章
[Chun Xu]的文章
[Awudan Gulimire]的文章
必应学术
必应学术中相似的文章
[Yong Yang]的文章
[Chun Xu]的文章
[Awudan Gulimire]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。