Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/68511
Title: Automatic segmentation of Chinese chunks using a neural network
Other Titles: 基于神经元网络的汉语组块自动划分
Authors: Wang, R
Chi, Z 
Keywords: Chunk analysis
Neural networks
Chinese information processing
Issue Date: 2004
Source: 计算机工程 (Computer engineering), 2004, v. 30, no. 20, p.133-5 How to cite?
Journal: 计算机工程 (Computer engineering) 
Abstract: 介绍一种基于三层神经元网络的汉语组块自动划分方法。输入信息为句子中每一个字本身及与前后字组合的划分情况,输出为句子中每个字的划分结果。对于一个新输入的汉语句子,在该方法中,并不对句子进行切词,这是与别的组块分析方法的不同之处。实验表明,该方法是可行的,也是有效的。
This paper presents a method for automatic segmentation of Chinese chunks based on 3-layer neural networks. The corpus has been processed with Chinese word segmentation and phrase identification and tagging. In the neural networks model, the input data is the segmentation situation of every character and its combinations with neighbor characters in a Chinese sentence. The output is the segmentation results of every character in a Chinese sentence. The preliminary results show that the method is feasible and effective.
URI: http://hdl.handle.net/10397/68511
ISSN: 1000-3428
Rights: © 2004 中国学术期刊电子杂志出版社。本内容的使用仅限于教育、科研之目的。
© 2004 China Academic Journal Electronic Publishing House. It is to be used strictly for educational and research purposes.
Appears in Collections:Journal/Magazine Article

Files in This Item:
File Description SizeFormat 
r20598.pdf136.99 kBAdobe PDFView/Open
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page view(s)

15
Checked on Oct 22, 2017

Download(s)

11
Checked on Oct 22, 2017

Google ScholarTM

Check



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.