Weakly Supervised Natural Language Learning Without Redundant Views

Vincent Ng and Claire Cardie.
Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics (HLT-NAACL), 2003.

Click here for the PostScript or PDF version.

Abstract

We investigate single-view algorithms as an alternative to multi-view algorithms for weakly supervised learning for natural language processing tasks without a natural feature split. In particular, we apply co-training, self-training, and EM to one such task and find that both self-training and FS-EM, a new variation of EM that incorporates feature selection, outperform co-training and are comparatively less sensitive to parameter changes.