[Most-ai-contest] Cross-View Training

許喬為 cwhsu於iis.sinica.edu.tw
Tue 10月 5 15:55:39 CST 2021


Dear all, 

You can check Cross-View Training in the following papers. 

1. For NER, Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning [ https://aclanthology.org/D18-1217.pdf | https://aclanthology.org/D18-1217.pdf ] (Alibaba) <= This applies to transformer-based model, which I think is quite helpful and provides an easy way to do cross-view training on BERT-like models. 
2. For Sequence Labeling, e.g., NER, Semi-Supervised Sequence Modeling with Cross-View Training [ https://arxiv.org/pdf/2105.03654.pdf | https://aclanthology.org/D18-1217.pdf ] (Clark, Le Quoc, Chris Manning) 
3. A Survey on Multi-view Learning: [ https://arxiv.org/abs/1304.5634 | https://arxiv.org/abs/1304.5634 ] 

FYI, 

Chiao-Wei 
-------------- 下一部份 --------------
抹去了一個 HTML 附加檔...
URL: <http://www.iis.sinica.edu.tw/pipermail/most-ai-contest/attachments/20211005/83dc9fee/attachment.html>


More information about the Most-ai-contest mailing list