[Most-ai-contest] Cross-View Training
許喬為
cwhsu於iis.sinica.edu.tw
Tue 10月 5 16:11:21 CST 2021
For Cross-Lingual Machine Translation, Multi-View Cross-Lingual Structured Prediction with Minimum Supervision (ACL2021) [ http://faculty.sist.shanghaitech.edu.cn/faculty/tukw/acl21mv.pdf | http://faculty.sist.shanghaitech.edu.cn/faculty/tukw/acl21mv.pdf ]
From: "許喬為" <cwhsu於iis.sinica.edu.tw>
To: "Most-ai-contest" <Most-ai-contest於iis.sinica.edu.tw>
Sent: Tuesday, October 5, 2021 3:55:39 PM
Subject: Cross-View Training
Dear all,
You can check Cross-View Training in the following papers.
1. For NER, Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning [ https://aclanthology.org/D18-1217.pdf | https://aclanthology.org/D18-1217.pdf ] (Alibaba) <= This applies to transformer-based model, which I think is quite helpful and provides an easy way to do cross-view training on BERT-like models.
2. For Sequence Labeling, e.g., NER, Semi-Supervised Sequence Modeling with Cross-View Training [ https://arxiv.org/pdf/2105.03654.pdf | https://aclanthology.org/D18-1217.pdf ] (Clark, Le Quoc, Chris Manning)
3. A Survey on Multi-view Learning: [ https://arxiv.org/abs/1304.5634 | https://arxiv.org/abs/1304.5634 ]
FYI,
Chiao-Wei
-------------- 下一部份 --------------
抹去了一個 HTML 附加檔...
URL: <http://www.iis.sinica.edu.tw/pipermail/most-ai-contest/attachments/20211005/05563789/attachment.html>
More information about the Most-ai-contest
mailing list