[Most-ai-contest] A Short Guide for Adding New Features/Embeddings to BERT's Attention Layers
張光瑜
simonc於iis.sinica.edu.tw
Wed 4月 1 17:48:28 CST 2020
Dear all,
To help you implement models that adds new features/embeddings to BERT's attention layers, we have create a short guide (guide_bert_add_feature_to_attention.pptx) for it.
You can also refer to the code of the single-span-multi-hop module (run_ssmh.py), which uses many kinds of features.
You may ask me if you have any problem.
Regards,
張光瑜
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.iis.sinica.edu.tw/pipermail/most-ai-contest/attachments/20200401/a1f9c14f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: run_ssmh.py
Type: text/x-python
Size: 44839 bytes
Desc: not available
URL: <http://www.iis.sinica.edu.tw/pipermail/most-ai-contest/attachments/20200401/a1f9c14f/attachment-0001.py>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: guide_bert_add_feature_to_attention.pptx
Type: application/vnd.openxmlformats-officedocument.presentationml.presentation
Size: 48817 bytes
Desc: not available
URL: <http://www.iis.sinica.edu.tw/pipermail/most-ai-contest/attachments/20200401/a1f9c14f/attachment-0001.bin>
More information about the Most-ai-contest
mailing list