知识图谱与语言预训练_biji

  1. ERNIE: Enhanced Language Representation with Informative Entities. (ACL 2019), Wikipedia作为文本语料输入,WikiData作为知识图谱输入。底层模型对于文本进行建模,高层模型对于知识信息进行整合。
  2. COMET : Commonsense Transformers for Automatic Knowledge Graph Construction.ACL2019
  3. KnowBERT:Knowledge Enhanced Contextual Word Representations. (EMNLP 2019)
    WKLM: Pretrained Encyclopedia: Weakly Supervised Knowledge-Pretrained Language Model. (ICLR 2020). 弱监督方式,给定文本链接到wikidata,将部分文本进行替换,训练时预测文本是否被替换,loss为交叉墒
  4. K-Adapter: Infusing Knowledge into Pre-Trained Models with Adapters. (2020)
  5. KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation. (TACL2020)

ERNIE: Enhanced Language Representation with Informative Entities

清华团队,用kg 和bert 结合的方式学习预训练,模型用的Transformer。T-Encoder输入正常的token,K-Encoder层输入token和entity࿰

你可能感兴趣的:(知识图谱,深度学习,自然语言处理,自然语言处理,深度学习)