By modeling the context information, ELMo and BERT have successfully improved the state-of-the-art of word representation, and demonstrated their effectiveness on the Named Entity Recognition task. In this paper, in addition to such context modeling, we propose to encode the prior knowledge of entities from an external knowledge base into the representation, and introduce a Knowledge-Graph Augmented Word Representation or KAWR for named entity recognition. Basically, KAWR provides a kind of knowledge-aware representation for words by 1) encoding entity information from a pre-trained KG embedding model with a new recurrent unit (GERU), and 2) strengthening context modeling from knowledge wise by providing a relation attention scheme based on...
An enormous amount of digital information is expressed as natural-language (NL) text that is not eas...
Previous knowledge graph embedding approaches usually map entities to representations and utilize sc...
Human communication is inevitably grounded in the real world. Existing work on natural language proc...
© 2020, Springer Nature Switzerland AG. Recent years have witnessed the emergence of novel models fo...
Representation learning (RL) of knowledge graphs aims to project both entities and relations into a ...
Knowledge Graphs (KGs) have become increasingly popular in the recent years. However, as knowledge c...
Knowledge Graphs (KGs) such as Freebase and YAGO have been widely adopted in a variety of NLP tasks....
Most of the existing knowledge graph embedding models are supervised methods and largely relying on ...
The representing learning makes specialty of knowledge graph and it indicates the difference between...
Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally inco...
Most state-of-the-art approaches for named-entity recognition (NER) use semi supervised information ...
Nowadays, Knowledge Graphs (KGs) have become invaluable for various applications such as named entit...
Pre-trained language representation models, such as BERT, capture a general language representation ...
Nowadays, Knowledge Graphs (KGs) have become invaluable for various applications such as named entit...
Most state-of-the-art approaches for named-entity recognition (NER) use semi supervised information ...
An enormous amount of digital information is expressed as natural-language (NL) text that is not eas...
Previous knowledge graph embedding approaches usually map entities to representations and utilize sc...
Human communication is inevitably grounded in the real world. Existing work on natural language proc...
© 2020, Springer Nature Switzerland AG. Recent years have witnessed the emergence of novel models fo...
Representation learning (RL) of knowledge graphs aims to project both entities and relations into a ...
Knowledge Graphs (KGs) have become increasingly popular in the recent years. However, as knowledge c...
Knowledge Graphs (KGs) such as Freebase and YAGO have been widely adopted in a variety of NLP tasks....
Most of the existing knowledge graph embedding models are supervised methods and largely relying on ...
The representing learning makes specialty of knowledge graph and it indicates the difference between...
Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally inco...
Most state-of-the-art approaches for named-entity recognition (NER) use semi supervised information ...
Nowadays, Knowledge Graphs (KGs) have become invaluable for various applications such as named entit...
Pre-trained language representation models, such as BERT, capture a general language representation ...
Nowadays, Knowledge Graphs (KGs) have become invaluable for various applications such as named entit...
Most state-of-the-art approaches for named-entity recognition (NER) use semi supervised information ...
An enormous amount of digital information is expressed as natural-language (NL) text that is not eas...
Previous knowledge graph embedding approaches usually map entities to representations and utilize sc...
Human communication is inevitably grounded in the real world. Existing work on natural language proc...