This dataset is a dump of Wikidata in JSON format, produced in January 4, 2016 by the Wikimedia Foundation. Wikidata historical dumps are not preserved by the Wikimedia Foundation. Thus, this dump is distributed here in order to make our experiments repeatable along the time.<br
RDF dump of wikidata produced with wdumper.View on wdumperentity count: 0, statement count: 0, tripl...
Slides for the final version of a presentation given several times in February 2016, during a tour o...
We maintain a wiki comparison dataset (which we used to call a wiki segmentation dataset) to show a ...
Wikidata dump retrieved from https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.json.bz2 o...
A copy of a dump which was available from WikiMedia: https://dumps.wikimedia.org/wikidatawiki/entiti...
The Wikidata knowledge base provides a public infrastructure for creating and syndicating machine-re...
This is a collection of pre-processed wikidata jsons which were used in the creation of CSQA dataset...
Wikipedia plain text data obtained from Wikipedia dumps with WikiExtractor in February 2018. The ...
The Wikidata knowledge base provides a public infrastructure for machine-readable metadata about com...
Wikidata is the newest project of the Wikimedia Foundation (WMF), the non-profit U.S.-based foundati...
RDF dump of wikidata produced with wdumper.Artworks with a Joconde ID (Joconde is a french database ...
This data contain the Wikidata of Feb 23, 2015 codified in four alternative schemes for our wo...
WDumper is a third-party tool enables users to craete custom dump of Wikidata. Here we create some t...
This is the so-called "truthy" dump of Wikidata from on or about August 28, 2021, shared for usage i...
RDF dump of wikidata produced with wdumper.View on wdumperentity count: 0, statement count: 0, tripl...
RDF dump of wikidata produced with wdumper.View on wdumperentity count: 0, statement count: 0, tripl...
Slides for the final version of a presentation given several times in February 2016, during a tour o...
We maintain a wiki comparison dataset (which we used to call a wiki segmentation dataset) to show a ...
Wikidata dump retrieved from https://dumps.wikimedia.org/wikidatawiki/entities/latest-all.json.bz2 o...
A copy of a dump which was available from WikiMedia: https://dumps.wikimedia.org/wikidatawiki/entiti...
The Wikidata knowledge base provides a public infrastructure for creating and syndicating machine-re...
This is a collection of pre-processed wikidata jsons which were used in the creation of CSQA dataset...
Wikipedia plain text data obtained from Wikipedia dumps with WikiExtractor in February 2018. The ...
The Wikidata knowledge base provides a public infrastructure for machine-readable metadata about com...
Wikidata is the newest project of the Wikimedia Foundation (WMF), the non-profit U.S.-based foundati...
RDF dump of wikidata produced with wdumper.Artworks with a Joconde ID (Joconde is a french database ...
This data contain the Wikidata of Feb 23, 2015 codified in four alternative schemes for our wo...
WDumper is a third-party tool enables users to craete custom dump of Wikidata. Here we create some t...
This is the so-called "truthy" dump of Wikidata from on or about August 28, 2021, shared for usage i...
RDF dump of wikidata produced with wdumper.View on wdumperentity count: 0, statement count: 0, tripl...
RDF dump of wikidata produced with wdumper.View on wdumperentity count: 0, statement count: 0, tripl...
Slides for the final version of a presentation given several times in February 2016, during a tour o...
We maintain a wiki comparison dataset (which we used to call a wiki segmentation dataset) to show a ...