This dataset consists the complete revision history of every instance of the 100 most important classes in Wikidata. It contains 9.3 million classes and around 450 million revisions made to those classes. This dataset was exported from a MongoDB database. After decompressing the files, the resulting JSON files can be imported into MongoDB using the following commands: mongoimport --db=db_name --collection=wd_entities --file=wd_entities.json mongoimport --db=db_name --collection=wd_revisions --file=wd_revisions.json Make sure that db_name is replaced by the database where this data will be imported. Documents within the wd_entities collection have the following schema: id: Internal id of the entity used by Wikidata (e.g. 8195238). en...
<div>This dataset represents <b>structured metadata and contextual information about references adde...
This is a collection of pre-processed wikidata jsons which were used in the creation of CSQA dataset...
The Wikidata knowledge base provides a public infrastructure for machine-readable metadata about com...
This dataset includes the historical versions of all individual references per article in the Englis...
WDumper is a third-party tool enables users to craete custom dump of Wikidata. Here we create some t...
Wikidata is the newest project of the Wikimedia Foundation (WMF), the non-profit U.S.-based foundati...
Wikis are popular tools commonly used to support distributedcollaborative work. Wikis can be seen as...
Introduction Wikipedia is written in the wikitext markup language. When serving content, the MediaW...
These datasets are used to support the results of the paper "Datasets for Creating and Querying Pers...
This version of Widoco creates an automated changelog section of your ontology. For all classes, pro...
<p>This Dataset contains about 600.000 Recent-Changes fetched from the Wikipadia API.</p> <p>All ent...
We present an open-source toolkit which allows (i) to reconstruct past states of Wikipedia, and (ii)...
International audienceIn its 7 years of existence, Wikidata has accumulated an edit history of milli...
Files in this dataset have been produced during Flexibility experiments of Wikidata subsetting pract...
WD50K dataset: An hyper-relational dataset derived from Wikidata statements. The dataset is constru...
<div>This dataset represents <b>structured metadata and contextual information about references adde...
This is a collection of pre-processed wikidata jsons which were used in the creation of CSQA dataset...
The Wikidata knowledge base provides a public infrastructure for machine-readable metadata about com...
This dataset includes the historical versions of all individual references per article in the Englis...
WDumper is a third-party tool enables users to craete custom dump of Wikidata. Here we create some t...
Wikidata is the newest project of the Wikimedia Foundation (WMF), the non-profit U.S.-based foundati...
Wikis are popular tools commonly used to support distributedcollaborative work. Wikis can be seen as...
Introduction Wikipedia is written in the wikitext markup language. When serving content, the MediaW...
These datasets are used to support the results of the paper "Datasets for Creating and Querying Pers...
This version of Widoco creates an automated changelog section of your ontology. For all classes, pro...
<p>This Dataset contains about 600.000 Recent-Changes fetched from the Wikipadia API.</p> <p>All ent...
We present an open-source toolkit which allows (i) to reconstruct past states of Wikipedia, and (ii)...
International audienceIn its 7 years of existence, Wikidata has accumulated an edit history of milli...
Files in this dataset have been produced during Flexibility experiments of Wikidata subsetting pract...
WD50K dataset: An hyper-relational dataset derived from Wikidata statements. The dataset is constru...
<div>This dataset represents <b>structured metadata and contextual information about references adde...
This is a collection of pre-processed wikidata jsons which were used in the creation of CSQA dataset...
The Wikidata knowledge base provides a public infrastructure for machine-readable metadata about com...