#wdqs — Public Fediverse posts
Live and recent posts from across the Fediverse tagged #wdqs, aggregated by home.social.
-
Just learned from @vrandecic 's presentation (https://videolectures.net/videos/iswc2025_nara_vrandecic_wikipedia_future) that there is a #wikidata #qlever interface mimicking the #WDQS gui. So here you've got native support for visualisations etc. -> https://wikidata-query-gui.scholia.wiki/
The interface has the drop-down-autocompletion function but no implicit PREFIXes, so you must add these on your own. Not a perfect replacement for WDQS but close.
-
Done mapping all 25 #barangays of Minalabac, Camarines Sur, #Philippines 🇵🇭 in #OpenStreetMap, creating/updating their #Wikidata items, and linking the two with each other.
Wanna play around? Here is the Overpass Turbo query: https://overpass-turbo.eu/s/2mza
And here is the Wikidata Query Service (#WDQS) query: https://w.wiki/K6ne
Previously: https://en.osm.town/@seav/114750346738105697
-
That's a list I've been looking for for some time: https://www.mediawiki.org/wiki/Wikibase/Indexing/RDF_Dump_Format#WDQS_data_differences
It shows the differences between the RDF source of a Wikidata item and the way it's stored in the RDF actually used in #Wikidata Query Service #WDQS (and in #qlever as well, apparently).
I'd stumbled over the fact that "?item a wikibase:Item" doesn't return any results. The link above explains why.
-
Maybe late to the party, but this days I learned that #Wikidata query service (#WDQS) now technically enforces descriptive user-agents as mandated by the Wikimedia User-Agent Policy: https://foundation.wikimedia.org/wiki/Policy:Wikimedia_Foundation_User-Agent_Policy
At different places I read that including an e-mail address in the user-agent is required by the Wikimedia User-Agent Policy. But up to the policy a project URL or similar would also be fine, right?
-
📢 #ABECTO version 3.1.5 has been released:
🔗 https://github.com/fusion-jena/abecto/releases/tag/v3.1.5ABECTO is an #OpenSource #CLI tool that compares #RDF graphs to spot errors 🪲 and assess completeness 📊 intended for use in #CICD pipelines.
Versions 3.1.3, 3.1.4, & 3.1.5 add rudimentary HTTP rate limit handling, some bug fixes, and (most important) a descriptive HTTP user-agent to cope with the #Wikidata query service (#WDQS) user-agent policy 🛂.
-
Ok, it now seems as if #qlever mapped the data-namespace (data:Q42 a schema:Dataset) onto the wd namespace (wd:Q42 a wikibase:Item) which makes it more compatible with #wikidata. I find it still a bit confusing as it differs from the RDF source. If you want to get, say, the number of sitelinks of an element you must now use "wd:Q42 wikibase:sitelinks ?n" in your #SPARQL both in qlever and in #WDQS. Previously, IIRC, you had to do "?dataset schema:about wd:Q42; ?dataset wikibase:sitelinks ?n".
-
heute Abend (20:00–21:00) im Free Knowledge Habitat: Wikidata Live Querying! wir denken uns gemeinsam interessante SPARQL-Anfragen für Wikidata aus ^^ (kommt zahlreich weil alleine werd ich nicht so viele Ideen haben :blobfoxwinkmlem:) https://pretalx.wikimedia.de/39c3-2025/talk/VTELBK/
-
if you want to know more about Wikidata, Andrew McAllister and I are doing an Intro to SPARQL and Wikidata Query Service at 8:30pm today, also in the Free Knowledge Habitat: https://pretalx.wikimedia.de/39c3-2025/talk/RNWUH8/
-
Gestern kam im #SPARQL-Workshop auf, wieso man überhaupt noch den #WDQS-Service benutzen sollte und nicht gleich den performanteren #wikidata-Endpunkt von #qlever . Gründe sind für mich aktuell noch:
- Autocomplete funktioniert in #wdqs insofern besser, als man ihn gezielter triggern kann
- mehr Optionen der Ergebnisvisualisierung
- gut nutzbare Code-Snippets
- aktuellere DatenOft schreibe ich die Queries in WDQS und führe sie dann in Qlever aus.
-
I'll try and ask here as well: Does anybody know if there's an overview of where #WDQS differs from standard #SPARQL? I mean e.g. mapping of data-namespace onto wdt at rdf level, different handling of dates (now() - ?date is possible in wdqs) etc. With alternatives such as qlever becoming more popular, it'd be useful to know what works just in wdqs and what not.
-
It took me some time to make sense of the paragraph on how to remove statements using #quickstatements here: https://www.wikidata.org/wiki/Help:QuickStatements#Removing_statements
Turns out it's actually not too difficult to generate qs-compatible output directly in the #wdqs Here's a template.
-
Done mapping all 15 #barangays of Bien Unido, Bohol, #Philippines 🇵🇭 in #OpenStreetMap, updating their #Wikidata items, and linking the two with each other.
Wanna play around? Here is the Overpass Turbo query: https://overpass-turbo.eu/s/26MA
And here is the Wikidata Query Service (#WDQS) query: https://w.wiki/EZse
Previously: https://en.osm.town/@seav/114607515064120827
-
Done mapping all 10 #barangays of Hadji Muhtamad, Basilan, #Philippines 🇵🇭 in #OpenStreetMap, creating their #Wikidata items, and linking the two with each other.
Wanna play around? Here is the Overpass Turbo query: https://overpass-turbo.eu/s/25fA
And here is the Wikidata Query Service (#WDQS) query: https://w.wiki/ELwH
Previously: https://en.osm.town/@seav/114438447195551150
-
@thibaultmol @nemobis the #WDQS database was recently split but this is not a sustainable solution because the two parts will eventually grow bigger. I’m not actually sure if the split is at least working as intended.
https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/WDQS_graph_split
-
Wikidata 查詢服務正在分割!
未來學術文章與通用內容將分開查詢!未來一般內容查詢服務將使用原本的 Query Service:https://query.wikidata.org/
學術文章查詢服務將使用 Scholarly Query Service:https://query-scholarly.wikidata.org/目前正在測試中,詳情可參閱:https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/WDQS_graph_split
-
Spätestens jetzt, wo auch die #loc ins Fadenkreuz des Trumpismus gekommen ist, lohnt es sich vermutlich, sich zu überlegen, welche Services/Projekte der LOC man gerne auch zukünftig zur Verfügung hätte. Hier die Dinge, von denen #wikidata weiß:
(Alternative über #qlever, falls der #wdqs ins Timeout läuft: https://qlever.cs.uni-freiburg.de/wikidata/eKWkMF)
-
I got curious to see what’s the longest named-after (P138) chain recorded in @wikidata and I found the following non-fictional list of 8 items using #WDQS:
1. USS Indianapolis
2. Indianapolis
3. Indiana
4. Indiana Territory
5. American Indians
6. Indians
7. India
8. Indus RiverThere are likely longer chains, but they are not yet recorded in #Wikidata.
-
Pour mon futur « Mémento SPARQL pour Wikidata », j'ai dû faire un petit visuel commentant l'interface de requêtage https://commons.wikimedia.org/wiki/File:WDQS_Interface_fr.jpg #Wikidata #SPARQL #WDQS
-
Wikidata-Nerds: wie bekomme ich die Geokoordinaten mit abgefangen, wenn diese in einem Qualifikator zu einer Property "versteckt" sind . Im konkreten Beispiel ist P626 hinter P159 (Hauptverwaltung) versteckt.... Beispieldatensatz: https://www.wikidata.org/wiki/Q3777245 #wikidata #wdqs
-
For today’s #MappyMondays, I’m trying something I have never done before: constructing WKT literals using the #Wikidata Query Service (#WDQS) in order to draw linestrings on a map!
So last week I added all of the routes flown by Air New Zealand and Air Chathams into Wikidata and now I get to see these airlines’ route maps using WDQS.
(Maps are based on #OpenStreetMap; © OSM contributors.)
1/2
-
The Wikimedia Search Platform / Query Service Team holds an open office hour/meeting the first Wednesday of each month.
Details for the next one on May 3: