home.social

#wdqs — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #wdqs, aggregated by home.social.

  1. Just learned from @vrandecic 's presentation (videolectures.net/videos/iswc2) that there is a #wikidata #qlever interface mimicking the #WDQS gui. So here you've got native support for visualisations etc. -> wikidata-query-gui.scholia.wik

    The interface has the drop-down-autocompletion function but no implicit PREFIXes, so you must add these on your own. Not a perfect replacement for WDQS but close.

  2. Done mapping all 25 #barangays of Minalabac, Camarines Sur, #Philippines 🇵🇭 in #OpenStreetMap, creating/updating their #Wikidata items, and linking the two with each other.

    Wanna play around? Here is the Overpass Turbo query: overpass-turbo.eu/s/2mza

    And here is the Wikidata Query Service (#WDQS) query: w.wiki/K6ne

    Previously: en.osm.town/@seav/114750346738

    #LinkedOpenData #OpenData #Mapstodon #gischat

  3. That's a list I've been looking for for some time: mediawiki.org/wiki/Wikibase/In

    It shows the differences between the RDF source of a Wikidata item and the way it's stored in the RDF actually used in #Wikidata Query Service #WDQS (and in #qlever as well, apparently).

    I'd stumbled over the fact that "?item a wikibase:Item" doesn't return any results. The link above explains why.

  4. Maybe late to the party, but this days I learned that #Wikidata query service (#WDQS) now technically enforces descriptive user-agents as mandated by the Wikimedia User-Agent Policy: foundation.wikimedia.org/wiki/

    At different places I read that including an e-mail address in the user-agent is required by the Wikimedia User-Agent Policy. But up to the policy a project URL or similar would also be fine, right?

    @wikidata

  5. 📢 #ABECTO version 3.1.5 has been released:
    🔗 github.com/fusion-jena/abecto/

    ABECTO is an #OpenSource #CLI tool that compares #RDF graphs to spot errors 🪲 and assess completeness 📊 intended for use in #CICD pipelines.

    Versions 3.1.3, 3.1.4, & 3.1.5 add rudimentary HTTP rate limit handling, some bug fixes, and (most important) a descriptive HTTP user-agent to cope with the #Wikidata query service (#WDQS) user-agent policy 🛂.

  6. Ok, it now seems as if #qlever mapped the data-namespace (data:Q42 a schema:Dataset) onto the wd namespace (wd:Q42 a wikibase:Item) which makes it more compatible with #wikidata. I find it still a bit confusing as it differs from the RDF source. If you want to get, say, the number of sitelinks of an element you must now use "wd:Q42 wikibase:sitelinks ?n" in your #SPARQL both in qlever and in #WDQS. Previously, IIRC, you had to do "?dataset schema:about wd:Q42; ?dataset wikibase:sitelinks ?n".

  7. heute Abend (20:00–21:00) im Free Knowledge Habitat: Wikidata Live Querying! wir denken uns gemeinsam interessante SPARQL-Anfragen für Wikidata aus ^^ (kommt zahlreich weil alleine werd ich nicht so viele Ideen haben :blobfoxwinkmlem:) pretalx.wikimedia.de/39c3-2025

    #39c3 #Wikimedia #Wikidata #WDQS #SPARQL

  8. if you want to know more about Wikidata, Andrew McAllister and I are doing an Intro to SPARQL and Wikidata Query Service at 8:30pm today, also in the Free Knowledge Habitat: pretalx.wikimedia.de/39c3-2025

    #39c3 #Wikimedia #Wikidata #WDQS

  9. Gestern kam im #SPARQL-Workshop auf, wieso man überhaupt noch den #WDQS-Service benutzen sollte und nicht gleich den performanteren #wikidata-Endpunkt von #qlever . Gründe sind für mich aktuell noch:

    - Autocomplete funktioniert in #wdqs insofern besser, als man ihn gezielter triggern kann
    - mehr Optionen der Ergebnisvisualisierung
    - gut nutzbare Code-Snippets
    - aktuellere Daten

    Oft schreibe ich die Queries in WDQS und führe sie dann in Qlever aus.

  10. I'll try and ask here as well: Does anybody know if there's an overview of where #WDQS differs from standard #SPARQL? I mean e.g. mapping of data-namespace onto wdt at rdf level, different handling of dates (now() - ?date is possible in wdqs) etc. With alternatives such as qlever becoming more popular, it'd be useful to know what works just in wdqs and what not.

    #wikidata

  11. It took me some time to make sense of the paragraph on how to remove statements using #quickstatements here: wikidata.org/wiki/Help:QuickSt

    Turns out it's actually not too difficult to generate qs-compatible output directly in the #wdqs Here's a template.

    #wikidata

  12. Done mapping all 15 #barangays of Bien Unido, Bohol, #Philippines 🇵🇭 in #OpenStreetMap, updating their #Wikidata items, and linking the two with each other.

    Wanna play around? Here is the Overpass Turbo query: overpass-turbo.eu/s/26MA

    And here is the Wikidata Query Service (#WDQS) query: w.wiki/EZse

    Previously: en.osm.town/@seav/114607515064

    #LinkedOpenData #OpenData #Mapstodon #gischat

  13. Done mapping all 10 #barangays of Hadji Muhtamad, Basilan, #Philippines 🇵🇭 in #OpenStreetMap, creating their #Wikidata items, and linking the two with each other.

    Wanna play around? Here is the Overpass Turbo query: overpass-turbo.eu/s/25fA

    And here is the Wikidata Query Service (#WDQS) query: w.wiki/ELwH

    Previously: en.osm.town/@seav/114438447195

    #LinkedOpenData #OpenData #Mapstodon #gischat

  14. @thibaultmol @nemobis the #WDQS database was recently split but this is not a sustainable solution because the two parts will eventually grow bigger. I’m not actually sure if the split is at least working as intended.

    wikidata.org/wiki/Wikidata:SPA

    #Wikidata #SPARQL #RDF

  15. Wikidata 查詢服務正在分割!
    未來學術文章與通用內容將分開查詢!

    未來一般內容查詢服務將使用原本的 Query Service:query.wikidata.org/
    學術文章查詢服務將使用 Scholarly Query Service:query-scholarly.wikidata.org/

    目前正在測試中,詳情可參閱:wikidata.org/wiki/Wikidata:SPA

    #Wikidata #維基資料 #維基數據
    #QueryService #WDQS

  16. Spätestens jetzt, wo auch die #loc ins Fadenkreuz des Trumpismus gekommen ist, lohnt es sich vermutlich, sich zu überlegen, welche Services/Projekte der LOC man gerne auch zukünftig zur Verfügung hätte. Hier die Dinge, von denen #wikidata weiß:

    w.wiki/E3yD

    (Alternative über #qlever, falls der #wdqs ins Timeout läuft: qlever.cs.uni-freiburg.de/wiki)

  17. I got curious to see what’s the longest named-after (P138) chain recorded in @wikidata and I found the following non-fictional list of 8 items using #WDQS:

    1. USS Indianapolis
    2. Indianapolis
    3. Indiana
    4. Indiana Territory
    5. American Indians
    6. Indians
    7. India
    8. Indus River

    There are likely longer chains, but they are not yet recorded in #Wikidata.

    #etymology

  18. Pour mon futur « Mémento SPARQL pour Wikidata », j'ai dû faire un petit visuel commentant l'interface de requêtage commons.wikimedia.org/wiki/Fil #Wikidata #SPARQL #WDQS

  19. Wikidata-Nerds: wie bekomme ich die Geokoordinaten mit abgefangen, wenn diese in einem Qualifikator zu einer Property "versteckt" sind . Im konkreten Beispiel ist P626 hinter P159 (Hauptverwaltung) versteckt.... Beispieldatensatz: wikidata.org/wiki/Q3777245 #wikidata #wdqs

  20. For today’s #MappyMondays, I’m trying something I have never done before: constructing WKT literals using the #Wikidata Query Service (#WDQS) in order to draw linestrings on a map!

    So last week I added all of the routes flown by Air New Zealand and Air Chathams into Wikidata and now I get to see these airlines’ route maps using WDQS.

    (Maps are based on #OpenStreetMap; © OSM contributors.)

    1/2

    #AVGeek #OpenData #NewZealand #aviation

  21. The Wikimedia Search Platform / Query Service Team holds an open office hour/meeting the first Wednesday of each month.

    Details for the next one on May 3:

    lists.wikimedia.org/hyperkitty

    #Wikimedia #MediaWiki #search #Wikidata #WDQS