home.social

#nwb — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #nwb, aggregated by home.social.

  1. Generated a quick plot to see usage trends for Open Source Brain v2 (): an integrated platform for that indexes multiple model and data sources (, , , ) and provides compute resources on the in "workspaces". It also includes specialist applications: for working with data in the NeuroData Without Borders () format; -UI for biophysically detailed and a environment.

  2. Generated a quick plot to see usage trends for Open Source Brain v2 (#OSBv2): an integrated #research platform for #neuroscience that indexes multiple model and data sources (#DANDI, #ModelDB, #Biomodels, #Github) and provides compute resources on the #cloud in "workspaces". It also includes specialist applications: #NWBExplorer for working with data in the NeuroData Without Borders (#NWB) format; #NetPyNE-UI for biophysically detailed #ComputationalModelling and a #JupyterLab environment.

  3. Generated a quick plot to see usage trends for Open Source Brain v2 (#OSBv2): an integrated #research platform for #neuroscience that indexes multiple model and data sources (#DANDI, #ModelDB, #Biomodels, #Github) and provides compute resources on the #cloud in "workspaces". It also includes specialist applications: #NWBExplorer for working with data in the NeuroData Without Borders (#NWB) format; #NetPyNE-UI for biophysically detailed #ComputationalModelling and a #JupyterLab environment.

  4. Generated a quick plot to see usage trends for Open Source Brain v2 (#OSBv2): an integrated #research platform for #neuroscience that indexes multiple model and data sources (#DANDI, #ModelDB, #Biomodels, #Github) and provides compute resources on the #cloud in "workspaces". It also includes specialist applications: #NWBExplorer for working with data in the NeuroData Without Borders (#NWB) format; #NetPyNE-UI for biophysically detailed #ComputationalModelling and a #JupyterLab environment.

  5. Generated a quick plot to see usage trends for Open Source Brain v2 (#OSBv2): an integrated #research platform for #neuroscience that indexes multiple model and data sources (#DANDI, #ModelDB, #Biomodels, #Github) and provides compute resources on the #cloud in "workspaces". It also includes specialist applications: #NWBExplorer for working with data in the NeuroData Without Borders (#NWB) format; #NetPyNE-UI for biophysically detailed #ComputationalModelling and a #JupyterLab environment.

  6. Moin aus dem IC von Hannover nach Amsterdam Central. Bislang war die #Bahn zuverlässig. Ich hoffe, das bleibt bis Osnabrück so. Und die #NWB Richtung Wildeshausen schließt sich dem an

  7. So wir sitzen nun in der #NWB dem letzten #Zug auf unserer Fahrt. Btw auch das erste Mal das wir auf der Reise von #Bielefeld nach #Köln und zurück kontrolliert wurden.

  8. #NWB #RS2 natürlich verpasst. Nächstes Mal doch zwangsläufig wieder das #Auto nehmen und 20 Liter #Benzin sinnlos verbrennen, #Feinstaub und #Mikroplastik produzieren.

    @Bundesregierung wir brauchen die #Verkehrswende!

  9. Check out these two upcoming neuroscience events hosted by
    DataJoint:

    1. Set up data management & analysis for neurophysiology studies using DataJoint Elements & other open-source tools
    neurodatawithoutborders.github

    2. Further NWB's software ecosystem, including the data standard, core NWB software, and community tools.
    try.datajoint.com/sciops-summi

    #neuroscience #neuroinformatics #NWB #NeurodataWithoutBorders #DataJoint #OpenScience #FAIR #OpenNeuro

  10. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  11. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  12. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  13. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  14. Open for registration: #NeurodataWithoutBorders' Open Neurodata Showcase
    Jun 26
    Virtual

    Check out this virtual poster session for data contributors & those interested in reusing existing neurophysiology data, hosted on Gather!

    ☕️ Engage in discussions
    🔍 Explore Dandisets
    🎤 Showcase your project

    Anyone interested in using open neurophysiology datasets is welcome!

    Register here: neurodatawithoutborders.github

    #neuroscience #neuroinformatics #neuroinformagical #OpenData #OpenScience #NWB

  15. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  16. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  17. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  18. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  19. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  20. Applications open for Neurodata Without Borders' #NeuroDataReHack 2024!

    Get trained on:
    - the DANDI Archive's open neurophysiology datasets
    - maximizing the archive & NWB
    - using the above to incorporate existing data into your workflows

    Apply by Mar 1: bit.ly/NWBNeuroDataReHack2024

    #neuroscience #neuroinformatics #OpenData #OpenScience #neuroinformagical #NWB #NeurodataWithoutBorders #DANDI #DANDIArchive

  21. Weiter geht es mit der #NWB diesmal sind es 5 Minuten #verspatung alles völlig okay bei der Wetterlage.

  22. Hey neuro folks,

    In case this is useful to anyone, we have made a python package to convert raw data from Spikegadgets' trodes system to NWB files: github.com/LorenFrankLab/trode

    This will directly read the .rec and associated files and convert them to the NWB format (no intermediate files). It will also check the NWB file at the end for compatibility for upload to the DANDI archive.

    Thanks to Samuel Bray and Ryan Ly for doing a lot of the work on this.

    #Spikegadgets #NWB

  23. @tdverstynen
    @neuralreckoning
    So eg. Im working on an independent implementation of #NWB right now that can break it out of HDF5 files and make the storage arbitrary while preserving interop. One of the major challenges of NWB (IMO) is that the hdf5 format makes it very difficult to index across metadata - you cant just grab some small piece of metadata across a big range of datasets because you need to download all of them in their entirety. Its also quite difficult to manipulate individual pieces during the multiple stages of an experiment, so eg. you might want to collect data at various points from different instruments, later add in some analysis, link to code, etc. This is technically possible, but it requires lugging around a big file to different machines or set up some server architecture and custom write tooling.

    Still, it does a number of things really excellently, standard naming conventions, strict schematized typing and array specifications, intra- (and soon extra-) dataset references, etc. So there are all these tricky balances.

  24. So im almost finished with my first independent implementation of a standard and I want to write up the process bc it was surprisingly challenging and I learned a lot about how to write them.

    I was purposefully experimenting with different methods of translation (eg. Adapter classes vs. pure functions in a build pipeline, recursive functions vs. flattening everything) so the code isnt as sleek as it could be. I had planned on this beforehand, but two major things I learned were a) not just isolating special cases, but making specific means to organize them and make them visible, and b) isolating different layers of the standard (eg. schema language is separate from models is separate from I/O) and not backpropagating special cases between layers.

    This is also my first project thats fully in the "new style" of python thats basically a typed language with validating classes, and it makes you write differently but uniformly for the better - it's almost self-testing bc if all the classes validate in an end-to-end test then you know that shit is working as intended. Forcing yourself to deal with errors immediately is the way.

    Lots more 2 say but anyway we're like 2 days of work away from a fully independent translation of #NWB to #LinkML that uses @pydantic models + #Dask for arrays. Schema extensions are now no-code: just write the schema (in nwb schema lang or linkml) and poof you can use it. Hoping this makes it way easier for tools to integrate with NWB, and my next step will be to put them in a SQL database and triple store so we can yno more easily share and grab smaller pieces of them and index across lots of datasets.

    Then, uh, we'll bridge our data archives + notebooks with the fedi for a new kind of scholarly communication....

  25. Zug!!

    über!!

    #Bremen!

    Hauptbahnhof!

    Automatische!

    Ansage!

    Heute!

    Irgendwie.

    Mit.

    komischer!

    Betonung!

    Und!

    Sehrviel.

    Whitespace!

    #bahn #rs2 #regio #öpnv #nwb #nordwestbahn

  26. I think I just did #NWB in SQL. The hierarchical structure makes for an absolute shitload of tables but a straightforward relationship structure

  27. One of the major sources of complexity in #NWB is actually a pretty interesting replay of blank nodes in #RDF / #SemanticWeb tech.

    So you've got a complex type of thing where your thing not only has properties but those properties are themselves other things with their own properties. Concretely, say I've got some electrophysiological recording- that's a timeseries, yes, but it also has metadata like the electrode group that collected it. That electrode group has multiple electrodes, and each has its own properties like impedance, position, etc.

    Neuroscientists would probably model this as a bigass nested untyped anonymous blob, one might call this the "cognitive style of #MATLAB structs." so that's where the format seems to have started...

  28. I had a lovely conversation with the #NWB devs yesterday. I messaged them cryptically like "hey can we talk about the history of this thing and the design decisions and constraints" and they were refreshingly candid and willing to engage with such a vague and open ended question. Formats and standards are always 100 times as complicated as they appear, and this one is this particularly heady mixture of neuroscientists riffing on something, computer scientists coming in later to be like "OK nice but let's make that work" and about a million rounds of historical baggage and iteration.

    I freaking love finding ppl who care about what they do enough to hold it at arms length for a second to evaluate its history and the constraints it navigates. Good people, I like em.

  29. alright I think I have a full semantic translation of #NWB, next for function, then aesthetics.

    there are a lot of implementations of links and references - two in the schema language, and then a number of classes. I think unifying those should iron out the last parts that are HDF5 specific. then it'll be time to write triple store and sqlite backends as an example.....

    and then we mirror DANDI.

  30. #NWB schema language translated to #LinkML ... check

    so now translating the rest of it should just be writing a few mappings