home.social

Search

636 results for “localjoost”

  1. If you clone a #LensStudio targeted to Snap Inc. #Spectacles and then open it, you might be greeted by an error that can be confusing if you are pretty new to the environment. Here's why, and how you can simply get past that.

    localjoost.github.io/Why-a-fre

  2. Part 6, The (provisional?) conclusion of my series "#LensStudio for the confused #Unity developer" describes how to add (spatial) sounds to a Lens, and learns you how to prevent a too large number of sounds being played simultaneously crashing the app.
    localjoost.github.io/Lens-Stud

  3. In part 5 of my series " #LensStudio for the confused #Unity developer" we implement the three actual buttons on the hand menu to re-create the grid, drop cubes on the floor, and move them back to their original position. The last one learns you also how to move and rotate using a lerp without blocking the main thread and keep a smooth display localjoost.github.io/Lens-Stud

  4. In part 4 of my series "#LensStudio for the confused #Unity developer" I show how you can implement a #HoloLens/#MRTK style 'hand menu' in your Lens, allow you to control various aspects it localjoost.github.io/Lens-Stud

  5. In part 3 of my series #LensStudio for the confused #Unity developer", I show code to smash cubes with your hands, and also explain how you can make them bounce off the spatial map (and thus real objects). Includes occlusion as well.

    localjoost.github.io/Lens-Stud

  6. In part 2 of my series "#LensStudio for the confused #Unity developer", I explain how to set up a project, how to create a prefab equivalent, how to write a Behaviour equivalent, and show things like getting a reference to the main camera, vector math, and more.

    localjoost.github.io/Lens-Stud

  7. If it's worth doing something, it's worth overdoing. My second blog for today, explaining the basics of Snap Inc.'s #LensStudio to develop for #Spectacles 2024 - to the confused Unity developer, aka myself about 4 weeks ago.
    localjoost.github.io/Starting-

  8. #SnapSpectacles 2024: I expected Google Glass-like AR, but it's a full-on Wave Guide see-though #MixedReality device in normal glasses! Here's my take after a month of toying and developing. localjoost.github.io/Snap-Spec

  9. I wanted to see what it took to set up a proper #MixedReality application that ticks all the boxes (viewing and interacting with reality, hand tracking, spatial sound, etc) on a #Quest3, using the latest version of #MRTK3. Because well, you need to be cross platform these days if you and your apps want to be relevant in the MR space. It took some tinkering, but I got it to work, and I am glad to see I will be able to carry over my MR investments to different platforms

    localjoost.github.io/CubeBounc

  10. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  11. Two weeks ago I released my based app Walk the World for , a week later the exact same app for . I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an

  12. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  13. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  14. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  15. When I ported Walk the World to Magic Leap 2, I found out their adoption of OpenXR made porting quite a bit harder. After failing two times, I did not follow documentation but wrote a little .NET 8.0 tool to compare to Unity Manifest files and succeed! localjoost.github.io/A-little- #magicleap #unity #openxr

  16. LocalJoost ((@)localjoost(@)mstdn.social: "Spent a weekend porting the #MRTK3 version of Walk the World to #MagicLeap2. In the end, not so difficult but the documentation leaves some things to be desired If you have a device, download the package at schaikweb.net/WalktheWorld/ML2. Feedback and suggestions welcome" #crossplat

    Jun 16, 2024, 04:11 PM

    mstdn.social/@localjoost/11262

  17. A geek is a geek even on his birthday, and to celebrate that I blogged once again a handy #realitycollective service: one that send telemetry to an Azure Application Insights resource, using the official Microsoft SDK. You can log events and exceptions, but you can also let the service log automatically stuff that passes through the internal Unity Application log.

    localjoost.github.io/Logging-M

  18. Sometimes you need to know whether or not the user is now actually wearing the device while your app is running. I wrote a little #RealityCollective #ServiceFramework Service to make that easily detectable #MixedReality #HoloLens2

    localjoost.github.io/Detecting

  19. Sometimes you need to know whether or not the user is now actually wearing the device while your app is running. I wrote a little Service to make that easily detectable

    localjoost.github.io/Detecting

  20. Sometimes you need to know whether or not the user is now actually wearing the device while your app is running. I wrote a little #RealityCollective #ServiceFramework Service to make that easily detectable #MixedReality #HoloLens2

    localjoost.github.io/Detecting

  21. Sometimes you need to know whether or not the user is now actually wearing the device while your app is running. I wrote a little #RealityCollective #ServiceFramework Service to make that easily detectable #MixedReality #HoloLens2

    localjoost.github.io/Detecting

  22. Sometimes you need to know whether or not the user is now actually wearing the device while your app is running. I wrote a little #RealityCollective #ServiceFramework Service to make that easily detectable #MixedReality #HoloLens2

    localjoost.github.io/Detecting

  23. After I wrote about how you could get the hand *position* while doing an air tap with #MRTK3, two developers asked me simultaneously how you could get the end of the hand *ray*. Noblesse oblige, and I wrote localjoost.github.io/Getting-t

  24. #MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-