home.social

#magicleap2 — Public Fediverse posts

Live and recent posts from across the Fediverse tagged #magicleap2, aggregated by home.social.

  1. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  2. Two weeks ago I released my based app Walk the World for , a week later the exact same app for . I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an

  3. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  4. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  5. Two weeks ago I released my #MRTK3 based app Walk the World for #HoloLens2, a week later the exact same app for #MagicLeap2. I promised to blog how I so easily and quickly got that done, and here it is: my 'recipe' for getting MRTK3-based HoloLens 2 apps to run on Magic Leap 2. localjoost.github.io/Making-an #openxr #MixedReality

  6. LocalJoost ((@)localjoost(@)mstdn.social: "Spent a weekend porting the #MRTK3 version of Walk the World to #MagicLeap2. In the end, not so difficult but the documentation leaves some things to be desired If you have a device, download the package at schaikweb.net/WalktheWorld/ML2. Feedback and suggestions welcome" #crossplat

    Jun 16, 2024, 04:11 PM

    mstdn.social/@localjoost/11262

  7. Spent a weekend porting the #MRTK3 version of Walk the World to #MagicLeap2. In the end, not so difficult but the documentation leaves some things to be desired If you have a device, download the package at schaikweb.net/WalktheWorld/ML2. Feedback and suggestions welcome #crossplat

  8. #MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-

  9. has a subsystem implementation for speech commands that is supported by . supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-

  10. #MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-

  11. #MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-

  12. #MRTK3 has a subsystem implementation for speech commands that is supported by #HoloLens2. #MagicLeap2 supports voice commands as well, if only someone had taken the trouble of making a KeywordRecognitionSubsystem implementation of it, it would have had MRTK3 support for speech commands as well.

    Well, guess what - someone just did. Me. So now you can have voice commands on *both* devices using the same API, and you only need to flip a checkbox in the MRTK3 settings.

    localjoost.github.io/An-MRTK3-

  13. When debugging problems after-the-fact, you can always use UnityPlayer.log. On #HoloLens2, that is. On other devices - for instance #MagicLeap2 - that's not available. So I wrote a little #ServiceFramework service to dump #Unity logfiles in a file for an app running on a device. While I was at it, I included filtering on keywords and log type as well.

    localjoost.github.io/A-cross-p

  14. When debugging problems after-the-fact, you can always use UnityPlayer.log. On , that is. On other devices - for instance - that's not available. So I wrote a little service to dump logfiles in a file for an app running on a device. While I was at it, I included filtering on keywords and log type as well.

    localjoost.github.io/A-cross-p

  15. When debugging problems after-the-fact, you can always use UnityPlayer.log. On #HoloLens2, that is. On other devices - for instance #MagicLeap2 - that's not available. So I wrote a little #ServiceFramework service to dump #Unity logfiles in a file for an app running on a device. While I was at it, I included filtering on keywords and log type as well.

    localjoost.github.io/A-cross-p

  16. When debugging problems after-the-fact, you can always use UnityPlayer.log. On #HoloLens2, that is. On other devices - for instance #MagicLeap2 - that's not available. So I wrote a little #ServiceFramework service to dump #Unity logfiles in a file for an app running on a device. While I was at it, I included filtering on keywords and log type as well.

    localjoost.github.io/A-cross-p

  17. When debugging problems after-the-fact, you can always use UnityPlayer.log. On #HoloLens2, that is. On other devices - for instance #MagicLeap2 - that's not available. So I wrote a little #ServiceFramework service to dump #Unity logfiles in a file for an app running on a device. While I was at it, I included filtering on keywords and log type as well.

    localjoost.github.io/A-cross-p

  18. For my second blogpost of the day ;) one of my more infamous and far-fetched experiments encompasses the combination of #MachineLearning combined with a Spatial Map to recognize and locate objects in 3D space, using a #YoloV8 #computervision model. This worked pretty well on #HoloLens2. I got it to work on #MagicLeap2 as well, but it was quite a rocky road. The #RealityCollective #ServiceFramework to the rescue :). Explanation, sample and full code at: localjoost.github.io/Running-a

  19. For my second blogpost of the day ;) one of my more infamous and far-fetched experiments encompasses the combination of combined with a Spatial Map to recognize and locate objects in 3D space, using a model. This worked pretty well on . I got it to work on as well, but it was quite a rocky road. The to the rescue :). Explanation, sample and full code at: localjoost.github.io/Running-a

  20. For my second blogpost of the day ;) one of my more infamous and far-fetched experiments encompasses the combination of #MachineLearning combined with a Spatial Map to recognize and locate objects in 3D space, using a #YoloV8 #computervision model. This worked pretty well on #HoloLens2. I got it to work on #MagicLeap2 as well, but it was quite a rocky road. The #RealityCollective #ServiceFramework to the rescue :). Explanation, sample and full code at: localjoost.github.io/Running-a

  21. For my second blogpost of the day ;) one of my more infamous and far-fetched experiments encompasses the combination of #MachineLearning combined with a Spatial Map to recognize and locate objects in 3D space, using a #YoloV8 #computervision model. This worked pretty well on #HoloLens2. I got it to work on #MagicLeap2 as well, but it was quite a rocky road. The #RealityCollective #ServiceFramework to the rescue :). Explanation, sample and full code at: localjoost.github.io/Running-a

  22. For my second blogpost of the day ;) one of my more infamous and far-fetched experiments encompasses the combination of #MachineLearning combined with a Spatial Map to recognize and locate objects in 3D space, using a #YoloV8 #computervision model. This worked pretty well on #HoloLens2. I got it to work on #MagicLeap2 as well, but it was quite a rocky road. The #RealityCollective #ServiceFramework to the rescue :). Explanation, sample and full code at: localjoost.github.io/Running-a

  23. I wanted to see if I could run one of my more far-fetched #MRTK3 #HoloLens2 samples on #MagicLeap2. Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

    localjoost.github.io/Using-a-S

  24. I wanted to see if I could run one of my more far-fetched samples on . Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

    localjoost.github.io/Using-a-S

  25. I wanted to see if I could run one of my more far-fetched #MRTK3 #HoloLens2 samples on #MagicLeap2. Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

    localjoost.github.io/Using-a-S

  26. I wanted to see if I could run one of my more far-fetched #MRTK3 #HoloLens2 samples on #MagicLeap2. Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

    localjoost.github.io/Using-a-S

  27. I wanted to see if I could run one of my more far-fetched #MRTK3 #HoloLens2 samples on #MagicLeap2. Step one was getting a Spatial Map. It was dead easy on HoloLens 2, but less so on Magic Leap 2. I managed to make it work, but there were more challenges than I had anticipated. I blogged my findings to help others avoid the same pitfalls as me.

    localjoost.github.io/Using-a-S

  28. To my surprise there is no Store for #MagicLeap2. So how do you alert users to new versions and help them update? I made a little #ServiceFramework Service and some #MRTK3 UI to do just that. localjoost.github.io/A-Service

  29. After getting my app running with #MRTK3 on #HoloLens2, I of wanted to try it running on #MagicLeap2 as well. Turned out the Magic Leap MRTK package wasn't quite ready for that, so I hacked a bit around until it was. Or almost entirely was. And blogged about my findings, of course. localjoost.github.io/Using-MRT

  30. After getting my app running with on , I of wanted to try it running on as well. Turned out the Magic Leap MRTK package wasn't quite ready for that, so I hacked a bit around until it was. Or almost entirely was. And blogged about my findings, of course. localjoost.github.io/Using-MRT

  31. After getting my app running with #MRTK3 on #HoloLens2, I of wanted to try it running on #MagicLeap2 as well. Turned out the Magic Leap MRTK package wasn't quite ready for that, so I hacked a bit around until it was. Or almost entirely was. And blogged about my findings, of course. localjoost.github.io/Using-MRT

  32. After getting my app running with #MRTK3 on #HoloLens2, I of wanted to try it running on #MagicLeap2 as well. Turned out the Magic Leap MRTK package wasn't quite ready for that, so I hacked a bit around until it was. Or almost entirely was. And blogged about my findings, of course. localjoost.github.io/Using-MRT

  33. I just finished the first full version of HoloATC based upon #MRTK3 for #MagicLeap2. They have no store, so I give a download link. Alternatively, you can use the QR code in this post to look at with the built-in QR code app

    schaikweb.net/ML2/HoloATC_Magi

    Cross-platform XR development FTW!