Magic Leap 2 Developer Forums - Latest posts https://forum.magicleap.cloud Latest posts Recurring native crashes in OpenXR perception runtime across multiple devices and apps I sent over bug reports for LeapBrush and our app. We first implemented the use of the OpenXR features and spaces/spatial anchors in September of 2024, and we have been noticing crashes since roughly then. During our time spent trying to fix this, we found that downgrading the OS to 1.10.0 was more stable. So we will gather some more data with 1.12.1, but we were seeing crashes on 1.12.0.

Thanks for taking a look.

]]>
https://forum.magicleap.cloud/t/recurring-native-crashes-in-openxr-perception-runtime-across-multiple-devices-and-apps/6998#post_3 Fri, 20 Mar 2026 23:13:04 +0000 forum.magicleap.cloud-post-17259
Recurring native crashes in OpenXR perception runtime across multiple devices and apps Hi Steve,

Thank you for reaching out to us with this very well documented and thorough report of the issues. I really appreciate the time and effort you put into this. We’re going to look into this and see what we can do. Do you know approximately when these crashes started to occur? Was it after the latest update of OS or Unity or before? You’re using 6000.0.45f1 so I would think it went back to April ‘25 at least, but the OS build is 1.10.0 from Oct ‘24 so this would likely have been going on for a while if it was related to the OS. Did you experiment with B3E.251015.01 to see if it replicates the issue on the latest Nov ‘25 OS release 1.12.1?

Could you send the full bug report to [email protected]? It should allow you to upload the whole thing and then I can transfer it from there and we can take a look at it.

]]>
https://forum.magicleap.cloud/t/recurring-native-crashes-in-openxr-perception-runtime-across-multiple-devices-and-apps/6998#post_2 Fri, 20 Mar 2026 22:25:31 +0000 forum.magicleap.cloud-post-17258
Recurring native crashes in OpenXR perception runtime across multiple devices and apps Hello,

We are experiencing frequent, seemingly random native crashes on Magic Leap 2 devices across our fleet, all running OS build B3E.240905.12-R.038. These crashes occur on multiple different devices and persist across Unity editor and package version upgrades.

Our Unity app has shown these issues when built with Unity 6000.0.45f1 and 6000.3.11f1, we also use OpenXR features related to hand tracking, localization, meshing, and spatial anchors. The exact list is shown in the attached screenshot.

The app is crashing with SIGILL and SIGSEGV signals originating from within Magic Leap’s native libraries — primarily libml_perception_session.so and libunity.so. We are also seeing identical crash patterns in the Leap Brush sample app (com.magicleap.leapbrush), which leads us to believe the root cause is deeper than our application code. Here’s a tombstone from the bugreport taken that captures the crash in the LeapBrush app, as well as the log file.

bugreport-demophon_aosp-B3E.240905.12-R.038-2026-03-13-13-34-04.txt (8.6 MB)

leapbrush tombstone_09.txt (1.0 MB)

Here are the descriptions of the crashes we see with our app:

  1. SIGILL in libml_perception_session.so during pose queries

The most common crash. The stack trace shows xrLocateSpace → Session::GetPose → ml_pose_combine_fn hitting an illegal instruction. We have captured multiple identical instances of this crash back-to-back on the same device (see the attached tombstone_08 and tombstone_09).

  1. SIGSEGV in magicflinger during head tracking

We captured a crash in the system compositor (magicflinger) where HeadTracking::RePredictPose in libml_perception_head_tracking.so segfaulted during CompositorBackend::UpdatePerceptionState. This crash caused a full device restart.

  1. SIGILL in xrSyncActions during input update

Captured via Sentry — a SIGILL crash in the OpenXR input provider during xrSyncActions, called from Unity’s XRInputSubsystem::Update in the normal player loop.

  1. Massive XR runtime stalls (3–20 seconds)

Using the Unity Profiler, we captured two instances of the OpenXR runtime stalling on the main thread:

  • EarlyUpdate.XRUpdate blocked for 3,400 ms
  • FrameEvents.XRBeginFrame blocked for 20,000 ms

In both cases, there was nothing underneath these calls in the profiler — Unity is waiting on a single native call into the OpenXR runtime that simply does not return for seconds at a time. These stalls cause display loss on the headset.

  1. Spatial anchor crashes

We also captured a SIGABRT from an uncaught ml::perception::Exception in SpatialAnchor::UpdateTrackedAnchors().

What We Have Tried:

  • Upgrading Unity editor versions
  • Upgrading ML2 Unity packages
  • Removing spatial anchor usage from our localization system
  • Profiling to identify any application-side cause

From our application (8 crashes on a single device, Feb 11 – Mar 4):

  • tombstone_00 — SIGILL in worker thread (Mar 4)
  • tombstone_02 — SIGABRT on UnityMain (Feb 11)
  • tombstone_03 — SIGSEGV in FMOD mixer thread (Feb 17)
  • tombstone_05 — SIGABRT from SpatialAnchor::UpdateTrackedAnchors() (Feb 20)
  • tombstone_06 — SIGSEGV in worker thread (Feb 24)
  • tombstone_07 — SIGSEGV null pointer, kernel-generated (Feb 25)
  • tombstone_08 — SIGILL in xrLocateSpace / libml_perception_session.so (Mar 3)
  • tombstone_09 — SIGILL in xrLocateSpace, identical to tombstone_08 (Mar 3)
  • bugreport-demophon_aosp-B3E.240905.12-R.038-2026-03-04-13-44-34.txt — Full system log from this device

Entire bugreport zips were too large to upload here, but I can send them if they would be helpful. Any guidance as to how to increase the stability of our app on the headset and to mitigate the multi-second stalls in xrBeginFrame and xrSyncActions would be greatly appreciated. Thanks.

bugreport-demophon_aosp-B3E.240905.12-R.038-2026-03-04-13-44-34.txt (9.2 MB)

tombstone_00.txt (1.1 MB)

tombstone_03.txt (1.2 MB)

tombstone_09.txt (1.2 MB)

tombstone_08.txt (1.2 MB)

tombstone_07.txt (1.1 MB)

tombstone_06.txt (1.3 MB)

tombstone_05.txt (1.3 MB)

tombstone_02.txt (2.1 MB)

]]>
https://forum.magicleap.cloud/t/recurring-native-crashes-in-openxr-perception-runtime-across-multiple-devices-and-apps/6998#post_1 Thu, 19 Mar 2026 17:31:39 +0000 forum.magicleap.cloud-post-17257
Localization without space Hi aw01,

I’m going to have to look into this more but could you share with me some images or videos of the example test space you’re using? I’d like to see how you have it set up and see where the drifting happens if possible. Would you be able to share that?

Thank you,

]]>
https://forum.magicleap.cloud/t/localization-without-space/6956#post_5 Fri, 13 Mar 2026 21:53:54 +0000 forum.magicleap.cloud-post-17254
Localization without space Hey jspire,

we have the problem that we want to localize within a space that contains multiple zones. Each zone looks identical to the others.

We need to display objects (in our case, squares) at specific positions, and these objects must remain fixed at exactly the points where we placed them, even when we move around inside the zone. This works initially, but because the zones look similar, the ML loses its localization and jumps to a different zone. When that happens, the ML jumps to a different starting point and draws the objects depending on the new starting point.

To prevent this, we want to support localization using markers (e.g., ArUco tags, large numbers, or large characters) or differences such as different colors on surfaces for each zone.
For this reason, we want to understand whether the ML uses the world camera and visual information from the environment for localization. More specifically: can the ML detect different colors or symbols and use these visual differences to determine the correct position inside a space?

We also tried creating a separate space for each zone, but even then the ML localized to the wrong zone while running the application.

Another approach we tested was skipping room scanning entirely and placing anchors only via markers.
However, even with this method, we observed drifting.
This raises the question of how the ML establishes its coordinate system and what methods it uses to determine distance and position relative to the markers. For example:
– Does a marker need to be visible at all times?
– Can we build a consistent coordinate system without a spatial scan by aligning the coordinate origin to a marker?

Ideally, we would have three markers visible to the camera. Based on the initial view, we would draw the object positions, and whenever enough markers are visible, we could sync.
For this to work, it would be important to know how to set the coordinate system’s origin to a specific marker so that our entire system remains consistent around a fixed point instead of around the ML’s internal starting point.

From the HoloLens, we know that it was able to determine its position relative to three infrared markers. Therefore, we are wondering whether something similar is possible with the ML.

Thank u for ur help.

]]>
https://forum.magicleap.cloud/t/localization-without-space/6956#post_4 Fri, 13 Mar 2026 07:31:44 +0000 forum.magicleap.cloud-post-17253
Localization without space Hi aw01,

I’ll look into this and see what I can find. Is there a specific use case you’re working on that you could tell me more about or is this more of an academic question about the workings of the technology?

Thank you,

]]>
https://forum.magicleap.cloud/t/localization-without-space/6956#post_3 Thu, 12 Mar 2026 23:56:15 +0000 forum.magicleap.cloud-post-17252
Localization without space Hi, I just wanted to ask if there are any updates on my question. Since it’s been around 30 days now and I still haven’t found a solution, even partial answers would be great. :slightly_smiling_face:

]]>
https://forum.magicleap.cloud/t/localization-without-space/6956#post_2 Thu, 12 Mar 2026 11:19:26 +0000 forum.magicleap.cloud-post-17248
[CMake] Module path contamination when including Magic Leap SDK Hi,

I was having difficulties locating some third-party libraries with CMake. It took me some time to find that the problem arose since I’ve included Magic Leap SDK in my project.

Long story short: Magic Leap SDK’s root find package module file ENV{MLSDK}/cmake/FindMagicLeap.cmakeincorrectly overwrites the CMAKE_MODULE_PATH variable. Indeed, line 105 (for Magic Leap SDK v1.12.0) reads as follows:

set(CMAKE_MODULE_PATH "${FOUND_MLSDK}/cmake/")

The CMAKE_MODULE_PATH variable is a list of paths, not a single path. It’s set by the consumer project. Unconditionally setting it as actually done in the Magic Leap SDK will clear what has been set by the consumer project, which is a problem IMHO, but opinions may vary :grin:

The fix is easy: Add the path of the Magic Leap SDK to the CMAKE_MODULE_PATH variable rather than setting it. Line 105 thus becomes:

list(APPEND CMAKE_MODULE_PATH "${FOUND_MLSDK}/cmake/")

If you think of an other way to fix this, please let me know. At the moment, in my consumer project, I’m saving the value of CMAKE_MODULE_PATH variable before calling find_package(MagicLeap) and then restore the saved value.

]]>
https://forum.magicleap.cloud/t/cmake-module-path-contamination-when-including-magic-leap-sdk/6988#post_1 Thu, 05 Mar 2026 09:20:42 +0000 forum.magicleap.cloud-post-17243
Marker Tracking Timestamps Thank you Corey! Please notify me if there are any updates on this. We are currently working on rebuilding ArUco tracking from scratch with an OpenCV Android build.

]]>
https://forum.magicleap.cloud/t/marker-tracking-timestamps/6987#post_3 Wed, 04 Mar 2026 18:04:27 +0000 forum.magicleap.cloud-post-17241
ML Spectator App (in android phone) fail to connect with Spectator Unity App in ML2 Hey @ahmad27 ,

Just so I’m understanding correctly, you made a Unity project that is trying to use the ML Spectator feature and are now having issues connecting the spectator phone app to your Unity app?

Best,
Corey

]]>
https://forum.magicleap.cloud/t/ml-spectator-app-in-android-phone-fail-to-connect-with-spectator-unity-app-in-ml2/6983#post_2 Wed, 04 Mar 2026 17:12:20 +0000 forum.magicleap.cloud-post-17240
Marker Tracking Timestamps @harperrhett ,

Thank you for the great feedback!
I’ll relay this to the proper team and see what we can do.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/marker-tracking-timestamps/6987#post_2 Wed, 04 Mar 2026 17:11:08 +0000 forum.magicleap.cloud-post-17239
Unity WebRTC: Encoder Queue Full @nathan ,

I’ve been looking into this a bit more for you.
You can try using an older WebRTC package if that’s viable for your project to see if you can force the encoder to be hardware driven.

Also, just out of curiosity, could you try an older version of Unity, like 2022.3 LTS to see if you get the same behavior?

If not, that’s totally fine, just trying to pin down the root cause here.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_6 Wed, 04 Mar 2026 17:08:48 +0000 forum.magicleap.cloud-post-17238
Marker Tracking Timestamps It would be really fantastic if the MarkerData class were to have a field like UTCTimestamp that contains a DateTime of when the picture was taken that was used to calculate the pose. This would be huge for my team. We are looking into computer vision alternatives just so we can generate that feature ourselves.

]]>
https://forum.magicleap.cloud/t/marker-tracking-timestamps/6987#post_1 Tue, 03 Mar 2026 21:08:24 +0000 forum.magicleap.cloud-post-17237
ML Spectator App (in android phone) fail to connect with Spectator Unity App in ML2 I have developed and run the Spectator Unity App and deployed it to ML2 device.

At the same time, I have run the ML Spectator app on my Android phone. Both phone and ML2 are connected to the same wifi. At the ML Spectator app on my phone, I can see the IP address of the ML_Spectator Unity app, but when I try to connect to that IP, it failed

]]>
https://forum.magicleap.cloud/t/ml-spectator-app-in-android-phone-fail-to-connect-with-spectator-unity-app-in-ml2/6983#post_1 Fri, 27 Feb 2026 02:13:41 +0000 forum.magicleap.cloud-post-17232
Unity WebRTC: Encoder Queue Full I just got VP8 to work, I have no idea what I did earlier this week when it didn’t. Currently the feed is very laggy though, probably getting ~10 fps.

Has there been someone who managed to get the Hardware accelerated H264 encoder to work with Unity WebRTC?

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_5 Thu, 26 Feb 2026 15:47:15 +0000 forum.magicleap.cloud-post-17228
Unity WebRTC: Encoder Queue Full They seemed to have removed that call in 2022. com.unity.webrtc/CHANGELOG.md at main · Unity-Technologies/com.unity.webrtc · GitHub

Do you recommend using an older WebRTC package?

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_4 Thu, 26 Feb 2026 13:19:53 +0000 forum.magicleap.cloud-post-17227
Unity WebRTC: Encoder Queue Full Hey @nathan ,

Correct, it should not be defaulting to the mesa encoder, to me that seems like the main issue here.
This may be coming from the Unity WebRTC Package.

If possible, please try and force Unity WebRTC to initialize with the hardware encoder:

WebRTC.Initialize(EncoderType.Hardware);

Please let me know if this helps at all!

Best,
Corey

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_3 Wed, 25 Feb 2026 19:46:52 +0000 forum.magicleap.cloud-post-17224
Unity WebRTC: Encoder Queue Full Found your WebRTC repo. You guys are aiming for:
private const int WIDTH = 1440;
private const int HEIGHT = 1080;

webCameraName = “Camera 2”;

We are currently aiming for lower than that and using the same camera (Which is the only MR to my knowledge):

Debug ThirdEye-HAL Setting up recording session Width 960 Height 720 Format 35 usage 515 flags 0 Type 3 FPS 30

But the Buffer is allocating a much bigger chunk:
Debug MLMRCamera-Buffers init BufferItemConsumer setting width 2048 height 1536 buffer count 6 format 35

Not sure if that’s intended or not, I’m guessing you guys just downscale and always allocate the max? Im not usually the guy that works on our calls but the others are busy. This is a bit of a new area for me.

I’ve tried VP8 and that seems to fill up even faster than the H264.

Also, from the following log line, I dont think the device should be defaulting to the mesa encoder:
Info org.webrtc.Logging HardwareVideoEncoder: initEncode name: OMX.mesa.video_encoder.avc type: H264 width: 960 height: 720 framerate_fps: 60 bitrate_kbps: 283 surface mode: false

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_2 Tue, 24 Feb 2026 17:57:32 +0000 forum.magicleap.cloud-post-17223
MagicLeap Hub 3 stream over Ethernet not working Thank you for your response.

I’ve tried these diligently, and these don’t work.

]]>
https://forum.magicleap.cloud/t/magicleap-hub-3-stream-over-ethernet-not-working/6970#post_3 Tue, 24 Feb 2026 15:51:54 +0000 forum.magicleap.cloud-post-17222
Unity WebRTC: Encoder Queue Full Give us as much detail as possible regarding the issue you’re experiencing.

Unity Editor version: 6000.0.62f1
ML2 OS version: Version 1.12.1
Build B3E.251015.01
Android API Level 29
MLSDK version: 2.6.0-pre.R15

Error messages from logs:

We had put a pause on our Magic Leap unity build for the past year. We recently got back into it and making sure the build is up to date with our new features. We are currently having issues with the calls, specifically the video which stops sending frames after about 5-10 seconds.

Our calls using Unity WebRTC used to work very well. We did update a lot of things when booting the project back up: Unity 6, WebRTC 3.0.0 and of course the ML OS and SDK.

Looking through the android logs im seeing a new system that I dont remember seeing back in the day (MLMRCamera3Hal). Im getting constant chatter from this system for every frame:

2026-02-23 18:43:32.796 3360 3523 Warn Camera3-Device No cvip timestamp available
2026-02-23 18:43:32.796 3294 9174 Info MLMRCamera3Hal ENTRY : processCaptureRequest
2026-02-23 18:43:32.796 3294 9174 Info MLMRCamera3Hal EXIT : processCaptureRequest

From our web rtc stats I consistently get around ~300 encoded frames before the Encoding Queue fills up:

WebRTC stats: {“t”:“2026-02-23T19:29:17.285559Z”,“lp”:“n1”,“rp”:“nl”,“a”:{“snd”:{“lvl”:0.0,“b”:24754},“rcv”:{“lvl”:0.0,“b”:31414}},“v”:{“snd”:{“enc”:291,“df”:290,“kf”:31,“b”:677510}}}

Then the new spam of logs are:

2026-02-23 19:29:17.450 3360 3533 Warn Camera3-Device No cvip timestamp available
2026-02-23 19:29:17.450 3294 3294 Info MLMRCamera3Hal ENTRY : processCaptureRequest
2026-02-23 19:29:17.450 3294 3294 Info MLMRCamera3Hal EXIT : processCaptureRequest
2026-02-23 19:29:17.456 23070 4963 Error org.webrtc.Logging HardwareVideoEncoder: Dropped frame, encoder queue full
2026-02-23 19:29:17.472 23070 4963 Error org.webrtc.Logging HardwareVideoEncoder: Dropped frame, encoder queue full

Some initial Encoder logs if it helps:

org.webrtc.Logging HardwareVideoEncoder: Format: {color-format=21, i-frame-interval=3600, mime=video/avc, width=960, bitrate-mode=2, bitrate=283000, frame-rate=60.0, height=720}

And right after it updates the format, unsure if this is expected:

org.webrtc.Logging HardwareVideoEncoder: updateInputFormat format: {color-transfer=0, color-format=21, slice-height=720, image-data=java.nio.HeapByteBuffer[pos=0 lim=104 cap=104], mime=video/raw, width=960, stride=960, color-range=0, color-standard=0, height=720} stride: 960 sliceHeight: 720 isSemiPlanar: true frameSizeBytes: 1036800

Looked at your forums and I heard mention of some Unity WebRTC examples but going through your examples I dont see anything relating to WebRTC.

Curious if there is a new way of accessing the camera that I should be doing on the Magic Leap. We have the same Android webrtc stack working on the Argo and Realwear but I’m currently struggling with this one.

Any info would be greatly appreciated!

]]>
https://forum.magicleap.cloud/t/unity-webrtc-encoder-queue-full/6977#post_1 Mon, 23 Feb 2026 19:43:32 +0000 forum.magicleap.cloud-post-17221
MagicLeap Hub 3 stream over Ethernet not working Hey @mbalazs1020 ,

From what I know, this may be an android related issue.
A few things you can try to see if the issue resolves itself:

    • Unplug ethernet
    • Turn Wifi off manually on the device
    • plug ethernet back in
    • Reboot the device
    • Try streaming again
    • Keep wifi enabled

    • Do NOT connect to a network

    • Plug in the ethernet

    • Try streaming again

Please let me know if any of these worked or not.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/magicleap-hub-3-stream-over-ethernet-not-working/6970#post_2 Wed, 18 Feb 2026 17:59:10 +0000 forum.magicleap.cloud-post-17216
Accessing Real RGB Camera in Unity on Magic Leap 2 Hey @jose.rodrigues ,

Yes, the MLCamera API is deprecated. You can still use it but you may run into issues.

The current recommended way to capture and visualize the cameras is via the Pixel Sensor API.
Here is some documentation with an example script for exactly what you’re looking for:

Also, you can always take a look at the Magic Leap Examples Unity Project from the MLHub. You can find fully built scenes for various features, including accessing and visualizing a camera stream that you can use as reference.
Just in case you still need to download and try the MLHub, here is the download page for it to get you started!

https://ml2-developer.magicleap.com/downloads

Please let me know if you have anymore questions! Good luck to you with your project.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/accessing-real-rgb-camera-in-unity-on-magic-leap-2/6971#post_2 Wed, 18 Feb 2026 17:52:59 +0000 forum.magicleap.cloud-post-17215
Accessing Real RGB Camera in Unity on Magic Leap 2 Hi everyone
I’m new to this area,

I’m trying to access the real RGB camera of Magic Leap 2 through Unity. I want to:

  1. Display the camera feed in a RawImage (or similar UI panel).

  2. Capture a photo and save it to disk.

However, whenever I try to use MLCamera, it always says the API is deprecated. I’m not sure how to proceed using the current Unity SDK.

I’ve seen references to MagicLeap.Android.AndroidCamera, but I’m not sure how to implement it correctly for continuous preview and taking pictures in Unity.

Does anyone have a working approach or example for:

  • Showing the real camera feed in a Unity UI (RawImage or 3D quad)

  • Capturing a still image and saving it

  • Using the current, supported APIs (not deprecated)

Any guidance or example scripts would be really appreciated.

Thanks!

]]>
https://forum.magicleap.cloud/t/accessing-real-rgb-camera-in-unity-on-magic-leap-2/6971#post_1 Wed, 18 Feb 2026 17:45:13 +0000 forum.magicleap.cloud-post-17213
MagicLeap Hub 3 stream over Ethernet not working Hi!

We are trying to connect MagicLeap to ethernet as you can see on the image, by using an ethernet and charging compatible USB-C adapter between the MagicLeap and the router. (We are doing this to avoid wifi, because we are planning to use this in a high EMC environment where wifi doesn’t work reliably.)

When we do this, the device stream in the MagicLeap hub can’t open. It says Stream error. All other functions work. Only problem is the stream not starting. The MagicLeap Hub is connected, device files and apps are visible. Our own app can use and send sensor data and frames with this setup. The device charges as well, with its original charger connected through the USB hub.

Note that with this setup, the MAC and IP addresses are actually not the Magicleap’s, but the USB hub’s instead. Maybe this induces some rarely seen bug which prevents stream opening.

I’m asking for help to get this stream going somehow.

Thanks.

]]>
https://forum.magicleap.cloud/t/magicleap-hub-3-stream-over-ethernet-not-working/6970#post_1 Wed, 18 Feb 2026 17:45:10 +0000 forum.magicleap.cloud-post-17212
Wifi list showing no SSID's at all while at a large convention. Hey @dryden ,

This sounds like overload to me. The device might be scanning for networks, but there’s so many that the scan encounters a timeout or something similar. The ML2 may not be filtering as many networks as a mobile phone does.

You could try manually adding a network if possible instead of scanning.

Another option may be to limit the wifi scan to 5Gz or 2.4Gz if possible on the device. This may not be possible but I will try to confirm on my own personal device once I have the time.

Sorry you encountered this issue! Please let us know if you found a solution or if you need any further assistance.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/wifi-list-showing-no-ssids-at-all-while-at-a-large-convention/6962#post_2 Wed, 18 Feb 2026 17:43:44 +0000 forum.magicleap.cloud-post-17211
MR recording shows black background in Cesium scene Hey @gregkoutsikos ,

I think what’s going on here is that Cesium is overriding the camera configuration at runtime or during initialization. Cesium’s own camera is managed and certain properties are set at runtime. That’s why setting the property in the inspector doesn’t work, because Cesium overrides it later.

Try to ensure that the camera background is set to Solid Color and background color is (0,0,0,0) at runtime via a script that sets these properties after Cesium has applied it’s configuration at runtime. You can try doing this in a LateUpdate().

If that ends up working for you, you can try to find a better/alternate solution that prevents Cesium from setting these properties at runtime, like creating your own camera setup but using XR Rig as your base camera instead of Cesium’s default Camera.

Best,
Corey

]]>
https://forum.magicleap.cloud/t/mr-recording-shows-black-background-in-cesium-scene/6967#post_2 Wed, 18 Feb 2026 17:36:18 +0000 forum.magicleap.cloud-post-17210
MR recording shows black background in Cesium scene Give us as much detail as possible regarding the issue you’re experiencing:

Unity Editor version: 2022.3.21f1
ML2 OS version: 1.5 (using OpenXR)
MLSDK version: 2.5.0

I’m using the Unity ML Examples 1.5 screen capture script to record Mixed Reality (MR) videos in my Unity application on Magic Leap 2.

Recording works correctly in most scenes. However, in one scene that uses Cesium for Unity, the MR portion of the recording (1440×1080 area) shows a black background instead of the real world. The Cesium map renders correctly, but the background behind it is solid black in the recorded video.

Important details:

  • Cesium uses its own camera.

  • I attempted to set the camera background alpha to 0 in the Inspector.

  • After building and deploying, the alpha value resets to 255.

  • If I use the ML2 Video Recorder tool and manually reduce the opacity below 0, the recording works correctly and the background shows the real world as expected.

My question:

Is there a way to programmatically set the recording camera’s opacity (alpha) below 0 at runtime so the MR capture includes the real-world background correctly?

Or is there a specific configuration required when using Cesium’s camera with Magic Leap’s MR recording pipeline?

Any guidance would be greatly appreciated.

]]>
https://forum.magicleap.cloud/t/mr-recording-shows-black-background-in-cesium-scene/6967#post_1 Mon, 16 Feb 2026 15:01:13 +0000 forum.magicleap.cloud-post-17206
Wifi list showing no SSID's at all while at a large convention. We currently have 12 Magic Leap 2 devices at a large convention right now and we’re running into issues where the headsets are unable to see any wireless networks at all.

We were able to confirm that the devices were working fine outside of the convention so I’m leaning towards the headsets not being able to understand the huge number of networks around it.

We’re seeing 650+ networks in our area alone across both 2.4 and 5ghz.

Has anyone else seen this while running OS 1.12.1? We have previously ran in similar conditions on older OS versions with no issues.

]]>
https://forum.magicleap.cloud/t/wifi-list-showing-no-ssids-at-all-while-at-a-large-convention/6962#post_1 Thu, 12 Feb 2026 19:25:15 +0000 forum.magicleap.cloud-post-17201
Eye image capture rate You are correct that the eye tracking camera can capture images at up to 30hz, there is no way to go above that rate. For downloading the images, I recommend testing your USB C cable to make sure that it supports high speed data transfer.

]]>
https://forum.magicleap.cloud/t/eye-image-capture-rate/6892#post_2 Wed, 11 Feb 2026 19:15:20 +0000 forum.magicleap.cloud-post-17191
Problem with Estimating Depth of IR Reflective Spheres Hey @4vladsurdu

This question falls a little outside of the scope of the forums as we do not typically provide custom code reviews, but I will help try and point you in the right direction here.

Just glancing over some of your script, this line jumped out at me a bit:

xy = input_pt/self.resolution - 0.5

This assumes that the magic leap intrinsic values are in normalized space (0-1), but the values are actually given in pixel space, which is a much wider range. There may be more things like that dashed throughout the code, so give it a good look over and ensure that you are running calculations in the correct space.

If you have any more questions please let me know! Good luck to you with your project!

Best,
Corey

]]>
https://forum.magicleap.cloud/t/problem-with-estimating-depth-of-ir-reflective-spheres/6923#post_2 Wed, 11 Feb 2026 19:09:59 +0000 forum.magicleap.cloud-post-17190
Localization without space Hello,
i have a question regarding localization without a scanned Space.
If I do not use a Space to localize to a specific environment, how does the ML2 localize itself relative to objects? Like do u use the Camera, Lidar, gyroscope sensors or a combination of them?

For example, I scanned an ArUco marker without being localized. What exactly does the ML do so that the objects generated from the ArUco marker remain in their position even after I remove the marker or move away from it?

From the code I understand that ML creates its own coordinate system. But what is the starting point (origin) of this system, and is it possible to move the coordinate system’s origin (0,0,0) to the scanned marker?

My last question is: Is it possible to use infrared markers to help the ML orient itself in Spaces that look almost identical, so that it does not switch between them?
More generally, could infrared markers help the headset improve its spatial orientation?

greetings and thx for the help :slight_smile:

]]>
https://forum.magicleap.cloud/t/localization-without-space/6956#post_1 Wed, 11 Feb 2026 18:57:29 +0000 forum.magicleap.cloud-post-17189
Hub 3 data download from ML2 sometimes takes extremely long over the stock cable — best practices? Hey @lzy ,

This is a common issue with the way that ADB pulls many small files from a device.
For example, transferring 1 300MB file is much faster than transferring 1 300MB folder of many files (100s or thousands). The amount of data may be the same, but the performance is much different when dealing with 1 file vs many files at once.

One suggestion would be to take the many small image files and put them inside 1 single container like a .zip, .tar, or some other format. You would then need to handle decompression/extraction, but the transfer time should be much shorter.

Let me know if you have any other issues or if you find a solution that works for you!

Best,
Corey

]]>
https://forum.magicleap.cloud/t/hub-3-data-download-from-ml2-sometimes-takes-extremely-long-over-the-stock-cable-best-practices/6925#post_2 Wed, 11 Feb 2026 18:30:31 +0000 forum.magicleap.cloud-post-17188
How to override Magic Leap 2 head tracking with external OptiTrack position data? So using OptiTrack’s external tracking to support the internal tracking of the ML2 is not possible? Would be great to use it for large spaces we’re working in.

]]>
https://forum.magicleap.cloud/t/how-to-override-magic-leap-2-head-tracking-with-external-optitrack-position-data/6898#post_4 Wed, 04 Feb 2026 08:15:19 +0000 forum.magicleap.cloud-post-17162
Hologram Drift Issue When Tracking Object with Retroreflective Markers using Depth Camera (Raw data) Hi Alessandro,

We ran into an issue with the Github project - it seems to be highly dependent on an “ml2irtrackingplugin”, which my co-coder can’t find anywhere in the project. Would it be possible for you to embed it in this repo and, if possible, include the code to compile it myself?

My teammate added an Issue in the github with that question and a few other questions, but I wanted to add this here in case the notification for that Issue did not go through.

Thanks again!

Graeme

]]>
https://forum.magicleap.cloud/t/hologram-drift-issue-when-tracking-object-with-retroreflective-markers-using-depth-camera-raw-data/5618#post_20 Tue, 03 Feb 2026 21:42:48 +0000 forum.magicleap.cloud-post-17160
How to override Magic Leap 2 head tracking with external OptiTrack position data? You cannot disable or override ML2’s head tracking. The device will show a “head pose lost” notification if the cameras do not have sufficient feature points to track the space.

You can use the Pixel Sensor API to obtain images from the world cameras, however they will continue to be used for tracking.

]]>
https://forum.magicleap.cloud/t/how-to-override-magic-leap-2-head-tracking-with-external-optitrack-position-data/6898#post_2 Tue, 03 Feb 2026 15:43:43 +0000 forum.magicleap.cloud-post-17152
Hub 3 data download from ML2 sometimes takes extremely long over the stock cable — best practices? Hi Magic Leap team,

I’m using Magic Leap Hub 3 (v3.0) to download recorded data from a Magic Leap 2 headset over USB-C (stock/original cable). The data is generated by our own application and consists of saved image files (e.g., camera frames) stored on the headset. Download sometimes proceeds normally, but occasionally becomes very slow.

  1. Magic Leap Hub: 3.0
  2. ML2 OS version: 1.12.0 (API Level 29)
  3. Computer OS: Windows 10
  4. Connection: direct USB-C port
  5. Dataset: ~300 MB, the dataset may include many small image files

Are there known issues or best practices for transferring many small files (images) from ML2 via Hub 3 over USB-C?

Is there a supported faster method for bulk transfer of app data?

Thank you!

Jenny

]]>
https://forum.magicleap.cloud/t/hub-3-data-download-from-ml2-sometimes-takes-extremely-long-over-the-stock-cable-best-practices/6925#post_1 Mon, 02 Feb 2026 20:27:54 +0000 forum.magicleap.cloud-post-17151
Magic Leap 2 Withdrawal from EU market Hi there, does this news imply that ML2 will enter end-of-life soon?

Noah Schiffman.

]]>
https://forum.magicleap.cloud/t/magic-leap-2-withdrawal-from-eu-market/6727#post_4 Mon, 02 Feb 2026 20:27:23 +0000 forum.magicleap.cloud-post-17149
Magic Leap 2 No Longer for Sale in Japan? Hi there @cfest - Does this mean that ML2 will reach EoL sometime soon?

]]>
https://forum.magicleap.cloud/t/magic-leap-2-no-longer-for-sale-in-japan/6888#post_4 Mon, 02 Feb 2026 20:27:16 +0000 forum.magicleap.cloud-post-17148
Problem with Estimating Depth of IR Reflective Spheres Give us as much detail as possible regarding the issue you’re experiencing.

ML2 OS version: 1.12.1

I am trying to use the raw and processed depth images streamed to a python client to track tools with IR reflective spheres attached to them. Specifically, I use the raw depth image to capture the locations of the spheres in 2D, and then use the processed depth frame to estimate their 3D position. The tracking seems to only work at a specific depth - otherwise, the detected spheres resemble a scaled-up or scaled-down version of my tool. I’m using the following code to undistort the sphere centroids acquired from raw depth image, and computing the ray that should resemble sphere position in local depth camera space, inspired by this post: Processing the depth frames - Unity Development - Magic Leap 2 Developer Forums.

def undistort(input_pt):

    xy = input_pt/self.resolution - np.double(0.5)

    r2 = np.sum(xy\*xy)

    r4 = r2 \* r2

    r6 = r4 \* r2

    xy_rd = xy \* (1 + (self.d.k1 \* r2) + (self.d.k2 \* r4) + (self.d.k3 \* r6))

    xtd = (2 \* self.d.p1 \* xy\[0\] \* xy\[1\]) + (self.d.p2 \* (r2 + (2 \* xy\[0\] \* xy\[0\])))

    ytd = (2 \* self.d.p2 \* xy\[0\] \* xy\[1\]) + (self.d.p1 \* (r2 + (2 \* xy\[1\] \* xy\[1\])))

    xy_rd\[0\] += xtd

    xy_rd\[1\] += ytd

    xy_rd += np.double(0.5)

    return (xy_rd \* resolution - center)/focal

#getting centriods
u = M[“m10”] / M[“m00”]
v = M[“m01”] / M[“m00”]

depth = raw_frame.depth[int(v), int(u)] * 1000 #convert to mm

uv = [u, v]
unit_vec = undistort(uv)

ir_tool_centers.extend([
unit_vec[0],
unit_vec[1],
depth
])

#spheres xyz a reformatted version of ir_tool_centers
xyz = spheres_xyz[i,:].copy() # [x_ray, y_ray, depth]
xyz[2] += cur_radius # z’ = depth + radius
temp_vec = np.array([xyz[0], xyz[1], 1])
spheres_xyz[i,:] = temp_vec / np.linalg.norm(temp_vec) * xyz[2]

Is this code correct? If not, what should I do to increase the robustness of the tracking?

]]>
https://forum.magicleap.cloud/t/problem-with-estimating-depth-of-ir-reflective-spheres/6923#post_1 Mon, 02 Feb 2026 20:09:07 +0000 forum.magicleap.cloud-post-17147
Controller Disconnecting Controller standby was and is off, still occurs nearly every session of testing my app

]]>
https://forum.magicleap.cloud/t/controller-disconnecting/6842#post_4 Sun, 01 Feb 2026 22:32:40 +0000 forum.magicleap.cloud-post-17142
Asset Bundle Loading Issue @cfeist thank you for returning. I hope this will be resolved in the future. The ironic thing is that outdated Hololens 2 has no problem with new versions. I hope Unity is aware of this issue. I’m looking to update my Unity version to the latest again. Maybe it will suddenly fixed at 6.4 version who knows..

]]>
https://forum.magicleap.cloud/t/asset-bundle-loading-issue/6874#post_6 Thu, 29 Jan 2026 18:33:12 +0000 forum.magicleap.cloud-post-17127
How to override Magic Leap 2 head tracking with external OptiTrack position data? Hi everyone,

I’m trying to use an OptiTrack motion capture system to override the Magic Leap 2’s internal head tracking. The goal is to have the ML2 camera position/rotation controlled entirely by OptiTrack when
tracking data is available, and fall back to ML2’s onboard tracking when OptiTrack is unavailable.

Setup

  • Middleware PC: Unity app receives OptiTrack rigid body data via NatNet (Unicast), then sends position/rotation as JSON over UDP to the ML2
  • Magic Leap 2: Unity app receives UDP packets and attempts to set the camera position
  • Data transmission works: UDP packets arrive correctly, I can see the position values in the logs

The Problem:

No matter what I try, the ML2’s internal head tracking keeps interfering. When I move my head, virtual objects wobble/shake because both tracking systems seem to fight for control. When OptiTrack is not sending data, the view is completely frozen (no tracking at all).

What I’ve Tried

  1. Disabling TrackedPoseDriver component on Main Camera at runtime
    • Result: Still wobbles
  2. Setting camera.transform.position directly in various callbacks:
    • Update() - wobbles
    • LateUpdate() - wobbles
    • Application.onBeforeRender with [BeforeRenderOrder(10000)] - wobbles
    • Camera.onPreCull - only ML2 tracking works, OptiTrack ignored
  3. Moving XR Origin (ML Rig) instead of camera
    • Calculated offset to compensate for local camera position
    • Result: Still wobbles
  4. Changing TrackedPoseDriver Update Type to “Update” only (not “Before Render”)
    • Result: Still wobbles
  5. Stopping XRInputSubsystem completely:
    var inputSubsystem = XRGeneralSettings.Instance.Manager
    .activeLoader.GetLoadedSubsystem();
    inputSubsystem.Stop();
    • Result: Same behavior

Hierarchy

ML Rig (XR Origin)
└── Camera Offset
└── Main Camera (TrackedPoseDriver: Update and Before Render)

Virtual objects are in world space (not parented to camera).

Questions:

How can I completely disable or override the ML2’s head tracking to use external position data from OptiTrack?

Is there a specific API or approach that allows manual control of the camera’s world position without the XR system overriding it?

Any help would be greatly appreciated!

Unity Editor version: 2022.3.67.f2
ML2 OS version: 1.12.0
Unity SDK version: 2.6.0
Host OS: Windows

Error messages from logs (syntax-highlighting is supported via Markdown):

]]>
https://forum.magicleap.cloud/t/how-to-override-magic-leap-2-head-tracking-with-external-optitrack-position-data/6898#post_1 Wed, 28 Jan 2026 14:19:49 +0000 forum.magicleap.cloud-post-17120
Asset Bundle Loading Issue @brs_kes
I understand that downgrading your project is not an ideal solution, apologies for that!

But just to clarify, this issue is caused by changed made by the Unity Engine in version 6 and above. That’s why downgrading your project to a non Unity 6 version helped the resolve the issue.

Unfortunately this isn’t something that Magic Leap can solve alone since it’s a Unity Engine issue, not Magic Leap specifically.

Please let me know if you need any further assistance and best of luck to you with your project!

Best,
Corey

]]>
https://forum.magicleap.cloud/t/asset-bundle-loading-issue/6874#post_5 Tue, 27 Jan 2026 18:20:56 +0000 forum.magicleap.cloud-post-17114
Magic Leap 2 No Longer for Sale in Japan? Hey @huytn,

As part of our evolving journey and in response to changing market dynamics, we are focusing on deeper technology partnerships to create the next-generation of AR solutions and discontinuing sales of Magic Leap 2 globally in March 2026. Please contact your reseller for specific order cutoff dates.

We are deeply grateful for your partnership and the impact we’ve made together with Magic Leap 2.

We are not announcing further product plans at this time.

]]>
https://forum.magicleap.cloud/t/magic-leap-2-no-longer-for-sale-in-japan/6888#post_3 Tue, 27 Jan 2026 18:17:51 +0000 forum.magicleap.cloud-post-17113
Asset Bundle Loading Issue Well, it’s not a solution, I should say it’s a total workaround. I had to downgrade my project, it was a big change, and I rolled back to 2022.3.61. I’m hoping that the problem will be solved because I really had to downgrade other things, like the burst compiler. AR foundation. etc. I hope magic leap team is aware of this problem and it would be solved in the future.

]]>
https://forum.magicleap.cloud/t/asset-bundle-loading-issue/6874#post_4 Tue, 27 Jan 2026 14:50:53 +0000 forum.magicleap.cloud-post-17112
Eye image capture rate Dear team,

Two quick question about eye-image capture and data export in ML2.

1, is 30Hz the maximum eye camera image caption rate or there is a way to increase it further.

2, Downloading large numbers of eye images from the hub is slow. Are there features that can accelerate this? Is it possible to export the eye captures as a video?

Thank you very much!

Zipai

]]>
https://forum.magicleap.cloud/t/eye-image-capture-rate/6892#post_1 Tue, 27 Jan 2026 00:57:51 +0000 forum.magicleap.cloud-post-17110
Magic Leap 2 No Longer for Sale in Japan? The sale of ML2 was also stopped in the EU last month: Magic Leap 2 Withdrawal from EU market

]]>
https://forum.magicleap.cloud/t/magic-leap-2-no-longer-for-sale-in-japan/6888#post_2 Mon, 26 Jan 2026 09:52:43 +0000 forum.magicleap.cloud-post-17106
Magic Leap 2 No Longer for Sale in Japan? Hi everyone,

We recently received information from our hardware distributor that Magic Leap 2 devices are no longer available for sale in Japan due to a halt in production.

We’ve also heard similar rumors elsewhere, so we wanted to ask the community (or Magic Leap team directly) for clarification. Is this a temporary situation, or has production and sales of Magic Leap 2 been permanently discontinued?

Any official confirmation or additional insight would be greatly appreciated.

Thank you!

]]>
https://forum.magicleap.cloud/t/magic-leap-2-no-longer-for-sale-in-japan/6888#post_1 Mon, 26 Jan 2026 02:19:16 +0000 forum.magicleap.cloud-post-17105
Controller Disconnecting I will try the standby setting when I get back to this.

It seems to be caused by a performance overload. The app I am working with can cause extreme lag. It’s not a complete app and this is unintended of course, but at some point, this causes the controller to disconnect. The tracking does not show in the app or home menu. I can get home by the wrist shortcut.

It’s almost like there is an overload that causes some form of tracking or controller connection thread or something along those lines to stall or crash.

Sometimes it can come back by simply waiting, (too long) but this is very rare. It’s not consistent, some times I’ve waited a literal 5 minutes to see if waiting is viable. It’s not.

Restarting the controller fixes it 9/10 of times.

That last 1/10 times requires a full system restart.

]]>
https://forum.magicleap.cloud/t/controller-disconnecting/6842#post_3 Sat, 24 Jan 2026 04:18:13 +0000 forum.magicleap.cloud-post-17098
Unity Marker Tracking Delayed When I explicitly set ArucoLength the positional prediction is less accurate than length estimation. Any idea why that might be? ArucoLength is measured in meters right? And it is measured from its left-most black edge to its right-most black edge?

]]>
https://forum.magicleap.cloud/t/unity-marker-tracking-delayed/6806#post_3 Fri, 23 Jan 2026 19:05:39 +0000 forum.magicleap.cloud-post-17096