AR System on Raspberry Pi
Developing on an existing XR headset (e.g., Meta Quest) can be quite straightforward, as the manufacturer usually already provides convenient APIs that developers can simply grab and use. The same also goes for software standards like OpenXR. While providing great convenience, these frameworks also limit what can be done with an XR device. The restrictions on AR and MR devices are even harsher, as developers directly accessing camera streams are worrisome. To unlock the full potential of AR/MR capabilities, I built this conceptual prototype below with a Raspberry Pi 5.
The device consists of basic parts of an AR system, including cameras, IMU, a screen, and a portable power supply with Lithium batteries. Those components are put together using a simple 3D-printed housing plus a few screws and nuts.
Of course, building a device from the ground up would lose access to some readily available features in commercial XR headsets, such as low glass-to-glass latency camera streams, or hardware synchronization between cameras and IMU. Nevertheless, running on a Linux system, the device provides maximal dev possibilities and allows for flexible customization of camera sensors, 2D/3D rendering pipelines, and image composition.