Sunrise over the lunar south pole, rendered in real-time in your browser.
Over the weekend, I built a new synthetic image generator for training Tycho Orbital's autonomous optical navigation models. This replaces our previous THREE.js setup with a custom SDF-based fragment shader (WebGPU).
Key upgrades:
- Custom SDF-based fragment shader with raymarching for physically accurate shadows
- Uses NASA's latest 2025 global lunar color texture
- Runs in real-time (> 60 fps)
- Sub-meter surface displacement precision from 16-bit DEMs
This new approach marks a significant improvement in visual quality, so we plan to share a new training dataset on Hugging Face in the next couple of weeks.
Question for lunar mission teams: Would real-time, high-fidelity lunar rendering (with support for importing custom DEMs in regions of interest, simulating orbits, tracking surface and orbital assets etc.) be useful for your synthetic data generation, mission planning, or situational awareness workflows?
Curious to hear how teams are approaching these problems.
Over the weekend, I built a new synthetic image generator for training Tycho Orbital's autonomous optical navigation models. This replaces our previous THREE.js setup with a custom SDF-based fragment shader (WebGPU).
Key upgrades: - Custom SDF-based fragment shader with raymarching for physically accurate shadows - Uses NASA's latest 2025 global lunar color texture - Runs in real-time (> 60 fps) - Sub-meter surface displacement precision from 16-bit DEMs
This new approach marks a significant improvement in visual quality, so we plan to share a new training dataset on Hugging Face in the next couple of weeks.
Question for lunar mission teams: Would real-time, high-fidelity lunar rendering (with support for importing custom DEMs in regions of interest, simulating orbits, tracking surface and orbital assets etc.) be useful for your synthetic data generation, mission planning, or situational awareness workflows?
Curious to hear how teams are approaching these problems.