SimulationWeather
Jan 10, 20266 minIn review

Simulation-driven weather for aligned LiDAR + RGB

A technical summary of how we drive weather effects from the simulation so every sensor sees the same state.

LiDAR render in low-visibility conditions
ArticleComing soon
Full article will drop soon

Simulation-driven weather so sensors stay aligned

Weather is a simulation problem before it is a sensor problem. If RGB and LiDAR are generated from different weather states, alignment breaks immediately. We keep a single simulation-owned weather state and advance it on a fixed timestep so every sensor samples the same conditions.

Design goals for synthetic weather

  • Single-step truth: RGB, labels, depth, and LiDAR sample the same sim step.
  • Deterministic: weather state is seeded by sim step to avoid drift.
  • Frame-budget aware: weather updates stay within a fixed timestep.
  • Controllable: weather parameters can be swept or toggled without reauthoring assets.
  • Dataset-ready: we can generate stable sequences at scale.

Pipeline summary

  • Weather controller owns the state (fog density, precipitation rate/type, wind, visibility).
  • Per-step updates in the simulation core so state advances deterministically.
  • Precipitation systems (rain, snow, hail) driven by the same state across sensors.
  • Materials + VFX bindings so the whole scene responds to a single parameter set.
  • Sensor sampling uses the same step + weather seed for RGB, labels, and LiDAR.

What we validate

  • Frame-to-frame toggles without jitter or label drift.
  • Low-visibility corner cases (dense fog, heavy precipitation).
  • Coverage for rare classes under weather stress.

Full technical breakdown is on the way.

© 2025–2026 SiRLab. All rights reserved.