Novel LiDAR View Synthesis

3 papers with code • 0 benchmarks • 0 datasets

Synthesize a novel frame of LiDAR point clouds with an arbitrary LiDAR sensor pose from given source point clouds and their LiDAR sensor poses. For dynamic scenes, it also includes different spatial and temporal view synthesis.

Most implemented papers

LiDAR-NeRF: Novel LiDAR View Synthesis via Neural Radiance Fields

tangtaogo/lidar-nerf 20 Apr 2023

We address this challenge by formulating, to the best of our knowledge, the first differentiable end-to-end LiDAR rendering framework, LiDAR-NeRF, leveraging a neural radiance field (NeRF) to facilitate the joint learning of geometry and the attributes of 3D points.

PC-NeRF: Parent-Child Neural Radiance Fields Using Sparse LiDAR Frames in Autonomous Driving Environments

biter0088/pc-nerf 14 Feb 2024

With extensive experiments, PC-NeRF is proven to achieve high-precision novel LiDAR view synthesis and 3D reconstruction in large-scale scenes.

LiDAR4D: Dynamic Neural Fields for Novel Space-time View LiDAR Synthesis

ispc-lab/lidar4d 3 Apr 2024

In light of this, we propose LiDAR4D, a differentiable LiDAR-only framework for novel space-time LiDAR view synthesis.