Skip to content

Detachable Novel Views Synthesis of Dynamic Scenes Using Distribution-Driven Neural Radiance Fields

License

Notifications You must be signed in to change notification settings

Luciferbobo/DetRF

Repository files navigation

$DetRF$

PyTorch implementation of paper DetRF: Detachable Novel Views Synthesis of Dynamic Scenes Using Backdrop-Driven Neural Radiance Fields (AAAI2025).

Novel View Synthesis

Rendering Results

Comparison of novel view synthesis

Disentangled Results

Comparison on disentangled static background from entire scenes

The Entire Scene

The Decoupled Background

Getting Started

1. Setup&Dependency

The code is trained with Python == 3.8.8, Pytorch == 1.11.0 and CUDA == 11.3, the dependencies include:

  • scikit-image
  • opencv
  • imageio
  • cupy
  • kornia
  • configargparse

Then download NVIDIA Dynamic and Urban Driving datasets. The whole file structure should be:

D4NeRF
├── configs
├── logs
├── models
├── data
|  └── NVIDIA
|  └── URBAN
|  └── others
...

2. Train

python train.py --config configs/config_Handcart.txt 

3. Evaluation

The evaluation on NVIDIA dataset focuses on synthesis across different viewpoints, while evaluation on Urban driving dataset aims to interpolate time intervals (frames).

Evaluation on Urban Driving Scenes

python evaluation_NV.py --config configs/config_Balloon1.txt 

Evaluation on NVIDIA Dynamic Scenes

python evaluation_urban.py --config configs/config_Handcart.txt 

4. Novel view synthesis

fixed time and view interpolation:

python view_render.py --config configs/config_Handcart.txt --fixed_time --target_idx 15

time interpolation and fixed view:

python view_render.py --config configs/config_Handcart.txt --fixed_view --target_idx 15

time interpolation and view interpolation:

python view_render.py --config configs/config_Handcart.txt --no_fixed --target_idx 15

5. Create other datasets

Use COLMAP to acquire camera poses and intrinsics. Then download scripts to obtain the flow and depth estimation models, RAFT and Midas. The pre-trained weights have been added to the directory.

Pose transformation

python save_poses_nerf.py --data_path "/xxx/dense"  #data_path is the path of COLMAP estimation results.

Depth estimation

python run_midas.py --data_path "/xxx/dense" --resize_height 272

Flow estimation

python run_flows_video.py --model models/raft-things.pth --data_path /xxx/dense

Acknowledge

The code is built upon:

Thanks for their great work.

About

Detachable Novel Views Synthesis of Dynamic Scenes Using Distribution-Driven Neural Radiance Fields

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages