This is a point cloud rendering (one pixel rendering) + CNN image processin (=deferred rendering to inpaint the holes between projected points).
Live inference |
---|
![]() |
using a very tiny CNN (a.k.a Vanilla decoder) 100k points only |
Local install of pixr
git clone https://github.com/balthazarneveu/per-pixel-point-rendering.git
cd per-pixel-point-rendering
pip install -r requirements.txt
pip install -e .
Run the demo on pre-trained scene,
python scripts/novel_views_interactive.py -e 55 -t pretrained_scene
__data
folder.pip install blenderproc
python studies/full_render.py -s material_balls -n 4 -m orbit
if args.scene == "material_balls":
config = {
"distance": 4.,
"altitude": 0.,
"background_map": "__world_maps/city.exr"
}
rendering library:
synthesis / rasterizer:
learning:
python scripts/optimize_point_based_neural_renderer.py -e 70
after defininng experiment studies:
[N, C, H, W]
.
[M, p, d]
Fuzzy depth test (varying $\alpha$ on a scene with two very close triangles) | Normal culling | Multiscale splatting |
---|---|---|
![]() |
![]() |
![]() |
To each point of the point cloud, we associate a color vector (later this vector will have a larger dimension, we get pseudo-colors instead of RGB).
Rendered colored point cloud - novel view synthesis | Groundtruth shaded images used to get colors per point so that the final rendering is faithful |
---|---|
![]() |
![]() |
Closest point | Fuzzy depth test |
---|---|
![]() |
![]() |
To reproduce this demo? python studies/interactive_projections.py -n 200000 -s test_aliasing
.
Can take some time to sample the point cloud from triangles