Closed AlchemistRS closed 2 years ago
Oh good catch! That is a Python2 -> Python3 conversion I was missing. (And I didn't have a test for that.)
If I disable Allow Geo Archives @Rop it works, but if in GPU mode it runs out of memory for some reason it's a simple 3 sphere + area light scene.
It's there a way of stopping the render in case of an error without using the task manager to end the task?
BTW: Here is my config.
And the Packages Json placed in "Documents\houdini18.5\packages"
{ "env": [ { "HOUDINI_PATH": "$HOUDINI_PACKAGE_PATH/../PBRT-v4" }, { "PATH":"$HOUDINI_PACKAGE_PATH/../PBRT-v4/bin/" } ] }
If you want to declarer an env variable you can do something like this.
{ "PBRT_V4" : "$HOUDINI_PACKAGE_PATH/../PBRT-v4" }
Then for example use it like this.
"${PBRT_V4}/bin"
Then you can use $PBRT_V4 inside Houdini.
I pushed a fix for the Ply failing to write in Python 3.
For killing renders that are running I tend to use the Render Scheduler pane in Houdini.
Regarding running out of memory with 3 spheres and an area light, are they primitive spheres? Or polygons? If polygons how many prims/points?
I'll do some tests on Windows later today to make sure everything is working (since I primarily work on Linux).
Also if you have an example hip file I'd be happy to take a look as well.
Yes, Render Scheduler Thanks.
I believe it ran out of memory because of me being dumb I kept hitting the render button thinking it was going to update the viewport. It so happened that as I was doing test I had the task manager open & realized there where a whole bunch of pbrt.exe instances running. Sorry User Error LOL
Thanks, Again.
This is the error I have now.
soho/python3.7\PBRTgeo.py", line 581, in save_ply data_pool[offset::num_elements] = tmp[0::3] ValueError: attempt to assign array of size 77124 to extended slice of size 12874
If I increase Geo. File Threshold from 10000 to 100000. No errors renders & outputs a .pbrt , but no .ply or geometry folder created.
I just pushed a fix for this. I was stashing some point / prim counts before further processing the geometry which changed the counts.
Normally when exporting we generate the polygon shapes directly into the pbrt file. However for large geometries this is slow and ascii text isn't the best format for it. So the Geo. File Threshold is an option to start creating Ply files if your geometry has more than 10,000 polygons. Picking 10,000 by default was somewhat arbitrary, a balance between huge .pbrt files vs the having a bunch of individual files sitting around.
So if you always want inline geometry and never ply files created, you can just set the threshold really high.
Working thank you very much. I use inline geos.
pbrt-v4 + Cuda 11.4 + OptiX 7.3 + Houdini 18.5 Py3
Receiving this error if I try to render or save pbrt scene using either pbrt or pbrtarchive rops.
Python: soho/python3.7\PBRTgeo.py", line 573, in save_ply f_handle.write("\r\n".join(header)) TypeError: a bytes-like object is required, not 'str'
Houdini Console: [1m[31mError[0m: Missing end to AttributeBegin [1m[31mError[0m: Fatal errors during scene construction
It does manage to write the .pbrt file. Data looks to be ok inside the file. It also writes out an empty .ply file.
If I change this line 543: with open(path, "wb") as f_handle: to with open(path, "w") as f_handle:
It writes out .ply file that reads like so.
ply
format binary_little_endian 1.0
element vertex 12874
property float x
property float y
property float z
property float nx
property float ny
property float nz
property float u
property float v
element face 25708
property list int int vertex_indices
end_header
Errors changes to this.
Python: soho/python3.7\PBRTgeo.py", line 576, in save_ply data_pool.frombytes("\x00" 4 num_elements * self.num_points) TypeError: a bytes-like object is required, not 'str'
Houdini Console: [1m[31mError[0m: Missing end to AttributeBegin [1m[31mError[0m: Fatal errors during scene construction
The .pbrt file looks ok.