Closed Rognva1dr closed 3 years ago
my qLib_package_linux.json reads as follows
{ "recommends": "houdini_version >= '17.5.321'",
"env": [
{
"QLIB": "/mnt/FRAMESTORE/common/dev/pipeline/houdini/qLib",
},
{
"QOTL": "$QLIB/otls",
},
{
"HOUDINI_OTLSCAN_PATH": "$QOTL/base:$QOTL/future:$QOTL/experimental:$HOUDINI_OTLSCAN_PATH",
},
],
"path": "$QLIB",
}
I checked the HDA file in qLib, and then found that camera_plane_ql caused this error because of the reason because the node contains an illegal character in line 13, after I remove this line, this error is solved.
The content of the problematic line is: [Image:opdef:Sop/qLib::camera_plane_ql::1?camera_plane_ql_sop_main.png]
Of course, even if this problem is not repaired, it will not affect the use of this node, it will only lead to this node. :D
Thanks @ReimuSG - good to know its not major
Just seen a new production build 18.5.696 - going to see if the message appears in this new build.......
Yep - same output error in latest production build 18.5.696...
@ReimuSG which one is the illegal character on that line?
Thanks for tracking it down, though, I didn't have much free time recently, but will look into this as soon as I can. Thank you
@johnnyquest This problem appears on the 13th line of the camera_plane_ql HDA Help page, its content is like this:
[Image:opdef:Sop/qLib::camera_plane_ql::1?camera_plane_ql_sop_main.png]
I tried to remove the exception character, but I didn't seem to solve the problem... so I deleted this line.
@ReimuSG aaah, apparently it should be like this --
[Image:opdef:/qLib::Sop/camera_plane_ql::2?camera_plane_ql_sop_main.png]
But it's just the camera plane object nodes right? I don't see anything in the frustum obj node. On it
But it's just the camera plane object nodes right? I don't see anything in the frustum obj node. On it
Yep~this problem is only on camera_plane_ql & camera_plane_ql_2
As for the frustum obj node, I don't seem to find errors...
oops, github closed the ticket automatically -- I just put up a new rolling release, @ReimuSG @Rognva1dr please test it and if it works I'll close the ticket again :) (v0.2.199)
Nice going! this problem has been solved, thanks for your timely maintenance :D @johnnyquest
cool cool, no worries, happy to help. closing this one then
Thank you @johnnyquest and @ReimuSG - I can concur that worked perfectly.
No more errors - cheers!
Hi
I just installed latest production build 18.5.672 py3 - and when reading the qlib package .json file _ i get the following errors in my shell:
Exception in thread Thread-2: Traceback (most recent call last): File "/opt/hfs18.5.672/python/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/opt/hfs18.5.672/houdini/python3.7libs/houdinihelp/api.py", line 589, in run indexer.update_with(w, pages, overlay=True) File "/opt/hfs18.5.672/houdini/python3.7libs/bookish/search.py", line 486, in update_with writer, pages, update_paths, needs_delete=changed File "/opt/hfs18.5.672/houdini/python3.7libs/bookish/search.py", line 516, in index_paths_with for doc in self.documents(pages, path): File "/opt/hfs18.5.672/houdini/python3.7libs/bookish/search.py", line 417, in documents jsondata = pages.json(path, postprocess=False) File "/opt/hfs18.5.672/houdini/python3.7libs/bookish/wiki/wikipages.py", line 607, in json self._pre_pipeline.apply(jsondata, wcontext) File "/opt/hfs18.5.672/houdini/python3.7libs/bookish/wiki/pipeline.py", line 118, in apply v.apply(block, context) File "/opt/hfs18.5.672/houdini/python3.7libs/houdinihelp/hpages.py", line 324, in apply self.apply(subblock, context) File "/opt/hfs18.5.672/houdini/python3.7libs/houdinihelp/hpages.py", line 320, in apply self.text(block["text"], context) File "/opt/hfs18.5.672/houdini/python3.7libs/houdinihelp/hpages.py", line 340, in text value = self._parse_opdef(context["path"], value) File "/opt/hfs18.5.672/houdini/python3.7libs/houdinihelp/hpages.py", line 284, in _parse_opdef table, nodetype = nodetype.split("/") ValueError: not enough values to unpack (expected 2, got 1)
Not sure if this is qLib or Houdini - would you know?
Thank you for the help