Open luisi-at opened 5 years ago
It works fine on my local machine (macOS 14.10.1, Intel Iris Graphics 650), but I get the same error for tests 20 to 25.
For example, in jjd06_logs_2019-02-08_15-51-10/log/test_20.log :
Loaded world with w=64, h=64
Stepping by dt=0.1 for n=1
Loaded world with w=64, h=64
Stepping by dt=0.1 for n=1
Found 1 platforms
Platform 0 : NVIDIA Corporation
Choosing platform 0
Found 1 devices
Device 0 : GeForce GTX 970
Choosing device 0
Exception : LoadWorld : File does not start with HPCEHeatWorldV0.
I had this issue on V5. The specification is very vague on the creation of the packed array but this was the cause of the error for me:
kernel.setArg(1, packed)
This is incorrect and should be set to:
kernel.setArg(1, buffProperties)
with packed only passed in as part of the write buffer:
queue.enqueueWriteBuffer(buffProperties, CL_TRUE, 0, cbBuffer, &packed[0]);
as the array you pass the OpenCL kernel needs to be declared as an OpenCL Buffer. Obviously your argument number is likely to differ from mine above.
This also occurs when there is no output piped to the bin/render_world
, likely due to a segfault happening somewhere in the code. In my case this was found to be due to a poorly bracketed (!((world.properties[index] & hpce::Cell_Fixed) || (world.properties[index] & hpce::Cell_Insulator)))
Check that the array during encoding is being indexed correctly, or that the kernel is being indexed correctly.
Ah, the old buffer vs what's being buffered problem. It's good that this kind of confusion is being addressed now so that these things become second nature for CW5 & 6!
System: macOS 10.14 GPU: AMD Radeon Pro 460
Running the v5 task code, I keep getting an issue when trying to produce an output, be it binary, text or .bmp. The code compiles successfully but upon running gets the following exception:
Has anyone else suffered from the same issue and how did you rectify it?
Thanks.