This is done in this line, by casting the input value to a jbyteArray.
But ... the input it's not a jbyteArray, it's a jcharArray.
The point is: The size of char in Java is 16 bits. The size of char in C/C++ is 8 bits. (Usually. Who really knows?)
This means that the native inputSwizzle array receives "garbage" data. In the best case, the first two Java 16 bit char values might be interpreted as four C/C++ 8 bit char values. But in fact, when compiling in Debug mode, calling the setInputSwizzle function will trigger a debug assertion ("String subscript out of range", in xstring).
The following is a small test case: It generates an image that contains a red, green, blue, and white "stripe", compresses it, and writes it out.
package de.javagl.jktx;
import java.io.IOException;
import java.nio.file.Path;
import java.nio.file.Paths;
import org.khronos.ktx.KtxBasisParams;
import org.khronos.ktx.KtxCreateStorage;
import org.khronos.ktx.KtxErrorCode;
import org.khronos.ktx.KtxTexture2;
import org.khronos.ktx.KtxTextureCreateInfo;
import org.khronos.ktx.VkFormat;
public class KtxSwizzleTest
{
static
{
System.loadLibrary("./lib/ktx-jni");
}
private static int sizeX = 64;
private static int sizeY = 64;
public static void main(String[] args) throws IOException
{
Path output = Paths.get("./data/output_swizzle.ktx");
KtxTextureCreateInfo info = new KtxTextureCreateInfo();
info.setBaseWidth(sizeX);
info.setBaseHeight(sizeY);
info.setVkFormat(VkFormat.VK_FORMAT_R8G8B8A8_SRGB);
KtxTexture2 t = KtxTexture2.create(info, KtxCreateStorage.ALLOC);
byte rgba[] = createRgba();
t.setImageFromMemory(0, 0, 0, rgba);
KtxBasisParams p = new KtxBasisParams();
p.setVerbose(true);
p.setUastc(false);
p.setInputSwizzle(new char[]{ 'r', 'g', 'b', 'a' });
//p.setInputSwizzle(new char[]{ 'b', 'r', 'g', 'a' });
int rc = t.compressBasisEx(p);
if (rc != KtxErrorCode.SUCCESS)
{
throw new RuntimeException("basis error " + rc);
}
t.writeToNamedFile(output.toAbsolutePath().toString());
System.out.println("Done, destroying");
t.destroy();
}
// Create the RGBA pixels for an image that contains
// 16 rows of red pixels
// 16 rows of green pixels
// 16 rows of blue pixels
// 16 rows of white pixels
private static byte[] createRgba()
{
byte[] rgba = new byte[sizeX * sizeY * 4];
fillRows(rgba, 0, 16, 255, 0, 0, 255); // Red
fillRows(rgba, 16, 32, 0, 255, 0, 255); // Green
fillRows(rgba, 32, 48, 0, 0, 255, 255); // Blue
fillRows(rgba, 48, 64, 255, 255, 255, 255); // White
return rgba;
}
private static void fillRows(byte rgba[], int min, int max, int r, int g, int b, int a)
{
for (int y=min; y<max; y++)
{
for (int x=0; x<sizeX; x++)
{
int index = (y * sizeX) + x;
rgba[index * 4 + 0] = (byte) r;
rgba[index * 4 + 1] = (byte) g;
rgba[index * 4 + 2] = (byte) b;
rgba[index * 4 + 3] = (byte) a;
}
}
}
}
The crucial line is the one where setInputSwizzle is called (the two options there can be commented out or used for the test)
Without calling setInputSwizzle, everything is fine (showing the PNG versions of the resulting KTX files here):
When calling setInputSwizzle(new char[]{ 'r', 'g', 'b', 'a' }); (which should not change anything!), the output is this:
The
setInputSwizzle
method of the JavaKtxBasisParams
class receives an array of Javachar
values. (The same forKtxAstcParams
- I'm focussing on one case here).In the JNI layer, this swizzling information is supposed to be transferred into the
char inputSwizzle[4];
array of thektxBasisParams
structureThis is done in this line, by casting the input value to a
jbyteArray
.But ... the input it's not a
jbyteArray
, it's ajcharArray
.The point is: The size of
char
in Java is 16 bits. The size ofchar
in C/C++ is 8 bits. (Usually. Who really knows?)This means that the native
inputSwizzle
array receives "garbage" data. In the best case, the first two Java 16 bitchar
values might be interpreted as four C/C++ 8 bitchar
values. But in fact, when compiling in Debug mode, calling thesetInputSwizzle
function will trigger a debug assertion ("String subscript out of range", inxstring
).The following is a small test case: It generates an image that contains a red, green, blue, and white "stripe", compresses it, and writes it out.
The crucial line is the one where
setInputSwizzle
is called (the two options there can be commented out or used for the test)Without calling
setInputSwizzle
, everything is fine (showing the PNG versions of the resulting KTX files here):When calling
setInputSwizzle(new char[]{ 'r', 'g', 'b', 'a' });
(which should not change anything!), the output is this: