BinomialLLC / basis_universal

Basis Universal GPU Texture Codec
Apache License 2.0
2.72k stars 267 forks source link

Encoder frontend fails only for ETC1S encoding #300

Open roy-t opened 2 years ago

roy-t commented 2 years ago

I'm trying to integrate basisu as a static library. To do so I've created the class that is listed below. The strange thing is that everything seems to work when I use uastc encoding. But when I use etc1s encoding the compresser fails with cECFailedFrontEnd.

The source file is a 1024x1024 TGA file with alpha channel. When I build basis as a command line tool, and try to encode it, I encouter no problems.

I guess this means I'm doing something wrong, but I can't figure it out. The basis set-up in my code looks exactly the same as in the README.

I'm using the latest version in this repository at this time (commit 1f4b564ae0b8ec1e67f21f058e53112d0405a25a),

#include <basisu_enc.h>
#include <basisu_comp.h>
// ... snip
using namespace basisu;

void NativeEncoder::Init()
{
    basisu_encoder_init();
}

const uint8_vec NativeEncoder::Encode(const std::string &filename)
{
    image image;
    if (!load_image(filename, image)) { throw std::exception(); }

        basis_compressor_params params;
    params.m_source_images.push_back(image);
    params.m_perceptual = false;
    params.m_mip_srgb = false;

    params.m_write_output_basis_files = true;
    params.m_out_filename = "test.basis";

    params.m_uastc = false; // If I put true here everything works fine!
    params.m_rdo_uastc_multithreading = false;
    params.m_multithreading = false;
    params.m_debug = true;
    params.m_status_output = true;
    params.m_compute_stats = true;

    job_pool jpool(1);
    params.m_pJob_pool = &jpool;

    basis_compressor compressor;

    if (!compressor.init(params)) { throw std::exception(); }

    auto result = compressor.process();
    if (result != basisu::basis_compressor::cECSuccess)
    {
               // with params.m_uastc = false this result in result == cECFailedFrontEnd
        throw std::exception();
    }

    return compressor.get_output_basis_file();
}

void NativeEncoder::Deinit()
{
    basisu::basisu_encoder_deinit();
}
roy-t commented 2 years ago

After comparing the values in my basis_compressor_params with those generated by the command line tool I figured it out.

By default m_quality_level is set to -1 . In the command line tool there is an extra check to make sure the quality level is set to a reasonable value.

        if (m_comp_params.m_quality_level != -1)
        {
            m_comp_params.m_max_endpoint_clusters = 0;
            m_comp_params.m_max_selector_clusters = 0;
        }
        else if ((!m_comp_params.m_max_endpoint_clusters) || (!m_comp_params.m_max_selector_clusters))
        {
            m_comp_params.m_max_endpoint_clusters = 0;
            m_comp_params.m_max_selector_clusters = 0;

            m_comp_params.m_quality_level = 128;
        }

This check should probably be moved to basisu_comp.cpp to make sure m_quality_level always has a reasonable value. I think it should be placed in basisu_comp.ccp:1045 (the start of the bool basis_compressor::process_frontend() method).

I'll try to create a MR to fix this.

This also explains why it only happens in ETC1S mode as the -q or m_quality_level setting only applies to ETC1S.

richgel999 commented 2 years ago

Sorry for the confusion - the encoder was first written as part of the command line tool, then over time it got separated out. I will integrate your PR either tonight or tomorrow.