mlc-ai / mlc-llm

Universal LLM Deployment Engine with ML Compilation
https://llm.mlc.ai/
Apache License 2.0
19.09k stars 1.56k forks source link

[Bug] TypeError: GemmaConfig.__init__() missing 1 required positional argument: 'hidden_activation' #2611

Closed haohenggang closed 4 months ago

haohenggang commented 4 months ago

🐛 Bug

Traceback (most recent call last): File "/home/pwb/miniforge3/envs/mlc-chat-venv/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/home/pwb/miniforge3/envs/mlc-chat-venv/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/home/hhg/mlc-llm/python/mlc_llm/main.py", line 64, in main() File "/home/hhg/mlc-llm/python/mlc_llm/main.py", line 53, in main cli.main(sys.argv[2:]) File "/home/hhg/mlc-llm/python/mlc_llm/cli/package.py", line 64, in main package( File "/home/hhg/mlc-llm/python/mlc_llm/interface/package.py", line 351, in package model_lib_path_for_prepare_libs = build_model_library( File "/home/hhg/mlc-llm/python/mlc_llm/interface/package.py", line 92, in build_model_library jit.jit( File "/home/hhg/mlc-llm/python/mlc_llm/interface/jit.py", line 129, in jit "model_config": _get_model_config(), File "/home/hhg/mlc-llm/python/mlc_llm/interface/jit.py", line 96, in _get_model_config return MODELS[model_type].config.from_dict(model_config).asdict() File "/home/hhg/mlc-llm/python/mlc_llm/support/config.py", line 51, in from_dict return cls(**fields, kwargs=kwargs) # type: ignore[call-arg] TypeError: GemmaConfig.init() missing 1 required positional argument: 'hidden_activation'

To Reproduce

Steps to reproduce the behavior:

1. 1. 1.

Expected behavior

Environment

Additional context

MasterJH5574 commented 4 months ago

Hey @haohenggang thanks for reporting. We fixed this issue in https://github.com/mlc-ai/mlc-llm/pull/2614. Could you update to the latest main branch and try again?

AkulRT commented 4 months ago

@MasterJH5574 ,

I tried compiling Gemma for chat again, after the update and it gives the following error:

[151/153] Building CXX object CMakeFiles/tvm4j_runtime_pac...vm/native/src/main/native/org_apache_tvm_native_c_api.cc.o FAILED: CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o C:\Users\akula\AppData\Local\Android\Sdk\ndk\27.0.11902837\toolchains\llvm\prebuilt\windows-x86_64\bin\clang++.exe --target=aarch64-none-linux-android24 --sysroot=C:/Users/akula/AppData/Local/Android/Sdk/ndk/27.0.11902837/toolchains/llvm/prebuilt/windows-x86_64/sysroot -DTVM4J_ANDROID -DTVM_LOG_CUSTOMIZE=1 -DTVM_RELAX_VM_ENABLE_PROFILER=0 -DTVM_SOURCE_DIR=C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm -Dtvm4j_runtime_packed_EXPORTS -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/MLCChat/build/jni_header -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/dlpack/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/dmlc-core/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/OpenCL-Headers -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/3rdparty/picojson -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include -IC:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tokenizers-cpp/include -g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security "-O3" -O3 -DNDEBUG -fPIC -MD -MT CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o -MF CMakeFiles\tvm4j_runtime_packed.dir\368b657d31d5c8d946e3ffa48aa52ef0\Internship\mlc-llm\3rdparty\tvm\jvm\native\src\main\native\org_apache_tvm_native_c_api.cc.o.d -o CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o -c C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc In file included from C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc:25: C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:15:10: fatal error: 'C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm /src/runtime/c_runtime_api.cc' file not found 15 | #include CONCAT(TVM_SOURCE_DIR,/src/runtime/c_runtime_api.cc) | ^~~~~~~~~~~~ C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:12:24: note: expanded from macro 'CONCAT' 12 | #define CONCAT(n1, n2) STRINGIFY_MACRO(EXPAND(n1) EXPAND(n2)) | ^~~~~~~~~~ C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:9:28: note: expanded from macro 'STRINGIFY_MACRO' 9 | #define STRINGIFY_MACRO(x) STR(x) | ^~ C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/tvm_runtime.h:10:16: note: expanded from macro 'STR' 10 | #define STR(x) #x | ^~

:329:1: note: expanded from here 329 | "C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm /src/runtime/c_runtime_api.cc" | ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1 error generated. ninja: build stopped: subcommand failed. Traceback (most recent call last): File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 120, in main(parsed.mlc_llm_source_dir) File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 103, in main run_cmake_build() File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 66, in run_cmake_build subprocess.run(cmd, check=True, env=os.environ) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--target', 'tvm4j_runtime_packed', '--config', 'release', '-j16']' returned non-zero exit status 1. Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Scripts\mlc_llm.exe\__main__.py", line 7, in File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\__main__.py", line 53, in main cli.main(sys.argv[2:]) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\cli\package.py", line 64, in main package( File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 361, in package build_android_binding(mlc_llm_source_dir, output) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 275, in build_android_binding subprocess.run([sys.executable, mlc4j_path / "prepare_libs.py"], check=True, env=os.environ) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['C:\\Users\\akula\\miniconda3\\envs\\mlc-chat-venv\\python.exe', WindowsPath('C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/prepare_libs.py')]' returned non-zero exit status 1.
MasterJH5574 commented 4 months ago

Thank you @AkulRT for reporting. I see, it seems that those macros might not directly work with Windows. We will dig into this and please give us some time to find a solution. Meanwhile, a quick workaround you can do locally is to replace these includes with the absolute path https://github.com/mlc-ai/mlc-llm/blob/0575b9244886b711e6a9809560d0dabb426edaea/android/mlc4j/src/cpp/tvm_runtime.h#L15-L44

For example, use

- #include CONCAT(TVM_SOURCE_DIR,/src/runtime/c_runtime_api.cc)
+ #include "C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/src/runtime/c_runtime_api.cc"

to include c_runtime_api.cc. And do the same for the other includes in the file.

MasterJH5574 commented 4 months ago

@AkulRT Ah wait the absolute path may not work as well. So instead we can do

#include "../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc"
AkulRT commented 4 months ago

Doing so gave the following error:

C:\WINDOWS\system32\cmd.exe /C "cd . && C:\Users\akula\AppData\Local\Android\Sdk\ndk\27.0.11902837\toolchains\llvm\prebuilt\windows-x86_64\bin\clang++.exe --target=aarch64-none-linux-android24 --sysroot=C:/Users/akula/AppData/Local/Android/Sdk/ndk/27.0.11902837/toolchains/llvm/prebuilt/windows-x86_64/sysroot -fPIC -g -DANDROID -fdata-sections -ffunction-sections -funwind-tables -fstack-protector-strong -no-canonical-prefixes -D_FORTIFY_SOURCE=2 -Wformat -Werror=format-security "-O3" -O3 -DNDEBUG -static-libstdc++ -Wl,--build-id=sha1 -Wl,--no-rosegment -Wl,--no-undefined-version -Wl,--fatal-warnings -Wl,--no-undefined -Qunused-arguments -Wl,--gc-sections -shared -Wl,-soname,libtvm4j_runtime_packed.so -o libtvm4j_runtime_packed.so CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o mlc_llm/tokenizers/libtokenizers_cpp.a -llog -Wl,--whole-archive mlc_llm/libmlc_llm.a lib/libmodel_android.a -Wl,--no-whole-archive mlc_llm/tokenizers/aarch64-linux-android/release/libtokenizers_c.a mlc_llm/tokenizers/sentencepiece/src/libsentencepiece.a -pthread -latomic -lm && cd ." ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFunction(tvm::runtime::String const&, bool)

referenced by packed_func.h:2136 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/3rdparty/tvm/include/tvm/runtime/packed_func.h:2136) model.cc.o:(mlc::llm::ModelMetadata::FromModule(tvm::runtime::Module, picojson::object_with_ordered_keys const&)) in archive mlc_llm/libmlc_llm.a referenced by engine.cc:647 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:647) engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::ndk1::vector<std::ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator>, std::ndk1::allocator<std::__ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator>>> const&, std::ndk1::vector<picojson::object_with_ordered_keys, std::ndk1::allocator> const&, DLDevice)::'lambda'(std::__ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::__ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a referenced by engine.cc:650 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:650) engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::ndk1::vector<std::ndk1::basic_string<char, std::__ndk1::char_traits, std::ndk1::allocator>, std::ndk1::allocator<std::ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator>>> const&, std::ndk1::vector<picojson::object_with_ordered_keys, std::ndk1::allocator> const&, DLDevice)::'lambda'(std::ndk1::basic_string<char, std::__ndk1::char_traits, std::ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::ndk1::basic_string<char, std::__ndk1::char_traits, std::ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a referenced 21 more times

ld.lld: error: undefined symbol: tvm::runtime::Module::LoadFromFile(tvm::runtime::String const&, tvm::runtime::String const&)

referenced by engine.cc:646 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/engine.cc:646) engine.cc.o:(mlc::llm::serve::EngineImpl::CreateDiscoSession(std::ndk1::vector<std::ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator>, std::ndk1::allocator<std::__ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator>>> const&, std::ndk1::vector<picojson::object_with_ordered_keys, std::ndk1::allocator> const&, DLDevice)::'lambda'(std::__ndk1::basic_string<char, std::ndk1::char_traits, std::ndk1::allocator> const&, picojson::object_with_ordered_keys const&)::operator()(std::__ndk1::basic_string<char, std::ndk1::char_traits, std::__ndk1::allocator> const&, picojson::object_with_ordered_keys const&) const) in archive mlc_llm/libmlc_llm.a referenced by function_table.cc:120 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/cpp/serve/function_table.cc:120) function_table.cc.o:(mlc::llm::serve::FunctionTable::Init(tvm::runtime::String, DLDevice, picojson::object_with_ordered_keys, tvm::runtime::Optional, int)) in archive mlc_llm/libmlc_llm.a referenced by c_runtime_api.cc:489 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:489) CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMModLoadFromFile)

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::SaveToFile(tvm::runtime::String const&, tvm::runtime::String const&)

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 5 more times

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::SaveToBinary(dmlc::Stream*)

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 5 more times

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetSource(tvm::runtime::String const&)

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 6 more times

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFormat()

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 7 more times

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::ImplementsFunction(tvm::runtime::String const&, bool)

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(vtable for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(vtable for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(vtable for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 7 more times

ld.lld: error: undefined symbol: typeinfo for tvm::runtime::ModuleNode

referenced by json_ffi_engine.cc json_ffi_engine.cc.o:(typeinfo for mlc::llm::json_ffi::JSONFFIEngineImpl) in archive mlc_llm/libmlc_llm.a referenced by engine.cc engine.cc.o:(typeinfo for mlc::llm::serve::EngineModule) in archive mlc_llm/libmlc_llm.a referenced by threaded_engine.cc threaded_engine.cc.o:(typeinfo for mlc::llm::serve::ThreadedEngineModule) in archive mlc_llm/libmlc_llm.a referenced 5 more times

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::Import(tvm::runtime::Module)

referenced by c_runtime_api.cc:499 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:499) CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMModImport)

ld.lld: error: undefined symbol: tvm::runtime::ModuleNode::GetFuncFromEnv(tvm::runtime::String const&)

referenced by c_runtime_api.cc:524 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/src/cpp/../../../../3rdparty/tvm/src/runtime/c_runtime_api.cc:524) CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(TVMBackendGetFuncFromEnv)

ld.lld: error: undefined symbol: vtable for tvm::runtime::ModuleNode

referenced by module.h:145 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include/tvm/runtime/module.h:145) CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(tvm::runtime::ModuleNode::~ModuleNode()) referenced by module.h:145 (C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/../../3rdparty/tvm/include/tvm/runtime/module.h:145) CMakeFiles/tvm4j_runtime_packed.dir/368b657d31d5c8d946e3ffa48aa52ef0/Internship/mlc-llm/3rdparty/tvm/jvm/native/src/main/native/org_apache_tvm_native_c_api.cc.o:(tvm::runtime::ModuleNode::~ModuleNode()) the vtable symbol may be undefined because the class is missing its key function (see https://lld.llvm.org/missingkeyfunction) clang++: error: linker command failed with exit code 1 (use -v to see invocation) ninja: build stopped: subcommand failed. Traceback (most recent call last): File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 120, in main(parsed.mlc_llm_source_dir) File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 103, in main run_cmake_build() File "C:\Users\akula\Desktop\Akul\School_Stuff\Internship\mlc-llm\android\mlc4j\prepare_libs.py", line 66, in run_cmake_build subprocess.run(cmd, check=True, env=os.environ) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['cmake', '--build', '.', '--target', 'tvm4j_runtime_packed', '--config', 'release', '-j16']' returned non-zero exit status 1. Traceback (most recent call last): File "", line 198, in _run_module_as_main File "", line 88, in _run_code File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Scripts\mlc_llm.exe__main__.py", line 7, in File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm__main__.py", line 53, in main cli.main(sys.argv[2:]) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\cli\package.py", line 64, in main package( File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 361, in package build_android_binding(mlc_llm_source_dir, output) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\site-packages\mlc_llm\interface\package.py", line 275, in build_android_binding subprocess.run([sys.executable, mlc4j_path / "prepare_libs.py"], check=True, env=os.environ) File "C:\Users\akula\miniconda3\envs\mlc-chat-venv\Lib\subprocess.py", line 571, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['C:\Users\akula\miniconda3\envs\mlc-chat-venv\python.exe', WindowsPath('C:/Users/akula/Desktop/Akul/School_Stuff/Internship/mlc-llm/android/mlc4j/prepare_libs.py')]' returned non-zero exit status 1.

I think I will try using an older wheel and see if that allows compiling. Thank you @MasterJH5574 for your prompt response, I really appreciate it!

MasterJH5574 commented 4 months ago

@AkulRT Thanks for the swift response. Yeah you can just try check out an older commit prior to this one https://github.com/mlc-ai/mlc-llm/commit/fbb6a48fa606fd5eba9a8a5e085da2692c433273 (since it is this commit that introduces the issue). Meanwhile we will try to look into it.

MasterJH5574 commented 4 months ago

@AkulRT We just merged a fix in https://github.com/mlc-ai/mlc-llm/pull/2616 and you can check out the latest main branch to try it.

Given the fix of the original gemma config problem has been confirmed, gonna close this issue now. You are more than welcome to open new issues for further problems :-)

AkulRT commented 4 months ago

Managed to get the chat app compiled. Thank you for your help @MasterJH5574 !