Closed DearFishi closed 6 months ago
Thank you @DearFishi for reporting! It looks like a bug. Would you mind sending a fix for this after confirming the fix can work for longer output?
Thank you @DearFishi for reporting! It looks like a bug. Would you mind sending a fix for this after confirming the fix can work for longer output? Thank for your review, I'll do it.
@MasterJH5574
Hello, I would like to ask if there are any plans to support Medusa for speculative decoding on the serve engine in the future
Thank you @jpf888 for bringing this up. Yeah Medusa-mode speculative decoding is a longer roadmap, but as of now we do not have plan to work on this very soon. You are more than welcome to contribute to the project :-)
Given the original issue has been resolved, I am gonna close this issue. And you can create a new feature request issue for the Medusa-mode speculative decoding.
🐛 Bug
The output of speculative decoding is inconsistent with the output of a single model
Speculative decoding for Llama-2-7b-chat-hf-q0f32, the ssm is Llama-2-7b-chat-hf-q4f16: Prompt 0: What is the meaning of life? Output 0:What is the purpose of life? What is the meaning of existence? These are some of the most fundamental questions that have puzzled philosophers, the
Single Llama-2-7b-chat-hf-q0f32: Prompt 0: What is the meaning of life? Output 0:What is the purpose of life? What is the meaning of existence? These are questions that have puzzled philosophers, theologians, scientists
To Reproduce
Steps to reproduce the behavior: Use the script from https://github.com/mlc-ai/mlc-llm/blob/main/tests/python/serve/test_serve_engine_spec.py
Expected behavior
The output of speculative decoding should be consistent with the output of a single model
Environment
python -c "import tvm; print('\n'.join(f'{k}: {v}' for k, v in tvm.support.libinfo().items()))"
, applicable if you compile models): USE_NVTX: OFF USE_GTEST: AUTO SUMMARIZE: OFF TVM_DEBUG_WITH_ABI_CHANGE: OFF USE_IOS_RPC: OFF USE_MSC: OFF USE_ETHOSU: OFF CUDA_VERSION: 12.3 USE_LIBBACKTRACE: AUTO DLPACK_PATH: 3rdparty/dlpack/include USE_TENSORRT_CODEGEN: OFF USE_THRUST: OFF USE_TARGET_ONNX: OFF USE_AOT_EXECUTOR: ON BUILD_DUMMY_LIBTVM: OFF USE_CUDNN: ON USE_TENSORRT_RUNTIME: OFF USE_ARM_COMPUTE_LIB_GRAPH_EXECUTOR: OFF USE_CCACHE: AUTO USE_ARM_COMPUTE_LIB: OFF USE_CPP_RTVM: OFF USE_OPENCL_GTEST: /path/to/opencl/gtest USE_MKL: OFF USE_PT_TVMDSOOP: OFF MLIR_VERSION: NOT-FOUND USE_CLML: OFF USE_STACKVM_RUNTIME: OFF USE_GRAPH_EXECUTOR_CUDA_GRAPH: OFF ROCM_PATH: /opt/rocm USE_DNNL: OFF USE_VITIS_AI: OFF USE_MLIR: OFF USE_RCCL: OFF USE_LLVM: llvm-config --ignore-libllvm --link-static USE_VERILATOR: OFF USE_TF_TVMDSOOP: OFF USE_THREADS: ON USE_MSVC_MT: OFF BACKTRACE_ON_SEGFAULT: OFF USE_GRAPH_EXECUTOR: ON USE_NCCL: OFF USE_ROCBLAS: OFF GIT_COMMIT_HASH: 7a8520581e4a70024de05fa9e803b5d2899796f6 USE_VULKAN: OFF USE_RUST_EXT: OFF USE_CUTLASS: OFF USE_CPP_RPC: OFF USE_HEXAGON: OFF USE_CUSTOM_LOGGING: OFF USE_UMA: OFF USE_FALLBACK_STL_MAP: OFF USE_SORT: ON USE_RTTI: ON GIT_COMMIT_TIME: 2024-04-17 15:07:41 -0700 USE_HEXAGON_SDK: /path/to/sdk USE_BLAS: none USE_ETHOSN: OFF USE_LIBTORCH: OFF USE_RANDOM: ON USE_CUDA: ON USE_COREML: OFF USE_AMX: OFF BUILD_STATIC_RUNTIME: OFF USE_CMSISNN: OFF USE_KHRONOS_SPIRV: OFF USE_CLML_GRAPH_EXECUTOR: OFF USE_TFLITE: OFF USE_HEXAGON_GTEST: /path/to/hexagon/gtest PICOJSON_PATH: 3rdparty/picojson USE_OPENCL_ENABLE_HOST_PTR: OFF INSTALL_DEV: OFF USE_PROFILER: ON USE_NNPACK: OFF LLVM_VERSION: 18.1.3 USE_MRVL: OFF USE_OPENCL: OFF COMPILER_RT_PATH: 3rdparty/compiler-rt RANG_PATH: 3rdparty/rang/include USE_SPIRV_KHR_INTEGER_DOT_PRODUCT: OFF USE_OPENMP: none USE_BNNS: OFF USE_FLASHINFER: OFF USE_CUBLAS: ON USE_METAL: OFF USE_MICRO_STANDALONE_RUNTIME: OFF USE_HEXAGON_EXTERNAL_LIBS: OFF USE_ALTERNATIVE_LINKER: AUTO USE_BYODT_POSIT: OFF USE_HEXAGON_RPC: OFF USE_MICRO: OFF DMLC_PATH: 3rdparty/dmlc-core/include INDEX_DEFAULT_I64: ON USE_RELAY_DEBUG: OFF USE_RPC: ON USE_TENSORFLOW_PATH: none TVM_CLML_VERSION: USE_MIOPEN: OFF USE_ROCM: OFF USE_PAPI: OFF USE_CURAND: OFF TVM_CXX_COMPILER_PATH: /usr/bin/c++ HIDE_PRIVATE_SYMBOLS: ONAdditional context
if I change the he code on lines 133-134 in https://github.com/mlc-ai/mlc-llm/blob/main/cpp/serve/engine_actions/batch_verify.cc to The result will right, like Prompt 0: What is the meaning of life? Output 0:What is the purpose of life? What is the meaning of existence? These are questions that have puzzled philosophers, theologians, scientists, and every