vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
26.56k stars 3.89k forks source link

[Bug]: VLLM 0.5.3.post1 [rank0]: RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) #6732

Open jueming0312 opened 1 month ago

jueming0312 commented 1 month ago

Your current environment

The output of `python collect_env.py`
Collecting environment information...
PyTorch version: 2.3.1+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A

OS: Ubuntu 22.04.3 LTS (x86_64)
GCC version: (Ubuntu 11.4.0-1ubuntu1~22.04) 11.4.0
Clang version: Could not collect
CMake version: version 3.30.1
Libc version: glibc-2.35

Python version: 3.10.12 (main, Mar 22 2024, 16:50:05) [GCC 11.4.0] (64-bit runtime)
Python platform: Linux-5.15.0-1053-nvidia-x86_64-with-glibc2.35
Is CUDA available: True
CUDA runtime version: 12.2.140
CUDA_MODULE_LOADING set to: LAZY
GPU models and configuration:
GPU 0: NVIDIA H100 80GB HBM3
GPU 1: NVIDIA H100 80GB HBM3
GPU 2: NVIDIA H100 80GB HBM3
GPU 3: NVIDIA H100 80GB HBM3
GPU 4: NVIDIA H100 80GB HBM3
GPU 5: NVIDIA H100 80GB HBM3
GPU 6: NVIDIA H100 80GB HBM3
GPU 7: NVIDIA H100 80GB HBM3

Nvidia driver version: 550.90.07
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_adv_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_adv_train.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_train.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_ops_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_ops_train.so.8.9.6
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Architecture:                       x86_64
CPU op-mode(s):                     32-bit, 64-bit
Address sizes:                      52 bits physical, 57 bits virtual
Byte Order:                         Little Endian
CPU(s):                             224
On-line CPU(s) list:                0-223
Vendor ID:                          GenuineIntel
Model name:                         Intel(R) Xeon(R) Platinum 8480C
CPU family:                         6
Model:                              143
Thread(s) per core:                 2
Core(s) per socket:                 56
Socket(s):                          2
Stepping:                           8
CPU max MHz:                        3800.0000
CPU min MHz:                        800.0000
BogoMIPS:                           4000.00
Flags:                              fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cat_l2 cdp_l3 invpcid_single intel_ppin cdp_l2 ssbd mba ibrs ibpb stibp ibrs_enhanced tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid cqm rdt_a avx512f avx512dq rdseed adx smap avx512ifma clflushopt clwb intel_pt avx512cd sha_ni avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local split_lock_detect avx_vnni avx512_bf16 wbnoinvd dtherm ida arat pln pts hwp hwp_act_window hwp_epp hwp_pkg_req avx512vbmi umip pku ospke waitpkg avx512_vbmi2 gfni vaes vpclmulqdq avx512_vnni avx512_bitalg tme avx512_vpopcntdq la57 rdpid bus_lock_detect cldemote movdiri movdir64b enqcmd fsrm md_clear serialize tsxldtrk pconfig arch_lbr amx_bf16 avx512_fp16 amx_tile amx_int8 flush_l1d arch_capabilities
Virtualization:                     VT-x
L1d cache:                          5.3 MiB (112 instances)
L1i cache:                          3.5 MiB (112 instances)
L2 cache:                           224 MiB (112 instances)
L3 cache:                           210 MiB (2 instances)
NUMA node(s):                       2
NUMA node0 CPU(s):                  0-55,112-167
NUMA node1 CPU(s):                  56-111,168-223
Vulnerability Gather data sampling: Not affected
Vulnerability Itlb multihit:        Not affected
Vulnerability L1tf:                 Not affected
Vulnerability Mds:                  Not affected
Vulnerability Meltdown:             Not affected
Vulnerability Mmio stale data:      Not affected
Vulnerability Retbleed:             Not affected
Vulnerability Spec rstack overflow: Not affected
Vulnerability Spec store bypass:    Mitigation; Speculative Store Bypass disabled via prctl and seccomp
Vulnerability Spectre v1:           Mitigation; usercopy/swapgs barriers and __user pointer sanitization
Vulnerability Spectre v2:           Mitigation; Enhanced IBRS, IBPB conditional, RSB filling, PBRSB-eIBRS SW sequence
Vulnerability Srbds:                Not affected
Vulnerability Tsx async abort:      Not affected

Versions of relevant libraries:
[pip3] numpy==1.26.4
[pip3] nvidia-nccl-cu12==2.20.5
[pip3] torch==2.3.1
[pip3] torchvision==0.18.1
[pip3] transformers==4.43.1
[pip3] triton==2.3.1
[conda] Could not collect
ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: 0.5.3.post1
vLLM Build Flags:
CUDA Archs: Not Set; ROCm: Disabled; Neuron: Disabled
GPU Topology:
GPU0    GPU1    GPU2    GPU3    GPU4    GPU5    GPU6    GPU7    NIC0    NIC1    NIC2    NIC3    NIC4    NIC5    NIC6    NIC7    NIC8    NIC9    NIC10   NIC11   CPU AffinityNUMA Affinity   GPU NUMA ID
GPU0     X  NV18    NV18    NV18    NV18    NV18    NV18    NV18    PXB NODE    NODE    NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS 0-55,112-167    0       N/A
GPU1    NV18     X  NV18    NV18    NV18    NV18    NV18    NV18    NODE    NODE    NODE    PXB NODE    NODE    SYS SYS SYS SYS SYS SYS 0-55,112-167    0       N/A
GPU2    NV18    NV18     X  NV18    NV18    NV18    NV18    NV18    NODE    NODE    NODE    NODE    PXB NODE    SYS SYS SYS SYS SYS SYS 0-55,112-167    0       N/A
GPU3    NV18    NV18    NV18     X  NV18    NV18    NV18    NV18    NODE    NODE    NODE    NODE    NODE    PXB SYS SYS SYS SYS SYS SYS 0-55,112-167    0       N/A
GPU4    NV18    NV18    NV18    NV18     X  NV18    NV18    NV18    SYS SYS SYS SYS SYS SYS PXB NODE    NODE    NODE    NODE    NODE    56-111,168-223  1       N/A
GPU5    NV18    NV18    NV18    NV18    NV18     X  NV18    NV18    SYS SYS SYS SYS SYS SYS NODE    NODE    NODE    PXB NODE    NODE    56-111,168-223  1       N/A
GPU6    NV18    NV18    NV18    NV18    NV18    NV18     X  NV18    SYS SYS SYS SYS SYS SYS NODE    NODE    NODE    NODE    PXB NODE    56-111,168-223  1       N/A
GPU7    NV18    NV18    NV18    NV18    NV18    NV18    NV18     X  SYS SYS SYS SYS SYS SYS NODE    NODE    NODE    NODE    NODE    PXB 56-111,168-223  1       N/A
NIC0    PXB NODE    NODE    NODE    SYS SYS SYS SYS  X  NODE    NODE    NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS
NIC1    NODE    NODE    NODE    NODE    SYS SYS SYS SYS NODE     X  PIX NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS
NIC2    NODE    NODE    NODE    NODE    SYS SYS SYS SYS NODE    PIX  X  NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS
NIC3    NODE    PXB NODE    NODE    SYS SYS SYS SYS NODE    NODE    NODE     X  NODE    NODE    SYS SYS SYS SYS SYS SYS
NIC4    NODE    NODE    PXB NODE    SYS SYS SYS SYS NODE    NODE    NODE    NODE     X  NODE    SYS SYS SYS SYS SYS SYS
NIC5    NODE    NODE    NODE    PXB SYS SYS SYS SYS NODE    NODE    NODE    NODE    NODE     X  SYS SYS SYS SYS SYS SYS
NIC6    SYS SYS SYS SYS PXB NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS  X  NODE    NODE    NODE    NODE    NODE
NIC7    SYS SYS SYS SYS NODE    NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS NODE     X  PIX NODE    NODE    NODE
NIC8    SYS SYS SYS SYS NODE    NODE    NODE    NODE    SYS SYS SYS SYS SYS SYS NODE    PIX  X  NODE    NODE    NODE
NIC9    SYS SYS SYS SYS NODE    PXB NODE    NODE    SYS SYS SYS SYS SYS SYS NODE    NODE    NODE     X  NODE    NODE
NIC10   SYS SYS SYS SYS NODE    NODE    PXB NODE    SYS SYS SYS SYS SYS SYS NODE    NODE    NODE    NODE     X  NODE
NIC11   SYS SYS SYS SYS NODE    NODE    NODE    PXB SYS SYS SYS SYS SYS SYS NODE    NODE    NODE    NODE    NODE     X

Legend:

  X    = Self
  SYS  = Connection traversing PCIe as well as the SMP interconnect between NUMA nodes (e.g., QPI/UPI)
  NODE = Connection traversing PCIe as well as the interconnect between PCIe Host Bridges within a NUMA node
  PHB  = Connection traversing PCIe as well as a PCIe Host Bridge (typically the CPU)
  PXB  = Connection traversing multiple PCIe bridges (without traversing the PCIe Host Bridge)
  PIX  = Connection traversing at most a single PCIe bridge
  NV#  = Connection traversing a bonded set of # NVLinks

NIC Legend:

  NIC0: mlx5_0
  NIC1: mlx5_1
  NIC2: mlx5_2
  NIC3: mlx5_3
  NIC4: mlx5_4
  NIC5: mlx5_5
  NIC6: mlx5_6
  NIC7: mlx5_7
  NIC8: mlx5_8
  NIC9: mlx5_9
  NIC10: mlx5_10
  NIC11: mlx5_11

🐛 Describe the bug

root@dbe9716daf8891-r5r5v:/workspace# vllm serve /models/Meta-Llama-3.1-405B-Instruct-FP8 --tensor-parallel-size 8 INFO 07-24 08:25:24 api_server.py:219] vLLM API server version 0.5.3.post1 INFO 07-24 08:25:24 api_server.py:220] args: Namespace(model_tag='/models/Meta-Llama-3.1-405B-Instruct-FP8', host=None, port=8000, uvicorn_log_level='info', allow_credentials=False, allowed_origins=[''], allowed_methods=[''], allowed_headers=[''], api_key=None, lora_modules=None, prompt_adapters=None, chat_template=None, response_role='assistant', ssl_keyfile=None, ssl_certfile=None, ssl_ca_certs=None, ssl_cert_reqs=0, root_path=None, middleware=[], model='/models/Meta-Llama-3.1-405B-Instruct-FP8', tokenizer=None, skip_tokenizer_init=False, revision=None, code_revision=None, tokenizer_revision=None, tokenizer_mode='auto', trust_remote_code=False, download_dir=None, load_format='auto', dtype='auto', kv_cache_dtype='auto', quantization_param_path=None, max_model_len=None, guided_decoding_backend='outlines', distributed_executor_backend=None, worker_use_ray=False, pipeline_parallel_size=1, tensor_parallel_size=8, max_parallel_loading_workers=None, ray_workers_use_nsight=False, block_size=16, enable_prefix_caching=False, disable_sliding_window=False, use_v2_block_manager=False, num_lookahead_slots=0, seed=0, swap_space=4, cpu_offload_gb=0, gpu_memory_utilization=0.9, num_gpu_blocks_override=None, max_num_batched_tokens=None, max_num_seqs=256, max_logprobs=20, disable_log_stats=False, quantization=None, rope_scaling=None, rope_theta=None, enforce_eager=False, max_context_len_to_capture=None, max_seq_len_to_capture=8192, disable_custom_all_reduce=False, tokenizer_pool_size=0, tokenizer_pool_type='ray', tokenizer_pool_extra_config=None, enable_lora=False, max_loras=1, max_lora_rank=16, lora_extra_vocab_size=256, lora_dtype='auto', long_lora_scaling_factors=None, max_cpu_loras=None, fully_sharded_loras=False, enable_prompt_adapter=False, max_prompt_adapters=1, max_prompt_adapter_token=0, device='auto', scheduler_delay_factor=0.0, enable_chunked_prefill=None, speculative_model=None, num_speculative_tokens=None, speculative_draft_tensor_parallel_size=None, speculative_max_model_len=None, speculative_disable_by_batch_size=None, ngram_prompt_lookup_max=None, ngram_prompt_lookup_min=None, spec_decoding_acceptance_method='rejection_sampler', typical_acceptance_sampler_posterior_threshold=None, typical_acceptance_sampler_posterior_alpha=None, disable_logprobs_during_spec_decoding=None, model_loader_extra_config=None, ignore_patterns=[], preemption_mode=None, served_model_name=None, qlora_adapter_name_or_path=None, otlp_traces_endpoint=None, engine_use_ray=False, disable_log_requests=False, max_log_len=None, dispatch_function=<function serve at 0x7fb401d4a710>) INFO 07-24 08:25:25 config.py:715] Defaulting to use mp for distributed inference WARNING 07-24 08:25:25 arg_utils.py:762] Chunked prefill is enabled by default for models with max_model_len > 32K. Currently, chunked prefill might not work with some features or models. If you encounter any issues, please disable chunked prefill by setting --enable-chunked-prefill=False. INFO 07-24 08:25:25 config.py:806] Chunked prefill is enabled with max_num_batched_tokens=512. INFO 07-24 08:25:25 llm_engine.py:176] Initializing an LLM engine (v0.5.3.post1) with config: model='/models/Meta-Llama-3.1-405B-Instruct-FP8', speculative_config=None, tokenizer='/models/Meta-Llama-3.1-405B-Instruct-FP8', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, rope_scaling=None, rope_theta=None, tokenizer_revision=None, trust_remote_code=False, dtype=torch.bfloat16, max_seq_len=131072, download_dir=None, load_format=LoadFormat.AUTO, tensor_parallel_size=8, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=fbgemm_fp8, enforce_eager=False, kv_cache_dtype=auto, quantization_param_path=None, device_config=cuda, decoding_config=DecodingConfig(guided_decoding_backend='outlines'), observability_config=ObservabilityConfig(otlp_traces_endpoint=None), seed=0, served_model_name=/models/Meta-Llama-3.1-405B-Instruct-FP8, use_v2_block_manager=False, enable_prefix_caching=False) INFO 07-24 08:25:25 custom_cache_manager.py:17] Setting Triton cache manager to: vllm.triton_utils.custom_cache_manager:CustomCacheManager (VllmWorkerProcess pid=758) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=757) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=760) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=761) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=759) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=763) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=762) INFO 07-24 08:25:27 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=757) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=757) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=758) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=758) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=759) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=760) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=761) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=763) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=761) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=762) INFO 07-24 08:25:45 utils.py:784] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=760) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=763) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=762) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=759) INFO 07-24 08:25:45 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=761) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(*args, *kwargs) (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=763) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=760) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(*args, kwargs) (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=762) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(*args, *kwargs) (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=759) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=758) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] output = executor(*args, **kwargs) (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=757) ERROR 07-24 08:26:01 multiproc_worker_utils.py:226] rank0: Traceback (most recent call last): rank0: File "/usr/local/bin/vllm", line 8, in

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/scripts.py", line 148, in main

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/scripts.py", line 28, in serve

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/entrypoints/openai/api_server.py", line 231, in run_server rank0: if llm_engine is not None else AsyncLLMEngine.from_engine_args( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/engine/async_llm_engine.py", line 466, in from_engine_args rank0: engine = cls( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/engine/async_llm_engine.py", line 380, in init rank0: self.engine = self._init_engine(*args, kwargs) rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/engine/async_llm_engine.py", line 547, in _init_engine rank0: return engine_class(*args, *kwargs) rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/engine/llm_engine.py", line 251, in init rank0: self.model_executor = executor_class( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_gpu_executor.py", line 201, in init rank0: super().init(args, kwargs) rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/executor/distributed_gpu_executor.py", line 25, in init rank0: super().init(*args, **kwargs) rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/executor/executor_base.py", line 47, in init

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_gpu_executor.py", line 123, in _init_executor

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/executor/multiproc_gpu_executor.py", line 178, in _run_workers rank0: driver_worker_output = driver_worker_method(*args, **kwargs) rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 132, in init_device rank0: init_worker_distributed_environment(self.parallel_config, self.rank, rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/worker/worker.py", line 346, in init_worker_distributed_environment

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 923, in ensure_model_parallel_initialized

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 889, in initialize_model_parallel rank0: _TP = init_model_parallel_group(group_ranks, rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 732, in init_model_parallel_group rank0: return GroupCoordinator( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/parallel_state.py", line 176, in init rank0: self.pynccl_comm = PyNcclCommunicator( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init rank0: self.comm: ncclComm_t = self.nccl.ncclCommInitRank( rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank

rank0: File "/usr/local/lib/python3.10/dist-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK rank0: raise RuntimeError(f"NCCL error: {error_str}") rank0: RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) Segmentation fault (core dumped)

Semihal commented 1 month ago

I get same error :(

kilakila-heart commented 1 month ago

I get same error :(

yuzisun commented 1 month ago

We got the same error for 405B and did not find any answer on the known issue page

youkaichao commented 1 month ago

please at least follow https://docs.vllm.ai/en/latest/getting_started/debugging.html to report more information.

xiejibing commented 3 weeks ago

I got the same error, and here is the detailed log:


INFO 08-19 19:00:46 api_server.py:339] vLLM API server version 0.5.4 INFO 08-19 19:00:46 api_server.py:340] args: Namespace(model_tag='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', host=None, port=8000, uvicorn_log_level='info', allow_credentials=False, allowed_origins=[''], allowed_methods=[''], allowed_headers=[''], api_key=None, lora_modules=None, prompt_adapters=None, chat_template=None, response_role='assistant', ssl_keyfile=None, ssl_certfile=None, ssl_ca_certs=None, ssl_cert_reqs=0, root_path=None, middleware=[], return_tokens_as_token_ids=False, disable_frontend_multiprocessing=False, model='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', tokenizer=None, skip_tokenizer_init=False, revision=None, code_revision=None, tokenizer_revision=None, tokenizer_mode='auto', trust_remote_code=False, download_dir=None, load_format='auto', dtype='auto', kv_cache_dtype='auto', quantization_param_path=None, max_model_len=None, guided_decoding_backend='outlines', distributed_executor_backend=None, worker_use_ray=False, pipeline_parallel_size=1, tensor_parallel_size=8, max_parallel_loading_workers=None, ray_workers_use_nsight=False, block_size=16, enable_prefix_caching=False, disable_sliding_window=False, use_v2_block_manager=False, num_lookahead_slots=0, seed=0, swap_space=4, cpu_offload_gb=0, gpu_memory_utilization=0.9, num_gpu_blocks_override=None, max_num_batched_tokens=None, max_num_seqs=256, max_logprobs=20, disable_log_stats=False, quantization=None, rope_scaling=None, rope_theta=None, enforce_eager=False, max_context_len_to_capture=None, max_seq_len_to_capture=8192, disable_custom_all_reduce=False, tokenizer_pool_size=0, tokenizer_pool_type='ray', tokenizer_pool_extra_config=None, enable_lora=False, max_loras=1, max_lora_rank=16, lora_extra_vocab_size=256, lora_dtype='auto', long_lora_scaling_factors=None, max_cpu_loras=None, fully_sharded_loras=False, enable_prompt_adapter=False, max_prompt_adapters=1, max_prompt_adapter_token=0, device='auto', scheduler_delay_factor=0.0, enable_chunked_prefill=None, speculative_model=None, num_speculative_tokens=None, speculative_draft_tensor_parallel_size=None, speculative_max_model_len=None, speculative_disable_by_batch_size=None, ngram_prompt_lookup_max=None, ngram_prompt_lookup_min=None, spec_decoding_acceptance_method='rejection_sampler', typical_acceptance_sampler_posterior_threshold=None, typical_acceptance_sampler_posterior_alpha=None, disable_logprobs_during_spec_decoding=None, model_loader_extra_config=None, ignore_patterns=[], preemption_mode=None, served_model_name=None, qlora_adapter_name_or_path=None, otlp_traces_endpoint=None, engine_use_ray=False, disable_log_requests=False, max_log_len=None, dispatch_function=<function serve at 0x7fcdadf16e60>) WARNING 08-19 19:00:46 config.py:1454] Casting torch.bfloat16 to torch.float16. INFO 08-19 19:00:46 config.py:729] Defaulting to use mp for distributed inference WARNING 08-19 19:00:46 arg_utils.py:766] Chunked prefill is enabled by default for models with max_model_len > 32K. Currently, chunked prefill might not work with some features or models. If you encounter any issues, please disable chunked prefill by setting --enable-chunked-prefill=False. INFO 08-19 19:00:46 config.py:820] Chunked prefill is enabled with max_num_batched_tokens=512. INFO 08-19 19:00:46 llm_engine.py:174] Initializing an LLM engine (v0.5.4) with config: model='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', speculative_config=None, tokenizer='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, rope_scaling=None, rope_theta=None, tokenizer_revision=None, trust_remote_code=False, dtype=torch.bfloat16, max_seq_len=131072, download_dir=None, load_format=LoadFormat.AUTO, tensor_parallel_size=8, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=fbgemm_fp8, enforce_eager=False, kv_cache_dtype=auto, quantization_param_path=None, device_config=cuda, decoding_config=DecodingConfig(guided_decoding_backend='outlines'), observability_config=ObservabilityConfig(otlp_traces_endpoint=None), seed=0, served_model_name=/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723, use_v2_block_manager=False, enable_prefix_caching=False) WARNING 08-19 19:00:47 multiproc_gpu_executor.py:59] Reducing Torch parallelism from 112 threads to 1 to avoid unnecessary CPU contention. Set OMP_NUM_THREADS in the external environment to tune this value as needed. INFO 08-19 19:00:47 custom_cache_manager.py:17] Setting Triton cache manager to: vllm.triton_utils.custom_cache_manager:CustomCacheManager (VllmWorkerProcess pid=413) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=414) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=416) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=412) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=415) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=417) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=411) INFO 08-19 19:00:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=412) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=413) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=412) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=414) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=413) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=417) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=416) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=414) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=417) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=416) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=411) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=411) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=415) INFO 08-19 19:00:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=415) INFO 08-19 19:00:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=416) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] output = executor(*args, *kwargs) (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=415) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=411) ERROR 08-19 19:00:55 multiproc_worker_utils.py:226] ^CTraceback (most recent call last): File "/opt/conda/bin/vllm", line 8, in sys.exit(main()) File "/opt/conda/lib/python3.10/site-packages/vllm/scripts.py", line 149, in main args.dispatch_function(args) File "/opt/conda/lib/python3.10/site-packages/vllm/scripts.py", line 30, in serve asyncio.run(run_server(args)) File "/opt/conda/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/opt/conda/lib/python3.10/asyncio/base_events.py", line 636, in run_until_complete self.run_forever() File "/opt/conda/lib/python3.10/asyncio/base_events.py", line 603, in run_forever self._run_once() File "/opt/conda/lib/python3.10/asyncio/base_events.py", line 1871, in _run_once event_list = self._selector.select(timeout) File "/opt/conda/lib/python3.10/selectors.py", line 469, in select fd_event_list = self._selector.poll(timeout, max_ev) KeyboardInterrupt

I have no name!@h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:/mnt$ ^C I have no name!@h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:/mnt$ ^C I have no name!@h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:/mnt$ ps aux USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND 103771 1 0.0 0.0 5656 3056 ? Ss 18:54 0:00 /bin/bash -c -- while true; do sleep 300; done 103771 128 0.0 0.0 6364 3976 pts/0 Ss 18:58 0:00 bash 103771 209 0.0 0.0 4284 588 ? S 18:59 0:00 sleep 300 103771 679 0.0 0.0 7704 3192 pts/0 R+ 19:03 0:00 ps aux I have no name!@h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:/mnt$ export NCCL_DEBUG=TRACE I have no name!@h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:/mnt$ vllm serve /mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723 --tensor-parallel-size 8 There was a problem when trying to write in your cache folder (/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory. INFO 08-19 19:03:46 api_server.py:339] vLLM API server version 0.5.4 INFO 08-19 19:03:46 api_server.py:340] args: Namespace(model_tag='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', host=None, port=8000, uvicorn_log_level='info', allow_credentials=False, allowed_origins=[''], allowed_methods=[''], allowed_headers=['*'], api_key=None, lora_modules=None, prompt_adapters=None, chat_template=None, response_role='assistant', ssl_keyfile=None, ssl_certfile=None, ssl_ca_certs=None, ssl_cert_reqs=0, root_path=None, middleware=[], return_tokens_as_token_ids=False, disable_frontend_multiprocessing=False, model='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', tokenizer=None, skip_tokenizer_init=False, revision=None, code_revision=None, tokenizer_revision=None, tokenizer_mode='auto', trust_remote_code=False, download_dir=None, load_format='auto', dtype='auto', kv_cache_dtype='auto', quantization_param_path=None, max_model_len=None, guided_decoding_backend='outlines', distributed_executor_backend=None, worker_use_ray=False, pipeline_parallel_size=1, tensor_parallel_size=8, max_parallel_loading_workers=None, ray_workers_use_nsight=False, block_size=16, enable_prefix_caching=False, disable_sliding_window=False, use_v2_block_manager=False, num_lookahead_slots=0, seed=0, swap_space=4, cpu_offload_gb=0, gpu_memory_utilization=0.9, num_gpu_blocks_override=None, max_num_batched_tokens=None, max_num_seqs=256, max_logprobs=20, disable_log_stats=False, quantization=None, rope_scaling=None, rope_theta=None, enforce_eager=False, max_context_len_to_capture=None, max_seq_len_to_capture=8192, disable_custom_all_reduce=False, tokenizer_pool_size=0, tokenizer_pool_type='ray', tokenizer_pool_extra_config=None, enable_lora=False, max_loras=1, max_lora_rank=16, lora_extra_vocab_size=256, lora_dtype='auto', long_lora_scaling_factors=None, max_cpu_loras=None, fully_sharded_loras=False, enable_prompt_adapter=False, max_prompt_adapters=1, max_prompt_adapter_token=0, device='auto', scheduler_delay_factor=0.0, enable_chunked_prefill=None, speculative_model=None, num_speculative_tokens=None, speculative_draft_tensor_parallel_size=None, speculative_max_model_len=None, speculative_disable_by_batch_size=None, ngram_prompt_lookup_max=None, ngram_prompt_lookup_min=None, spec_decoding_acceptance_method='rejection_sampler', typical_acceptance_sampler_posterior_threshold=None, typical_acceptance_sampler_posterior_alpha=None, disable_logprobs_during_spec_decoding=None, model_loader_extra_config=None, ignore_patterns=[], preemption_mode=None, served_model_name=None, qlora_adapter_name_or_path=None, otlp_traces_endpoint=None, engine_use_ray=False, disable_log_requests=False, max_log_len=None, dispatch_function=<function serve at 0x7fee0fddee60>) WARNING 08-19 19:03:46 config.py:1454] Casting torch.bfloat16 to torch.float16. INFO 08-19 19:03:46 config.py:729] Defaulting to use mp for distributed inference WARNING 08-19 19:03:46 arg_utils.py:766] Chunked prefill is enabled by default for models with max_model_len > 32K. Currently, chunked prefill might not work with some features or models. If you encounter any issues, please disable chunked prefill by setting --enable-chunked-prefill=False. INFO 08-19 19:03:46 config.py:820] Chunked prefill is enabled with max_num_batched_tokens=512. INFO 08-19 19:03:46 llm_engine.py:174] Initializing an LLM engine (v0.5.4) with config: model='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', speculative_config=None, tokenizer='/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, rope_scaling=None, rope_theta=None, tokenizer_revision=None, trust_remote_code=False, dtype=torch.bfloat16, max_seq_len=131072, download_dir=None, load_format=LoadFormat.AUTO, tensor_parallel_size=8, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=fbgemm_fp8, enforce_eager=False, kv_cache_dtype=auto, quantization_param_path=None, device_config=cuda, decoding_config=DecodingConfig(guided_decoding_backend='outlines'), observability_config=ObservabilityConfig(otlp_traces_endpoint=None), seed=0, served_model_name=/mnt/llm-models/models/llm-demo-project/Llama-3_1-405B-FP8/240723, use_v2_block_manager=False, enable_prefix_caching=False) WARNING 08-19 19:03:46 multiproc_gpu_executor.py:59] Reducing Torch parallelism from 112 threads to 1 to avoid unnecessary CPU contention. Set OMP_NUM_THREADS in the external environment to tune this value as needed. INFO 08-19 19:03:46 custom_cache_manager.py:17] Setting Triton cache manager to: vllm.triton_utils.custom_cache_manager:CustomCacheManager (VllmWorkerProcess pid=841) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=842) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=843) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=844) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=839) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=840) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks (VllmWorkerProcess pid=845) INFO 08-19 19:03:47 multiproc_worker_utils.py:215] Worker ready; awaiting tasks INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=841) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=841) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation (VllmWorkerProcess pid=839) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=840) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=842) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=839) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=840) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=843) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=844) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=845) INFO 08-19 19:03:49 utils.py:841] Found nccl from library libnccl.so.2 (VllmWorkerProcess pid=842) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=844) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=843) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 (VllmWorkerProcess pid=845) INFO 08-19 19:03:49 pynccl.py:63] vLLM is using nccl==2.20.5 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO cudaDriverVersion 12020 NCCL version 2.20.5+cuda12.4 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO cudaDriverVersion 12020 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Failed to open libibverbs.so[.1] h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0> h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Using non-device net plugin version 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Using network Socket h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO comm 0xb1324d0 rank 4 nranks 8 cudaDev 4 nvmlDev 4 busId 9a000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO comm 0xb132b70 rank 5 nranks 8 cudaDev 5 nvmlDev 5 busId ab000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO comm 0xb1310a0 rank 1 nranks 8 cudaDev 1 nvmlDev 1 busId 2a000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO comm 0xb132850 rank 2 nranks 8 cudaDev 2 nvmlDev 2 busId 3a000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO comm 0xb132960 rank 3 nranks 8 cudaDev 3 nvmlDev 3 busId 5d000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO comm 0xb139c40 rank 0 nranks 8 cudaDev 0 nvmlDev 0 busId 18000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO comm 0xb134160 rank 6 nranks 8 cudaDev 6 nvmlDev 6 busId ba000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO comm 0xb134030 rank 7 nranks 8 cudaDev 7 nvmlDev 7 busId db000 commId 0x2405afa45ee5c660 - Init START h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Setting affinity for GPU 5 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO NVLS multicast support is available on dev 5 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Setting affinity for GPU 4 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO NVLS multicast support is available on dev 4 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Setting affinity for GPU 1 to ff,fffff000,00000000,00000000,00ffffff,f0000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Setting affinity for GPU 0 to 0fff,ffff0000,00000000,00000000,0fffffff h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO NVLS multicast support is available on dev 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO NVLS multicast support is available on dev 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO NCCL_CUMEM_ENABLE set by environment to 0. h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Setting affinity for GPU 3 to 0fff,ffff0000,00000000,00000000,0fffffff h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO NVLS multicast support is available on dev 3 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Setting affinity for GPU 7 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO NVLS multicast support is available on dev 7 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Setting affinity for GPU 6 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO NVLS multicast support is available on dev 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Setting affinity for GPU 2 to ff,fffff000,00000000,00000000,00ffffff,f0000000 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO NVLS multicast support is available on dev 2 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO comm 0xb1310a0 rank 1 nRanks 8 nNodes 1 localRanks 8 localRank 1 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO comm 0xb134160 rank 6 nRanks 8 nNodes 1 localRanks 8 localRank 6 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO comm 0xb132b70 rank 5 nRanks 8 nNodes 1 localRanks 8 localRank 5 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO comm 0xb139c40 rank 0 nRanks 8 nNodes 1 localRanks 8 localRank 0 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 00/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO comm 0xb132850 rank 2 nRanks 8 nNodes 1 localRanks 8 localRank 2 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 01/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Trees [0] 6/-1/-1->5->7 [1] 6/-1/-1->5->7 [2] 6/-1/-1->5->7 [3] 6/-1/-1->5->7 [4] 6/-1/-1->5->7 [5] 6/-1/-1->5->7 [6] 6/-1/-1->5->7 [7] 6/-1/-1->5->7 [8] 6/-1/-1->5->7 [9] 6/-1/-1->5->7 [10] 6/-1/-1->5->7 [11] 6/-1/-1->5->7 [12] 6/-1/-1->5->7 [13] 6/-1/-1->5->7 [14] 6/-1/-1->5->7 [15] 6/-1/-1->5->7 [16] 6/-1/-1->5->7 [17] 6/-1/-1->5->7 [18] 6/-1/-1->5->7 [19] 6/-1/-1->5->7 [20] 6/-1/-1->5->7 [21] 6/-1/-1->5->7 [22] 6/-1/-1->5->7 [23] 6/-1/-1->5->7 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Trees [0] -1/-1/-1->6->5 [1] -1/-1/-1->6->5 [2] -1/-1/-1->6->5 [3] -1/-1/-1->6->5 [4] -1/-1/-1->6->5 [5] -1/-1/-1->6->5 [6] -1/-1/-1->6->5 [7] -1/-1/-1->6->5 [8] -1/-1/-1->6->5 [9] -1/-1/-1->6->5 [10] -1/-1/-1->6->5 [11] -1/-1/-1->6->5 [12] -1/-1/-1->6->5 [13] -1/-1/-1->6->5 [14] -1/-1/-1->6->5 [15] -1/-1/-1->6->5 [16] -1/-1/-1->6->5 [17] -1/-1/-1->6->5 [18] -1/-1/-1->6->5 [19] -1/-1/-1->6->5 [20] -1/-1/-1->6->5 [21] -1/-1/-1->6->5 [22] -1/-1/-1->6->5 [23] -1/-1/-1->6->5 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 02/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO comm 0xb132960 rank 3 nRanks 8 nNodes 1 localRanks 8 localRank 3 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 03/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Trees [0] 2/-1/-1->1->3 [1] 2/-1/-1->1->3 [2] 2/-1/-1->1->3 [3] 2/-1/-1->1->3 [4] 2/-1/-1->1->3 [5] 2/-1/-1->1->3 [6] 2/-1/-1->1->3 [7] 2/-1/-1->1->3 [8] 2/-1/-1->1->3 [9] 2/-1/-1->1->3 [10] 2/-1/-1->1->3 [11] 2/-1/-1->1->3 [12] 2/-1/-1->1->3 [13] 2/-1/-1->1->3 [14] 2/-1/-1->1->3 [15] 2/-1/-1->1->3 [16] 2/-1/-1->1->3 [17] 2/-1/-1->1->3 [18] 2/-1/-1->1->3 [19] 2/-1/-1->1->3 [20] 2/-1/-1->1->3 [21] 2/-1/-1->1->3 [22] 2/-1/-1->1->3 [23] 2/-1/-1->1->3 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO comm 0xb134030 rank 7 nRanks 8 nNodes 1 localRanks 8 localRank 7 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 04/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Trees [0] 4/-1/-1->2->1 [1] 4/-1/-1->2->1 [2] 4/-1/-1->2->1 [3] 4/-1/-1->2->1 [4] 4/-1/-1->2->1 [5] 4/-1/-1->2->1 [6] 4/-1/-1->2->1 [7] 4/-1/-1->2->1 [8] 4/-1/-1->2->1 [9] 4/-1/-1->2->1 [10] 4/-1/-1->2->1 [11] 4/-1/-1->2->1 [12] 4/-1/-1->2->1 [13] 4/-1/-1->2->1 [14] 4/-1/-1->2->1 [15] 4/-1/-1->2->1 [16] 4/-1/-1->2->1 [17] 4/-1/-1->2->1 [18] 4/-1/-1->2->1 [19] 4/-1/-1->2->1 [20] 4/-1/-1->2->1 [21] 4/-1/-1->2->1 [22] 4/-1/-1->2->1 [23] 4/-1/-1->2->1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO comm 0xb1324d0 rank 4 nRanks 8 nNodes 1 localRanks 8 localRank 4 MNNVL 0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 05/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Trees [0] 1/-1/-1->3->0 [1] 1/-1/-1->3->0 [2] 1/-1/-1->3->0 [3] 1/-1/-1->3->0 [4] 1/-1/-1->3->0 [5] 1/-1/-1->3->0 [6] 1/-1/-1->3->0 [7] 1/-1/-1->3->0 [8] 1/-1/-1->3->0 [9] 1/-1/-1->3->0 [10] 1/-1/-1->3->0 [11] 1/-1/-1->3->0 [12] 1/-1/-1->3->0 [13] 1/-1/-1->3->0 [14] 1/-1/-1->3->0 [15] 1/-1/-1->3->0 [16] 1/-1/-1->3->0 [17] 1/-1/-1->3->0 [18] 1/-1/-1->3->0 [19] 1/-1/-1->3->0 [20] 1/-1/-1->3->0 [21] 1/-1/-1->3->0 [22] 1/-1/-1->3->0 [23] 1/-1/-1->3->0 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 06/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 07/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Trees [0] 5/-1/-1->7->4 [1] 5/-1/-1->7->4 [2] 5/-1/-1->7->4 [3] 5/-1/-1->7->4 [4] 5/-1/-1->7->4 [5] 5/-1/-1->7->4 [6] 5/-1/-1->7->4 [7] 5/-1/-1->7->4 [8] 5/-1/-1->7->4 [9] 5/-1/-1->7->4 [10] 5/-1/-1->7->4 [11] 5/-1/-1->7->4 [12] 5/-1/-1->7->4 [13] 5/-1/-1->7->4 [14] 5/-1/-1->7->4 [15] 5/-1/-1->7->4 [16] 5/-1/-1->7->4 [17] 5/-1/-1->7->4 [18] 5/-1/-1->7->4 [19] 5/-1/-1->7->4 [20] 5/-1/-1->7->4 [21] 5/-1/-1->7->4 [22] 5/-1/-1->7->4 [23] 5/-1/-1->7->4 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 08/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 09/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Trees [0] 7/-1/-1->4->2 [1] 7/-1/-1->4->2 [2] 7/-1/-1->4->2 [3] 7/-1/-1->4->2 [4] 7/-1/-1->4->2 [5] 7/-1/-1->4->2 [6] 7/-1/-1->4->2 [7] 7/-1/-1->4->2 [8] 7/-1/-1->4->2 [9] 7/-1/-1->4->2 [10] 7/-1/-1->4->2 [11] 7/-1/-1->4->2 [12] 7/-1/-1->4->2 [13] 7/-1/-1->4->2 [14] 7/-1/-1->4->2 [15] 7/-1/-1->4->2 [16] 7/-1/-1->4->2 [17] 7/-1/-1->4->2 [18] 7/-1/-1->4->2 [19] 7/-1/-1->4->2 [20] 7/-1/-1->4->2 [21] 7/-1/-1->4->2 [22] 7/-1/-1->4->2 [23] 7/-1/-1->4->2 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 10/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 11/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 12/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 13/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 14/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 15/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 16/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 17/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 18/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 19/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 20/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 21/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 22/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 23/24 : 0 3 1 2 4 7 5 6 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Trees [0] 3/-1/-1->0->-1 [1] 3/-1/-1->0->-1 [2] 3/-1/-1->0->-1 [3] 3/-1/-1->0->-1 [4] 3/-1/-1->0->-1 [5] 3/-1/-1->0->-1 [6] 3/-1/-1->0->-1 [7] 3/-1/-1->0->-1 [8] 3/-1/-1->0->-1 [9] 3/-1/-1->0->-1 [10] 3/-1/-1->0->-1 [11] 3/-1/-1->0->-1 [12] 3/-1/-1->0->-1 [13] 3/-1/-1->0->-1 [14] 3/-1/-1->0->-1 [15] 3/-1/-1->0->-1 [16] 3/-1/-1->0->-1 [17] 3/-1/-1->0->-1 [18] 3/-1/-1->0->-1 [19] 3/-1/-1->0->-1 [20] 3/-1/-1->0->-1 [21] 3/-1/-1->0->-1 [22] 3/-1/-1->0->-1 [23] 3/-1/-1->0->-1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO P2P Chunksize set to 524288 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 00/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 01/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 02/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 03/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 00/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 04/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 05/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 01/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 06/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 02/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 07/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 03/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 08/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 04/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 09/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 10/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 05/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 11/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 06/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 12/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 07/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 13/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 08/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 14/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 09/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 15/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 10/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 16/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 11/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 17/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 12/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 18/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 13/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 19/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 14/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 20/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 15/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 21/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 16/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 22/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 17/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 23/0 : 5[5] -> 6[6] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 18/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 00/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 19/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 01/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 20/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 02/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 21/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 03/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 22/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 04/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 23/0 : 1[1] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 05/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 06/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 00/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 07/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 01/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 08/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 02/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 09/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 03/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 10/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 04/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 11/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 05/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 12/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 06/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 13/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 07/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 14/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 08/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 15/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 09/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 16/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 10/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 17/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 11/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 18/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 12/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 19/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 13/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 20/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 14/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 21/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 15/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 22/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 16/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 23/0 : 6[6] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 17/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 00/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 18/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 01/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 19/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 20/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 02/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 03/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 21/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 04/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 22/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 05/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 23/0 : 2[2] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 06/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 00/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 07/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 01/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 02/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 03/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 08/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 04/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 09/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 05/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 10/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 06/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 11/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 07/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 12/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 08/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 13/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 14/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 15/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 09/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 16/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 10/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 17/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 11/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 18/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 12/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 19/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 13/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 20/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 14/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 21/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 15/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 22/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 16/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Channel 23/0 : 0[0] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 17/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 00/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 18/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 01/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 19/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 02/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 20/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 03/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 21/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 04/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 05/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 22/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 06/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 23/0 : 4[4] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 07/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 00/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 08/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 01/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 09/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 02/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 10/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 03/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 11/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 04/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 12/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 05/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 13/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 06/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 14/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 07/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 08/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 15/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 09/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 16/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 10/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 11/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 17/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 18/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 19/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 20/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 12/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 21/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 13/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 22/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 14/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 23/0 : 3[3] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 15/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 16/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 17/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 18/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 19/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 20/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 21/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 22/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 23/0 : 7[7] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 00/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 00/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Connected all rings h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 00/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 01/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 02/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 01/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 01/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 03/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 02/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 02/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 04/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 03/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 03/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 05/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 04/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 04/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 06/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 05/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 05/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 07/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 06/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 06/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 08/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 07/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 07/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 09/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 08/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 08/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 10/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 09/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 09/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 11/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 10/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 10/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 12/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 11/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 11/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 13/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 12/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 12/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 14/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 13/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 13/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 15/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 14/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 14/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 16/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 15/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 15/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 17/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 16/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 16/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 18/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 17/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 17/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 19/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 18/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 18/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 20/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 19/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 19/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 21/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 20/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 20/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 22/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 21/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 21/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Channel 23/0 : 6[6] -> 5[5] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 22/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 22/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Channel 23/0 : 1[1] -> 3[3] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Channel 23/0 : 5[5] -> 7[7] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 00/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 01/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 02/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 03/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 04/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 05/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 00/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 06/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 01/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 07/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 02/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 08/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 03/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 09/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 04/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 10/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 05/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 11/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 06/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 12/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 07/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 13/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 08/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 14/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 09/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 15/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 10/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 16/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 11/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 17/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 12/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 18/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 13/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 19/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 14/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 20/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 21/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 22/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Channel 23/0 : 3[3] -> 0[0] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 15/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 16/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 17/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 18/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 19/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 20/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 21/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 22/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Channel 23/0 : 7[7] -> 4[4] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 00/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 01/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 02/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 03/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 04/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 05/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 06/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 07/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 08/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 09/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 10/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 11/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 12/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 13/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 14/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 15/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 16/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 17/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 18/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 19/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 20/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 21/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 22/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Channel 23/0 : 4[4] -> 2[2] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 00/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO NVLS comm 0xb134160 headRank 7 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO NVLS comm 0xb132b70 headRank 6 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 01/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 02/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 03/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 04/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 05/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 06/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 07/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 08/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 09/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 10/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 11/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 12/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 13/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 14/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 15/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 16/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 17/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 18/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 19/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 20/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 21/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 22/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Channel 23/0 : 2[2] -> 1[1] via P2P/IPC h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO NVLS comm 0xb134030 headRank 5 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO NVLS comm 0xb1324d0 headRank 4 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO Connected all trees h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO NVLS comm 0xb132850 headRank 3 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO NVLS comm 0xb1310a0 headRank 2 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO NVLS comm 0xb139c40 headRank 0 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO NVLS comm 0xb132960 headRank 1 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument' h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument' h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument' h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument' h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO transport/nvls.cc:328 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO transport/nvls.cc:328 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO transport/nvls.cc:328 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO transport/nvls.cc:328 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO transport/nvls.cc:328 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO init.cc:1236 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO init.cc:1501 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO init.cc:1746 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:843:843 [5] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:844:844 [6] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:842:842 [4] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:840:840 [2] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:774:774 [0] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:845:845 [7] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:841:841 [3] NCCL INFO init.cc:1784 -> 1 h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:839:839 [1] NCCL INFO init.cc:1784 -> 1 (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] output = executor(*args, kwargs) (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=843) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] output = executor(*args, *kwargs) (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=844) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method init_device: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details), Traceback (most recent call last): (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] output = executor(args, kwargs) (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 132, in init_device (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] init_worker_distributed_environment(self.parallel_config, self.rank, (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 348, in init_worker_distributed_environment (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] ensure_model_parallel_initialized(parallel_config.tensor_parallel_size, (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 965, in ensure_model_parallel_initialized (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] initialize_model_parallel(tensor_model_parallel_size, (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 931, in initialize_model_parallel (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] _TP = init_model_parallel_group(group_ranks, (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 773, in init_model_parallel_group (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] return GroupCoordinator( (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/parallel_state.py", line 154, in init (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.pynccl_comm = PyNcclCommunicator( (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl.py", line 89, in init (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.comm: ncclComm_t = self.nccl.ncclCommInitRank( (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 244, in ncclCommInitRank (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] self.NCCL_CHECK(self._funcs["ncclCommInitRank"](ctypes.byref(comm), (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] File "/opt/conda/lib/python3.10/site-packages/vllm/distributed/device_communicators/pynccl_wrapper.py", line 223, in NCCL_CHECK (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] raise RuntimeError(f"NCCL error: {error_str}") (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226] RuntimeError: NCCL error: unhandled cuda error (run with NCCL_DEBUG=INFO for details) (VllmWorkerProcess pid=839) ERROR 08-19 19:03:55 multiproc_worker_utils.py:226]

xiejibing commented 3 weeks ago
Here is the log after run `NCCL_DEBUG=TRACE torchrun --nproc-per-node=8 test.py`

vllm: 0.5.4
cuda:12.2

-----
W0819 19:28:50.060000 140406652626752 torch/distributed/run.py:779] 
W0819 19:28:50.060000 140406652626752 torch/distributed/run.py:779] *****************************************
W0819 19:28:50.060000 140406652626752 torch/distributed/run.py:779] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. 
W0819 19:28:50.060000 140406652626752 torch/distributed/run.py:779] *****************************************
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO cudaDriverVersion 12020
NCCL version 2.20.5+cuda12.4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO Bootstrap : Using eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO NET/Socket : Using [0]eth0:10.177.137.15<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO comm 0x7c429b0 rank 2 nranks 8 cudaDev 2 nvmlDev 2 busId 3a000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO comm 0x7d01420 rank 3 nranks 8 cudaDev 3 nvmlDev 3 busId 5d000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO comm 0x87952f0 rank 6 nranks 8 cudaDev 6 nvmlDev 6 busId ba000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO comm 0x7687540 rank 7 nranks 8 cudaDev 7 nvmlDev 7 busId db000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO comm 0x7e3e850 rank 1 nranks 8 cudaDev 1 nvmlDev 1 busId 2a000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO comm 0x8d3a240 rank 4 nranks 8 cudaDev 4 nvmlDev 4 busId 9a000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO comm 0x7a20aa0 rank 0 nranks 8 cudaDev 0 nvmlDev 0 busId 18000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO comm 0x8df0a00 rank 5 nranks 8 cudaDev 5 nvmlDev 5 busId ab000 commId 0x50e06a01eda5a139 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Setting affinity for GPU 3 to 0fff,ffff0000,00000000,00000000,0fffffff
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO NVLS multicast support is available on dev 3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Setting affinity for GPU 6 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO NVLS multicast support is available on dev 6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Setting affinity for GPU 4 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO NVLS multicast support is available on dev 4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Setting affinity for GPU 7 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO NVLS multicast support is available on dev 7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Setting affinity for GPU 2 to ff,fffff000,00000000,00000000,00ffffff,f0000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO NVLS multicast support is available on dev 2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Setting affinity for GPU 0 to 0fff,ffff0000,00000000,00000000,0fffffff
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO NVLS multicast support is available on dev 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Setting affinity for GPU 1 to ff,fffff000,00000000,00000000,00ffffff,f0000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO NVLS multicast support is available on dev 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Setting affinity for GPU 5 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO NVLS multicast support is available on dev 5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO comm 0x87952f0 rank 6 nRanks 8 nNodes 1 localRanks 8 localRank 6 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO comm 0x8d3a240 rank 4 nRanks 8 nNodes 1 localRanks 8 localRank 4 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO comm 0x7d01420 rank 3 nRanks 8 nNodes 1 localRanks 8 localRank 3 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO comm 0x7e3e850 rank 1 nRanks 8 nNodes 1 localRanks 8 localRank 1 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Trees [0] -1/-1/-1->6->5 [1] -1/-1/-1->6->5 [2] -1/-1/-1->6->5 [3] -1/-1/-1->6->5 [4] -1/-1/-1->6->5 [5] -1/-1/-1->6->5 [6] -1/-1/-1->6->5 [7] -1/-1/-1->6->5 [8] -1/-1/-1->6->5 [9] -1/-1/-1->6->5 [10] -1/-1/-1->6->5 [11] -1/-1/-1->6->5 [12] -1/-1/-1->6->5 [13] -1/-1/-1->6->5 [14] -1/-1/-1->6->5 [15] -1/-1/-1->6->5 [16] -1/-1/-1->6->5 [17] -1/-1/-1->6->5 [18] -1/-1/-1->6->5 [19] -1/-1/-1->6->5 [20] -1/-1/-1->6->5 [21] -1/-1/-1->6->5 [22] -1/-1/-1->6->5 [23] -1/-1/-1->6->5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO comm 0x8df0a00 rank 5 nRanks 8 nNodes 1 localRanks 8 localRank 5 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Trees [0] 7/-1/-1->4->2 [1] 7/-1/-1->4->2 [2] 7/-1/-1->4->2 [3] 7/-1/-1->4->2 [4] 7/-1/-1->4->2 [5] 7/-1/-1->4->2 [6] 7/-1/-1->4->2 [7] 7/-1/-1->4->2 [8] 7/-1/-1->4->2 [9] 7/-1/-1->4->2 [10] 7/-1/-1->4->2 [11] 7/-1/-1->4->2 [12] 7/-1/-1->4->2 [13] 7/-1/-1->4->2 [14] 7/-1/-1->4->2 [15] 7/-1/-1->4->2 [16] 7/-1/-1->4->2 [17] 7/-1/-1->4->2 [18] 7/-1/-1->4->2 [19] 7/-1/-1->4->2 [20] 7/-1/-1->4->2 [21] 7/-1/-1->4->2 [22] 7/-1/-1->4->2 [23] 7/-1/-1->4->2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Trees [0] 1/-1/-1->3->0 [1] 1/-1/-1->3->0 [2] 1/-1/-1->3->0 [3] 1/-1/-1->3->0 [4] 1/-1/-1->3->0 [5] 1/-1/-1->3->0 [6] 1/-1/-1->3->0 [7] 1/-1/-1->3->0 [8] 1/-1/-1->3->0 [9] 1/-1/-1->3->0 [10] 1/-1/-1->3->0 [11] 1/-1/-1->3->0 [12] 1/-1/-1->3->0 [13] 1/-1/-1->3->0 [14] 1/-1/-1->3->0 [15] 1/-1/-1->3->0 [16] 1/-1/-1->3->0 [17] 1/-1/-1->3->0 [18] 1/-1/-1->3->0 [19] 1/-1/-1->3->0 [20] 1/-1/-1->3->0 [21] 1/-1/-1->3->0 [22] 1/-1/-1->3->0 [23] 1/-1/-1->3->0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Trees [0] 6/-1/-1->5->7 [1] 6/-1/-1->5->7 [2] 6/-1/-1->5->7 [3] 6/-1/-1->5->7 [4] 6/-1/-1->5->7 [5] 6/-1/-1->5->7 [6] 6/-1/-1->5->7 [7] 6/-1/-1->5->7 [8] 6/-1/-1->5->7 [9] 6/-1/-1->5->7 [10] 6/-1/-1->5->7 [11] 6/-1/-1->5->7 [12] 6/-1/-1->5->7 [13] 6/-1/-1->5->7 [14] 6/-1/-1->5->7 [15] 6/-1/-1->5->7 [16] 6/-1/-1->5->7 [17] 6/-1/-1->5->7 [18] 6/-1/-1->5->7 [19] 6/-1/-1->5->7 [20] 6/-1/-1->5->7 [21] 6/-1/-1->5->7 [22] 6/-1/-1->5->7 [23] 6/-1/-1->5->7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Trees [0] 2/-1/-1->1->3 [1] 2/-1/-1->1->3 [2] 2/-1/-1->1->3 [3] 2/-1/-1->1->3 [4] 2/-1/-1->1->3 [5] 2/-1/-1->1->3 [6] 2/-1/-1->1->3 [7] 2/-1/-1->1->3 [8] 2/-1/-1->1->3 [9] 2/-1/-1->1->3 [10] 2/-1/-1->1->3 [11] 2/-1/-1->1->3 [12] 2/-1/-1->1->3 [13] 2/-1/-1->1->3 [14] 2/-1/-1->1->3 [15] 2/-1/-1->1->3 [16] 2/-1/-1->1->3 [17] 2/-1/-1->1->3 [18] 2/-1/-1->1->3 [19] 2/-1/-1->1->3 [20] 2/-1/-1->1->3 [21] 2/-1/-1->1->3 [22] 2/-1/-1->1->3 [23] 2/-1/-1->1->3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO comm 0x7c429b0 rank 2 nRanks 8 nNodes 1 localRanks 8 localRank 2 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO comm 0x7687540 rank 7 nRanks 8 nNodes 1 localRanks 8 localRank 7 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO comm 0x7a20aa0 rank 0 nRanks 8 nNodes 1 localRanks 8 localRank 0 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 00/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 01/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Trees [0] 5/-1/-1->7->4 [1] 5/-1/-1->7->4 [2] 5/-1/-1->7->4 [3] 5/-1/-1->7->4 [4] 5/-1/-1->7->4 [5] 5/-1/-1->7->4 [6] 5/-1/-1->7->4 [7] 5/-1/-1->7->4 [8] 5/-1/-1->7->4 [9] 5/-1/-1->7->4 [10] 5/-1/-1->7->4 [11] 5/-1/-1->7->4 [12] 5/-1/-1->7->4 [13] 5/-1/-1->7->4 [14] 5/-1/-1->7->4 [15] 5/-1/-1->7->4 [16] 5/-1/-1->7->4 [17] 5/-1/-1->7->4 [18] 5/-1/-1->7->4 [19] 5/-1/-1->7->4 [20] 5/-1/-1->7->4 [21] 5/-1/-1->7->4 [22] 5/-1/-1->7->4 [23] 5/-1/-1->7->4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Trees [0] 4/-1/-1->2->1 [1] 4/-1/-1->2->1 [2] 4/-1/-1->2->1 [3] 4/-1/-1->2->1 [4] 4/-1/-1->2->1 [5] 4/-1/-1->2->1 [6] 4/-1/-1->2->1 [7] 4/-1/-1->2->1 [8] 4/-1/-1->2->1 [9] 4/-1/-1->2->1 [10] 4/-1/-1->2->1 [11] 4/-1/-1->2->1 [12] 4/-1/-1->2->1 [13] 4/-1/-1->2->1 [14] 4/-1/-1->2->1 [15] 4/-1/-1->2->1 [16] 4/-1/-1->2->1 [17] 4/-1/-1->2->1 [18] 4/-1/-1->2->1 [19] 4/-1/-1->2->1 [20] 4/-1/-1->2->1 [21] 4/-1/-1->2->1 [22] 4/-1/-1->2->1 [23] 4/-1/-1->2->1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 02/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 03/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 04/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 05/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 06/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 07/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 08/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 09/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 10/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 11/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 12/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 13/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 14/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 15/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 16/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 17/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 18/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 19/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 20/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 21/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 22/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 23/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Trees [0] 3/-1/-1->0->-1 [1] 3/-1/-1->0->-1 [2] 3/-1/-1->0->-1 [3] 3/-1/-1->0->-1 [4] 3/-1/-1->0->-1 [5] 3/-1/-1->0->-1 [6] 3/-1/-1->0->-1 [7] 3/-1/-1->0->-1 [8] 3/-1/-1->0->-1 [9] 3/-1/-1->0->-1 [10] 3/-1/-1->0->-1 [11] 3/-1/-1->0->-1 [12] 3/-1/-1->0->-1 [13] 3/-1/-1->0->-1 [14] 3/-1/-1->0->-1 [15] 3/-1/-1->0->-1 [16] 3/-1/-1->0->-1 [17] 3/-1/-1->0->-1 [18] 3/-1/-1->0->-1 [19] 3/-1/-1->0->-1 [20] 3/-1/-1->0->-1 [21] 3/-1/-1->0->-1 [22] 3/-1/-1->0->-1 [23] 3/-1/-1->0->-1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 00/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 01/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 02/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 03/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 04/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 05/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 00/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 06/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 01/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 07/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 02/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 08/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 03/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 04/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 09/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 05/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 10/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 06/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 11/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 07/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 12/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 08/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 13/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 09/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 14/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 10/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 15/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 11/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 16/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 12/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 17/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 13/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 18/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 14/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 19/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 15/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 20/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 16/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 21/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 17/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 22/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 18/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 23/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 19/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 20/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 00/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 21/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 01/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 22/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 02/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 23/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 03/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 00/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 04/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 01/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 05/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 02/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 06/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 03/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 07/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 04/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 08/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 05/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 09/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 06/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 10/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 07/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 11/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 08/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 12/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 09/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 13/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 10/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 14/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 11/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 15/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 12/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 16/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 13/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 17/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 14/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 18/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 15/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 19/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 16/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 20/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 17/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 21/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 18/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 22/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 19/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 23/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 20/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 00/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 21/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 01/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 22/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 02/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 23/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 03/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 00/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 04/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 01/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 05/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 02/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 06/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 03/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 07/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 04/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 08/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 05/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 09/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 06/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 10/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 07/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 11/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 08/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 12/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 09/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 13/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 10/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 14/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 11/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 15/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 12/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 16/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 13/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 17/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 14/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 18/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 15/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 19/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 16/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 17/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 20/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 18/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 21/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 19/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 22/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 20/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Channel 23/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 21/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 00/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 22/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 01/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 23/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 02/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 03/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 00/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 04/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 01/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 05/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 02/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 06/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 03/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 07/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 08/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 09/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 10/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 11/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 12/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 04/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 13/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 05/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 14/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 06/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 15/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 07/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 16/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 08/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 17/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 09/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 10/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 18/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 11/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 12/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 19/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 13/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 20/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 14/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 21/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 15/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 16/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 17/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 18/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 19/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 20/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 22/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 21/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 23/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 22/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 23/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 00/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 00/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 00/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 01/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 01/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 01/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 02/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 02/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 02/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 03/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 03/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 03/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 04/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 04/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 04/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 05/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 05/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 05/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 06/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 06/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 06/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 07/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 07/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 07/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 08/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 08/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 08/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 09/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 09/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 09/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 10/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 10/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 10/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 11/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 11/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 11/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 12/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 12/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 12/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 13/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 13/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 13/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 14/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 14/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 14/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 15/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 15/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 15/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 16/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 16/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 16/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 17/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 17/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 17/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 18/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 18/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 18/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 19/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 19/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 19/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 20/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 20/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 20/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 21/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 21/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 21/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 22/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 22/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 22/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Channel 23/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Channel 23/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Channel 23/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 00/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 00/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 01/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 01/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 02/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 02/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 03/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 03/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 04/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 04/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 05/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 05/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 06/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 06/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 07/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 07/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 08/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 08/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 09/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 09/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 10/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 10/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 11/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 11/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 12/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 12/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 13/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 13/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 14/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 14/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 15/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 15/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 16/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 16/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 17/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 17/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 18/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 18/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 19/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 19/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 20/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 20/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 21/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 21/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 22/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 22/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Channel 23/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Channel 23/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 00/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 01/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 02/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 03/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 04/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 05/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 06/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 07/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 08/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 09/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 10/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 11/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 12/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 13/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 14/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 15/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 16/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 17/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 18/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 19/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 20/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 21/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 22/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Channel 23/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 00/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 01/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 02/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 03/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 04/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 05/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 06/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 07/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 08/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 09/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 10/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 11/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 12/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 13/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 14/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 15/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 16/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 17/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 18/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 19/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 20/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 21/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 22/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Channel 23/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO NVLS comm 0x87952f0 headRank 7 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO NVLS comm 0x8df0a00 headRank 6 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO NVLS comm 0x7d01420 headRank 1 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO NVLS comm 0x7a20aa0 headRank 0 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO NVLS comm 0x8d3a240 headRank 4 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO NVLS comm 0x7e3e850 headRank 2 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO NVLS comm 0x7c429b0 headRank 3 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO NVLS comm 0x7687540 headRank 5 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO init.cc:1236 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2384 [4] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2379 [2] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2383 [3] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2380 [5] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2378 [1] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2377 [0] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2382 [6] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2381 [7] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2395 [4] NCCL INFO [Service thread] Connection closed by localRank 4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2391 [6] NCCL INFO [Service thread] Connection closed by localRank 6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2397 [1] NCCL INFO [Service thread] Connection closed by localRank 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2400 [2] NCCL INFO [Service thread] Connection closed by localRank 2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2399 [3] NCCL INFO [Service thread] Connection closed by localRank 3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2405 [0] NCCL INFO [Service thread] Connection closed by localRank 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2403 [7] NCCL INFO [Service thread] Connection closed by localRank 7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2392 [5] NCCL INFO [Service thread] Connection closed by localRank 5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2342:2342 [6] NCCL INFO comm 0x87952f0 rank 6 nranks 8 cudaDev 6 busId ba000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2336:2336 [0] NCCL INFO comm 0x7a20aa0 rank 0 nranks 8 cudaDev 0 busId 18000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2343:2343 [7] NCCL INFO comm 0x7687540 rank 7 nranks 8 cudaDev 7 busId db000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2341:2341 [5] NCCL INFO comm 0x8df0a00 rank 5 nranks 8 cudaDev 5 busId ab000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2339:2339 [3] NCCL INFO comm 0x7d01420 rank 3 nranks 8 cudaDev 3 busId 5d000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2337:2337 [1] NCCL INFO comm 0x7e3e850 rank 1 nranks 8 cudaDev 1 busId 2a000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2338:2338 [2] NCCL INFO comm 0x7c429b0 rank 2 nranks 8 cudaDev 2 busId 3a000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd:2340:2340 [4] NCCL INFO comm 0x8d3a240 rank 4 nranks 8 cudaDev 4 busId 9a000 - Abort COMPLETE
[rank6]: Traceback (most recent call last):
[rank6]:   File "/tmp/test.py", line 8, in <module>
[rank6]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank6]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank6]:     return func(*args, **kwargs)
[rank6]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank6]:     work = group.allreduce([tensor], opts)
[rank6]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank6]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank6]: Last error:
[rank6]: Cuda failure 1 'invalid argument'
[rank5]: Traceback (most recent call last):
[rank5]:   File "/tmp/test.py", line 8, in <module>
[rank5]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank5]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank5]:     return func(*args, **kwargs)
[rank5]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank5]:     work = group.allreduce([tensor], opts)
[rank5]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank5]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank5]: Last error:
[rank5]: Cuda failure 1 'invalid argument'
[rank1]: Traceback (most recent call last):
[rank1]:   File "/tmp/test.py", line 8, in <module>
[rank1]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank1]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank1]:     return func(*args, **kwargs)
[rank1]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank1]:     work = group.allreduce([tensor], opts)
[rank1]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank1]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank1]: Last error:
[rank1]: Cuda failure 1 'invalid argument'
[rank0]: Traceback (most recent call last):
[rank0]:   File "/tmp/test.py", line 8, in <module>
[rank0]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank0]:     return func(*args, **kwargs)
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank0]:     work = group.allreduce([tensor], opts)
[rank0]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank0]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank0]: Last error:
[rank0]: Cuda failure 1 'invalid argument'
[rank4]: Traceback (most recent call last):
[rank4]:   File "/tmp/test.py", line 8, in <module>
[rank4]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank4]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank4]:     return func(*args, **kwargs)
[rank4]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank4]:     work = group.allreduce([tensor], opts)
[rank4]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank4]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank4]: Last error:
[rank4]: Cuda failure 1 'invalid argument'
[rank2]: Traceback (most recent call last):
[rank2]:   File "/tmp/test.py", line 8, in <module>
[rank2]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank2]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank2]:     return func(*args, **kwargs)
[rank2]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank2]:     work = group.allreduce([tensor], opts)
[rank2]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank2]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank2]: Last error:
[rank2]: Cuda failure 1 'invalid argument'
[rank7]: Traceback (most recent call last):
[rank7]:   File "/tmp/test.py", line 8, in <module>
[rank7]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank7]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank7]:     return func(*args, **kwargs)
[rank7]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank7]:     work = group.allreduce([tensor], opts)
[rank7]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank7]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank7]: Last error:
[rank7]: Cuda failure 1 'invalid argument'
[rank3]: Traceback (most recent call last):
[rank3]:   File "/tmp/test.py", line 8, in <module>
[rank3]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank3]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank3]:     return func(*args, **kwargs)
[rank3]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank3]:     work = group.allreduce([tensor], opts)
[rank3]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank3]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank3]: Last error:
[rank3]: Cuda failure 1 'invalid argument'
[rank0]:[W819 19:29:00.828498674 ProcessGroupNCCL.cpp:1168] Warning: WARNING: process group has NOT been destroyed before we destruct ProcessGroupNCCL. On normal program exit, the application should call destroy_process_group to ensure that any pending NCCL operations have finished in this process. In rare cases this process can exit before this point and block the progress of another member of the process group. This constraint has always been present,  but this warning has only been added since PyTorch 2.4 (function operator())
W0819 19:29:00.785000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2336 closing signal SIGTERM
W0819 19:29:00.786000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2337 closing signal SIGTERM
W0819 19:29:00.786000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2338 closing signal SIGTERM
W0819 19:29:00.786000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2339 closing signal SIGTERM
W0819 19:29:00.786000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2340 closing signal SIGTERM
W0819 19:29:00.786000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 2343 closing signal SIGTERM
E0819 19:29:01.095000 140406652626752 torch/distributed/elastic/multiprocessing/api.py:833] failed (exitcode: 1) local_rank: 5 (pid: 2341) of binary: /opt/conda/bin/python
Traceback (most recent call last):
  File "/opt/conda/bin/torchrun", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 348, in wrapper
    return f(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/run.py", line 901, in main
    run(args)
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/run.py", line 892, in run
    elastic_launch(
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 133, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 264, in launch_agent
    raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError: 
============================================================
test.py FAILED
------------------------------------------------------------
Failures:
[1]:
  time      : 2024-08-19_19:29:00
  host      : h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd
  rank      : 6 (local_rank: 6)
  exitcode  : 1 (pid: 2342)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
  time      : 2024-08-19_19:29:00
  host      : h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-gn6nd
  rank      : 5 (local_rank: 5)
  exitcode  : 1 (pid: 2341)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
youkaichao commented 3 weeks ago

@xiejibing can you also report your environment, using https://github.com/vllm-project/vllm/blob/main/collect_env.py ?

Since it fails at line 8 when you run NCCL_DEBUG=TRACE torchrun --nproc-per-node=8 test.py, chances are your hardware or driver might have a problem.

xiejibing commented 3 weeks ago

@youkaichao Here is the env report. Thanks!

Collecting environment information...
There was a problem when trying to write in your cache folder (/.cache/huggingface/hub). You should set the environment variable TRANSFORMERS_CACHE to a writable directory.
PyTorch version: 2.4.0+cu121
Is debug build: False
CUDA used to build PyTorch: 12.1
ROCM used to build PyTorch: N/A

OS: Ubuntu 20.04.6 LTS (x86_64)
GCC version: (Ubuntu 9.4.0-1ubuntu1~20.04.2) 9.4.0
Clang version: Could not collect
CMake version: version 3.30.2
Libc version: glibc-2.31

Python version: 3.10.12 (main, Jul  5 2023, 19:22:19) [GCC 11.2.0] (64-bit runtime)
Python platform: Linux-5.15.0-26-generic-x86_64-with-glibc2.31
Is CUDA available: True
CUDA runtime version: 12.2.140
CUDA_MODULE_LOADING set to: LAZY
GPU models and configuration: Could not collect
Nvidia driver version: Could not collect
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_adv_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_adv_train.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_cnn_train.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_ops_infer.so.8.9.6
/usr/lib/x86_64-linux-gnu/libcudnn_ops_train.so.8.9.6
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Architecture:                    x86_64
CPU op-mode(s):                  32-bit, 64-bit
Byte Order:                      Little Endian
Address sizes:                   52 bits physical, 57 bits virtual
CPU(s):                          224
On-line CPU(s) list:             0-223
Thread(s) per core:              2
Core(s) per socket:              56
Socket(s):                       2
NUMA node(s):                    2
Vendor ID:                       GenuineIntel
CPU family:                      6
Model:                           143
Model name:                      Intel(R) Xeon(R) Platinum 8480+
Stepping:                        8
Frequency boost:                 enabled
CPU MHz:                         2001.000
CPU max MHz:                     2001.0000
CPU min MHz:                     800.0000
BogoMIPS:                        4000.00
Virtualization:                  VT-x
L1d cache:                       5.3 MiB
L1i cache:                       3.5 MiB
L2 cache:                        224 MiB
L3 cache:                        210 MiB
NUMA node0 CPU(s):               0-55,112-167
NUMA node1 CPU(s):               56-111,168-223
Vulnerability Itlb multihit:     Not affected
Vulnerability L1tf:              Not affected
Vulnerability Mds:               Not affected
Vulnerability Meltdown:          Not affected
Vulnerability Spec store bypass: Mitigation; Speculative Store Bypass disabled via prctl and seccomp
Vulnerability Spectre v1:        Mitigation; usercopy/swapgs barriers and __user pointer sanitization
Vulnerability Spectre v2:        Mitigation; Enhanced IBRS, IBPB conditional, RSB filling
Vulnerability Srbds:             Not affected
Vulnerability Tsx async abort:   Not affected
Flags:                           fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss ht tm pbe syscall nx pdpe1gb rdtscp lm constant_tsc art arch_perfmon pebs bts rep_good nopl xtopology nonstop_tsc cpuid aperfmperf tsc_known_freq pni pclmulqdq dtes64 monitor ds_cpl vmx smx est tm2 ssse3 sdbg fma cx16 xtpr pdcm pcid dca sse4_1 sse4_2 x2apic movbe popcnt tsc_deadline_timer aes xsave avx f16c rdrand lahf_lm abm 3dnowprefetch cpuid_fault epb cat_l3 cat_l2 cdp_l3 invpcid_single intel_ppin cdp_l2 ssbd mba ibrs ibpb stibp ibrs_enhanced tpr_shadow vnmi flexpriority ept vpid ept_ad fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid cqm rdt_a avx512f avx512dq rdseed adx smap avx512ifma clflushopt clwb intel_pt avx512cd sha_ni avx512bw avx512vl xsaveopt xsavec xgetbv1 xsaves cqm_llc cqm_occup_llc cqm_mbm_total cqm_mbm_local split_lock_detect avx_vnni avx512_bf16 wbnoinvd dtherm ida arat pln pts hwp hwp_act_window hwp_epp hwp_pkg_req avx512vbmi umip pku ospke waitpkg avx512_vbmi2 gfni vaes vpclmulqdq avx512_vnni avx512_bitalg tme avx512_vpopcntdq la57 rdpid bus_lock_detect cldemote movdiri movdir64b enqcmd fsrm md_clear serialize tsxldtrk pconfig arch_lbr amx_bf16 avx512_fp16 amx_tile amx_int8 flush_l1d arch_capabilities

Versions of relevant libraries:
[pip3] numpy==1.26.4
[pip3] nvidia-nccl-cu12==2.20.5
[pip3] pyzmq==26.0.3
[pip3] torch==2.4.0
[pip3] torchvision==0.19.0
[pip3] transformers==4.44.0
[pip3] triton==3.0.0
[conda] numpy                     1.26.4                   pypi_0    pypi
[conda] nvidia-nccl-cu12          2.20.5                   pypi_0    pypi
[conda] pyzmq                     26.0.3                   pypi_0    pypi
[conda] torch                     2.4.0                    pypi_0    pypi
[conda] torchvision               0.19.0                   pypi_0    pypi
[conda] transformers              4.44.0                   pypi_0    pypi
[conda] triton                    3.0.0                    pypi_0    pypi
ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: 0.5.4@4db5176d9758b720b05460c50ace3c01026eb158
vLLM Build Flags:
CUDA Archs: Not Set; ROCm: Disabled; Neuron: Disabled
GPU Topology:
Could not collect
youkaichao commented 3 weeks ago

GPU Topology: Could not collect

your environment looks very strange. it does not have GPU info and topology info.

xiejibing commented 3 weeks ago

Thanks. I will check the hardware.

xiejibing commented 3 weeks ago

@youkaichao Here is the GPU info:

ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: N/A
vLLM Build Flags:
CUDA Archs: Not Set; ROCm: Disabled; Neuron: Disabled
GPU Topology:
GPU0    GPU1    GPU2    GPU3    GPU4    GPU5    GPU6    GPU7    NIC0    NIC1    CPU Affinity    NUMA Affinity   GPU NUMA ID
GPU0     X  NV18    NV18    NV18    NV18    NV18    NV18    NV18    SYS SYS 0-27,112-139    0       N/A
GPU1    NV18     X  NV18    NV18    NV18    NV18    NV18    NV18    NODE    NODE    28-55,140-167   1       N/A
GPU2    NV18    NV18     X  NV18    NV18    NV18    NV18    NV18    NODE    NODE    28-55,140-167   1       N/A
GPU3    NV18    NV18    NV18     X  NV18    NV18    NV18    NV18    SYS SYS 0-27,112-139    0       N/A
GPU4    NV18    NV18    NV18    NV18     X  NV18    NV18    NV18    SYS SYS 56-83,168-195   2       N/A
GPU5    NV18    NV18    NV18    NV18    NV18     X  NV18    NV18    SYS SYS 84-111,196-223  3       N/A
GPU6    NV18    NV18    NV18    NV18    NV18    NV18     X  NV18    SYS SYS 84-111,196-223  3       N/A
GPU7    NV18    NV18    NV18    NV18    NV18    NV18    NV18     X  SYS SYS 56-83,168-195   2       N/A
NIC0    SYS NODE    NODE    SYS SYS SYS SYS SYS  X  PIX             
NIC1    SYS NODE    NODE    SYS SYS SYS SYS SYS PIX  X              

Legend:

  X    = Self
  SYS  = Connection traversing PCIe as well as the SMP interconnect between NUMA nodes (e.g., QPI/UPI)
  NODE = Connection traversing PCIe as well as the interconnect between PCIe Host Bridges within a NUMA node
  PHB  = Connection traversing PCIe as well as a PCIe Host Bridge (typically the CPU)
  PXB  = Connection traversing multiple PCIe bridges (without traversing the PCIe Host Bridge)
  PIX  = Connection traversing at most a single PCIe bridge
  NV#  = Connection traversing a bonded set of # NVLinks

NIC Legend:

  NIC0: mlx5_0
  NIC1: mlx5_1
youkaichao commented 3 weeks ago

generally speaking, if you cannot run the test script successfully, it means your environment is broken. and you have to talk to your admin to fix either the hardware or the driver. I cannot provide any further help.

xiejibing commented 3 weeks ago

OK. Thanks.

xiejibing commented 3 weeks ago

@youkaichao I have some new findings. When I run with: NCCL_DEBUG=TRACE torchrun --nproc-per-node=2 test.py. The check passed. When I run with: NCCL_DEBUG=TRACE torchrun --nproc-per-node=8 test.py. The check failed. The only difference is the nproc-per-node number. And my pod has 8 H100.

Tue Aug 20 00:02:06 2024       
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.104.05             Driver Version: 535.104.05   CUDA Version: 12.2     |
|-----------------------------------------+----------------------+----------------------+
| GPU  Name                 Persistence-M | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp   Perf          Pwr:Usage/Cap |         Memory-Usage | GPU-Util  Compute M. |
|                                         |                      |               MIG M. |
|=========================================+======================+======================|
|   0  NVIDIA H100 80GB HBM3          Off | 00000000:18:00.0 Off |                    0 |
| N/A   30C    P0              72W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   1  NVIDIA H100 80GB HBM3          Off | 00000000:2A:00.0 Off |                    0 |
| N/A   31C    P0              71W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   2  NVIDIA H100 80GB HBM3          Off | 00000000:3A:00.0 Off |                    0 |
| N/A   32C    P0              71W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   3  NVIDIA H100 80GB HBM3          Off | 00000000:5D:00.0 Off |                    0 |
| N/A   30C    P0              67W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   4  NVIDIA H100 80GB HBM3          Off | 00000000:9A:00.0 Off |                    0 |
| N/A   31C    P0              73W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   5  NVIDIA H100 80GB HBM3          Off | 00000000:AB:00.0 Off |                    0 |
| N/A   32C    P0              70W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   6  NVIDIA H100 80GB HBM3          Off | 00000000:BA:00.0 Off |                    0 |
| N/A   31C    P0              72W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+
|   7  NVIDIA H100 80GB HBM3          Off | 00000000:DB:00.0 Off |                    0 |
| N/A   30C    P0              70W / 700W |      2MiB / 81559MiB |      0%      Default |
|                                         |                      |             Disabled |
+-----------------------------------------+----------------------+----------------------+

+---------------------------------------------------------------------------------------+
| Processes:                                                                            |
|  GPU   GI   CI        PID   Type   Process name                            GPU Memory |
|        ID   ID                                                             Usage      |
|=======================================================================================|
|  No running processes found                                                           |
+---------------------------------------------------------------------------------------+
youkaichao commented 3 weeks ago

When I run with: NCCL_DEBUG=TRACE torchrun --nproc-per-node=2 test.py. The check passed. When I run with: NCCL_DEBUG=TRACE torchrun --nproc-per-node=8 test.py. The check failed.

Looks like GPU 2 is broken then.

youkaichao commented 3 weeks ago

@xiejibing btw, I think you can also try to set export NCCL_ALGO=^NV , to avoid broken nvls .

I learned this from https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html#nccl-algo .

transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'

your error log seems to indicate something wrong with nvls .

xiejibing commented 3 weeks ago

@youkaichao I have set the export NCCL_ALGO=^NV

As you mentioned before, it seems that rank6 is broken.

Root Cause (first observed failure):
[0]:
  time      : 2024-08-20_00:41:41
  host      : h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8
  rank      : 6 (local_rank: 6)
  exitcode  : 1 (pid: 1912)
  error_file: <N/A>

And here is the log:


W0820 00:41:31.018000 139845152036672 torch/distributed/run.py:779] 
W0820 00:41:31.018000 139845152036672 torch/distributed/run.py:779] *****************************************
W0820 00:41:31.018000 139845152036672 torch/distributed/run.py:779] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. 
W0820 00:41:31.018000 139845152036672 torch/distributed/run.py:779] *****************************************
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO cudaDriverVersion 12020
NCCL version 2.20.5+cuda12.4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO cudaDriverVersion 12020
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO Bootstrap : Using eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO NET/Plugin : dlerror=libnccl-net.so: cannot open shared object file: No such file or directory No plugin found (libnccl-net.so), using internal implementation
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Failed to open libibverbs.so[.1]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO NET/Socket : Using [0]eth0:10.177.136.142<0>
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Using non-device net plugin version 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Using network Socket
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO comm 0x8923d50 rank 2 nranks 8 cudaDev 2 nvmlDev 2 busId 3a000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO comm 0x7363940 rank 7 nranks 8 cudaDev 7 nvmlDev 7 busId db000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO comm 0x7f2c0e0 rank 1 nranks 8 cudaDev 1 nvmlDev 1 busId 2a000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO comm 0x79e8900 rank 6 nranks 8 cudaDev 6 nvmlDev 6 busId ba000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO comm 0x71379e0 rank 0 nranks 8 cudaDev 0 nvmlDev 0 busId 18000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO comm 0x8e16340 rank 4 nranks 8 cudaDev 4 nvmlDev 4 busId 9a000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO comm 0x81eaf30 rank 5 nranks 8 cudaDev 5 nvmlDev 5 busId ab000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO comm 0x7b4de00 rank 3 nranks 8 cudaDev 3 nvmlDev 3 busId 5d000 commId 0xfc6662075671e134 - Init START
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Setting affinity for GPU 0 to 0fff,ffff0000,00000000,00000000,0fffffff
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO NVLS multicast support is available on dev 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Setting affinity for GPU 1 to ff,fffff000,00000000,00000000,00ffffff,f0000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO NVLS multicast support is available on dev 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Setting affinity for GPU 5 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO NVLS multicast support is available on dev 5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Setting affinity for GPU 4 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO NVLS multicast support is available on dev 4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Setting affinity for GPU 6 to fffffff0,00000000,00000000,0000ffff,fff00000,00000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO NVLS multicast support is available on dev 6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Setting affinity for GPU 3 to 0fff,ffff0000,00000000,00000000,0fffffff
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO NVLS multicast support is available on dev 3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Setting affinity for GPU 2 to ff,fffff000,00000000,00000000,00ffffff,f0000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO NVLS multicast support is available on dev 2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Setting affinity for GPU 7 to 0f,ffffff00,00000000,00000000,000fffff,ff000000,00000000
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO NVLS multicast support is available on dev 7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO comm 0x79e8900 rank 6 nRanks 8 nNodes 1 localRanks 8 localRank 6 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO comm 0x8923d50 rank 2 nRanks 8 nNodes 1 localRanks 8 localRank 2 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO comm 0x7f2c0e0 rank 1 nRanks 8 nNodes 1 localRanks 8 localRank 1 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO comm 0x81eaf30 rank 5 nRanks 8 nNodes 1 localRanks 8 localRank 5 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Trees [0] -1/-1/-1->6->5 [1] -1/-1/-1->6->5 [2] -1/-1/-1->6->5 [3] -1/-1/-1->6->5 [4] -1/-1/-1->6->5 [5] -1/-1/-1->6->5 [6] -1/-1/-1->6->5 [7] -1/-1/-1->6->5 [8] -1/-1/-1->6->5 [9] -1/-1/-1->6->5 [10] -1/-1/-1->6->5 [11] -1/-1/-1->6->5 [12] -1/-1/-1->6->5 [13] -1/-1/-1->6->5 [14] -1/-1/-1->6->5 [15] -1/-1/-1->6->5 [16] -1/-1/-1->6->5 [17] -1/-1/-1->6->5 [18] -1/-1/-1->6->5 [19] -1/-1/-1->6->5 [20] -1/-1/-1->6->5 [21] -1/-1/-1->6->5 [22] -1/-1/-1->6->5 [23] -1/-1/-1->6->5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO comm 0x71379e0 rank 0 nRanks 8 nNodes 1 localRanks 8 localRank 0 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Trees [0] 6/-1/-1->5->7 [1] 6/-1/-1->5->7 [2] 6/-1/-1->5->7 [3] 6/-1/-1->5->7 [4] 6/-1/-1->5->7 [5] 6/-1/-1->5->7 [6] 6/-1/-1->5->7 [7] 6/-1/-1->5->7 [8] 6/-1/-1->5->7 [9] 6/-1/-1->5->7 [10] 6/-1/-1->5->7 [11] 6/-1/-1->5->7 [12] 6/-1/-1->5->7 [13] 6/-1/-1->5->7 [14] 6/-1/-1->5->7 [15] 6/-1/-1->5->7 [16] 6/-1/-1->5->7 [17] 6/-1/-1->5->7 [18] 6/-1/-1->5->7 [19] 6/-1/-1->5->7 [20] 6/-1/-1->5->7 [21] 6/-1/-1->5->7 [22] 6/-1/-1->5->7 [23] 6/-1/-1->5->7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Trees [0] 4/-1/-1->2->1 [1] 4/-1/-1->2->1 [2] 4/-1/-1->2->1 [3] 4/-1/-1->2->1 [4] 4/-1/-1->2->1 [5] 4/-1/-1->2->1 [6] 4/-1/-1->2->1 [7] 4/-1/-1->2->1 [8] 4/-1/-1->2->1 [9] 4/-1/-1->2->1 [10] 4/-1/-1->2->1 [11] 4/-1/-1->2->1 [12] 4/-1/-1->2->1 [13] 4/-1/-1->2->1 [14] 4/-1/-1->2->1 [15] 4/-1/-1->2->1 [16] 4/-1/-1->2->1 [17] 4/-1/-1->2->1 [18] 4/-1/-1->2->1 [19] 4/-1/-1->2->1 [20] 4/-1/-1->2->1 [21] 4/-1/-1->2->1 [22] 4/-1/-1->2->1 [23] 4/-1/-1->2->1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Trees [0] 2/-1/-1->1->3 [1] 2/-1/-1->1->3 [2] 2/-1/-1->1->3 [3] 2/-1/-1->1->3 [4] 2/-1/-1->1->3 [5] 2/-1/-1->1->3 [6] 2/-1/-1->1->3 [7] 2/-1/-1->1->3 [8] 2/-1/-1->1->3 [9] 2/-1/-1->1->3 [10] 2/-1/-1->1->3 [11] 2/-1/-1->1->3 [12] 2/-1/-1->1->3 [13] 2/-1/-1->1->3 [14] 2/-1/-1->1->3 [15] 2/-1/-1->1->3 [16] 2/-1/-1->1->3 [17] 2/-1/-1->1->3 [18] 2/-1/-1->1->3 [19] 2/-1/-1->1->3 [20] 2/-1/-1->1->3 [21] 2/-1/-1->1->3 [22] 2/-1/-1->1->3 [23] 2/-1/-1->1->3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 00/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO comm 0x7363940 rank 7 nRanks 8 nNodes 1 localRanks 8 localRank 7 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 01/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO comm 0x8e16340 rank 4 nRanks 8 nNodes 1 localRanks 8 localRank 4 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 02/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO comm 0x7b4de00 rank 3 nRanks 8 nNodes 1 localRanks 8 localRank 3 MNNVL 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 03/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 04/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Trees [0] 5/-1/-1->7->4 [1] 5/-1/-1->7->4 [2] 5/-1/-1->7->4 [3] 5/-1/-1->7->4 [4] 5/-1/-1->7->4 [5] 5/-1/-1->7->4 [6] 5/-1/-1->7->4 [7] 5/-1/-1->7->4 [8] 5/-1/-1->7->4 [9] 5/-1/-1->7->4 [10] 5/-1/-1->7->4 [11] 5/-1/-1->7->4 [12] 5/-1/-1->7->4 [13] 5/-1/-1->7->4 [14] 5/-1/-1->7->4 [15] 5/-1/-1->7->4 [16] 5/-1/-1->7->4 [17] 5/-1/-1->7->4 [18] 5/-1/-1->7->4 [19] 5/-1/-1->7->4 [20] 5/-1/-1->7->4 [21] 5/-1/-1->7->4 [22] 5/-1/-1->7->4 [23] 5/-1/-1->7->4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 05/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 06/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Trees [0] 7/-1/-1->4->2 [1] 7/-1/-1->4->2 [2] 7/-1/-1->4->2 [3] 7/-1/-1->4->2 [4] 7/-1/-1->4->2 [5] 7/-1/-1->4->2 [6] 7/-1/-1->4->2 [7] 7/-1/-1->4->2 [8] 7/-1/-1->4->2 [9] 7/-1/-1->4->2 [10] 7/-1/-1->4->2 [11] 7/-1/-1->4->2 [12] 7/-1/-1->4->2 [13] 7/-1/-1->4->2 [14] 7/-1/-1->4->2 [15] 7/-1/-1->4->2 [16] 7/-1/-1->4->2 [17] 7/-1/-1->4->2 [18] 7/-1/-1->4->2 [19] 7/-1/-1->4->2 [20] 7/-1/-1->4->2 [21] 7/-1/-1->4->2 [22] 7/-1/-1->4->2 [23] 7/-1/-1->4->2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 07/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Trees [0] 1/-1/-1->3->0 [1] 1/-1/-1->3->0 [2] 1/-1/-1->3->0 [3] 1/-1/-1->3->0 [4] 1/-1/-1->3->0 [5] 1/-1/-1->3->0 [6] 1/-1/-1->3->0 [7] 1/-1/-1->3->0 [8] 1/-1/-1->3->0 [9] 1/-1/-1->3->0 [10] 1/-1/-1->3->0 [11] 1/-1/-1->3->0 [12] 1/-1/-1->3->0 [13] 1/-1/-1->3->0 [14] 1/-1/-1->3->0 [15] 1/-1/-1->3->0 [16] 1/-1/-1->3->0 [17] 1/-1/-1->3->0 [18] 1/-1/-1->3->0 [19] 1/-1/-1->3->0 [20] 1/-1/-1->3->0 [21] 1/-1/-1->3->0 [22] 1/-1/-1->3->0 [23] 1/-1/-1->3->0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 08/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 09/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 10/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 11/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 12/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 13/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 14/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 15/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 16/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 17/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 18/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 19/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 20/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 21/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 22/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 23/24 :    0   3   1   2   4   7   5   6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Trees [0] 3/-1/-1->0->-1 [1] 3/-1/-1->0->-1 [2] 3/-1/-1->0->-1 [3] 3/-1/-1->0->-1 [4] 3/-1/-1->0->-1 [5] 3/-1/-1->0->-1 [6] 3/-1/-1->0->-1 [7] 3/-1/-1->0->-1 [8] 3/-1/-1->0->-1 [9] 3/-1/-1->0->-1 [10] 3/-1/-1->0->-1 [11] 3/-1/-1->0->-1 [12] 3/-1/-1->0->-1 [13] 3/-1/-1->0->-1 [14] 3/-1/-1->0->-1 [15] 3/-1/-1->0->-1 [16] 3/-1/-1->0->-1 [17] 3/-1/-1->0->-1 [18] 3/-1/-1->0->-1 [19] 3/-1/-1->0->-1 [20] 3/-1/-1->0->-1 [21] 3/-1/-1->0->-1 [22] 3/-1/-1->0->-1 [23] 3/-1/-1->0->-1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO P2P Chunksize set to 524288
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 00/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 00/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 01/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 02/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 03/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 04/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 05/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 06/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 07/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 08/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 09/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 10/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 11/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 12/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 13/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 14/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 15/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 16/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 17/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 18/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 19/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 20/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 21/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 22/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 23/0 : 1[1] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 01/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 00/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 02/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 01/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 03/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 02/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 04/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 03/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 05/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 04/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 06/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 05/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 07/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 06/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 08/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 07/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 09/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 08/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 10/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 09/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 11/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 10/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 12/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 11/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 13/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 12/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 14/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 13/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 15/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 14/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 16/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 15/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 17/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 16/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 18/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 17/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 19/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 18/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 20/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 19/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 21/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 20/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 22/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 21/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 23/0 : 5[5] -> 6[6] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 22/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 23/0 : 2[2] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 00/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 00/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 01/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 02/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 03/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 01/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 04/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 05/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 02/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 06/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 03/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 07/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 04/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 08/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 09/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 05/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 10/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 11/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 12/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 13/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 14/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 15/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 16/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 17/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 06/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 18/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 19/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 20/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 21/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 07/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 22/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 23/0 : 6[6] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 08/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 09/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 00/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 10/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 01/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 11/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 02/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 12/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 03/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 13/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 04/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 14/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 05/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 15/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 06/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 16/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 07/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 17/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 08/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 18/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 19/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 09/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 20/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 10/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 21/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 11/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 22/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 12/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 23/0 : 4[4] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 13/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 00/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 14/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 01/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 15/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 02/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 16/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 03/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 17/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 04/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 18/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 05/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 19/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 06/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 20/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 07/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 21/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 08/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 22/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 09/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Channel 23/0 : 0[0] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 10/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 00/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 11/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 01/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 12/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 02/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 13/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 03/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 14/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 04/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 05/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 15/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 06/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 16/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 07/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 17/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 08/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 18/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 09/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 19/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 10/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 20/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 11/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 21/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 12/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 22/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 13/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 23/0 : 7[7] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 14/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 15/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 16/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 17/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 18/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 19/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 20/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 21/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 22/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 23/0 : 3[3] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 00/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 00/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Connected all rings
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 00/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 01/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 01/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 01/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 02/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 02/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 02/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 03/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 03/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 03/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 04/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 04/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 04/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 05/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 05/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 05/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 06/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 06/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 07/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 07/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 06/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 08/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 08/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 07/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 09/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 09/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 08/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 10/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 10/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 09/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 11/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 11/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 10/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 12/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 12/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 11/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 13/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 13/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 12/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 14/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 14/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 13/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 15/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 15/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 14/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 16/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 16/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 15/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 17/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 17/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 16/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 18/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 18/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 17/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 19/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 19/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 18/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 20/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 20/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 19/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 21/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 21/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 20/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 22/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 22/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 21/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Channel 23/0 : 6[6] -> 5[5] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Channel 23/0 : 5[5] -> 7[7] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 22/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 00/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Channel 23/0 : 1[1] -> 3[3] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 01/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 02/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 00/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 03/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 01/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 04/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 02/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 05/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 03/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 06/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 04/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 07/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 05/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 08/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 06/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 09/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 07/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 10/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 08/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 11/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 09/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 12/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 10/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 13/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 11/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 14/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 12/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 15/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 13/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 16/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 14/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 17/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 15/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 18/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 16/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 19/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 17/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 20/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 18/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 21/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 19/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 22/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 20/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Channel 23/0 : 7[7] -> 4[4] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 21/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 00/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 22/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 01/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Channel 23/0 : 3[3] -> 0[0] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 02/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 03/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 04/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 05/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 06/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 07/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 08/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 09/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 10/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 11/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 12/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 13/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 14/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 15/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 16/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 17/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 18/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 19/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 20/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 21/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 22/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Channel 23/0 : 4[4] -> 2[2] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 00/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 01/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 02/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 03/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 04/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 05/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 06/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 07/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 08/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 09/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 10/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 11/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 12/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 13/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 14/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 15/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 16/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 17/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 18/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 19/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 20/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 21/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 22/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Channel 23/0 : 2[2] -> 1[1] via P2P/CUMEM
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO Connected all trees
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO NVLS comm 0x81eaf30 headRank 6 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO NVLS comm 0x79e8900 headRank 7 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO NVLS comm 0x7b4de00 headRank 1 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO NVLS comm 0x71379e0 headRank 0 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO NVLS comm 0x7f2c0e0 headRank 2 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO NVLS comm 0x8e16340 headRank 4 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO NVLS comm 0x8923d50 headRank 3 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO NVLS comm 0x7363940 headRank 5 nHeads 8 buffSize 4194304 memSize 2097152 nvlsPerRankSize 301989888 nvlsTotalSize 2415919104

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO transport/nvls.cc:328 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO init.cc:1236 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO init.cc:1501 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1955 [7] NCCL INFO group.cc:64 -> 1 [Async thread]

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO init.cc:1501 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO init.cc:1236 -> 1

h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] transport/nvls.cc:157 NCCL WARN Cuda failure 1 'invalid argument'
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1957 [6] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO transport/nvls.cc:328 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1960 [4] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1956 [5] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO init.cc:1236 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1954 [2] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1958 [3] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1959 [1] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO init.cc:1501 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1953 [0] NCCL INFO group.cc:64 -> 1 [Async thread]
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO group.cc:418 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO init.cc:1876 -> 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1961 [6] NCCL INFO [Service thread] Connection closed by localRank 6
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1965 [7] NCCL INFO [Service thread] Connection closed by localRank 7
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1975 [3] NCCL INFO [Service thread] Connection closed by localRank 3
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1973 [4] NCCL INFO [Service thread] Connection closed by localRank 4
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1971 [1] NCCL INFO [Service thread] Connection closed by localRank 1
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1969 [2] NCCL INFO [Service thread] Connection closed by localRank 2
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1967 [0] NCCL INFO [Service thread] Connection closed by localRank 0
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1963 [5] NCCL INFO [Service thread] Connection closed by localRank 5
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1912:1912 [6] NCCL INFO comm 0x79e8900 rank 6 nranks 8 cudaDev 6 busId ba000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1906:1906 [0] NCCL INFO comm 0x71379e0 rank 0 nranks 8 cudaDev 0 busId 18000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1909:1909 [3] NCCL INFO comm 0x7b4de00 rank 3 nranks 8 cudaDev 3 busId 5d000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1907:1907 [1] NCCL INFO comm 0x7f2c0e0 rank 1 nranks 8 cudaDev 1 busId 2a000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1913:1913 [7] NCCL INFO comm 0x7363940 rank 7 nranks 8 cudaDev 7 busId db000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1908:1908 [2] NCCL INFO comm 0x8923d50 rank 2 nranks 8 cudaDev 2 busId 3a000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1911:1911 [5] NCCL INFO comm 0x81eaf30 rank 5 nranks 8 cudaDev 5 busId ab000 - Abort COMPLETE
h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8:1910:1910 [4] NCCL INFO comm 0x8e16340 rank 4 nranks 8 cudaDev 4 busId 9a000 - Abort COMPLETE
[rank6]: Traceback (most recent call last):
[rank6]:   File "/tmp/test.py", line 8, in <module>
[rank6]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank6]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank6]:     return func(*args, **kwargs)
[rank6]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank6]:     work = group.allreduce([tensor], opts)
[rank6]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank6]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank6]: Last error:
[rank6]: Cuda failure 1 'invalid argument'
[rank0]: Traceback (most recent call last):
[rank0]:   File "/tmp/test.py", line 8, in <module>
[rank0]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank0]:     return func(*args, **kwargs)
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank0]:     work = group.allreduce([tensor], opts)
[rank0]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank0]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank0]: Last error:
[rank0]: Cuda failure 1 'invalid argument'
[rank3]: Traceback (most recent call last):
[rank3]:   File "/tmp/test.py", line 8, in <module>
[rank3]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank3]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank3]:     return func(*args, **kwargs)
[rank3]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank3]:     work = group.allreduce([tensor], opts)
[rank3]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank3]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank3]: Last error:
[rank3]: Cuda failure 1 'invalid argument'
[rank4]: Traceback (most recent call last):
[rank4]:   File "/tmp/test.py", line 8, in <module>
[rank4]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank4]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank4]:     return func(*args, **kwargs)
[rank4]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank4]:     work = group.allreduce([tensor], opts)
[rank4]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank4]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank4]: Last error:
[rank4]: Cuda failure 1 'invalid argument'
[rank2]: Traceback (most recent call last):
[rank2]:   File "/tmp/test.py", line 8, in <module>
[rank2]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank2]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank2]:     return func(*args, **kwargs)
[rank2]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank2]:     work = group.allreduce([tensor], opts)
[rank2]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank2]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank2]: Last error:
[rank2]: Cuda failure 1 'invalid argument'
[rank5]: Traceback (most recent call last):
[rank5]:   File "/tmp/test.py", line 8, in <module>
[rank5]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank5]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank5]:     return func(*args, **kwargs)
[rank5]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank5]:     work = group.allreduce([tensor], opts)
[rank5]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank5]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank5]: Last error:
[rank5]: Cuda failure 1 'invalid argument'
[rank1]: Traceback (most recent call last):
[rank1]:   File "/tmp/test.py", line 8, in <module>
[rank1]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank1]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank1]:     return func(*args, **kwargs)
[rank1]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank1]:     work = group.allreduce([tensor], opts)
[rank1]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank1]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank1]: Last error:
[rank1]: Cuda failure 1 'invalid argument'
[rank7]: Traceback (most recent call last):
[rank7]:   File "/tmp/test.py", line 8, in <module>
[rank7]:     dist.all_reduce(data, op=dist.ReduceOp.SUM)
[rank7]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/c10d_logger.py", line 79, in wrapper
[rank7]:     return func(*args, **kwargs)
[rank7]:   File "/opt/conda/lib/python3.10/site-packages/torch/distributed/distributed_c10d.py", line 2288, in all_reduce
[rank7]:     work = group.allreduce([tensor], opts)
[rank7]: torch.distributed.DistBackendError: NCCL error in: ../torch/csrc/distributed/c10d/NCCLUtils.hpp:275, unhandled cuda error (run with NCCL_DEBUG=INFO for details), NCCL version 2.20.5
[rank7]: ncclUnhandledCudaError: Call to CUDA function failed.
[rank7]: Last error:
[rank7]: Cuda failure 1 'invalid argument'
[rank0]:[W820 00:41:41.469713207 ProcessGroupNCCL.cpp:1168] Warning: WARNING: process group has NOT been destroyed before we destruct ProcessGroupNCCL. On normal program exit, the application should call destroy_process_group to ensure that any pending NCCL operations have finished in this process. In rare cases this process can exit before this point and block the progress of another member of the process group. This constraint has always been present,  but this warning has only been added since PyTorch 2.4 (function operator())
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1906 closing signal SIGTERM
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1907 closing signal SIGTERM
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1908 closing signal SIGTERM
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1909 closing signal SIGTERM
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1910 closing signal SIGTERM
W0820 00:41:41.444000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1911 closing signal SIGTERM
W0820 00:41:41.445000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:858] Sending process 1913 closing signal SIGTERM
E0820 00:41:41.823000 139845152036672 torch/distributed/elastic/multiprocessing/api.py:833] failed (exitcode: 1) local_rank: 6 (pid: 1912) of binary: /opt/conda/bin/python
Traceback (most recent call last):
  File "/opt/conda/bin/torchrun", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 348, in wrapper
    return f(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/run.py", line 901, in main
    run(args)
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/run.py", line 892, in run
    elastic_launch(
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 133, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
  File "/opt/conda/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 264, in launch_agent
    raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError: 
============================================================
test.py FAILED
------------------------------------------------------------
Failures:
  <NO_OTHER_FAILURES>
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
  time      : 2024-08-20_00:41:41
  host      : h100-kickoff-llama-70b-chat-20230328-bcf75c4f6-c9ph8
  rank      : 6 (local_rank: 6)
  exitcode  : 1 (pid: 1912)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================
youkaichao commented 3 weeks ago

If you want to learn more, you can try export NCCL_DEBUG_SUBSYS=ALL . it will give you more information.

xiejibing commented 3 weeks ago

@youkaichao Thanks. The NCCL error may be caused by the hardware or driver. We changed to another node and the NCCL check passed.

And we encountered another issue when load the llama3.1-405B-FP8 model。 And we use the official script to start vllm could you please help to take a look:

/opt/conda/lib/python3.10/site-packages/transformers/utils/hub.py:127: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
  warnings.warn(
The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`.
0it [00:00, ?it/s]
WARNING 08-20 23:07:21 arg_utils.py:766] Chunked prefill is enabled by default for models with max_model_len > 32K. Currently, chunked prefill might not work with some features or models. If you encounter any issues, please disable chunked prefill by setting --enable-chunked-prefill=False.
INFO 08-20 23:07:21 config.py:820] Chunked prefill is enabled with max_num_batched_tokens=512.
INFO 08-20 23:07:21 llm_engine.py:174] Initializing an LLM engine (v0.5.4) with config: model='/mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723', speculative_config=None, tokenizer='/mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723', skip_tokenizer_init=False, tokenizer_mode=auto, revision=None, rope_scaling=None, rope_theta=None, tokenizer_revision=None, trust_remote_code=False, dtype=torch.bfloat16, max_seq_len=131072, download_dir=None, load_format=LoadFormat.AUTO, tensor_parallel_size=8, pipeline_parallel_size=1, disable_custom_all_reduce=False, quantization=fbgemm_fp8, enforce_eager=False, kv_cache_dtype=auto, quantization_param_path=None, device_config=cuda, decoding_config=DecodingConfig(guided_decoding_backend='outlines'), observability_config=ObservabilityConfig(otlp_traces_endpoint=None), seed=0, served_model_name=/mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723, use_v2_block_manager=False, enable_prefix_caching=False)
WARNING 08-20 23:07:21 multiproc_gpu_executor.py:59] Reducing Torch parallelism from 112 threads to 1 to avoid unnecessary CPU contention. Set OMP_NUM_THREADS in the external environment to tune this value as needed.
INFO 08-20 23:07:21 custom_cache_manager.py:17] Setting Triton cache manager to: vllm.triton_utils.custom_cache_manager:CustomCacheManager
(VllmWorkerProcess pid=1615) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1616) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1617) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1618) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1620) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1619) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1614) INFO 08-20 23:07:21 multiproc_worker_utils.py:215] Worker ready; awaiting tasks
(VllmWorkerProcess pid=1620) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1620) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1614) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1614) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1615) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1616) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1615) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1617) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1616) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1619) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1618) INFO 08-20 23:07:24 utils.py:841] Found nccl from library libnccl.so.2
(VllmWorkerProcess pid=1617) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1618) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
(VllmWorkerProcess pid=1619) INFO 08-20 23:07:24 pynccl.py:63] vLLM is using nccl==2.20.5
INFO 08-20 23:07:30 custom_all_reduce_utils.py:203] generating GPU P2P access cache in /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1620) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1619) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1617) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1616) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1614) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1618) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
(VllmWorkerProcess pid=1615) INFO 08-20 23:07:55 custom_all_reduce_utils.py:234] reading GPU P2P access cache from /tmp/gpu_p2p_access_cache_for_0,1,2,3,4,5,6,7.json
INFO 08-20 23:07:55 shm_broadcast.py:235] vLLM message queue communication handle: Handle(connect_ip='127.0.0.1', local_reader_ranks=[1, 2, 3, 4, 5, 6, 7], buffer=<vllm.distributed.device_communicators.shm_broadcast.ShmRingBuffer object at 0x7f110fe0cd90>, local_subscribe_port=34023, remote_subscribe_port=None)
INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1617) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1615) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1614) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1616) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1618) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1619) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
(VllmWorkerProcess pid=1620) INFO 08-20 23:07:55 model_runner.py:720] Starting to load model /mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723...
Loading safetensors checkpoint shards:   0% Completed | 0/109 [00:00<?, ?it/s]
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method load_model: start out of range (expected to be in range of [-1024, 1024], but got 1280), Traceback (most recent call last):
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     output = executor(*args, **kwargs)
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 139, in load_model
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model_runner.load_model()
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 722, in load_model
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model = get_model(model_config=self.model_config,
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/__init__.py", line 21, in get_model
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     return loader.load_model(model_config=model_config,
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/loader.py", line 327, in load_model
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     model.load_weights(
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/models/llama.py", line 498, in load_weights
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     weight_loader(param, loaded_weight, shard_id)
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/layers/linear.py", line 660, in weight_loader
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     loaded_weight = loaded_weight.narrow(output_dim, start_idx,
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] IndexError: start out of range (expected to be in range of [-1024, 1024], but got 1280)
(VllmWorkerProcess pid=1618) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] 
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method load_model: start out of range (expected to be in range of [-1024, 1024], but got 1792), Traceback (most recent call last):
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     output = executor(*args, **kwargs)
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 139, in load_model
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model_runner.load_model()
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 722, in load_model
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model = get_model(model_config=self.model_config,
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/__init__.py", line 21, in get_model
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     return loader.load_model(model_config=model_config,
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/loader.py", line 327, in load_model
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     model.load_weights(
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/models/llama.py", line 498, in load_weights
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     weight_loader(param, loaded_weight, shard_id)
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/layers/linear.py", line 660, in weight_loader
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     loaded_weight = loaded_weight.narrow(output_dim, start_idx,
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] IndexError: start out of range (expected to be in range of [-1024, 1024], but got 1792)
(VllmWorkerProcess pid=1620) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] 
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method load_model: start out of range (expected to be in range of [-1024, 1024], but got 1536), Traceback (most recent call last):
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     output = executor(*args, **kwargs)
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 139, in load_model
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model_runner.load_model()
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 722, in load_model
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model = get_model(model_config=self.model_config,
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/__init__.py", line 21, in get_model
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     return loader.load_model(model_config=model_config,
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/loader.py", line 327, in load_model
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     model.load_weights(
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/models/llama.py", line 498, in load_weights
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     weight_loader(param, loaded_weight, shard_id)
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/layers/linear.py", line 660, in weight_loader
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     loaded_weight = loaded_weight.narrow(output_dim, start_idx,
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] IndexError: start out of range (expected to be in range of [-1024, 1024], but got 1536)
(VllmWorkerProcess pid=1619) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] 
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] Exception in worker VllmWorkerProcess while processing method load_model: start (1024) + length (256) exceeds dimension size (1024)., Traceback (most recent call last):
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 223, in _run_worker_process
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     output = executor(*args, **kwargs)
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/worker.py", line 139, in load_model
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model_runner.load_model()
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/worker/model_runner.py", line 722, in load_model
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     self.model = get_model(model_config=self.model_config,
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/__init__.py", line 21, in get_model
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     return loader.load_model(model_config=model_config,
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/model_loader/loader.py", line 327, in load_model
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     model.load_weights(
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/models/llama.py", line 498, in load_weights
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     weight_loader(param, loaded_weight, shard_id)
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]   File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/layers/linear.py", line 660, in weight_loader
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226]     loaded_weight = loaded_weight.narrow(output_dim, start_idx,
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] RuntimeError: start (1024) + length (256) exceeds dimension size (1024).
(VllmWorkerProcess pid=1617) ERROR 08-20 23:07:59 multiproc_worker_utils.py:226] 
Loading safetensors checkpoint shards:   1% Completed | 1/109 [00:03<06:59,  3.89s/it]
Loading safetensors checkpoint shards:   2% Completed | 2/109 [00:11<11:20,  6.36s/it]
Loading safetensors checkpoint shards:   3% Completed | 3/109 [00:16<10:01,  5.68s/it]
Loading safetensors checkpoint shards:   4% Completed | 4/109 [00:22<09:58,  5.70s/it]
Loading safetensors checkpoint shards:   5% Completed | 5/109 [00:27<09:38,  5.56s/it]
Loading safetensors checkpoint shards:   6% Completed | 6/109 [00:32<09:03,  5.27s/it]
Loading safetensors checkpoint shards:   6% Completed | 7/109 [00:37<08:55,  5.25s/it]
Loading safetensors checkpoint shards:   7% Completed | 8/109 [00:43<09:02,  5.38s/it]
Loading safetensors checkpoint shards:   8% Completed | 9/109 [00:48<08:51,  5.31s/it]
Loading safetensors checkpoint shards:   9% Completed | 10/109 [00:53<08:31,  5.17s/it]
Loading safetensors checkpoint shards:  10% Completed | 11/109 [00:58<08:07,  4.97s/it]
Loading safetensors checkpoint shards:  11% Completed | 12/109 [01:02<07:50,  4.85s/it]
Loading safetensors checkpoint shards:  12% Completed | 13/109 [01:07<07:39,  4.79s/it]
Loading safetensors checkpoint shards:  13% Completed | 14/109 [01:12<07:38,  4.82s/it]
Loading safetensors checkpoint shards:  14% Completed | 15/109 [01:16<07:24,  4.73s/it]
Loading safetensors checkpoint shards:  15% Completed | 16/109 [01:21<07:15,  4.68s/it]
Loading safetensors checkpoint shards:  16% Completed | 17/109 [01:26<07:15,  4.73s/it]
Loading safetensors checkpoint shards:  17% Completed | 18/109 [01:30<07:06,  4.68s/it]
Loading safetensors checkpoint shards:  17% Completed | 19/109 [01:35<07:16,  4.85s/it]
Loading safetensors checkpoint shards:  18% Completed | 20/109 [01:43<08:33,  5.77s/it]
Loading safetensors checkpoint shards:  19% Completed | 21/109 [01:48<08:04,  5.51s/it]
Loading safetensors checkpoint shards:  20% Completed | 22/109 [01:56<08:53,  6.14s/it]
Loading safetensors checkpoint shards:  21% Completed | 23/109 [02:00<08:04,  5.64s/it]
Loading safetensors checkpoint shards:  22% Completed | 24/109 [02:06<08:07,  5.74s/it]
Loading safetensors checkpoint shards:  23% Completed | 25/109 [02:15<09:17,  6.64s/it]
Loading safetensors checkpoint shards:  24% Completed | 26/109 [02:23<09:38,  6.97s/it]
Loading safetensors checkpoint shards:  25% Completed | 27/109 [02:27<08:32,  6.25s/it]
Loading safetensors checkpoint shards:  26% Completed | 28/109 [02:32<07:43,  5.72s/it]
Loading safetensors checkpoint shards:  27% Completed | 29/109 [02:37<07:25,  5.57s/it]
Loading safetensors checkpoint shards:  28% Completed | 30/109 [02:42<07:06,  5.40s/it]
Loading safetensors checkpoint shards:  28% Completed | 31/109 [02:47<06:50,  5.26s/it]
Loading safetensors checkpoint shards:  29% Completed | 32/109 [02:51<06:29,  5.05s/it]
Loading safetensors checkpoint shards:  30% Completed | 33/109 [02:56<06:11,  4.88s/it]
Loading safetensors checkpoint shards:  31% Completed | 34/109 [03:02<06:40,  5.35s/it]
Loading safetensors checkpoint shards:  32% Completed | 35/109 [03:07<06:19,  5.13s/it]
Loading safetensors checkpoint shards:  33% Completed | 36/109 [03:16<07:28,  6.15s/it]
Loading safetensors checkpoint shards:  34% Completed | 37/109 [03:20<06:51,  5.72s/it]
Loading safetensors checkpoint shards:  35% Completed | 38/109 [03:26<06:46,  5.72s/it]
Loading safetensors checkpoint shards:  36% Completed | 39/109 [03:35<07:41,  6.60s/it]
Loading safetensors checkpoint shards:  37% Completed | 40/109 [03:40<07:15,  6.31s/it]
Loading safetensors checkpoint shards:  38% Completed | 41/109 [03:45<06:41,  5.90s/it]
Loading safetensors checkpoint shards:  39% Completed | 42/109 [03:50<06:20,  5.68s/it]
Loading safetensors checkpoint shards:  39% Completed | 43/109 [03:55<05:58,  5.43s/it]
Loading safetensors checkpoint shards:  40% Completed | 44/109 [04:00<05:37,  5.19s/it]
Loading safetensors checkpoint shards:  41% Completed | 45/109 [04:05<05:27,  5.11s/it]
Loading safetensors checkpoint shards:  42% Completed | 46/109 [04:14<06:45,  6.44s/it]
Loading safetensors checkpoint shards:  43% Completed | 47/109 [04:23<07:29,  7.25s/it]
Loading safetensors checkpoint shards:  44% Completed | 48/109 [04:29<06:55,  6.81s/it]
Loading safetensors checkpoint shards:  45% Completed | 49/109 [04:36<06:40,  6.67s/it]
Loading safetensors checkpoint shards:  46% Completed | 50/109 [04:40<06:01,  6.13s/it]
Loading safetensors checkpoint shards:  47% Completed | 51/109 [04:42<04:29,  4.64s/it]
Loading safetensors checkpoint shards:  48% Completed | 52/109 [04:48<04:50,  5.09s/it]
Loading safetensors checkpoint shards:  49% Completed | 53/109 [04:53<04:43,  5.06s/it]
Loading safetensors checkpoint shards:  50% Completed | 54/109 [04:59<04:51,  5.31s/it]
Loading safetensors checkpoint shards:  50% Completed | 55/109 [05:08<05:56,  6.60s/it]
Loading safetensors checkpoint shards:  51% Completed | 56/109 [05:14<05:31,  6.26s/it]
Loading safetensors checkpoint shards:  52% Completed | 57/109 [05:19<05:09,  5.95s/it]
Loading safetensors checkpoint shards:  53% Completed | 58/109 [05:24<04:56,  5.81s/it]
Loading safetensors checkpoint shards:  54% Completed | 59/109 [05:30<04:46,  5.72s/it]
Loading safetensors checkpoint shards:  55% Completed | 60/109 [05:38<05:18,  6.50s/it]
Loading safetensors checkpoint shards:  56% Completed | 61/109 [05:44<04:57,  6.20s/it]
Loading safetensors checkpoint shards:  57% Completed | 62/109 [05:50<04:45,  6.07s/it]
Loading safetensors checkpoint shards:  58% Completed | 63/109 [05:55<04:36,  6.01s/it]
Loading safetensors checkpoint shards:  59% Completed | 64/109 [06:04<05:01,  6.71s/it]
Loading safetensors checkpoint shards:  60% Completed | 65/109 [06:09<04:39,  6.36s/it]
Loading safetensors checkpoint shards:  61% Completed | 66/109 [06:17<04:47,  6.68s/it]
Loading safetensors checkpoint shards:  61% Completed | 67/109 [06:25<05:04,  7.26s/it]
Loading safetensors checkpoint shards:  62% Completed | 68/109 [06:31<04:33,  6.68s/it]
Loading safetensors checkpoint shards:  63% Completed | 69/109 [06:36<04:11,  6.28s/it]
Loading safetensors checkpoint shards:  64% Completed | 70/109 [06:41<03:52,  5.97s/it]
Loading safetensors checkpoint shards:  65% Completed | 71/109 [06:48<04:00,  6.34s/it]
Loading safetensors checkpoint shards:  66% Completed | 72/109 [06:57<04:23,  7.12s/it]
Loading safetensors checkpoint shards:  67% Completed | 73/109 [07:06<04:35,  7.64s/it]
Loading safetensors checkpoint shards:  68% Completed | 74/109 [07:13<04:23,  7.52s/it]
Loading safetensors checkpoint shards:  69% Completed | 75/109 [07:22<04:26,  7.83s/it]
Loading safetensors checkpoint shards:  70% Completed | 76/109 [07:27<03:51,  7.03s/it]
Loading safetensors checkpoint shards:  71% Completed | 77/109 [07:34<03:38,  6.82s/it]
Loading safetensors checkpoint shards:  72% Completed | 78/109 [07:39<03:15,  6.30s/it]
Loading safetensors checkpoint shards:  72% Completed | 79/109 [07:43<02:53,  5.77s/it]
Loading safetensors checkpoint shards:  73% Completed | 80/109 [07:45<02:11,  4.52s/it]
Loading safetensors checkpoint shards:  74% Completed | 81/109 [07:53<02:39,  5.68s/it]
Loading safetensors checkpoint shards:  75% Completed | 82/109 [07:59<02:34,  5.73s/it]
Loading safetensors checkpoint shards:  76% Completed | 83/109 [08:05<02:33,  5.90s/it]
Loading safetensors checkpoint shards:  77% Completed | 84/109 [08:14<02:46,  6.64s/it]
Loading safetensors checkpoint shards:  78% Completed | 85/109 [08:19<02:29,  6.23s/it]
Loading safetensors checkpoint shards:  79% Completed | 86/109 [08:24<02:18,  6.01s/it]
Loading safetensors checkpoint shards:  80% Completed | 87/109 [08:29<02:03,  5.63s/it]
Loading safetensors checkpoint shards:  81% Completed | 88/109 [08:37<02:12,  6.31s/it]
Loading safetensors checkpoint shards:  82% Completed | 89/109 [08:42<02:00,  6.03s/it]
Loading safetensors checkpoint shards:  83% Completed | 90/109 [08:44<01:30,  4.74s/it]
Loading safetensors checkpoint shards:  83% Completed | 91/109 [08:49<01:23,  4.63s/it]
Loading safetensors checkpoint shards:  84% Completed | 92/109 [08:56<01:32,  5.44s/it]
Loading safetensors checkpoint shards:  85% Completed | 93/109 [09:01<01:26,  5.41s/it]
Loading safetensors checkpoint shards:  86% Completed | 94/109 [09:05<01:15,  5.03s/it]
Loading safetensors checkpoint shards:  87% Completed | 95/109 [09:11<01:13,  5.22s/it]
Loading safetensors checkpoint shards:  88% Completed | 96/109 [09:16<01:08,  5.25s/it]
Loading safetensors checkpoint shards:  89% Completed | 97/109 [09:21<01:01,  5.11s/it]
Loading safetensors checkpoint shards:  90% Completed | 98/109 [09:29<01:03,  5.80s/it]
Loading safetensors checkpoint shards:  91% Completed | 99/109 [09:34<00:55,  5.55s/it]
Loading safetensors checkpoint shards:  92% Completed | 100/109 [09:40<00:53,  5.94s/it]
Loading safetensors checkpoint shards:  93% Completed | 101/109 [09:45<00:44,  5.61s/it]
Loading safetensors checkpoint shards:  94% Completed | 102/109 [09:49<00:36,  5.20s/it]
Loading safetensors checkpoint shards:  94% Completed | 103/109 [09:54<00:29,  4.90s/it]
Loading safetensors checkpoint shards:  95% Completed | 104/109 [09:58<00:24,  4.84s/it]
Loading safetensors checkpoint shards:  96% Completed | 105/109 [10:06<00:22,  5.73s/it]
Loading safetensors checkpoint shards:  97% Completed | 106/109 [10:11<00:16,  5.51s/it]
Loading safetensors checkpoint shards:  98% Completed | 107/109 [10:16<00:10,  5.45s/it]
Loading safetensors checkpoint shards:  99% Completed | 108/109 [10:18<00:04,  4.33s/it]
Loading safetensors checkpoint shards: 100% Completed | 109/109 [10:23<00:00,  4.33s/it]
Loading safetensors checkpoint shards: 100% Completed | 109/109 [10:23<00:00,  5.72s/it]

(VllmWorkerProcess pid=1616) INFO 08-20 23:18:19 model_runner.py:732] Loading model weights took 57.7520 GB
INFO 08-20 23:18:19 model_runner.py:732] Loading model weights took 57.7520 GB
(VllmWorkerProcess pid=1614) INFO 08-20 23:18:19 model_runner.py:732] Loading model weights took 57.7520 GB
(VllmWorkerProcess pid=1615) INFO 08-20 23:18:19 model_runner.py:732] Loading model weights took 57.7520 GB
[rank0]: Traceback (most recent call last):
[rank0]:   File "/dev/shm/test_vllm.py", line 16, in <module>
[rank0]:     llm = LLM(model="/mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723", tensor_parallel_size=8, distributed_executor_backend="mp")
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 158, in __init__
[rank0]:     self.llm_engine = LLMEngine.from_engine_args(
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 445, in from_engine_args
[rank0]:     engine = cls(
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 249, in __init__
[rank0]:     self.model_executor = executor_class(
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/distributed_gpu_executor.py", line 25, in __init__
[rank0]:     super().__init__(*args, **kwargs)
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/executor_base.py", line 47, in __init__
[rank0]:     self._init_executor()
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_gpu_executor.py", line 138, in _init_executor
[rank0]:     self._run_workers("load_model",
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_gpu_executor.py", line 196, in _run_workers
[rank0]:     ] + [output.get() for output in worker_outputs]
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_gpu_executor.py", line 196, in <listcomp>
[rank0]:     ] + [output.get() for output in worker_outputs]
[rank0]:   File "/opt/conda/lib/python3.10/site-packages/vllm/executor/multiproc_worker_utils.py", line 58, in get
[rank0]:     raise self.result.exception
[rank0]: RuntimeError: start (1024) + length (256) exceeds dimension size (1024).
ERROR 08-20 23:18:20 multiproc_worker_utils.py:120] Worker VllmWorkerProcess pid 1615 died, exit code: -15
INFO 08-20 23:18:20 multiproc_worker_utils.py:123] Killing local vLLM worker processes
[rank0]:[W820 23:18:21.755823376 CudaIPCTypes.cpp:16] Producer process has been terminated before all shared CUDA tensors released. See Note [Sharing CUDA tensors]
youkaichao commented 3 weeks ago

@xiejibing

/mnt/llm-models/405B/models/llm-demo-project/Llama-3_1-405B-FP8/240723

where did you get this model? is it the official release?

RuntimeError: start (1024) + length (256) exceeds dimension size (1024).

Looks like your model checkpoint does not agree with the model you are loading.

xiejibing commented 3 weeks ago

@youkaichao Thanks. We downloaded the model files from https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-FP8/tree/main about 8 days ago. I will check the integrity of model files.

xiejibing commented 3 weeks ago

Update: the IndexError: start out of range comes from the num_key_value_heads change in the config.json in https://huggingface.co/meta-llama/Meta-Llama-3.1-405B-FP8/tree/main. This file was updated 7 days ago and we downloaded the model 8 days ago.