randaller / llama-chat

Chat with Meta's LLaMA models at home made easy
GNU General Public License v3.0
833 stars 118 forks source link

To shield the annoying progress bar, we found a way #27

Open snow-wind-001 opened 1 year ago

snow-wind-001 commented 1 year ago

To shield the annoying progress bar, we found a way: /llama-chat/llama/model.py --261 for layer in tqdm(self.layers, desc="flayers",leave=Ture): replace as :for layer in self.layers: /llama-chat/llama/generation.py --60 for cur_pos in trange(start_pos, total_len, desc="forward"): replace as :for cur_pos in range(start_pos, total_len):

meeeo commented 1 year ago

Thanks, but still seems... can we have a chat just like llama.cpp dose? Screen Shot 2023-04-04 at 11 20 41 AM Screen Shot 2023-04-04 at 11 22 22 AM