meta-llama / llama

Inference code for Llama models
Other
55.37k stars 9.44k forks source link

how to use llama to finish the text summarization task #310

Open whzyf951620 opened 1 year ago

whzyf951620 commented 1 year ago

I am a freshman in NLP. I want to finish the text summarization task with the llama. I have tried many prompts, such as [abstract]:, [text summarization]:, and giving the model a text summarization example, but they could be more helpful. Is the text summarization fine-tuning or other methods needed?

ChukwumaChukwuma commented 1 year ago

def main( ckpt_dir: str, tokenizer_path: str, temperature: float = 0.8, top_p: float = 0.95, max_seq_len: int = 512, max_batch_size: int = 32, ): local_rank, world_size = setup_model_parallel() if local_rank > 0: sys.stdout = open(os.devnull, "w")

generator = load(
    ckpt_dir, tokenizer_path, local_rank, world_size, max_seq_len, max_batch_size
)

# Define the input text for summarization
text = """
This is a long piece of text that you want to summarize. It contains multiple sentences
and paragraphs. The goal is to generate a concise summary that captures the main points
of the text.
"""

# Generate the summary
prompt = f"[text summarization]: {text}"
results = generator.generate(
    [prompt], max_gen_len=256, temperature=temperature, top_p=top_p
)

for result in results:
    print(result)
    print("\n==================================\n")

In this example, I've added a main function that takes the input text and passes it as a summarization prompt to the generator. You can modify the text variable to contain your actual input text. The prompt is constructed with the [text summarization]: prefix to indicate the task to the generator. The generated summaries will be printed out.

Make sure you have the necessary dependencies and models installed to run the code successfully.