jina-ai / thinkgpt

Agent techniques to augment your LLM and push it beyong its limits
Apache License 2.0
1.51k stars 134 forks source link

chunked_summarize() just returns original input data #14

Open alexcg1 opened 1 year ago

alexcg1 commented 1 year ago

llm.summarize() works totally as expected. llm.chunked_summarize() just returns the original input data without any processing behind the scenes (it returns output instantly, so I don't think it's sending to OpenAI)

My code:

import requests
from thinkgpt.llm import ThinkGPT

llm = ThinkGPT(model_name='gpt-3.5-turbo', temperature=0)

URL = 'https://raw.githubusercontent.com/jina-ai/thinkgpt/main/README.md'

input_data = requests.get(URL).text

summary = llm.chunked_summarize(
    input_data,
    instruction_hint='Rewrite the following into an informal, SEO-friendly blog post in markdown format',
    max_tokens=4096,
)

It returns:

# ThinkGPT 🧠🤖
<a href="https://discord.jina.ai"><img src="https://img.shields.io/discord/1106542220112302130?logo=discord&logoColor=white&style=flat-square"></a>

ThinkGPT is a Python library aimed at implementing Chain of Thoughts for Large Language Models (LLMs), prompting the model to think, reason, and to create generative agents. 
The library aims to help with the following:
* solve limited context with long memory and compressed knowledge
* enhance LLMs' one-shot reasoning with higher order reasoning primitives
* add intelligent decisions to your code base

And then the rest of the readme

However, if I change that to llm.summarize(), I get:

Introducing ThinkGPT: A Python Library for Large Language Models

ThinkGPT is a Python library that implements Chain of Thoughts for Large Language Models (LLMs), prompting the model to think, reason, and create generative agents. The library aims to solve limited context with long memory and compressed knowledge, enhance LLMs' one-shot reasoning with higher order reasoning primitives, and add intelligent decisions to your code base.

Key Features:
- Thinking building blocks: Memory, Self-refinement, Compress knowledge, Inference, and Natural Language Conditions
- Efficient and Measurable GPT context length
- Extremely easy setup and pythonic API thanks to DocArray

And then the rest of a short blog post

alexcg1 commented 1 year ago

I'm using latest version of thinkgpt installed via pip install git+...