Significant-Gravitas / AutoGPT

AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
https://agpt.co
Other
168.12k stars 44.37k forks source link

Send_tweet gives an error #2194

Closed JTemmink closed 1 year ago

JTemmink commented 1 year ago

⚠️ Search for existing issues first ⚠️

GPT-3 or GPT-4

Steps to reproduce 🕹

I get the following error while the send_tweet command using the stable version. The API keys should be correct.

NEXT ACTION: COMMAND = send_tweet ARGUMENTS = {'text': 'Test'} SYSTEM: Command send_tweet returned: Error: 'Forbidden' object has no attribute 'reason'

Current behavior 😯

No response

Expected behavior 🤔

No response

Your prompt 📝

# Paste your prompt here
Pinguu-dev commented 1 year ago

Same issue here

theRealOZ commented 1 year ago

also getting this issue

mossy426 commented 1 year ago

Same

ntindle commented 1 year ago

Are you authorized to use the twitter API normally? Can you use it outside this?

Kendhal7 commented 1 year ago

Try modifying the twitter.py file in /autogpt/commands/ folder with this:



import os

import tweepy
from dotenv import load_dotenv

load_dotenv()

def send_tweet(tweet_text):
    consumer_key = os.environ.get("TW_CONSUMER_KEY")
    consumer_secret = os.environ.get("TW_CONSUMER_SECRET")
    access_token = os.environ.get("TW_ACCESS_TOKEN")
    access_token_secret = os.environ.get("TW_ACCESS_TOKEN_SECRET")
    # Authenticate to Twitter
    client = tweepy.Client(consumer_key=consumer_key,
                    consumer_secret=consumer_secret,
                    access_token=access_token,
                    access_token_secret=access_token_secret)
    # Send tweet
    try:
        client.create_tweet(text=tweet_text)
        print("Tweet sent successfully!")
    except tweepy.TweepyException as e:
        print("Error sending tweet: {}".format(e.reason))
doutv commented 1 year ago

Update: See @mossy426 solution below.

The problem is: for Free Twitter API account, it can only use Twitter API v2 for tweet creation. https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api

You can use tweepy.Client instead of tweepy.API, which supports Twitter API v2.

However, after I change to tweepy.Client, the following error occurs:

{
    "title": "Forbidden",
    "status": 403,
    "detail": "Your client app is not configured with the appropriate oauth1 app permissions for this endpoint.",
    "type": "https://api.twitter.com/2/problems/oauth1-permissions"
}

I am not able to set the write permission for Access Token and Secret in Developer Portal. The official docs tell me only paid account can set permissions. https://developer.twitter.com/en/docs/apps/app-permissions https://stackoverflow.com/a/74495869

mossy426 commented 1 year ago

I've figured it out using Bing! You need to update the code as @Kendhal7 mentioned, but V2 requires a bearer token and will need to setup OAuth 2.0 in your dev portal. Here's the solution step-by-step (for free twitter API)

  1. Update the twitter.py file to use the following code for API v2 and include the bearer token in the request :
import os

import tweepy
from dotenv import load_dotenv

load_dotenv()

def send_tweet(tweet_text):
    consumer_key = os.environ.get("TW_CONSUMER_KEY")
    consumer_secret = os.environ.get("TW_CONSUMER_SECRET")
    access_token = os.environ.get("TW_ACCESS_TOKEN")
    access_token_secret = os.environ.get("TW_ACCESS_TOKEN_SECRET")
    bearer_token = os.environ.get("TW_BEARER_TOKEN")
    # Authenticate to Twitter
    client = tweepy.Client(bearer_token=bearer_token,
                    consumer_key=consumer_key,
                    consumer_secret=consumer_secret,
                    access_token=access_token,
                    access_token_secret=access_token_secret)
    # Send tweet
    try:
        client.create_tweet(text=tweet_text)
        print("Tweet sent successfully!")
    except tweepy.TweepyException as e:
        print("Error sending tweet: {}".format(e.reason))
  1. Go to your Twitter Developer Portal (https://developer.twitter.com/en).
  2. Select your project and app.
  3. In the settings tab scroll down and hit the edit/set up button of User authentication settings.
  4. Change App permissions to Read and write and Direct message.
  5. Set to web app (2nd option).
  6. Set Callback URI & Website just to https://twitter.com/ since it doesn't matter.
  7. Save and regenerate your Keys and Tokens in the Keys and Tokens Tab. (ignore the newly created client ID/secret generated after setting up auth, those are not needed)
  8. Generate a bearer token.
  9. Update the .env file with new tokens and add the bearer token like so:
    TW_CONSUMER_KEY=[YOUR CREDENTIALS]
    TW_CONSUMER_SECRET=[YOUR CREDENTIALS]
    TW_ACCESS_TOKEN=[YOUR CREDENTIALS]
    TW_ACCESS_TOKEN_SECRET=[YOUR CREDENTIALS]
    TW_BEARER_TOKEN=[YOUR CREDENTIALS]

    That should work :)

doutv commented 1 year ago

@mossy426 @Kendhal7 Thanks for your help! Can I make a pull request to wrap all the code changes and update readme document?

fewmatty commented 1 year ago

wow i have been trying to figure this out for days. Thanks you guys!!!

mossy426 commented 1 year ago

@doutv Yes of course! Thanks :)

theRealOZ commented 1 year ago

tried all above still ran into errors, tried it on new master branch and works perfect without bearer token not sure what has changed

LudovicoPapalia commented 1 year ago

Before i had the same problem, now i've solved thanks to @mossy426 . Now i can't "convince" my auto gpt to use this script. I created 3 instances and all 3 are willing to create or edith a file to post on twitter (that doesn't work). I even tryed to create an instance with the only scope to post "hello world" with the given twitter api on a tweet but it keeps doing random stuff but posting...

mossy426 commented 1 year ago

@LudovicoPapalia Interesting, what are the goals for your AI? I specified making a tweet in one of them and it works.

@theRealOZ Did you regenerate all your tokens after adding OAuth2? And then update them in your env variables?

LudovicoPapalia commented 1 year ago

@mossy426 infact, i even created an istance called "post-tweetGPT" to post a twwt so that's his only goal. How did you wrote the goal? Are you in the twitter api free version?

mossy426 commented 1 year ago

@LudovicoPapalia i am using the free twitter api version. You might have altered something in memory?

Here's an example I've used in the ai_settings.yaml:

ai_goals:

ai_name: TwitterBot

ai_role: The most popular and ethical social media content creator ai in the world designed to create entertaining, engaging, epic, and sometimes witty posts about cool advancements in science

LudovicoPapalia commented 1 year ago

@mossy426 i will try right now

Let me just say that to make autogpt "see" the twitter.py file i had to copy it (the @mossy426 version) in the folder auto_gpt_workspace. The keys are in .env file. Just trying again and waiting for the output

mossy426 commented 1 year ago

Are you using the stable version? Or the master version? That might make a difference as well

LudovicoPapalia commented 1 year ago

@mossy426 i'm using a stable version. I'm on mac but it worked well so far..maybe that's the difference. Anyway, that's the output that i got creating a brand new istance.

`Create an AI-Assistant: Enter the name of your AI and its role below. Entering nothing will load defaults. Name your AI: For example, 'Entrepreneur-GPT' AI Name: post2 post2 here! I am at your service. Describe your AI's role: For example, 'an AI designed to autonomously develop and run businesses with the sole goal of increasing your net worth.' post2 is: post a tweet with the string "123" using the given api keys Enter up to 5 goals for your AI: For example: Increase net worth, Grow Twitter Account, Develop and manage multiple businesses autonomously' Enter nothing to load defaults, enter nothing when finished. Goal 1: post a tweet using the given api keys in .env file. use the post_tweet.py file to post Goal 2: Using memory of type: LocalCache Using Browser: chrome THOUGHTS: I will use the write_to_file command to write the string '123' to a file named 'tweet.txt' REASONING: I need to save the string '123' to a file so that I can use the post_tweet.py file to post it as a tweet using the given api keys. PLAN:

mossy426 commented 1 year ago

I would try again but make it less specfic about how it should make a tweet. I'd try just telling it to make a tweet without telling it "how" it should make a tweet (i.e use env variables, twitter.py, etc.). It should already know how to do that.. Telling it to do all that could possible mess it up? Not entirely sure, but you shouldn't have to specify anything in the prompts/goals for what to use.

LudovicoPapalia commented 1 year ago

i tryed it with the same error...don't know! I'll update if someting new happens! Every new idea on how to solve is verry welcome!

doutv commented 1 year ago

@LudovicoPapalia You can try my ai_settings.yaml. It works for me.

Example 1:

ai_goals:
ai_name: Twitter-GPT
ai_role: send a tweet "Hello World! I am Auto-GPT."

Example 2:

ai_goals:
ai_name: Twitter-GPT
ai_role: Search latest research paper published in 3 days about GPT, summarize it within 50 characters, and send a tweet with the paper url.
ntindle commented 1 year ago

This moved to a module in: Auto-GPT-Plugins Please comment there

katmai commented 1 year ago

This moved to a module in: Auto-GPT-Plugins Please comment there

that plugin is not working. i suggest making tweeting a core functionality that is actively maintained and not it being a plugin.

anonhostpi commented 1 year ago

that plugin is not working. i suggest making tweeting a core functionality that is actively maintained and not it being a plugin.

No. While the plugin is currently broken, there is a reason that we offloaded it from the main repository.

The issue is the size and traction/traffic of this repository. There have been so many requests for various APIs to be supported by AutoGPT core. In terms of APIs that would be useful to the core, there's new LLM and Vector Database APIs being released weekly.

We just simply don't have the manpower to keep up with these requests (we get them multiple times a day). Just take a look at our issue count. This repository isn't even 3 months old, and has more issues than most major source code repositories. Even with ~20 team members, we can barely keep up with all of the feature requests. Keep in mind none of us are paid. We do this for free. In fact, maintaining this repo cost most of the team members money.

However, we can support abstractifying components of AutoGPT. Which is why we ended up favoring plugins.

The current plan is to abstractify every possible part of AutoGPT, so it can be built with replaceable modules. This would allow the tool to support any LLM API, memory API, commands, and plugins

katmai commented 1 year ago

that plugin is not working. i suggest making tweeting a core functionality that is actively maintained and not it being a plugin.

No. While the plugin is currently broken, there is a reason that we offloaded it from the main repository.

The issue is the size and traction/traffic of this repository. There have been so many requests for various APIs to be supported by AutoGPT core. In terms of APIs that would be useful to the core, there's new LLM and Vector Database APIs being released weekly.

We just simply don't have the manpower to keep up with these requests (we get them multiple times a day). Just take a look at our issue count. This repository isn't even 3 months old, and has more issues than most major source code repositories. Even with ~20 team members, we can barely keep up with all of the feature requests. Keep in mind none of us are paid. We do this for free. In fact, maintaining this repo cost most of the team members money.

However, we can support abstractifying components of AutoGPT. Which is why we ended up favoring plugins.

The current plan is to abstractify every possible part of AutoGPT, so it can be built with replaceable modules. This would allow the tool to support any LLM API, memory API, commands, and plugins

as you wish. i think you're making a mistake though. actually it's mistakes, but who's counting. let me address your points first.

the issue with size and traffic is a non issue, once you figure out what the end game / goal is. that has not been set and that's why things seem confusing and overload of requests. once a shape is formed, then the requests should add up to polish that shape, not pull it in all directions putting strain on the core.

you do have the manpower. and i don't mean the core team, i mean the rest of the world that gathered around in excitement on this brand new thing, thinking they have the ability to run their own AI locally - (which is a lie currently because we're not doing anything but the same thing people did for Google, when billions trained Recaptcha for free). yeah it's costing the community money because the community thinks they are doing something meaningful when the core is rotten - and by core i don't necessarily mean the core people, because i don't know them - but the core premise - that we're working with "intelligence", although i have my doubts about the core people intentions as well, when i saw that the prompts that you put in the AI are not exactly what goes into asking the AI a question, but they get another wrapper of pre-baked prompts every time you start, so you're in essence altering whatever people say every time they say it. that's a shady move and that's what prompted me to stop in this endeavor altogether.

i got nothing to say about the plugins part. i think you're doing yourself a disservice, as well as the people that are showing enthusiasm for this project.

with that said, i will show you a way forward in which you can naturally strengthen your position and rely on the enthusiasm and the passion of these people that gathered around this project with renewed optimism:

any other discussions about LLM's and vector databases should be on mute till that is figured out.

but for all this to happen, you'd need honest people, with no agenda other than pure discovery and exploration.

until these points can be addressed, you're just spinning on other people's hamster wheels, thinking you're making progress. you might make what you consider progress, but it will be in the confines that have been defined at the core. people love limits.

mahesh-ashom commented 1 year ago

I've figured it out using Bing! You need to update the code as @Kendhal7 mentioned, but V2 requires a bearer token and will need to setup OAuth 2.0 in your dev portal. Here's the solution step-by-step (for free twitter API)

  1. Update the twitter.py file to use the following code for API v2 and include the bearer token in the request :
import os

import tweepy
from dotenv import load_dotenv

load_dotenv()

def send_tweet(tweet_text):
    consumer_key = os.environ.get("TW_CONSUMER_KEY")
    consumer_secret = os.environ.get("TW_CONSUMER_SECRET")
    access_token = os.environ.get("TW_ACCESS_TOKEN")
    access_token_secret = os.environ.get("TW_ACCESS_TOKEN_SECRET")
    bearer_token = os.environ.get("TW_BEARER_TOKEN")
    # Authenticate to Twitter
    client = tweepy.Client(bearer_token=bearer_token,
                    consumer_key=consumer_key,
                    consumer_secret=consumer_secret,
                    access_token=access_token,
                    access_token_secret=access_token_secret)
    # Send tweet
    try:
        client.create_tweet(text=tweet_text)
        print("Tweet sent successfully!")
    except tweepy.TweepyException as e:
        print("Error sending tweet: {}".format(e.reason))
  1. Go to your Twitter Developer Portal (https://developer.twitter.com/en).
  2. Select your project and app.
  3. In the settings tab scroll down and hit the edit/set up button of User authentication settings.
  4. Change App permissions to Read and write and Direct message.
  5. Set to web app (2nd option).
  6. Set Callback URI & Website just to https://twitter.com/ since it doesn't matter.
  7. Save and regenerate your Keys and Tokens in the Keys and Tokens Tab. (ignore the newly created client ID/secret generated after setting up auth, those are not needed)
  8. Generate a bearer token.
  9. Update the .env file with new tokens and add the bearer token like so:
TW_CONSUMER_KEY=[YOUR CREDENTIALS]
TW_CONSUMER_SECRET=[YOUR CREDENTIALS]
TW_ACCESS_TOKEN=[YOUR CREDENTIALS]
TW_ACCESS_TOKEN_SECRET=[YOUR CREDENTIALS]
TW_BEARER_TOKEN=[YOUR CREDENTIALS]

That should work :)

What's in the callback URI?

github-actions[bot] commented 1 year ago

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

github-actions[bot] commented 1 year ago

This issue was closed automatically because it has been stale for 10 days with no activity.