MineDojo / Voyager

An Open-Ended Embodied Agent with Large Language Models
https://voyager.minedojo.org/
MIT License
5.53k stars 512 forks source link

Bot fails to deposit items into chest, claims "no chest at coordinates, it is chest" #56

Closed TimeLordRaps closed 1 year ago

TimeLordRaps commented 1 year ago

Before submitting an issue, make sure you read the FAQ.md

I have read the FAQ.

Briefly describe your issue

Bot fails to deposit items into chest.

Error occurs In your program code: throw new Error( No chest at (62.5, 52, -47.470345346655634), it is chest at line 17:await depositItemIntoChest(bot, position, uselessItems); in your code

Please provide your python, nodejs, Minecraft, and Fabric versions here

Python 3.10.11, node.js v16.14.0, Minecraft 1.19, fabric-0.14.18-1.19 I have all the exact mods, except better respawn as I want my bots to be able to handle death.

[If applicable] Please provide the Mineflayer and Minecraft logs, you can find the log under logs folder

Important lines from mineflayer logs: 2023-05-31 21:36:53,884 - mineflayer - INFO - Error: No chest at (62.5, 52, -47.470345346655634), it is chest 2023-05-31 21:36:53,884 - mineflayer - INFO - at moveToChest (eval at evaluateCode Voyager\voyager\env\mineflayer\index.js:256:19), :1382:15) 2023-05-31 21:36:53,885 - mineflayer - INFO - at depositItemIntoChest (eval at evaluateCode Voyager\voyager\env\mineflayer\index.js:256:19), :1331:11)

[If applicable] Please provide the GPT conversations that are printed each round.

This has happened across so many different attempts at this point through multiple manual critiques, but I recall this problem ending my first fully auto run.

xieleo5 commented 1 year ago

I sincerely apologize for the inconvenience caused. During the codebase cleanup prior to the release, I thought a small code segment related to the chest appeared to be unused. Consequently, I deleted it, causing a bug in the current codebase. Once I complete some final testing, I will push the updated code. I deeply regret the oversight and any trouble it may have caused. Additionally, I have discovered that this issue also affects the two community checkpoints. Thanks to @daswer123 and @swen128 for sharing the checkpoints, as it was through their events that I located this bug.

go-maple commented 1 year ago

@swen128 Bro,How to export the openai openapi? I saw a openai_requests.json file which is sharing in your checkpoint repo. Please tell me how to do it.

go-maple commented 1 year ago

@xieleo5 Bro,You will update the code?

xieleo5 commented 1 year ago

Hi, I just fixed the issue in the latest commit, you can resume and continue running your experiments.

go-maple commented 1 year ago

@xieleo5 Tks,Bro,Can I use this code for resume and continue? """ voyager = Voyager( azure_login=azure_login, openai_api_key=openai_api_key, env_wait_ticks=30, resume=True )

voyager.learn(reset_env=False) """

Is it right?

TimeLordRaps commented 1 year ago

Exact code: voyager = Voyager(mc_port=mc_port, openai_api_key=openai_api_key, env_wait_ticks=100, env_request_timeout=1800, max_iterations=150, resume=True, ckpt_dir='manual4', curriculum_agent_mode="manual", critic_agent_mode="manual", action_agent_task_max_retries=16)

voyager.learn(reset_env=True)

go-maple commented 1 year ago

@TimeLordRaps Tks Bro. I will try it.


It works now. Thanks brother. last time I always lost all of Inventories. Now I found they can load again.

Hahaha!!!. I can check every task now.It looks like to debug every task.but I found a bug let me check it tomorrow .

swen128 commented 1 year ago

@go-maple

@swen128 Bro,How to export the openai openapi? I saw a openai_requests.json file which is sharing in your checkpoint repo. Please tell me how to do it.

I used the tracing feature of LangChain to capture all requests to the API. https://python.langchain.com/en/latest/additional_resources/tracing.html

You can locally host the tracing background using docker-compose.yml like this:

version: '3'
services:
  langchain-frontend:
    image: notlangchain/langchainplus-frontend:latest
    ports:
      - 4173:4173
    environment:
      - BACKEND_URL=http://langchain-backend:1984
      - PUBLIC_BASE_URL=http://localhost:1984
      - PUBLIC_DEV_MODE=true
    depends_on:
      - langchain-backend
  langchain-backend:
    image: notlangchain/langchainplus:latest
    environment:
      - PORT=1984
      - LANGCHAIN_ENV=local
    ports:
      - 1984:1984
    depends_on:
      - langchain-db
  langchain-db:
    image: postgres:14.1
    ports:
      - 5432:5432
    volumes:
      - ./postgres:/var/lib/postgresql/data
    environment:
      - POSTGRES_PASSWORD=postgres
      - POSTGRES_USER=postgres
      - POSTGRES_DB=postgres

And then set the environment variables LANGCHAIN_TRACING and LANGCHAIN_HANDLER when running Voyager.

import os

os.environ["LANGCHAIN_TRACING"] = "true"
os.environ["LANGCHAIN_HANDLER"] = "langchain"

from voyager import Voyager

# ...

voyager = Voyager(
    azure_login=azure_login,
    openai_api_key=openai_api_key,
)

voyager.learn()