All-Hands-AI / OpenHands

🙌 OpenHands: Code Less, Make More
https://all-hands.dev
MIT License
36.15k stars 4.12k forks source link

[Bug]: Error when trying to pull all docker evaluation containers (SWEBench) #4242

Closed AlexCuadron closed 1 month ago

AlexCuadron commented 1 month ago

Is there an existing issue for the same bug?

Describe the bug

I try to pull all the docker containers using: evaluation/swe_bench/scripts/docker/pull_all_eval_docker.sh instance lite as described in the documentation. However I run into the following error:

❯ ./evaluation/swe_bench/scripts/docker/pull_all_eval_docker.sh instance lite
Pulling images from ./evaluation/swe_bench/scripts/docker/all-swebench-lite-instance-images.txt
Pulling docker images for [instance] level
Pattern: sweb.base\|sweb.env\|sweb.eval
Image file: ./evaluation/swe_bench/scripts/docker/all-swebench-lite-instance-images.txt
Pulling lite/sweb.base.x86_64:latest into sweb.base.x86_64:latest
Error response from daemon: pull access denied for lite/sweb.base.x86_64, repository does not exist or may require 'docker login': denied: requested access to the resource is denied

Current OpenHands version

0.9.7

Installation and Configuration

git clone https://github.com/AlexCuadron/OpenHands.git
cd OpenHands
git remote add upstream git@github.com:All-Hands-AI/OpenHands.git
git remote -v
git fetch upstream\ngit checkout main\ngit merge upstream/main\ngit push origin main
conda create --name open-hands python=3.11 conda-forge::nodejs conda-forge::poetry
conda activate open-hands
brew install netcat
pip install git+https://github.com/OpenDevin/SWE-bench.git@7b0c4b1c249ed4b4600a5bba8afb916d543e034a
make build
poetry self update
make build
make setup-config
Setting up config.toml...
Enter your workspace directory (as absolute path) [default: ./workspace]:
Enter your LLM model name, used for running without UI. Set the model in the UI after you start the app. (see https://docs.litellm.ai/docs/providers for full list) [default: gpt-4o]: o1-mini
Enter your LLM api key:
Enter your LLM base URL [mostly used for local LLMs, leave blank if not needed - example: http://localhost:5001/v1/]:
Enter your LLM Embedding Model
Choices are:
  - openai
  - azureopenai
  - Embeddings available only with OllamaEmbedding:
    - llama2
    - mxbai-embed-large
    - nomic-embed-text
    - all-minilm
    - stable-code
    - bge-m3
    - bge-large
    - paraphrase-multilingual
    - snowflake-arctic-embed
  - Leave blank to default to 'BAAI/bge-small-en-v1.5' via huggingface
>
Config.toml setup completed.

export DEBUG=1

Model and Agent

Operating System

Kubuntu 22.04.4 LTS x86_64

Reproduction Steps

Run: evaluation/swe_bench/scripts/docker/pull_all_eval_docker.sh instance lite

Logs, Errors, Screenshots, and Additional Context

No response

AlexCuadron commented 1 month ago

Fixed in #4244