I've followed the instructions provided in the Quick Setup guide but I seem to have encountered a few issues along the way. While I managed to successfully clone the API-Miner repository and install the required dependencies, I encountered difficulties when setting up the environment to download the BERT model and boot up the demo.
:white_check_mark: 1) Clone API-Miner into your local system
:white_check_mark: 2) Load your dataset
wget https://github.com/APIs-guru/openapi-directory/blob/main/APIs/openai.com/1.2.0/openapi.yaml
ls -lrt data/raw/
total 408
-rw-r--r-- 1 gitpod gitpod 414736 Aug 7 21:24 openapi.yaml
:warning: 3) Set up your environment
:white_check_mark: pip install all dependencies
:warning: load WordNet library
gitpod ~ $ ls -lrt ~/nltk_data/
total 0
drwxr-xr-x 2 gitpod gitpod 25 Aug 10 07:42 corpora
:eyes: Observation: By default, data is downloaded into the ~/nltk_data directory. However, the instructions lack specificity on the download location.
:white_check_mark: Download BERT-base uncased model
:eyes: Observation: Neither of the directory structure align with the provided image
:warning: Boot up demo
python demo.py
Traceback (most recent call last):
File "/workspace/api-miner/demo.py", line 7, in <module>
from experiments.utils import transform_endpoint_name_to_features, transform_specs_to_features, retrieve_endpoints_from_database, normalize_scores
ModuleNotFoundError: No module named 'experiments'
:eyes: Observation: The code structure is inconsistent. The mentioned functions are not under the experiments package as expected.
:hammer: Solution Attempt-
Revised the script to locate the functions.
# from experiments.utils import transform_endpoint_name_to_features, transform_specs_to_features, retrieve_endpoints_from_database, normalize_scores
# from experiments.setup import initialize_demo
from api_miner.data_processing.utils import testing_func
from api_miner.data_processing.utils import transform_endpoint_name_to_features, transform_specs_to_features, retrieve_endpoints_from_database, normalize_scores
from tests.setup import initialize_demo
from experiments.sample_generator import MaskedSpecGenerator, MangledSpecGenerator, EndpointNameGenerator, UserStudySpecGenerator
However, the problem persists as the mentioned setup and sample_generator scripts were not found in the repository.
Issue: Setup and Demo Issues
Instructions and Setup
I've followed the instructions provided in the Quick Setup guide but I seem to have encountered a few issues along the way. While I managed to successfully clone the API-Miner repository and install the required dependencies, I encountered difficulties when setting up the environment to download the BERT model and boot up the demo.
Following along the instruction on Quick Setup
:white_check_mark: 1) Clone API-Miner into your local system
:white_check_mark: 2) Load your dataset
:warning: 3) Set up your environment
:white_check_mark: pip install all dependencies
:warning: load WordNet library
:eyes: Observation: By default, data is downloaded into the ~/nltk_data directory. However, the instructions lack specificity on the download location.
:white_check_mark: Download BERT-base uncased model
Source: https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip
:eyes: Observation: Successfully matched the directory structure as depicted in the provided image
:warning: Download sentence bert model (MiniLM-L6-H384-uncased)
Source A: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2/tree/main Source B: https://huggingface.co/nreimers/MiniLM-L6-H384-uncased/tree/main
:eyes: Observation: Neither of the directory structure align with the provided image
:warning: Boot up demo
:eyes: Observation: The code structure is inconsistent. The mentioned functions are not under the experiments package as expected.
:hammer: Solution Attempt- Revised the script to locate the functions.
However, the problem persists as the mentioned
setup
andsample_generator
scripts were not found in the repository.