mikeizbicki / cmc-csci143

big data course materials
40 stars 76 forks source link

Port issue when running `load_tweets_parallel.sh` for indexes homework #534

Closed adamzterenyi closed 6 months ago

adamzterenyi commented 6 months ago

Hello @mikeizbicki,

I'm running into an error when I run load_tweets_parallel.sh and cat:

$ nohup ./load_tweets_parallel.sh &
$ cat nohup.out

The above returns a horrendous error message basically indicating that my ports are wrong. I changed my ports, double and triple checking them on my docker-compose.yml, load_denormalized.sh, and load_tweets_parallel.sh files, and still get the same ports (1362 and 1363)--which I am no longer using (I am using 41362 and 41363, instead)--in my error.

I am wondering if this is because I need the data in my volumes reset, but I believe I haven't been able to delete these.

In the homework, you say: "Hint: If you need help deleting the data for whatever reason, let me know and I can delete it for you as a root user." First, I'd like to take you up on this offer and am wondering if you could delete the pg_denormalized and pg_normalized volumes for me. Second, I am curious if you have any other suggestions as to how I could/should move forward. Thanks!

abizermamnoon commented 6 months ago

Can you share your error message and your docker-compose.yml file?

mikeizbicki commented 6 months ago

I have deleted these folders by running

sudo rm -rf /home/aterenyi25/bigdata/pg_denormalized /home/aterenyi25/bigdata/pg_normalized_batch