Open adelnehme opened 4 years ago
hi @adelnehme ,
In general, the POC was pretty smooth in terms of the syntax, but I'm having a few issues with input dependencies. You could follow this Jupyter notebook for the content and the related data and assets needed for the initial POC in the PR #2
A few things I would still like to confirm and discuss:
Yes indeed - all files outputted into colabs will be in the temporary data file navigation system of colabs - not in GitHub.
requirements.txt
file for teaching pip install
, a database.db
file for teaching querying against a localhost database, a .zip
file to teach how to unzip on the command line, etc. How can I confirm that they can be called from the notebook? Of course! As long as you can import them correctly into the colabs session - feel free to add whatever data/assets you want!
Not really - we can always do a part II for using cron 😄
Fair enough - we'll have a much clearer idea of what to include once we reach session specs 🚀
Hi @sunwsusan :wave:
I'm really happy things turned out okay on Colabs 🎉
I've updated your comment above with answers to your question! Given that we have the green-light to develop the session, I suggest confirming with Kelsey the final date and time of the session 😄
Cheers,
Adel
Hi @adelnehme,
Thank you for the fast response. One last open issue though. While I'm able to bring the notebook from Github -> Colab Google Drive, using chrome extension button, I haven't been able to pull the data in from Github to my Colab content folder using the same extension.
I can, of course, clone the content of the repo, and then manually upload to Google Drive and link the data and assets to the Colab notebook manually, but that doesn't seem like a reasonable set up for the students to go through.
Am I missing a key step here?
Best, Susan
Hi @sunwsusan :wave:
To get data/images/assets hosted on the GitHub repository into the colabs file - all you'd need to do is just get the raw link of the asset you want to import and use that link when making your import. For example, you can import the Spotify_MusicAttributes.csv
file in the notebooks
folder by running
!curl -o Spotify201812.zip -L https://raw.githubusercontent.com/datacamp/shell-notebook-sandbox/sunwsusan/proof_of_concept/data/Spotify_MusicAttributes.csv
I hope that answers your question! If not, I'm happy to discuss this further.
Cheers,
Adel
Ah that clears things up! I didn't know the final repo for the course will be a public one. I've been pulling files from the AWS buckets used for the Datacamp online course Intermediate Shell.
In this case, I'm feeling pretty comfortable with the POC part of the file access needed for this live training course. Happy to talk next steps!
Hi @sunwsusan :wave:
Thanks for the time today and I'm really stoked about a potential shell session. Following up on today's discussion - this is a GitHub repository for you to experiment with different shell syntax and see how it fits in a live session.
At the end of the readme file, there are instructions on authoring the notebook. For the sake of this exercise, please just feel free to try out the different syntax you may want to teach in the session on any sample data you find fit - we just want to make sure it works.
The more exhaustive we are in this testing phase the more confident we can be about proceeding with the training.
Let me know if you have any comments or questions!
Cheers and stay safe 😷
Adel