recodehive / Scrape-ML

For new data generation Semi-supervised-sequence-learning-Project we have writtern a python script to fetch📊, data from the 💻, imdb website 🌐 and converted into txt files.
https://scrape-ml.streamlit.app/
MIT License
85 stars 116 forks source link

issue in code [scrapping.py} #14

Closed kairveeehh closed 2 months ago

kairveeehh commented 4 months ago

@sanjay-kv in the amazon scrapping folder is we carefully analyze there is a bug or error to be fixed - the name_list variable is referenced inside the product_listing function without being passed as an argument or defined as a global variable within the function scope. I would like to fix this issue

please assign the issue under gssoc'24

Blackphoenix-15 commented 4 months ago

@sanjay-kv In scrapping.py, I analyzed that in the code, Variable name_list is not being defined within the product_listing function. Since name_list is defined outside the function, it is not accessible within the function's scope. I would like to fix this, so please assign this issue to me.

sanjay-kv commented 4 months ago

Assigned to you @kairveeehh Issue will be assigned to Only 1 person ...FCFS basis. Others get a chance if the issue gets stale & the assignee is inactive for 5+ days Also if you would like to work on more issue create a new issue and I will create label and assign you.

varshithar12 commented 3 months ago

I agree to follow this project's Code of Conduct I'm a GSSOC'24 contributor I want to work on this issue.

khushikunte commented 3 months ago

Hii.I would like to work on this issue .Please assign me this under GSSOC'24

lassmara commented 3 months ago

Please assign me this issue.

ananyabansal16 commented 3 months ago

If this issue is still pending, I can take this up under GSSoC'24. Lemme know. :)

Taranpreet10451 commented 3 months ago

Can you please assign this to me.

github-actions[bot] commented 2 months ago

This issue has been automatically closed because it has been inactive for more than 30 days. If you believe this is still relevant, feel free to reopen it or create a new one. Thank you!