pydatadelhi / talks

Talks at PyData Delhi Meetups
44 stars 13 forks source link

How to do Data Science with larger than memory data using Dask? #132

Open arnabbiswas1 opened 3 years ago

arnabbiswas1 commented 3 years ago

As a Data Scientist, we face few challenges while dealing with large volume of data:

  1. Popular Python libraries like NumPy & Pandas are not designed to scale beyond single processor/core
  2. Numpy, Pandas, Scikit-Learn are not designed to scale beyond a single machine
  3. If data is bigger than RAM, these libraries can't be used

In this session, I will discuss how these challenges can be addressed using parallel computing library, Dask.

The talk is divided in two portions:

  1. Understanding the challenges of large data (Will be delivered through presentation) a. Fundamentals of computer architecture (with a focus on Computing Unit & Memory unit) b. Why parallelism is necessary in a multi-core architecture? c. Challenges with large data (data that doesn't fit RAM) & how to address d. Introduction to distributed computing?

  2. How does Dask handle large data? (Code walk through) a. What is Dask and Why it is needed? b. How Dask parallelizes jobs across cores/processors? c. How Dask handles larger than memory data using out of core computing and distributed computing?

Basic knowledge about the Python based Data Science libraries like Pandas, NumPy, ScikitLearn

45 minutes to 1 hour. This talk can be extended to a 2 hour long work shop as well.

https://speakerdeck.com/arnabbiswas1/scale-up-your-data-science-work-flow-using-dask

Yes.

https://github.com/arnabbiswas1/dask_workshop

https://arnab.blog/about/

Yes

This talk (45 minutes) have been delivered recently to Bangalore Python User Group, BangPypers. Here is the recording for your reference: Link

MSanKeys963 commented 3 years ago

Hi @arnabbiswas1. Thanks for the proposal. @shagunsodhani please have a look.

shagunsodhani commented 3 years ago

Hey @arnabbiswas1 ! Thanks for proposing the talk. The content looks good. Best of luck :)

arnabbiswas1 commented 3 years ago

@MSanKeys963 @shagunsodhani

As mentioned in the proposal, this talk can be delivered as it is (45 minutes). Or it can be extended to 1.5 to 2 hours focusing on usage of Dask just for Data Science use cases (Existing content + Dask Machine Learning + Dask Dashboard for debugging).

Please let me know your thoughts.