dmlc / dgl

Python package built to ease deep learning on graph, on top of existing DL frameworks.
http://dgl.ai
Apache License 2.0
13.58k stars 3.02k forks source link

Running TODOs #23

Closed jermainewang closed 6 years ago

jermainewang commented 6 years ago

Hi all,

Let's use this thread to book-keep all the major TODOs before our first major milestone.

High priority Tasks that have a major contributor to be in charge of.

Low priority Tasks that anyone can contribute when they don't have high priority tasks at the moment.

Applications

zzhang-cn commented 6 years ago

Some thoughts: [] Extend networkx's dict-of-dict to store tensors [] Cache adjacency matrix (for the reduce function) [] Establish scale-testing [] Think about how to scale out to multiple cards

jermainewang commented 6 years ago

Incorporated ZZ's advices.

BarclayII commented 6 years ago

I have gone through the nvGraph API and I doubt if we can use the SrSPMV function in the reducers. Their graph structure and SrSPMV only operates on scalars.

Also for keeping track of TODOs I recommend Trello. It can set deadlines, assign people and send reminders to mailboxes.

zzhang-cn commented 6 years ago

Also let's re-org the folds and make a more complete listing of examples. We have:

I might have missed some. Let's put a readme in the example folder, with links to the appropriate papers.

jermainewang commented 6 years ago

Close as we are using trello to keep track of progress.

kitaev-chen commented 5 years ago

I think "Dynamic graph (in the generative_model branch)" will be a real competitive feature compared with other graph libraries since graph based generative model / gaussian process will be very prevalent .

mufeili commented 5 years ago

Thanks a lot for the suggestion @kitaev-chen . I'm personally a big fan of generative models and yes we are likely to incorporate some more examples on generative models in the future. As this issue is closed now, you may want to instead post your suggestion at #450 . Also let us know if there are some models that you are particularly interested.