huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
128.38k stars 25.47k forks source link

Add DALL-E: Zero-Shot Text-to-Image Generation #10935

Open slavakurilyak opened 3 years ago

slavakurilyak commented 3 years ago

🚀 Feature request

Please add DALLE-E model to huggingface's Transformers library.

  1. Announcement
  2. Abstract
  3. Paper
  4. Code:

Motivation

DALL·E is a 12-billion parameter version of GPT-3 trained to generate images from text descriptions, using a dataset of text–image pairs

We (Open AI) decided to name our model using a portmanteau of the artist Salvador Dalí and Pixar’s WALL·E.

github-actions[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

krrishdholakia commented 3 years ago

+1

farzanehnakhaee70 commented 1 year ago

Does dall-e mini currently added to transformers? Currently, it doesn't eBart is not recognized in transformers library.

julien-c commented 1 year ago

cc @patil-suraj who's currently working on making it easier to use from transformers

Tanman2001 commented 8 months ago

There are dozens of DALL-E models currently listed on the Hugging Face site. Unless this is a specific variant/implementation that has yet to be added, it seems this issue can be closed.