huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
135.45k stars 27.11k forks source link

RuntimeError: Unable to build docs using doc-builder #26416

Closed ENate closed 11 months ago

ENate commented 1 year ago

I am trying to build the docs with doc-builder

using the following command

doc-builder build transformers docs/source/en --build_dir ~/tmp/test-build 

but got the following error:

Traceback (most recent call last):
  File "/home/miniconda3/envs/transformers/bin/doc-builder", line 8, in <module>
    sys.exit(main())
  File "/home/miniconda3/envs/transformers/lib/python3.9/site-packages/doc_builder/commands/doc_builder_cli.py", line 47, in main
    args.func(args)
  File "/home/miniconda3/envs/transformers/lib/python3.9/site-packages/doc_builder/commands/build.py", line 96, in build_command
    build_doc(
  File "/home/miniconda3/envs/transformers/lib/python3.9/site-packages/doc_builder/build_doc.py", line 405, in build_doc
    sphinx_refs = check_toc_integrity(doc_folder, output_dir)
  File "/home/miniconda3/envs/transformers/lib/python3.9/site-packages/doc_builder/build_doc.py", line 460, in check_toc_integrity
    raise RuntimeError(
RuntimeError: The following files are not present in the table of contents:
- gpt_neo
- t5
- qdqbert
- videomae
- herbert
- graphormer
- reformer
- mvp
- visual_bert
- dinat
- esm
- ernie
- camembert
- openai-gpt
- bart
- trocr
- informer
- audio-spectrogram-transformer
- whisper
- unispeech-sat
- deberta
- deta
- xlsr_wav2vec2
- xglm
- speecht5
- swin
- gptsan-japanese
- sew
- mobilenet_v1
- layoutlmv3
- mobilevitv2
- align
- git
- perceiver
- xlm-v
- efficientnet
- wavlm
- encoder-decoder
- blenderbot-small
- sew-d
- xls_r
- vision-text-dual-encoder
- conditional_detr
- layoutlmv2
- flava
- trajectory_transformer
- pix2struct
- data2vec
- dinov2
- unispeech
- mctct
- bark
- xlm-prophetnet
- ernie_m
- clipseg
- tvlt
- decision_transformer
- bert-generation
- m2m_100
- ul2
- hubert
- cpm
- clip
- marian
- codegen
- table-transformer
- rembert
- van
- roformer
- opt
- mobilenet_v2
- altclip
- gpt_neox
- mms
- pegasus_x
- big_bird
- nllb-moe
- lxmert
- beit
- deberta-v2
- megatron-bert
- pvt
- dpr
- detr
- rag
- upernet
- bridgetower
- gptj
- levit
- glpn
- xmod
- sam
- llama2
- xlm
- vitdet
- maskformer
- owlvit
- instructblip
- transfo-xl
- pegasus
- longt5
- mbart
- layoutxlm
- barthez
- vision-encoder-decoder
- dit
- flan-ul2
- byt5
- roc_bert
- mega
- encodec
- bert
- phobert
- ibert
- gpt_bigcode
- deformable_detr
- vit_msn
- markuplm
- convnextv2
- yoso
- gpt-sw3
- wav2vec2_phoneme
- focalnet
- donut
- regnet
- jukebox
- mluke
- realm
- autoformer
- bertweet
- chinese_clip
- swiftformer
- umt5
- auto
- t5v1.1
- poolformer
- dialogpt
- vit_hybrid
- xclip
- deplot
- canine
- resnet
- bloom
- groupvit
- fsmt
- bert-japanese
- squeezebert
- mra
- convbert
- biogpt
- xlm-roberta
- nystromformer
- cpmant
- nezha
- bit
- segformer
- rwkv
- bort
- longformer
- time_series_transformer
- blip-2
- mpnet
- pop2piano
- flan-t5
- musicgen
- speech_to_text_2
- distilbert
- tapas
- lilt
- blenderbot
- gpt_neox_japanese
- xlm-roberta-xl
- xlnet
- nllb
- speech_to_text
- vilt
- llama
- roberta-prelayernorm
- yolos
- megatron_gpt2
- plbart
- cvt
- speech-encoder-decoder
- mt5
- vivit
- idefics
- imagegpt
- flaubert
- vit
- bartpho
- clap
- mobilebert
- mgp-str
- efficientformer
- tapex
- switch_transformers
- electra
- mask2former
Add them to docs/source/en/_toctree.yml.

Any ideas on why this is happening? Any help will be appreciated. Thanks

Wauplin commented 1 year ago

Hi @ENate , I'm transferring the issue to the transformers repo as it seems you are trying to build the docs for this library (instead of huggingface_hub). Also cc @mishig25 who worked on the doc builder.

ENate commented 1 year ago

Hi @Wauplin thanks. Was building the docs after forking the transformers repo.

mishig25 commented 1 year ago

As the error suggests:

Add them to docs/source/en/_toctree.yml.

Could you check the following doc pages exist in docs/source/en/_toctree.yml ?

ENate commented 1 year ago

Yes, I saw the doc pages in the

 docs/source/en/_toctree.yml

The docs are listed in the following manner

- local: <model_doc/<name-of-doc>
   title: <name>

in the

 docs/source/en/_toctree.yml

file

mishig25 commented 1 year ago

btw, you are not on Windows?

ENate commented 1 year ago

nope. I am using Ubuntu 22.04 LTS

ENate commented 1 year ago

Hi @mishig25 did you manage to find out why this is happening on Ubuntu (UNIX)? Any ideas for a work around? Thanks

ENate commented 1 year ago

Hi @mishig25 I did add the model docs in the file as follows:

- sections:
      - local: model_doc/mt5
        title: Vision Transformer
      - local: model_doc/vivit
        title: Video Vision Transformer
      - local: model_doc/idefics
        title: Idefics
      - local: model_doc/imagegpt
        title: ImageGPT
      - local: model_doc/flaubert
        title: FlauBERT
      - local: model_doc/vit
        title: ViLT
      - local: model_doc/bartpho
        title: BARTpho
      - local: model_doc/clap
        title: CLAP
      - local: model_doc/mobilebert
        title: MobileBERT
      - local: model_doc/mgp-str
        title: MGP-STR
      - local: model_doc/efficientformer
        title: EfficientFormer
      - local: model_doc/tapex
        title: TAPEX
      - local: model_doc/switch_transformers
        title: SwitchTransformers
      - local: model_doc/electra
        title: ELECTRA
      - local: model_doc/mask2former
        title: Mask2Former
  title: Text models

in the _toc_tree.yml but got the same problem on Ubuntu (when trying to run pip install -e "[docs]"):


Traceback (most recent call last):
  File "/home/
/miniconda3/envs/transformers/bin/doc-builder", line 8, in <module>
    sys.exit(main())
             ^^^^^^
  File "/home/miniconda3/envs/transformers/lib/python3.11/site-packages/doc_builder/commands/doc_builder_cli.py", line 47, in main
    args.func(args)
  File "/home/miniconda3/envs/transformers/lib/python3.11/site-packages/doc_builder/commands/build.py", line 96, in build_command
    build_doc(
  File "/home/miniconda3/envs/transformers/lib/python3.11/site-packages/doc_builder/build_doc.py", line 405, in build_doc
    sphinx_refs = check_toc_integrity(doc_folder, output_dir)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/miniconda3/envs/transformers/lib/python3.11/site-packages/doc_builder/build_doc.py", line 460, in check_toc_integrity
    raise RuntimeError(
RuntimeError: The following files are not present in the table of contents:
- gpt_neo
- t5
- qdqbert
- videomae
- herbert
- graphormer
- reformer
- mvp
- visual_bert
- dinat
- esm
- ernie
- camembert
- openai-gpt
- bart
- trocr
- informer
- audio-spectrogram-transformer
- whisper
- unispeech-sat
- deberta
- deta
- xlsr_wav2vec2
- xglm
- speecht5
- swin
- gptsan-japanese
- sew
- mobilenet_v1
- layoutlmv3
- mobilevitv2
- align
- git
- perceiver
- xlm-v
- efficientnet
- wavlm
- encoder-decoder
- blenderbot-small
- sew-d
- xls_r
- vision-text-dual-encoder
- conditional_detr
- layoutlmv2
- flava
- trajectory_transformer
- pix2struct
- data2vec
- dinov2
- unispeech
- mctct
- bark
- xlm-prophetnet
- ernie_m
- clipseg
- tvlt
- decision_transformer
- bert-generation
- m2m_100
- ul2
- hubert
- cpm
- clip
- marian
- codegen
- table-transformer
- rembert
- van
- roformer
- opt
- mobilenet_v2
- altclip
- gpt_neox
- mms
- pegasus_x
- big_bird
- nllb-moe
- lxmert
- beit
- deberta-v2
- megatron-bert
- pvt
- dpr
- detr
- rag
- upernet
- bridgetower
- gptj
- levit
- glpn
- xmod
- sam
- llama2
- xlm
- vitdet
- maskformer
- owlvit
- instructblip
- transfo-xl
- pegasus
- longt5
- mbart
- layoutxlm
- barthez
- vision-encoder-decoder
- dit
- flan-ul2
- byt5
- roc_bert
- mega
- encodec
- bert
- phobert
- ibert
- gpt_bigcode
- deformable_detr
- vit_msn
- markuplm
- convnextv2
- yoso
- gpt-sw3
- wav2vec2_phoneme
- focalnet
- donut
- regnet
- jukebox
- mluke
- realm
- autoformer
- bertweet
- chinese_clip
- swiftformer
- umt5
- auto
- t5v1.1
- poolformer
- dialogpt
- vit_hybrid
- xclip
- deplot
- canine
- resnet
- bloom
- groupvit
- fsmt
- bert-japanese
- squeezebert
- mra
- convbert
- biogpt
- xlm-roberta
- nystromformer
- cpmant
- nezha
- bit
- segformer
- rwkv
- bort
- longformer
- time_series_transformer
- blip-2
- mpnet
- pop2piano
- flan-t5
- musicgen
- speech_to_text_2
- distilbert
- tapas
- lilt
- blenderbot
- gpt_neox_japanese
- xlm-roberta-xl
- xlnet
- nllb
- speech_to_text
- vilt
- llama
- roberta-prelayernorm
- yolos
- megatron_gpt2
- plbart
- cvt
- speech-encoder-decoder
- mt5
- vivit
- idefics
- imagegpt
- flaubert
- vit
- bartpho
- clap
- mobilebert
- mgp-str
- efficientformer
- tapex
- switch_transformers
- electra
- mask2former

Add them to docs/source/en/_toctree.yml. Id there any way to make this work (so that I can successfully build the docs, or dev) before opening a PR? Thanks again

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

ENate commented 1 year ago

I am not able to still resolve this issue.

ArthurZucker commented 1 year ago

Hey! If you wanted to open a PR an check how the doc renders, would just recommend you to still open the PR but in draft mode. Then ping me if the doc does not render, make sure you rebase on main and then I can help you. It's hard without looking at any code! 🤗

ENate commented 1 year ago

Thanks. I did open the PR successfully. I am surprised it was impossible to build and view the docs locally. I was able to do so a few weeks ago. Not sure why it is no longer possible to view the docs locally.

github-actions[bot] commented 11 months ago

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.