opening-up-chatgpt / opening-up-chatgpt.github.io

Tracking instruction-tuned LLM openness. Paper: Liesenfeld, Andreas, Alianda Lopez, and Mark Dingemanse. 2023. “Opening up ChatGPT: Tracking Openness, Transparency, and Accountability in Instruction-Tuned Text Generators.” In Proceedings of the 5th International Conference on Conversational User Interfaces. doi:10.1145/3571884.3604316.
https://opening-up-chatgpt.github.io/
Apache License 2.0
96 stars 7 forks source link

Add ChatGLM-6B and ChatGLM2-6B #37

Open mdingemanse opened 1 year ago

mdingemanse commented 1 year ago
  1. https://github.com/THUDM/ChatGLM-6B/blob/main/README_en.md

    ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chinese and English corpus, supplemented by supervised fine-tuning, feedback bootstrap, and reinforcement learning wit human feedback. With only about 6.2 billion parameters, the model is able to generate answers that are in line with human preference.

  2. https://github.com/THUDM/ChatGLM2-6B

    ChatGLM2-6B is the second-generation version of the open-source bilingual (Chinese-English) chat model ChatGLM-6B. It retains the smooth conversation flow and low deployment threshold of the first-generation model, while introducing the following new features:

mdingemanse commented 1 year ago

Added ChatGLM-6B. Second version yet to be added; perhaps best to replace it, as the 6B model seems superseded?