This plugin integrates ChatGPT (or your OpenAI compatible LLM) within TinyMCE, allowing you to generate realistic and creative text with the push of a button.
To use the plugin, simply install and activate it in TinyMCE. Once activated, you will see a new "ChatGPT" button in the TinyMCE toolbar. Click this button to open a dialog box where you can enter a request to ChatGPT. ChatGPT will then generate the requested text and enter it into your editor.
ChatGPT can be used to generate a variety of text formats, including articles, blog posts, e-mails, letters, and more. It can also be used to translate languages, write different types of creative content, and answer your questions in an informative way.
ChatGPT is a powerful tool that can help you improve your productivity and the quality of your work. Try it out today!
:warning: WARNING |
---|
Consider this plugin as a prototype and not suitable for production deployment at this time. Use it only in controlled environments and at your own risk. |
This plugin comes as an external plugin; however, you can download the .js file and upload it to your host.
To install, simply edit the init configuration of your TinyMCE as follows:
tinymce.init({
// 1. Add the plugin to the list of external plugins
external_plugins: {
chatgpt:
"https://cdn.jsdelivr.net/gh/The-3Labs-Team/tinymce-chatgpt-plugin@2/dist/chatgpt.js",
},
// 2. Configure the ChatGPT plugin
openai: {
api_key: "sk-yourapikeyhere", // Your OpenAI API key
model: "gpt-3.5-turbo",
temperature: 0.5,
max_tokens: 150,
prompts: [
"Translate from English to Italian",
"Summarize",
"Proofread",
"Write a blog post about",
],
// Optional: Add your custom LLM
// baseUri: "https://your-llm-endpoint.com",
},
// 3. Add the ChatGPT button to the toolbar
toolbar: ["chatgpt"],
});
If you are using our TinyMCE Laravel Nova Package 4 you can configure as follows:
<?php
'init' => [
// 1. Add the plugin to the list of external plugins
'external_plugins' => [
'chatgpt' => 'https://cdn.jsdelivr.net/gh/The-3Labs-Team/tinymce-chatgpt-plugin@2/dist/chatgpt.js'
],
// 2. Configure the plugin
'openai' => [
'api_key' => 'sk-yourapikeyhere', // Your OpenAI API key
'model' => 'gpt-3.5-turbo',
'temperature' => 0.5,
'max_tokens' => 150,
'prompts' => [
'Translate from English to Italian',
'Summarize',
'Proofread',
'Write a blog post about',
],
// Optional: Add your custom LLM
// 'base_uri' => 'https://your-llm-endpoint.com',
],
],
// 3. Add the plugin to the toolbar
'toolbar' => ['chatgpt']
//...
/chat/completions
endpoint from OpenAImodel
default now must be a chat model, like gpt-3.5-turbo
/completion
endpoint, you can use the baseUri
option to set your custom LLM endpoint like https://api.openai.com/v1/completions
If you have a custom LLM, you can use it by setting the baseUri
option in the configuration. The plugin will use this endpoint to generate the text.
baseUri: "https://your-llm-endpoint.com"
Please note that the custom LLM must be OpenAI compatible and follow the same API as OpenAI.
Also, the custom LLM must be accessible from the client-side.
Check the demo-lm-studio.html
file for an example of how to use a custom LLM.