Skip to content

Latest commit

 

History

History
126 lines (94 loc) · 5.18 KB

README.md

File metadata and controls

126 lines (94 loc) · 5.18 KB

TinyMCE ChatGPT Plugin

Logo TinyMCE ChatGPT Plugin

Lint License jsDelivr hits (GitHub) Maintainability

This plugin integrates ChatGPT (or your OpenAI compatible LLM) within TinyMCE, allowing you to generate realistic and creative text with the push of a button.

TinyMCE Demo Gif

To use the plugin, simply install and activate it in TinyMCE. Once activated, you will see a new "ChatGPT" button in the TinyMCE toolbar. Click this button to open a dialog box where you can enter a request to ChatGPT. ChatGPT will then generate the requested text and enter it into your editor.

ChatGPT can be used to generate a variety of text formats, including articles, blog posts, e-mails, letters, and more. It can also be used to translate languages, write different types of creative content, and answer your questions in an informative way.

ChatGPT is a powerful tool that can help you improve your productivity and the quality of your work. Try it out today!

Features

  • 🤖 OpenAI and Custom LLM OpenAI compatible
  • ⚙️ Support custom prompts defined by you!
  • 🧑‍🎨 Generates realistic and creative text with the push of a button
  • 🧬 Can be used to generate a variety of text formats
  • 🈷️ Can be used to translate languages
  • 🙋 Can be used to answer your questions in an informative way
⚠️ WARNING
Consider this plugin as a prototype and not suitable for production deployment at this time. Use it only in controlled environments and at your own risk.

Requirements

  • TinyMCE 6|7 or later
  • OpenAI API key (get one here)
  • (Optional) Custom LLM API

Installation

This plugin comes as an external plugin; however, you can download the .js file and upload it to your host.

To install, simply edit the init configuration of your TinyMCE as follows:

tinymce.init({
  // 1. Add the plugin to the list of external plugins
  external_plugins: {
    chatgpt:
      "https://cdn.jsdelivr.net/gh/The-3Labs-Team/tinymce-chatgpt-plugin@2/dist/chatgpt.js",
  },

  // 2. Configure the ChatGPT plugin
  openai: {
    api_key: "sk-yourapikeyhere", // Your OpenAI API key
    model: "gpt-3.5-turbo",
    temperature: 0.5,
    max_tokens: 150,
    prompts: [
      "Translate from English to Italian",
      "Summarize",
      "Proofread",
      "Write a blog post about",
    ],
    // Optional: Add your custom LLM
    // baseUri: "https://your-llm-endpoint.com",
  },

  // 3. Add the ChatGPT button to the toolbar
  toolbar: ["chatgpt"],
});

If you are using our TinyMCE Laravel Nova Package 4 you can configure as follows:

<?php

'init' => [

    // 1. Add the plugin to the list of external plugins
    'external_plugins' => [
        'chatgpt' => 'https://cdn.jsdelivr.net/gh/The-3Labs-Team/tinymce-chatgpt-plugin@2/dist/chatgpt.js'
    ],

    // 2. Configure the plugin
    'openai' => [
        'api_key' => 'sk-yourapikeyhere', // Your OpenAI API key
        'model' => 'gpt-3.5-turbo',
        'temperature' => 0.5,
        'max_tokens' => 150,
        'prompts' => [
            'Translate from English to Italian',
            'Summarize',
            'Proofread',
            'Write a blog post about',
        ],
        // Optional: Add your custom LLM
        // 'base_uri' => 'https://your-llm-endpoint.com',
    ],

],

// 3. Add the plugin to the toolbar
'toolbar' => ['chatgpt']

//...

Breaking Changes from v1 to v2

  • We are using the new /chat/completions endpoint from OpenAI
  • The model default now must be a chat model, like gpt-3.5-turbo
  • If you want to use the old /completion endpoint, you can use the baseUri option to set your custom LLM endpoint like https://api.openai.com/v1/completions

Custom LLM

If you have a custom LLM, you can use it by setting the baseUri option in the configuration. The plugin will use this endpoint to generate the text.

baseUri: "https://your-llm-endpoint.com"

Please note that the custom LLM must be OpenAI compatible and follow the same API as OpenAI. Also, the custom LLM must be accessible from the client-side. Check the demo-lm-studio.html file for an example of how to use a custom LLM.