Node.js client for the unofficial ChatGPT API.
 
 
 
Go to file
Travis Fischer 5f7609c0ad
Merge pull request #363 from zeke/add-cli
2023-02-19 04:34:31 -06:00
.github chore: fix templates 2022-12-18 22:09:27 -06:00
.husky feat: init 2022-12-02 17:43:59 -06:00
demos feat: add ChatGPTUnofficialProxyAPI 2023-02-19 01:56:34 -06:00
docs chore: update ts docs 2023-02-19 02:51:53 -06:00
media chore: update demo gif 2022-12-05 17:38:28 -06:00
src feat: switch from gpt-3-encoder to gpt3-tokenizer 2023-02-19 03:48:06 -06:00
.env.example feat: MAJOR BREAKING CHANGE; moved from browser to official completion API with unofficial chatgpt model 2023-02-01 03:14:10 -06:00
.gitignore feat: add proxy support 2022-12-18 14:15:14 -08:00
.npmrc feat: init 2022-12-02 17:43:59 -06:00
.prettierignore feat: add nopecha automatic captcha bypass 2022-12-17 03:12:58 -06:00
.prettierrc.cjs feat: init 2022-12-02 17:43:59 -06:00
cli.js add a CLI 2023-02-18 22:56:49 -08:00
license chore: update year to 2023 2023-01-02 16:48:37 -06:00
package-lock.json add a CLI 2023-02-18 22:56:49 -08:00
package.json Merge pull request #363 from zeke/add-cli 2023-02-19 04:34:31 -06:00
pnpm-lock.yaml feat: switch from gpt-3-encoder to gpt3-tokenizer 2023-02-19 03:48:06 -06:00
readme.md Merge pull request #359 from linjungz/main 2023-02-19 04:26:01 -06:00
tsconfig.json feat: init 2022-12-02 17:43:59 -06:00
tsup.config.ts feat: remove browser build; fix build errors 2022-12-13 20:13:31 -06:00
typedoc.json feat: init 2022-12-02 17:43:59 -06:00

readme.md

Example usage

Updates

Feb 19, 2023

We now provide three ways of accessing the unofficial ChatGPT API, all of which have tradeoffs:

Method Free? Robust? Quality?
ChatGPTAPI No Yes ☑️ Mimics ChatGPT
ChatGPTUnofficialProxyAPI Yes ☑️ Maybe Real ChatGPT
ChatGPAPIBrowser (v3) Yes No Real ChatGPT

Note: I recommend that you use either ChatGPTAPI or ChatGPTUnofficialProxyAPI.

  1. ChatGPTAPI - Uses text-davinci-003 to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat)
  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)
  3. ChatGPTAPIBrowser - (v3.5.1 of this package) Uses Puppeteer to access the official ChatGPT webapp (uses the real ChatGPT, but very flaky, heavyweight, and error prone)
Previous Updates
Feb 5, 2023

OpenAI has disabled the leaked chat model we were previously using, so we're now defaulting to text-davinci-003, which is not free.

We've found several other hidden, fine-tuned chat models, but OpenAI keeps disabling them, so we're searching for alternative workarounds.

Feb 1, 2023

This package no longer requires any browser hacks – it is now using the official OpenAI completions API with a leaked model that ChatGPT uses under the hood. 🔥

import { ChatGPTAPI } from 'chatgpt'

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY
})

const res = await api.sendMessage('Hello World!')
console.log(res.text)

Please upgrade to chatgpt@latest (at least v4.0.0). The updated version is significantly more lightweight and robust compared with previous versions. You also don't have to worry about IP issues or rate limiting.

Huge shoutout to @waylaidwanderer for discovering the leaked chat model!

If you run into any issues, we do have a pretty active Discord with a bunch of ChatGPT hackers from the Node.js & Python communities.

Lastly, please consider starring this repo and following me on twitter twitter to help support the project.

Thanks && cheers, Travis

ChatGPT API

Node.js client for the unofficial ChatGPT API.

NPM Build Status MIT License Prettier Code Formatting

Intro

This package is a Node.js wrapper around ChatGPT by OpenAI. TS batteries included.

You can use it to start building projects powered by ChatGPT like chatbots, websites, etc...

Install

npm install chatgpt

Make sure you're using node >= 18 so fetch is available (or node >= 14 if you install a fetch polyfill).

Usage

You need to pick between two methods:

Method Free? Robust? Quality?
ChatGPTAPI No Yes ☑️ Mimics ChatGPT
ChatGPTUnofficialProxyAPI Yes ☑️ Maybe Real ChatGPT
  1. ChatGPTAPI - Uses text-davinci-003 to mimic ChatGPT via the official OpenAI completions API (most robust approach, but it's not free and doesn't use a model fine-tuned for chat). You can override the model, completion params, and prompt to fully customize your bot.

  2. ChatGPTUnofficialProxyAPI - Uses an unofficial proxy server to access ChatGPT's backend API in a way that circumvents Cloudflare (uses the real ChatGPT and is pretty lightweight, but relies on a third-party server and is rate-limited)

Usage - ChatGPTAPI

Sign up for an OpenAI API key and store it in your environment.

import { ChatGPTAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTAPI({
    apiKey: process.env.OPENAI_API_KEY
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

You can override the default model (text-davinci-003) and any OpenAI completion params using completionParams:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  completionParams: {
    temperature: 0.5,
    top_p: 0.8
  }
})

If you want to track the conversation, you'll need to pass the parentMessageid and conversationid:

const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

// send a message and wait for the response
let res = await api.sendMessage('What is OpenAI?')
console.log(res.text)

// send a follow-up
res = await api.sendMessage('Can you expand on that?', {
  conversationId: res.conversationId,
  parentMessageId: res.id
})
console.log(res.text)

// send another follow-up
res = await api.sendMessage('What were we talking about?', {
  conversationId: res.conversationId,
  parentMessageId: res.id
})
console.log(res.text)

You can add streaming via the onProgress handler:

const res = await api.sendMessage('Write a 500 word essay on frogs.', {
  // print the partial response as the AI is "typing"
  onProgress: (partialResponse) => console.log(partialResponse.text)
})

// print the full text at the end
console.log(res.text)

You can add a timeout using the timeoutMs option:

// timeout after 2 minutes (which will also abort the underlying HTTP request)
const response = await api.sendMessage(
  'write me a really really long essay on frogs',
  {
    timeoutMs: 2 * 60 * 1000
  }
)

If you want to see more info about what's actually being sent to OpenAI's completions API, set the debug: true option in the ChatGPTAPI constructor:

const api = new ChatGPTAPI({
  apiKey: process.env.OPENAI_API_KEY,
  debug: true
})

You'll notice that we're using a reverse-engineered promptPrefix and promptSuffix. You can customize these via the sendMessage options:

const res = await api.sendMessage('what is the answer to the universe?', {
  promptPrefix: `You are ChatGPT, a large language model trained by OpenAI. You answer as concisely as possible for each responseIf you are generating a list, do not have too many items.
Current date: ${new Date().toISOString()}\n\n`
})

Note that we automatically handle appending the previous messages to the prompt and attempt to optimize for the available tokens (which defaults to 4096).

Usage in CommonJS (Dynamic import)
async function example() {
  // To use ESM in CommonJS, you can use a dynamic import
  const { ChatGPTAPI } = await import('chatgpt')

  const api = new ChatGPTAPI({ apiKey: process.env.OPENAI_API_KEY })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

Usage - ChatGPTUnofficialProxyAPI

The API is almost exactly the same for the ChatGPTUnofficialProxyAPI; you just need to provide a ChatGPT accessToken instead of an OpenAI API key.

import { ChatGPTUnofficialProxyAPI } from 'chatgpt'

async function example() {
  const api = new ChatGPTUnofficialProxyAPI({
    accessToken: process.env.OPENAI_ACCESS_TOKEN
  })

  const res = await api.sendMessage('Hello World!')
  console.log(res.text)
}

See demos/demo-reverse-proxy for a full example:

npx tsx demos/demo-reverse-proxy.ts

Reverse Proxies

You can override the reverse proxy by passing apiReverseProxyUrl to ChatGPTUnofficialProxyAPI:

const api = new ChatGPTUnofficialProxyAPI({
  accessToken: process.env.OPENAI_ACCESS_TOKEN,
  apiReverseProxyUrl: 'https://your-example-server.com/api/conversation'
})

Known reverse proxies run by community members include:

Reverse Proxy URL Author Rate Limits Last Checked
https://chat.duti.tech/api/conversation @acheong08 50 req/min 2/19/2023
https://gpt.pawan.krd/backend-api/conversation @PawanOsman ? 2/19/2023

Access Tokens

To use ChatGPTUnofficialProxyAPI, you'll need a ChatGPT access token. You can either:

  1. Use acheong08/OpenAIAuth, which is a python script to login and get an access token automatically. This works with email + password accounts (e.g., it does not support accounts where you auth using Microsoft / Google).

  2. You can manually get an accessToken by logging in to the ChatGPT webapp and then opening https://chat.openai.com/api/auth/session, which will return a JSON object containing your accessToken string.

Note: using a reverse proxy will expose your access token to a third-party. There shouldn't be any adverse effects possible from this, but please consider the risks before using this method.

Docs

See the auto-generated docs for more info on methods and parameters.

Demos

Most of the demos use ChatGPTAPI. It should be pretty easy to convert them to use ChatGPTUnofficialProxyAPI if you'd rather use that approach. The only thing that needs to change is how you initialize the api with an accessToken instead of an apiKey.

To run the included demos:

  1. clone repo
  2. install node deps
  3. set OPENAI_API_KEY in .env

A basic demo is included for testing purposes:

npx tsx demos/demo.ts

A demo showing on progress handler:

npx tsx demos/demo-on-progress.ts

The on progress demo uses the optional onProgress parameter to sendMessage to receive intermediary results as ChatGPT is "typing".

A conversation demo:

npx tsx demos/demo-conversation.ts

A persistence demo shows how to store messages in Redis for persistence:

npx tsx demos/demo-persistence.ts

Any keyv adaptor is supported for persistence, and there are overrides if you'd like to use a different way of storing / retrieving messages.

Note that persisting message is required for remembering the context of previous conversations beyond the scope of the current Node.js process, since by default, we only store messages in memory. Here's an external demo of using a completely custom database solution to persist messages.

Note: Persistence is handled automatically when using ChatGPTUnofficialProxyAPI because it is connecting indirectly to ChatGPT.

Projects

All of these awesome projects are built using the chatgpt package. 🤯

If you create a cool integration, feel free to open a PR and add it to the list.

Compatibility

  • This package is ESM-only.
  • This package supports node >= 14.
  • This module assumes that fetch is installed.
    • In node >= 18, it's installed by default.
    • In node < 18, you need to install a polyfill like unfetch/polyfill (guide) or isomorphic-fetch (guide).
  • If you want to build a website using chatgpt, we recommend using it only from your backend API

Credits

License

MIT © Travis Fischer

If you found this project interesting, please consider sponsoring me or following me on twitter twitter