2.9 KiB
chatgpt / Exports / ChatGPTUnofficialProxyAPI
Class: ChatGPTUnofficialProxyAPI
Table of contents
Constructors
Accessors
Methods
Constructors
constructor
• new ChatGPTUnofficialProxyAPI(opts
)
Parameters
Name | Type | Description |
---|---|---|
opts |
Object |
- |
opts.accessToken |
string |
- |
opts.apiReverseProxyUrl? |
string |
Default Value https://chat.openai.com/backend-api/conversation * |
opts.debug? |
boolean |
Default Value false * |
opts.fetch? |
(input : RequestInfo | URL , init? : RequestInit ) => Promise <Response > |
- |
opts.headers? |
Record <string , string > |
Default Value undefined * |
opts.model? |
string |
Default Value text-davinci-002-render-sha * |
Defined in
src/chatgpt-unofficial-proxy-api.ts:20
Accessors
accessToken
• get
accessToken(): string
Returns
string
Defined in
src/chatgpt-unofficial-proxy-api.ts:66
• set
accessToken(value
): void
Parameters
Name | Type |
---|---|
value |
string |
Returns
void
Defined in
src/chatgpt-unofficial-proxy-api.ts:70
Methods
sendMessage
▸ sendMessage(text
, opts?
): Promise
<ChatMessage
>
Sends a message to ChatGPT, waits for the response to resolve, and returns the response.
If you want your response to have historical context, you must provide a valid parentMessageId
.
If you want to receive a stream of partial responses, use opts.onProgress
.
If you want to receive the full response, including message and conversation IDs,
you can use opts.onConversationResponse
or use the ChatGPTAPI.getConversation
helper.
Set debug: true
in the ChatGPTAPI
constructor to log more info on the full prompt sent to the OpenAI completions API. You can override the promptPrefix
and promptSuffix
in opts
to customize the prompt.
Parameters
Name | Type |
---|---|
text |
string |
opts |
SendMessageBrowserOptions |
Returns
Promise
<ChatMessage
>
The response from ChatGPT