kopia lustrzana https://github.com/transitive-bullshit/chatgpt-api
docs: update readme for new version
rodzic
a5f1f20f9a
commit
2937409f15
50
readme.md
50
readme.md
|
@ -17,9 +17,10 @@ const api = new ChatGPTAPIBrowser({
|
||||||
email: process.env.OPENAI_EMAIL,
|
email: process.env.OPENAI_EMAIL,
|
||||||
password: process.env.OPENAI_PASSWORD
|
password: process.env.OPENAI_PASSWORD
|
||||||
})
|
})
|
||||||
await api.init()
|
await api.initSession()
|
||||||
|
|
||||||
const response = await api.sendMessage('Hello World!')
|
const result = await api.sendMessage('Hello World!')
|
||||||
|
console.log(result.response)
|
||||||
```
|
```
|
||||||
|
|
||||||
Note that this solution is not lightweight, but it does work a lot more consistently than the REST API-based versions. I'm currently using this solution to power 10 OpenAI accounts concurrently across 10 minimized Chrome windows for my [Twitter bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot). 😂
|
Note that this solution is not lightweight, but it does work a lot more consistently than the REST API-based versions. I'm currently using this solution to power 10 OpenAI accounts concurrently across 10 minimized Chrome windows for my [Twitter bot](https://github.com/transitive-bullshit/chatgpt-twitter-bot). 😂
|
||||||
|
@ -86,15 +87,13 @@ async function example() {
|
||||||
})
|
})
|
||||||
|
|
||||||
const api = new ChatGPTAPI({ ...openAIAuth })
|
const api = new ChatGPTAPI({ ...openAIAuth })
|
||||||
await api.ensureAuth()
|
await api.initSession()
|
||||||
|
|
||||||
// send a message and wait for the response
|
// send a message and wait for the response
|
||||||
const response = await api.sendMessage(
|
const result = await api.sendMessage('Write a python version of bubble sort.')
|
||||||
'Write a python version of bubble sort.'
|
|
||||||
)
|
|
||||||
|
|
||||||
// response is a markdown-formatted string
|
// result.response is a markdown-formatted string
|
||||||
console.log(response)
|
console.log(result.response)
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -110,10 +109,10 @@ async function example() {
|
||||||
password: process.env.OPENAI_PASSWORD
|
password: process.env.OPENAI_PASSWORD
|
||||||
})
|
})
|
||||||
|
|
||||||
await api.init()
|
await api.initSession()
|
||||||
|
|
||||||
const response = await api.sendMessage('Hello World!')
|
const result = await api.sendMessage('Hello World!')
|
||||||
console.log(response)
|
console.log(result.response)
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
@ -123,21 +122,30 @@ ChatGPT responses are formatted as markdown by default. If you want to work with
|
||||||
const api = new ChatGPTAPI({ ...openAIAuth, markdown: false })
|
const api = new ChatGPTAPI({ ...openAIAuth, markdown: false })
|
||||||
```
|
```
|
||||||
|
|
||||||
If you want to automatically track the conversation, you can use `ChatGPTAPI.getConversation()`:
|
If you want to track the conversation, use the `conversationId` and `messageId` in the result object, and pass them to `sendMessage` as `conversationId` and `parentMessageId` respectively.
|
||||||
|
|
||||||
```ts
|
```ts
|
||||||
const api = new ChatGPTAPI({ ...openAIAuth, markdown: false })
|
const api = new ChatGPTAPI({ ...openAIAuth, markdown: false })
|
||||||
|
await api.initSession()
|
||||||
const conversation = api.getConversation()
|
|
||||||
|
|
||||||
// send a message and wait for the response
|
// send a message and wait for the response
|
||||||
const response0 = await conversation.sendMessage('What is OpenAI?')
|
let res = await conversation.sendMessage('What is OpenAI?')
|
||||||
|
console.log(res.response)
|
||||||
|
|
||||||
// send a follow-up
|
// send a follow-up
|
||||||
const response1 = await conversation.sendMessage('Can you expand on that?')
|
res = await conversation.sendMessage('Can you expand on that?', {
|
||||||
|
conversationId: res.conversationId,
|
||||||
|
parentMessageId: res.messageId
|
||||||
|
})
|
||||||
|
console.log(res.response)
|
||||||
|
|
||||||
// send another follow-up
|
// send another follow-up
|
||||||
const response2 = await conversation.sendMessage('Oh cool; thank you')
|
// send a follow-up
|
||||||
|
res = await conversation.sendMessage('What were we talking about?', {
|
||||||
|
conversationId: res.conversationId,
|
||||||
|
parentMessageId: res.messageId
|
||||||
|
})
|
||||||
|
console.log(res.response)
|
||||||
```
|
```
|
||||||
|
|
||||||
Sometimes, ChatGPT will hang for an extended period of time before beginning to respond. This may be due to rate limiting or it may be due to OpenAI's servers being overloaded.
|
Sometimes, ChatGPT will hang for an extended period of time before beginning to respond. This may be due to rate limiting or it may be due to OpenAI's servers being overloaded.
|
||||||
|
@ -151,8 +159,6 @@ const response = await api.sendMessage('this is a timeout test', {
|
||||||
})
|
})
|
||||||
```
|
```
|
||||||
|
|
||||||
You can stream responses using the `onProgress` or `onConversationResponse` callbacks. See the [docs](./docs/classes/ChatGPTAPI.md) for more details.
|
|
||||||
|
|
||||||
<details>
|
<details>
|
||||||
<summary>Usage in CommonJS (Dynamic import)</summary>
|
<summary>Usage in CommonJS (Dynamic import)</summary>
|
||||||
|
|
||||||
|
@ -167,10 +173,10 @@ async function example() {
|
||||||
})
|
})
|
||||||
|
|
||||||
const api = new ChatGPTAPI({ ...openAIAuth })
|
const api = new ChatGPTAPI({ ...openAIAuth })
|
||||||
await api.ensureAuth()
|
await api.initSession()
|
||||||
|
|
||||||
const response = await api.sendMessage('Hello World!')
|
const result = await api.sendMessage('Hello World!')
|
||||||
console.log(response)
|
console.log(result)
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|
Ładowanie…
Reference in New Issue