kopia lustrzana https://github.com/transitive-bullshit/chatgpt-api
pull/643/head^2
rodzic
594c8a0053
commit
efa568f54d
24
readme.md
24
readme.md
|
@ -3,7 +3,7 @@
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
<em>AI agent stdlib that works with any TypeScript AI SDK and LLM</em>
|
<em>AI agent stdlib that works with any LLM and TypeScript AI SDK</em>
|
||||||
</p>
|
</p>
|
||||||
|
|
||||||
<p align="center">
|
<p align="center">
|
||||||
|
@ -15,12 +15,12 @@
|
||||||
|
|
||||||
# Agentic <!-- omit from toc -->
|
# Agentic <!-- omit from toc -->
|
||||||
|
|
||||||
> [!WARNING]
|
> [!WARNING]
|
||||||
> TODO: this project is not published yet and is an active WIP.
|
> TODO: this project is not published yet and is an active WIP.
|
||||||
|
|
||||||
The goal of this project is to create a **set of standard AI functions / tools** which are **optimized for both normal TS-usage as well as LLM-based apps** and that work with all of the major AI SDKs (LangChain, LlamaIndex, Vercel AI SDK, OpenAI SDK, etc).
|
The goal of this project is to create a **set of standard AI functions / tools** which are **optimized for both normal TS-usage as well as LLM-based apps** and that work with all of the major AI SDKs (LangChain, LlamaIndex, Vercel AI SDK, OpenAI SDK, etc).
|
||||||
|
|
||||||
For example, stdlib clients like `WeatherClient` can be used normally as a fully-typed TS client:
|
For example, stdlib clients like `WeatherClient` can be used as normal TS classes:
|
||||||
|
|
||||||
```ts
|
```ts
|
||||||
import { WeatherClient } from '@agentic/stdlib'
|
import { WeatherClient } from '@agentic/stdlib'
|
||||||
|
@ -33,9 +33,9 @@ const result = await weather.getCurrentWeather({
|
||||||
console.log(result)
|
console.log(result)
|
||||||
```
|
```
|
||||||
|
|
||||||
Or you can use this same function an LLM-based tool which works across all of the major AI SDKs via adaptors.
|
Or you can use them as LLM-based tools where the LLM decides when and how to invoke the underlying functions for you.
|
||||||
|
|
||||||
Here's an example using [Vercel's AI SDK](https://github.com/vercel/ai):
|
This works across all of the major AI SDKs via adaptors. Here's an example using [Vercel's AI SDK](https://github.com/vercel/ai):
|
||||||
|
|
||||||
```ts
|
```ts
|
||||||
// sdk-specific imports
|
// sdk-specific imports
|
||||||
|
@ -50,7 +50,7 @@ const weather = new WeatherClient()
|
||||||
|
|
||||||
const result = await generateText({
|
const result = await generateText({
|
||||||
model: openai('gpt-4o'),
|
model: openai('gpt-4o'),
|
||||||
// this is the key line which uses the ai-sdk adaptor
|
// this is the key line which uses the `@agentic/stdlib/ai-sdk` adaptor
|
||||||
tools: createAISDKTools(weather),
|
tools: createAISDKTools(weather),
|
||||||
toolChoice: 'required',
|
toolChoice: 'required',
|
||||||
prompt: 'What is the weather in San Francisco?'
|
prompt: 'What is the weather in San Francisco?'
|
||||||
|
@ -59,7 +59,7 @@ const result = await generateText({
|
||||||
console.log(result.toolResults[0])
|
console.log(result.toolResults[0])
|
||||||
```
|
```
|
||||||
|
|
||||||
You can use all of our thoroughly tested stdlib AI functions with your favorite AI SDK – without having to write any glue code!
|
You can use our standard library of thoroughly tested AI functions with your favorite AI SDK – without having to write any glue code!
|
||||||
|
|
||||||
Here's a slightly more complex example which uses multiple clients and selects a subset of their functions using the `AIFunctionSet.pick` method:
|
Here's a slightly more complex example which uses multiple clients and selects a subset of their functions using the `AIFunctionSet.pick` method:
|
||||||
|
|
||||||
|
@ -72,6 +72,7 @@ import { createDexterFunctions } from '@agentic/stdlib/dexter'
|
||||||
import { PerigonClient, SerperClient } from '@agentic/stdlib'
|
import { PerigonClient, SerperClient } from '@agentic/stdlib'
|
||||||
|
|
||||||
async function main() {
|
async function main() {
|
||||||
|
// Perigon is a news API and Serper is a Google search API
|
||||||
const perigon = new PerigonClient()
|
const perigon = new PerigonClient()
|
||||||
const serper = new SerperClient()
|
const serper = new SerperClient()
|
||||||
|
|
||||||
|
@ -131,14 +132,16 @@ All heavy third-party imports are isolated as _optional peer dependencies_ to ke
|
||||||
| [Slack](https://api.slack.com/docs) | `SlackClient` | Send and receive Slack messages. |
|
| [Slack](https://api.slack.com/docs) | `SlackClient` | Send and receive Slack messages. |
|
||||||
| [Tavily](https://tavily.com) | `TavilyClient` | Web search API tailored for LLMs. 🔥 |
|
| [Tavily](https://tavily.com) | `TavilyClient` | Web search API tailored for LLMs. 🔥 |
|
||||||
| [Twilio](https://www.twilio.com/docs/conversations/api) | `TwilioClient` | Twilio conversation API to send and receive SMS messages. |
|
| [Twilio](https://www.twilio.com/docs/conversations/api) | `TwilioClient` | Twilio conversation API to send and receive SMS messages. |
|
||||||
| [Twitter](https://developer.x.com/en/docs/twitter-api) | `TwitterClient` | Basic Twitter API methods for fetching users, tweets, and searching recent tweets. Includes support for plan-aware rate-limiting. |
|
| [Twitter](https://developer.x.com/en/docs/twitter-api) | `TwitterClient` | Basic Twitter API methods for fetching users, tweets, and searching recent tweets. Includes support for plan-aware rate-limiting. Uses [Nango](https://www.nango.dev) for OAuth support. |
|
||||||
| [WeatherAPI](https://www.weatherapi.com) | `WeatherClient` | Basic access to current weather data based on location. |
|
| [WeatherAPI](https://www.weatherapi.com) | `WeatherClient` | Basic access to current weather data based on location. |
|
||||||
| [Wikipedia](https://www.mediawiki.org/wiki/API) | `WikipediaClient` | Wikipedia page search and summaries. |
|
| [Wikipedia](https://www.mediawiki.org/wiki/API) | `WikipediaClient` | Wikipedia page search and summaries. |
|
||||||
| [Wolfram Alpha](https://products.wolframalpha.com/llm-api/documentation) | `WolframAlphaClient` | Wolfram Alpha LLM API client for answering computational, mathematical, and scientific questions. |
|
| [Wolfram Alpha](https://products.wolframalpha.com/llm-api/documentation) | `WolframAlphaClient` | Wolfram Alpha LLM API client for answering computational, mathematical, and scientific questions. |
|
||||||
|
|
||||||
|
Note that many of these clients expose multiple AI functions.
|
||||||
|
|
||||||
## Compound Tools
|
## Compound Tools
|
||||||
|
|
||||||
- search and scrape
|
- `SearchAndCrawl`
|
||||||
|
|
||||||
## AI SDKs
|
## AI SDKs
|
||||||
|
|
||||||
|
@ -161,6 +164,7 @@ All heavy third-party imports are isolated as _optional peer dependencies_ to ke
|
||||||
- clients should use `ky` and `zod` where possible
|
- clients should use `ky` and `zod` where possible
|
||||||
- clients should have a strongly-typed TS DX
|
- clients should have a strongly-typed TS DX
|
||||||
- clients should expose select methods via the `@aiFunction(...)` decorator
|
- clients should expose select methods via the `@aiFunction(...)` decorator
|
||||||
|
- `inputSchema` zod schemas should be as minimal as possible with descriptions prompt engineered specifically for use with LLMs
|
||||||
- clients and AIFunctions should be composable via `AIFunctionSet`
|
- clients and AIFunctions should be composable via `AIFunctionSet`
|
||||||
- clients should work with all major TS AI SDKs
|
- clients should work with all major TS AI SDKs
|
||||||
- SDK adaptors should be as lightweight as possible and be optional peer dependencies of `@agentic/stdlib`
|
- SDK adaptors should be as lightweight as possible and be optional peer dependencies of `@agentic/stdlib`
|
||||||
|
@ -171,7 +175,9 @@ All heavy third-party imports are isolated as _optional peer dependencies_ to ke
|
||||||
- sdks
|
- sdks
|
||||||
- modelfusion
|
- modelfusion
|
||||||
- services
|
- services
|
||||||
|
- [phantombuster](https://phantombuster.com)
|
||||||
- perplexity
|
- perplexity
|
||||||
|
- valtown
|
||||||
- replicate
|
- replicate
|
||||||
- huggingface
|
- huggingface
|
||||||
- [skyvern](https://github.com/Skyvern-AI/skyvern)
|
- [skyvern](https://github.com/Skyvern-AI/skyvern)
|
||||||
|
|
|
@ -1,12 +1,19 @@
|
||||||
import defaultKy, { type KyInstance } from 'ky'
|
import defaultKy, { type KyInstance } from 'ky'
|
||||||
|
import pThrottle from 'p-throttle'
|
||||||
import { z } from 'zod'
|
import { z } from 'zod'
|
||||||
|
|
||||||
import { aiFunction, AIFunctionsProvider } from '../fns.js'
|
import { aiFunction, AIFunctionsProvider } from '../fns.js'
|
||||||
import { assert, getEnv, pruneNullOrUndefined } from '../utils.js'
|
import { assert, getEnv, pruneNullOrUndefined, throttleKy } from '../utils.js'
|
||||||
|
|
||||||
export namespace tavily {
|
export namespace tavily {
|
||||||
export const API_BASE_URL = 'https://api.tavily.com'
|
export const API_BASE_URL = 'https://api.tavily.com'
|
||||||
|
|
||||||
|
// Allow up to 20 requests per minute by default.
|
||||||
|
export const throttle = pThrottle({
|
||||||
|
limit: 20,
|
||||||
|
interval: 60 * 1000
|
||||||
|
})
|
||||||
|
|
||||||
export interface SearchOptions {
|
export interface SearchOptions {
|
||||||
/** Search query. (required) */
|
/** Search query. (required) */
|
||||||
query: string
|
query: string
|
||||||
|
@ -86,10 +93,12 @@ export class TavilyClient extends AIFunctionsProvider {
|
||||||
constructor({
|
constructor({
|
||||||
apiKey = getEnv('TAVILY_API_KEY'),
|
apiKey = getEnv('TAVILY_API_KEY'),
|
||||||
apiBaseUrl = tavily.API_BASE_URL,
|
apiBaseUrl = tavily.API_BASE_URL,
|
||||||
|
throttle = true,
|
||||||
ky = defaultKy
|
ky = defaultKy
|
||||||
}: {
|
}: {
|
||||||
apiKey?: string
|
apiKey?: string
|
||||||
apiBaseUrl?: string
|
apiBaseUrl?: string
|
||||||
|
throttle?: boolean
|
||||||
ky?: KyInstance
|
ky?: KyInstance
|
||||||
} = {}) {
|
} = {}) {
|
||||||
assert(
|
assert(
|
||||||
|
@ -101,7 +110,9 @@ export class TavilyClient extends AIFunctionsProvider {
|
||||||
this.apiKey = apiKey
|
this.apiKey = apiKey
|
||||||
this.apiBaseUrl = apiBaseUrl
|
this.apiBaseUrl = apiBaseUrl
|
||||||
|
|
||||||
this.ky = ky.extend({
|
const throttledKy = throttle ? throttleKy(ky, tavily.throttle) : ky
|
||||||
|
|
||||||
|
this.ky = throttledKy.extend({
|
||||||
prefixUrl: this.apiBaseUrl
|
prefixUrl: this.apiBaseUrl
|
||||||
})
|
})
|
||||||
}
|
}
|
||||||
|
|
|
@ -7,6 +7,9 @@ import { SerpAPIClient } from '../services/serpapi-client.js'
|
||||||
import { isValidCrawlableUrl, normalizeUrl } from '../url-utils.js'
|
import { isValidCrawlableUrl, normalizeUrl } from '../url-utils.js'
|
||||||
import { omit, pick } from '../utils.js'
|
import { omit, pick } from '../utils.js'
|
||||||
|
|
||||||
|
// TODO: allow `search` tool to support other search clients
|
||||||
|
// (e.g. Bing, Exa, Searxng, Serper, Tavily)
|
||||||
|
|
||||||
export class SearchAndCrawl extends AIFunctionsProvider {
|
export class SearchAndCrawl extends AIFunctionsProvider {
|
||||||
readonly serpapi: SerpAPIClient
|
readonly serpapi: SerpAPIClient
|
||||||
readonly diffbot: DiffbotClient
|
readonly diffbot: DiffbotClient
|
||||||
|
@ -21,7 +24,7 @@ export class SearchAndCrawl extends AIFunctionsProvider {
|
||||||
@aiFunction({
|
@aiFunction({
|
||||||
name: 'search_and_crawl',
|
name: 'search_and_crawl',
|
||||||
description:
|
description:
|
||||||
'Uses Google to search the web, crawls the results, and then summarizes the most relevant results.',
|
'Uses Google to search the web, crawls the results, and then summarizes the most relevant results. Useful for creating in-depth summaries of topics along with sources.',
|
||||||
inputSchema: z.object({
|
inputSchema: z.object({
|
||||||
query: z.string().describe('search query')
|
query: z.string().describe('search query')
|
||||||
})
|
})
|
||||||
|
|
Ładowanie…
Reference in New Issue