![]() |
||
---|---|---|
.changeset | ||
.github | ||
.husky | ||
examples | ||
media | ||
packages | ||
.editorconfig | ||
.env.example | ||
.eslintignore | ||
.eslintrc.json | ||
.gitignore | ||
.npmrc | ||
.prettierignore | ||
.prettierrc | ||
license | ||
package.json | ||
pnpm-lock.yaml | ||
pnpm-workspace.yaml | ||
readme.md | ||
tsup.config.ts | ||
turbo.json | ||
vite.config.ts |
readme.md
AI agent stdlib that works with any LLM and TypeScript AI SDK.
Agentic
Intro
The goal of this project is to create a set of standard AI functions / tools which are optimized for both normal TS-usage as well as LLM-based apps and work with all of the major TS AI SDKs (LangChain, LlamaIndex, Vercel AI SDK, OpenAI SDK, etc).
Agentic clients like WeatherClient
can be used as normal TS classes:
import { WeatherClient } from '@agentic/stdlib'
// Requires `process.env.WEATHER_API_KEY` (from weatherapi.com)
const weather = new WeatherClient()
const result = await weather.getCurrentWeather({
q: 'San Francisco'
})
console.log(result)
Or you can use these clients as LLM-based tools where the LLM decides when and how to invoke the underlying functions for you.
This works across all of the major AI SDKs via adapters. Here's an example using Vercel's AI SDK:
// sdk-specific imports
import { openai } from '@ai-sdk/openai'
import { generateText } from 'ai'
import { createAISDKTools } from '@agentic/ai-sdk'
// sdk-agnostic imports
import { WeatherClient } from '@agentic/stdlib'
const weather = new WeatherClient()
const result = await generateText({
model: openai('gpt-4o-mini'),
// this is the key line which uses the `@agentic/ai-sdk` adapter
tools: createAISDKTools(weather),
toolChoice: 'required',
prompt: 'What is the weather in San Francisco?'
})
console.log(result.toolResults[0])
You can use our standard library of thoroughly tested AI functions with your favorite AI SDK – without having to write any glue code!
Here's a slightly more complex example which uses multiple clients and selects a subset of their functions using the AIFunctionSet.pick
method:
// sdk-specific imports
import { ChatModel, createAIRunner } from '@dexaai/dexter'
import { createDexterFunctions } from '@agentic/dexter'
// sdk-agnostic imports
import { PerigonClient, SerperClient } from '@agentic/stdlib'
async function main() {
// Perigon is a news API and Serper is a Google search API
const perigon = new PerigonClient()
const serper = new SerperClient()
const runner = createAIRunner({
chatModel: new ChatModel({
params: { model: 'gpt-4o-mini', temperature: 0 }
}),
functions: createDexterFunctions(
perigon.functions.pick('search_news_stories'),
serper
),
systemMessage: 'You are a helpful assistant. Be as concise as possible.'
})
const result = await runner(
'Summarize the latest news stories about the upcoming US election.'
)
console.log(result)
}
Here we've exposed 2 functions to the LLM, search_news_stories
(which comes from the PerigonClient.searchStories
method) and serper_google_search
(which implicitly comes from the SerperClient.search
method).
All of the SDK adapters like createDexterFunctions
accept very flexible AIFunctionLike
objects, which include:
AIFunctionSet
- Sets of AI functions (likeperigon.functions.pick('search_news_stories')
orperigon.functions
orserper.functions
)AIFunctionsProvider
- Client classes which expose anAIFunctionSet
via the.functions
property (likeperigon
orserper
)AIFunction
- Individual functions (likeperigon.functions.get('search_news_stories')
orserper.functions.get('serper_google_search')
or AI functions created directly via thecreateAIFunction
utility function)
You can pass as many of these AIFunctionLike
objects as you'd like and you can manipulate them as AIFunctionSet
sets via .pick
, .omit
, .get
, .map
, etc.
Install
npm install @agentic/stdlib @agentic/core zod
This package is ESM only and requires Node.js >= 18
or an equivalent environment (bun, deno, CF workers, etc).
AI SDKs
Each AI SDK adapter has its own package which needs to be installed.
#### Vercel AI SDk
npm install @agentic/ai-sdk ai
import { createAISDKTools } from '@agentic/ai-sdk'
See examples/ai-sdk for a full example.
#### LangChain
npm install @agentic/langchain @langchain/core langchain
import { createLangChainTools } from '@agentic/langchain'
See examples/langchain for a full example.
#### LlamaIndex
npm install @agentic/llamaindex llamaindex
import { createLlamaIndexTools } from '@agentic/llamaindex'
See examples/llamaindex for a full example.
#### Firebase Genkit
npm install @agentic/genkit @genkit-ai/ai @genkit-ai/core
import { createGenkitTools } from '@agentic/genkit'
See examples/genkit for a full example.
#### Dexa Dexter
npm install @agentic/dexter @dexaai/dexter
import { createDexterFunctions } from '@agentic/dexter'
See examples/dexter for a full example.
#### OpenAI SDK
npm install openai
There's no need for an adapter with the OpenAI SDK since all agentic tools are compatible with OpenAI by default. You can use AIFunctionSet.specs
for function calling or AIFunctionSet.toolSpecs
for parallel tool calling.
import { WeatherClient } from '@agentic/stdlib'
import OpenAI from 'openai'
const weather = new WeatherClient()
const openai = new OpenAI()
const messages: OpenAI.ChatCompletionMessageParam[] = [
{
role: 'system',
content: 'You are a helpful assistant. Be as concise as possible.'
},
{ role: 'user', content: 'What is the weather in San Francisco?' }
]
{
// First call to OpenAI to invoke the weather tool
const res = await openai.chat.completions.create({
messages,
model: 'gpt-4o-mini',
temperature: 0,
tools: weather.functions.toolSpecs,
tool_choice: 'required'
})
const message = res.choices[0]?.message!
console.log(JSON.stringify(message, null, 2))
assert(message.tool_calls?.[0]?.function?.name === 'get_current_weather')
const fn = weather.functions.get('get_current_weather')!
assert(fn)
const toolParams = message.tool_calls[0].function.arguments
const toolResult = await fn(toolParams)
messages.push(message)
messages.push({
role: 'tool',
tool_call_id: message.tool_calls[0].id,
content: JSON.stringify(toolResult)
})
}
{
// Second call to OpenAI to generate a text response
const res = await openai.chat.completions.create({
messages,
model: 'gpt-4o-mini',
temperature: 0,
tools: weather.functions.toolSpecs
})
const message = res.choices?.[0]?.message
console.log(JSON.stringify(message, null, 2))
}
See examples/openai for a full example.
See the examples directory for working examples of how to use each of these adapters.
Optimized Imports
@agentic/stdlib
is just a convenience wrapper which re-exports all of the built-in AI tool packages. If you want to optimize your imports, you can replace @agentic/stdlib
with the specific AI tools you want. For example:
npm install @agentic/weather @agentic/core zod
import { WeatherClient } from '@agentic/weather'
Some of these individual tool packages have peer dependencies if they depend on large, external packages. If so, you'll need to install their peer deps as well.
Take e2b
, for example, which requires @e2b/code-interpreter
as a peer dep:
npm install @agentic/e2b @agentic/core zod @e2b/code-interpreter
import { e2b } from '@agentic/e2b'
[!NOTE] There is no functional difference between using
@agentic/stdlib
versus using the individual packages directly. The only difference is if you want to optimize your install size (when running on serverless functions, for instance), in which case installing and using the individual packages directly will be more efficient. The default examples use@agentic/stdlib
because it provides a simpler DX.
Services
Service | Package | Named export | Description |
---|---|---|---|
Bing | @agentic/bing |
BingClient |
Bing web search. |
Calculator | @agentic/calculator |
calculator |
Basic calculator for simple mathematical expressions. |
Clearbit | @agentic/clearbit |
ClearbitClient |
Resolving and enriching people and company datae. |
Dexa | @agentic/dexa |
DexaClient |
Answers questions from the world's best podcasters. |
Diffbot | @agentic/diffbot |
DiffbotClient |
Web page classification and scraping; person and company data enrichment. |
E2B | @agentic/e2b |
e2b |
Hosted Python code intrepreter sandbox which is really useful for data analysis, flexible code execution, and advanced reasoning on-the-fly. (peer dep @e2b/code-interpreter ) |
Exa | @agentic/exa |
ExaClient |
Web search tailored for LLMs. |
Firecrawl | @agentic/firecrawl |
FirecrawlClient |
Website scraping and sanitization. |
HackerNews | @agentic/hacker-news |
HackerNewsClient |
Official HackerNews API. |
Hunter | @agentic/hunter |
HunterClient |
Email finder, verifier, and enrichment. |
Jina | @agentic/jina |
JinaClient |
Clean URL reader and web search + URL top result reading with a generous free tier. |
Midjourney | @agentic/midjourney |
MidjourneyClient |
Unofficial Midjourney client for generative images. |
Novu | @agentic/novu |
NovuClient |
Sending notifications (email, SMS, in-app, push, etc). |
People Data Labs | @agentic/people-data-labs |
PeopleDataLabsClient |
People & company data (WIP). |
Perigon | @agentic/perigon |
PerigonClient |
Real-time news API and web content data from 140,000+ sources. Structured and enriched by AI, primed for LLMs. |
Polygon | @agentic/polygon |
PolygonClient |
Stock market and company financial data. |
PredictLeads | @agentic/predict-leads |
PredictLeadsClient |
In-depth company data including signals like fundraising events, hiring news, product launches, technologies used, etc. |
Proxycurl | @agentic/proxycurl |
ProxycurlClient |
People and company data from LinkedIn & Crunchbase. |
Searxng | @agentic/searxng |
SearxngClient |
OSS meta search engine capable of searching across many providers like Reddit, Google, Brave, Arxiv, Genius, IMDB, Rotten Tomatoes, Wikidata, Wolfram Alpha, YouTube, GitHub, etc. |
SerpAPI | @agentic/serpapi |
SerpAPIClient |
Lightweight wrapper around SerpAPI for Google search. |
Serper | @agentic/serper |
SerperClient |
Lightweight wrapper around Serper for Google search. |
Slack | @agentic/slack |
SlackClient |
Send and receive Slack messages. |
SocialData | @agentic/social-data |
SocialDataClient |
Unofficial Twitter / X client (readonly) which is much cheaper than the official Twitter API. |
Tavily | @agentic/tavily |
TavilyClient |
Web search API tailored for LLMs. |
Twilio | @agentic/twilio |
TwilioClient |
Twilio conversation API to send and receive SMS messages. |
@agentic/twitter |
TwitterClient |
Basic Twitter API methods for fetching users, tweets, and searching recent tweets. Includes support for plan-aware rate-limiting. Uses Nango for OAuth support. | |
Weather | @agentic/weather |
WeatherClient |
Basic access to current weather data based on location. |
Wikidata | @agentic/wikidata |
WikidataClient |
Basic Wikidata client. |
Wikipedia | @agentic/wikipedia |
WikipediaClient |
Wikipedia page search and summaries. |
Wolfram Alpha | @agentic/wolfram-alpha |
WolframAlphaClient |
Wolfram Alpha LLM API client for answering computational, mathematical, and scientific questions. |
Note that you can import any of these AI tools from @agentic/stdlib
OR from their individual packages. Installing and importing from their individual packages is more efficient, but it's less convenient so it isn't the default.
Client Design Philosophy
- clients should be as minimal as possible
- clients should use
ky
andzod
where possible - clients should have a strongly-typed TS DX
- clients should expose select methods via the
@aiFunction(...)
decoratorinputSchema
zod schemas should be as minimal as possible with descriptions prompt engineered specifically for use with LLMs
- clients and AIFunctions should be composable via
AIFunctionSet
- clients should work with all major TS AI SDKs
TODO
- services
- browserbase
- brave search
- phantombuster
- apify
- perplexity
- valtown
- replicate
- huggingface
- skyvern
- pull from clay
- pull from langchain
- provide a converter for langchain
DynamicStructuredTool
- provide a converter for langchain
- pull from nango
- pull from activepieces
- general openapi support ala workgpt
- compound tools / chains / flows / runnables
- market maps
- incorporate zod-validation-error
- investigate autotool
- investigate alt search engines
- investigate data connectors
- add unit tests for individual providers
Contributors
- Travis Fischer
- David Zhang
- Philipp Burckhardt
- And all of the amazing OSS contributors!
License
MIT © Travis Fischer
To stay up to date or learn more, follow @transitive_bs on Twitter.