Use your own data with large language models (LLMs, OpenAI ChatGPT and others) in JS runtime environments with TypeScript support.
Documentation: https://ts.llamaindex.ai/
Try examples online:
LlamaIndex.TS aims to be a lightweight, easy to use set of libraries to help you integrate large language models into your applications with your own data.
Multiple JS Environment SupportLlamaIndex.TS supports multiple JS environments, including:
For now, browser support is limited due to the lack of support for AsyncLocalStorage-like APIs
npm install llamaindex pnpm install llamaindex yarn add llamaindexSetup in Node.js, Deno, Bun, TypeScript...?
See our official document: https://ts.llamaindex.ai/docs/llamaindex/getting_started
In most cases, you'll also need to install provider packages to use LlamaIndexTS. These are for adding AI models, file readers for ingestion or storing documents, e.g. in vector databases.
For example, to use the OpenAI LLM, you would install the following package:
npm install @llamaindex/openai pnpm install @llamaindex/openai yarn add @llamaindex/openai
Check out our NextJS playground at https://llama-playground.vercel.app/. The source is available at https://github.com/run-llama/ts-playground
Core concepts for getting started:See our documentation: https://ts.llamaindex.ai/docs/llamaindex/getting_started/concepts
Please see our contributing guide for more information. You are highly encouraged to contribute to LlamaIndex.TS!
Please join our Discord! https://discord.com/invite/eN6D2HQ4aX
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4