LangChain Integration
As a provider of large language models (LLMs), Generative AI service has an integration with LangChain.
- About LangChain
-
LangChain is an open source modular framework for creating applications from large language models (LLMs). You can use LangChain to build chatbots, analyze text, perform Q&A from structured data, interact with APIs, and create applications that use generative AI.
LangChain has six modules for building applications:
- Model I/O: An interface to interact with the LLMs.
- Data connection: An interface to ingest data from data sources.
- Chains: Hard-coded sequence of actions (steps) that create calls to LLMs or tools.
- Agents: Engines that decide which actions to take and in which order.
- Memory: A component that retains the application's state between runs of a chain.
- Callbacks: A component that logs and streams intermediate steps of the chains.
- LangChain Integrations with OCI Generative AI
-
LangChain integrations are LangChain adapters to 3rd party services. OCI Generative AI has a LangChain integration that's supported for Python.
You can use the following OCI Generative AI models with LangChain:
- Pretrained out-of-the box Cohere models
- Fine-tuned custom models