A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://python.langchain.com/docs/integrations/llms/oci_model_deployment_endpoint below:

OCI Data Science Model Deployment Endpoint

OCI Data Science Model Deployment Endpoint

OCI Data Science is a fully managed and serverless platform for data science teams to build, train, and manage machine learning models in the Oracle Cloud Infrastructure.

For the latest updates, examples and experimental features, please see ADS LangChain Integration.

This notebooks goes over how to use an LLM hosted on a OCI Data Science Model Deployment.

For authentication, the oracle-ads library is used to automatically load credentials required for invoking the endpoint.

Prerequisite Deploy model

You can easily deploy, fine-tune, and evaluate foundation models using the AI Quick Actions on OCI Data Science Model deployment. For additional deployment examples, please visit the Oracle GitHub samples repository.

Policies

Make sure to have the required policies to access the OCI Data Science Model Deployment endpoint.

Set up

After having deployed model, you have to set up following required parameters of the call:

Authentication

You can set authentication through either ads or environment variables. When you are working in OCI Data Science Notebook Session, you can leverage resource principal to access other OCI resources. Check out here to see more options.

Examples
import ads
from langchain_community.llms import OCIModelDeploymentLLM





ads.set_auth("resource_principal")






llm = OCIModelDeploymentLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
model="odsc-llm",
)


llm.invoke("Who is the first president of United States?")
import ads
from langchain_community.llms import OCIModelDeploymentVLLM





ads.set_auth("resource_principal")





llm = OCIModelDeploymentVLLM(
endpoint="https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict",
)


llm.invoke("Who is the first president of United States?")
import os

from langchain_community.llms import OCIModelDeploymentTGI





os.environ["OCI_IAM_TYPE"] = "api_key"
os.environ["OCI_CONFIG_PROFILE"] = "default"
os.environ["OCI_CONFIG_LOCATION"] = "~/.oci"



os.environ["OCI_LLM_ENDPOINT"] = (
"https://modeldeployment.<region>.oci.customer-oci.com/<md_ocid>/predict"
)




llm = OCIModelDeploymentTGI()


llm.invoke("Who is the first president of United States?")
Asynchronous calls
await llm.ainvoke("Tell me a joke.")
Streaming calls
for chunk in llm.stream("Tell me a joke."):
print(chunk, end="", flush=True)
API reference

For comprehensive details on all features and configurations, please refer to the API reference documentation for each class:


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4