A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://cloud.google.com/vertex-ai/generative-ai/docs/models/gemini-use-supervised-tuning below:

Tune Gemini models by using supervised fine-tuning | Generative AI on Vertex AI

Skip to main content Tune Gemini models by using supervised fine-tuning

Stay organized with collections Save and categorize content based on your preferences.

This document describes how to tune a Gemini model by using supervised fine-tuning.

Before you begin

Before you begin, you must prepare a supervised fine-tuning dataset. Depending on your use case, there are different requirements.

Supported models

The following Gemini models support supervised tuning:

Create a tuning job

You can create a supervised fine-tuning job by using the Google Cloud console, the Google Gen AI SDK, the Vertex AI SDK for Python, the REST API, or Colab Enterprise:

Console

To tune a text model with supervised fine-tuning by using the Google Cloud console, perform the following steps:

  1. In the Vertex AI section of the Google Cloud console, go to the Vertex AI Studio page.

    Go to Vertex AI Studio

  2. Click Create tuned model.

  3. Under Model details, configure the following:

    1. In the Tuned model name field, enter a name for your new tuned model, up to 128 characters.
    2. In the Base model field, select gemini-2.5-flash.
    3. In the Region drop-down field, Select the region where the pipeline tuning job runs and where the tuned model is deployed.
  4. Under Tuning setting, configure the following:

    1. In the Number of epochs field, enter the number of steps to run for model tuning.
    2. In the Adapter Size field, enter the adapter size to use for model tuning.
    3. In the Learning rate multiplier field, enter the step size at each iteration. The default value is 1. .
  5. Optional: To disable intermediate checkpoints and use only the latest checkpoint, click the Export last checkpoint only toggle.

  6. Click Continue.

    The Tuning dataset page opens.

  7. To upload a dataset file, select one of the following:

    1. If you haven't uploaded a dataset yet, select the radio button for Upload file to Cloud Storage.
    2. In the Select JSONL file field, click Browse and select your dataset file.
    3. In the Dataset location field, click Browse and select the Cloud Storage bucket where you want to store your dataset file.
    4. If your dataset file is already in a Cloud Storage bucket, select the radio button for Existing file on Cloud Storage.
    5. In Cloud Storage file path field, click Browse and select the Cloud Storage bucket where your dataset file is located.
  8. (Optional) To get validation metrics during training, click the Enable model validation toggle.

    1. In the Validation dataset file, enter the Cloud Storage path of your validation dataset.
  9. Click Start Tuning.

    Your new model appears under the Gemini Pro tuned models section on the Tune and Distill page. When the model is finished tuning, the Status says Succeeded.

Google Gen AI SDK Vertex AI SDK for Python REST

To create a model tuning job, send a POST request by using the tuningJobs.create method. Some of the parameters are not supported by all of the models. Ensure that you include only the applicable parameters for the model that you're tuning.

Before using any of the request data, make the following replacements:

HTTP method and URL:

POST https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs

Request JSON body:

{
  "baseModel": "BASE_MODEL",
  "supervisedTuningSpec" : {
      "trainingDatasetUri": "TRAINING_DATASET_URI",
      "validationDatasetUri": "VALIDATION_DATASET_URI",
      "hyperParameters": {
          "epochCount": "EPOCH_COUNT",
          "adapterSize": "ADAPTER_SIZE",
          "learningRateMultiplier": "LEARNING_RATE_MULTIPLIER"
      },
      "export_last_checkpoint_only": EXPORT_LAST_CHECKPOINT_ONLY,
  },
  "tunedModelDisplayName": "TUNED_MODEL_DISPLAYNAME",
  "encryptionSpec": {
    "kmsKeyName": "KMS_KEY_NAME"
  },
  "serviceAccount": "SERVICE_ACCOUNT"
}

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Save the request body in a file named request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Save the request body in a file named request.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response
{
  "name": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID",
  "createTime": CREATE_TIME,
  "updateTime": UPDATE_TIME,
  "status": "STATUS",
  "supervisedTuningSpec": {
        "trainingDatasetUri": "TRAINING_DATASET_URI",
        "validationDatasetUri": "VALIDATION_DATASET_URI",
        "hyperParameters": {
            "epochCount": EPOCH_COUNT,
            "adapterSize": "ADAPTER_SIZE",
            "learningRateMultiplier": LEARNING_RATE_MULTIPLIER
        },
    },
  "tunedModelDisplayName": "TUNED_MODEL_DISPLAYNAME",
  "encryptionSpec": {
    "kmsKeyName": "KMS_KEY_NAME"
  },
  "serviceAccount": "SERVICE_ACCOUNT"
}
Example curl command
PROJECT_ID=myproject
LOCATION=global
curl \
-X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
"https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/tuningJobs" \
-d \
$'{
   "baseModel": "gemini-2.5-flash",
   "supervisedTuningSpec" : {
      "training_dataset_uri": "gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_train_data.jsonl",
      "validation_dataset_uri": "gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_validation_data.jsonl"
   },
   "tunedModelDisplayName": "tuned_gemini"
}'
Colab Enterprise

You can create a model tuning job in Vertex AI by using the side panel in Colab Enterprise. The side panel adds the relevant code snippets to your notebook. Then, you modify the code snippets and run them to create your tuning job. To learn more about using the side panel with your Vertex AI tuning jobs, see Interact with Vertex AI to tune a model.

  1. In the Google Cloud console, go to the Colab Enterprise My notebooks page.

    Go to My notebooks

  2. In the Region menu, select the region that contains your notebook.

  3. Click the notebook that you want to open. If you haven't created a notebook yet, create a notebook.

  4. To the right of your notebook, in the side panel, click the  Tuning button.

    The side panel expands the Tuning tab.

  5. Click the Tune a Gemini model button.

    Colab Enterprise adds code cells to your notebook for tuning a Gemini model.

  6. In your notebook, find the code cell that stores parameter values. You'll use these parameters to interact with Vertex AI.

  7. Update the values for the following parameters:

  8. In the next code cell, update the model tuning parameters:

  9. Run the code cells that the side panel added to your notebook.

  10. After the last code cell runs, click the  View tuning job button that appears.

  11. The side panel shows information about your model tuning job.

  12. After the tuning job has completed, you can go directly from the Tuning details tab to a page where you can test your model. Click Test.

    The Google Cloud console opens to the Vertex AI Text chat page, where you can test your model.

Tuning hyperparameters

It's recommended to submit your first tuning job without changing the hyperparameters. The default value is the recommended value based on our benchmarking results to yield the best model output quality.

For a discussion of best practices for supervised fine-tuning, see the blog post Supervised Fine Tuning for Gemini: A best practices guide.

View a list of tuning jobs

You can view a list of tuning jobs in your current project by using the Google Cloud console, the Google Gen AI SDK, the Vertex AI SDK for Python, or by sending a GET request by using the tuningJobs method.

Console

To view your tuning jobs in the Google Cloud console, go to the Vertex AI Studio page.

Go to Vertex AI Studio

Your Gemini tuning jobs are listed in the table under the Gemini Pro tuned models section.

Google Gen AI SDK Vertex AI SDK for Python REST

To view a list of model tuning jobs, send a GET request by using the tuningJobs.list method.

Before using any of the request data, make the following replacements:

HTTP method and URL:

GET https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Execute the following command:

curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method GET `
-Headers $headers `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response
{
  "tuning_jobs": [
    TUNING_JOB_1, TUNING_JOB_2, ...
  ]
}
Get details of a tuning job

You can get the details of a tuning job in your current project by using the Google Cloud console, the Google Gen AI SDK, the Vertex AI SDK for Python, or by sending a GET request by using the tuningJobs method.

Console
  1. To view details of a tuned model in the Google Cloud console, go to the Vertex AI Studio page.

    Go to Vertex AI Studio

  2. In the Gemini Pro tuned models table, find your model and click Details.

    The details of your model are shown.

Google Gen AI SDK Vertex AI SDK for Python REST

To view a list of model tuning jobs, send a GET request by using the tuningJobs.get method and specify the TuningJob_ID.

Before using any of the request data, make the following replacements:

HTTP method and URL:

GET https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Execute the following command:

curl -X GET \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method GET `
-Headers $headers `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response
{
  "name": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID",
  "tunedModelDisplayName": "TUNED_MODEL_DISPLAYNAME",
  "createTime": CREATE_TIME,
  "endTime": END_TIME,
  "tunedModel": {
      "model": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/models/MODEL_ID",
      "endpoint": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/endpoints/ENDPOINT_ID"
  },
  "experiment": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/metadataStores/default/contexts/EXPERIMENT_ID",
  "tuning_data_statistics": {
      "supervisedTuningDataStats": {
          "tuninDatasetExampleCount": "TUNING_DATASET_EXAMPLE_COUNT",
          "totalBillableTokenCount": "TOTAL_BILLABLE_TOKEN_COUNT",
          "tuningStepCount": "TUNING_STEP_COUNT"
      }
  },
  "status": "STATUS",
  "supervisedTuningSpec" : {
        "trainingDatasetUri": "TRAINING_DATASET_URI",
        "validationDataset_uri": "VALIDATION_DATASET_URI",
        "hyperParameters": {
            "epochCount": EPOCH_COUNT,
            "learningRateMultiplier": LEARNING_RATE_MULTIPLIER
        }
    }
}
Cancel a tuning job

You can cancel a tuning job in your current project by using the Google Cloud console or the Vertex AI SDK for Python, or by sending a POST request using the tuningJobs method.

REST

To view a list of model tuning jobs, send a GET request by using the tuningJobs.cancel method and specify the TuningJob_ID.

Before using any of the request data, make the following replacements:

HTTP method and URL:

POST https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID:cancel

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d "" \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID:cancel"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method POST `
-Headers $headers `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID:cancel" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response
{}
Vertex AI SDK for Python Console
  1. To cancel a tuning job in the Google Cloud console, go to the Vertex AI Studio page.

    Go to Vertex AI Studio

  2. In the Gemini Pro tuned models table, click more_vert Manage run.

  3. Click Cancel.

Evaluate the tuned model

You can interact with the tuned model endpoint the same way as base Gemini by using the Vertex AI SDK for Python or the Google Gen AI SDK, or by sending a POST request using the generateContent method.

For thinking models like Gemini 2.5 Flash, we recommend to set the thinking budget to 0 to turn off thinking on tuned tasks for optimal performance and cost efficiency. During supervised fine-tuning, the model learns to mimic the ground truth in tuning dataset, omitting the thinking process. Therefore, tuned model is able to handle the task without thinking budget effectively.

The following example prompts a model with the question "Why is sky blue?".

Console
  1. To view details of a tuned model in the Google Cloud console, go to the Vertex AI Studio page.

    Go to Vertex AI Studio

  2. In the Gemini Pro tuned models table, select Test.

    It opens up a page where you can create a conversation with your tuned model.

Google Gen AI SDK Vertex AI SDK for Python
from vertexai.generative_models import GenerativeModel

sft_tuning_job = sft.SupervisedTuningJob("projects/<PROJECT_ID>/locations/<TUNING_JOB_REGION>/tuningJobs/<TUNING_JOB_ID>")
tuned_model = GenerativeModel(sft_tuning_job.tuned_model_endpoint_name)
print(tuned_model.generate_content(content))
REST

To test a tuned model with a prompt, send a POST request and specify the TUNED_ENDPOINT_ID.

Before using any of the request data, make the following replacements:

HTTP method and URL:

POST https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/endpoints/ENDPOINT_ID:generateContent

Request JSON body:

{
    "contents": [
        {
            "role": "USER",
            "parts": {
                "text" : "Why is sky blue?"
            }
        }
    ],
    "generation_config": {
        "temperature":TEMPERATURE,
        "topP": TOP_P,
        "topK": TOP_K,
        "maxOutputTokens": MAX_OUTPUT_TOKENS
    }
}

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Save the request body in a file named request.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/endpoints/ENDPOINT_ID:generateContent"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Save the request body in a file named request.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/endpoints/ENDPOINT_ID:generateContent" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response
{
  "candidates": [
    {
      "content": {
        "role": "model",
        "parts": [Why is sky blue?
          {
            "text": "The sky appears blue due to a phenomenon called Rayleigh scattering, where shorter blue wavelengths of sunlight are scattered more strongly by the Earth's atmosphere than longer red wavelengths."
          }
        ]
      },
      "finishReason": "STOP",
      "safetyRatings": [
        {
          "category": "HARM_CATEGORY_HATE_SPEECH",
          "probability": "NEGLIGIBLE",
          "probabilityScore": 0.06325052,
          "severity": "HARM_SEVERITY_NEGLIGIBLE",
          "severityScore": 0.03179867
        },
        {
          "category": "HARM_CATEGORY_DANGEROUS_CONTENT",
          "probability": "NEGLIGIBLE",
          "probabilityScore": 0.09334688,
          "severity": "HARM_SEVERITY_NEGLIGIBLE",
          "severityScore": 0.027742893
        },
        {
          "category": "HARM_CATEGORY_HARASSMENT",
          "probability": "NEGLIGIBLE",
          "probabilityScore": 0.17356819,
          "severity": "HARM_SEVERITY_NEGLIGIBLE",
          "severityScore": 0.025419652
        },
        {
          "category": "HARM_CATEGORY_SEXUALLY_EXPLICIT",
          "probability": "NEGLIGIBLE",
          "probabilityScore": 0.07864238,
          "severity": "HARM_SEVERITY_NEGLIGIBLE",
          "severityScore": 0.020332353
        }
      ]
    }
  ],
  "usageMetadata": {
    "promptTokenCount": 5,
    "candidatesTokenCount": 33,
    "totalTokenCount": 38
  }
}
Delete a tuned model

To delete a tuned model:

REST

Call the models.delete method.

Before using any of the request data, make the following replacements:

HTTP method and URL:

DELETE https://REGION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/models/MODEL_ID

To send your request, choose one of these options:

curl Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login , or by using Cloud Shell, which automatically logs you into the gcloud CLI . You can check the currently active account by running gcloud auth list.

Execute the following command:

curl -X DELETE \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
"https://REGION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/models/MODEL_ID"
PowerShell Note: The following command assumes that you have logged in to the gcloud CLI with your user account by running gcloud init or gcloud auth login . You can check the currently active account by running gcloud auth list.

Execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `


-Method DELETE `
-Headers $headers `
-Uri "https://REGION-aiplatform.googleapis.com/v1beta1/projects/PROJECT_ID/locations/REGION/models/MODEL_ID" | Select-Object -Expand Content

You should receive a successful status code (2xx) and an empty response.

Vertex AI SDK for Python
from google.cloud import aiplatform

aiplatform.init(project=PROJECT_ID, location=LOCATION)

# To find out which models are available in Model Registry
models = aiplatform.Model.list()

model = aiplatform.Model(MODEL_ID)
model.delete()
Tuning and validation metrics

You can configure a model tuning job to collect and report model tuning and model evaluation metrics, which can then be visualized in Vertex AI Studio.

  1. To view details of a tuned model in the Google Cloud console, go to the Vertex AI Studio page.

    Go to Vertex AI Studio

  2. In the Tune and Distill table, click the name of the tuned model that you want to view metrics for.

    The tuning metrics appear under the Monitor tab.

Model tuning metrics

The model tuning job automatically collects the following tuning metrics for Gemini 2.0 Flash:

Model validation metrics

You can configure a model tuning job to collect the following validation metrics for Gemini 2.0 Flash:

The metrics visualizations are available after the tuning job starts running. It will be updated in real time as tuning progresses. If you don't specify a validation dataset when you create the tuning job, only the visualizations for the tuning metrics are available.

What's next

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-08-14 UTC.

[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-14 UTC."],[],[]]


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4