A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://cloud.google.com/storage-transfer/docs/create-manage-transfer-console below:

Create transfers | Storage Transfer Service

Skip to main content Create transfers

Stay organized with collections Save and categorize content based on your preferences.

This page shows you how to create and start transfer jobs.

To see if your source and destination (also known as a sink) are supported by Storage Transfer Service, refer to Supported sources and sinks.

Agents and agent pools

Depending on your source and destination, you may need to create and configure an agent pool and install agents on a machine with access to your source or destination.

Before you begin

Before configuring your transfers, make sure you have configured access:

If you're using gcloud commands, install the gcloud CLI.

Create a transfer

Don't include sensitive information such as personally identifiable information (PII) or security data in your transfer job name. Resource names may be propagated to the names of other Google Cloud resources and may be exposed to Google-internal systems outside of your project.

Google Cloud console
  1. Go to the Storage Transfer Service page in the Google Cloud console.

    Go to Storage Transfer Service

  2. Click Create transfer job. The Create a transfer job page is displayed.

  3. Choose a source:

    Cloud Storage

    Your user account must have storage.buckets.get permission to select source and destination buckets. Alternatively, you can type the name of the bucket directly. For more information, see Troubleshooting access.

    1. Under Source type, select Cloud Storage.

    2. Select your Destination type.

    3. If your destination is Cloud Storage, select your Scheduling mode. Batch transfers execute on a one-time or scheduled basis. Event-driven transfers continuously monitor the source and transfer data when it's added or modified.

      To configure an event-driven transfer, follow the instructions at Event-driven transfers.

    4. Click Next step.

    5. Select a bucket and (optionally) a folder in that bucket by doing one of the following:

      • Enter an existing Cloud Storage bucket name and path in the Bucket or folder field without the prefix gs://. For example, my-test-bucket/path/to/files. To specify a Cloud Storage bucket from another project, type the name exactly into the Bucket name field.

      • Select a list of existing buckets in your projects by clicking Browse, then selecting a bucket.

        When you click Browse, you can select buckets in other projects by clicking the Project ID, then selecting the new Project ID and bucket.

      • To create a new bucket, click Create new bucket.

    6. If this is an event-driven transfer, enter the Pub/Sub subscription name, which takes the following format:

      projects/PROJECT_NAME/subscriptions/SUBSCRIPTION_ID
      
    7. Optionally, choose to filter objects by prefix or by last modified date. If you specified a folder as your source location, prefix filters are relative to that folder. For example, if your source is my-test-bucket/path/, an include filter of file includes all files starting with my-test-bucket/path/file.
    8. Click Next step.

    Amazon S3

    See Transfer from Amazon S3 to Cloud Storage.

    S3-compatible storage

    See Transfer from S3-compatible storage to Cloud Storage.

    Microsoft Azure Blob Storage Storage Transfer Service is able to transfer data from the following Microsoft Azure Storage regions:
    1. Under Source type, select Azure Blob Storage or Data Lake Storage Gen2.

    2. Click Next step.

    3. Specify the following:

      1. Storage account name — the source Microsoft Azure Storage account name.

        The storage account name is displayed in the Microsoft Azure Storage portal under All services > Storage > Storage accounts.

      2. Container name — the Microsoft Azure Storage container name.

        The container name is displayed in the Microsoft Azure Storage portal under Storage explorer > Blob containers.

      3. Shared access signature (SAS) — the Microsoft Azure Storage SAS token created from a stored access policy. For more information, see Grant limited access to Azure Storage resources using shared access signatures (SAS).

        The default expiration time for SAS tokens is 8 hours. When you create your SAS token, be sure to set a reasonable expiration time that enables you to successfully complete your transfer. Caution: Basic SAS tokens can't be revoked, and the only way to invalidate a basic SAS token is to remove the storage access key of your account. We strongly recommend that you create SAS tokens from stored access policies, so that you can revoke a policy to invalidate an SAS token. For more information, see Best practices when using SAS. Note: When creating an SAS key, avoid including an IP restriction. Storage Transfer Service uses various IP addresses and doesn't support IP address restriction.
    4. Optionally, choose to filter objects by prefix or by last modified date. If you specified a folder as your source location, prefix filters are relative to that folder. For example, if your source is my-test-bucket/path/, an include filter of file includes all files starting with my-test-bucket/path/file.
    5. Click Next step.

    File system
    1. Under Source type, select POSIX file system.

    2. Select your Destination type and click Next step.

    3. Select an existing agent pool, or select Create agent pool and follow the instructions to create a new pool.

    4. Specify the fully qualified path of the file system directory.

    5. Click Next step.

    HDFS

    See Transfer from HDFS to Cloud Storage.

    URL list
    1. Under Source type, select URL list and click Next step.

    2. Under URL of TSV file, provide the URL to a tab-separated values (TSV) file. See Creating a URL List for details about how to create the TSV file.

    3. Optionally, choose to filter objects by prefix or by last modified date. If you specified a folder as your source location, prefix filters are relative to that folder. For example, if your source is my-test-bucket/path/, an include filter of file includes all files starting with my-test-bucket/path/file.
    4. Click Next step.

  4. Choose a destination:

    Cloud Storage​
    1. In the Bucket or folder field, enter the destination bucket and (optionally) folder name, or click Browse to select a bucket from a list of existing buckets in your current project. To create a new bucket, click Create new bucket.

    2. Click Next step.

    3. Choose your scheduling options:

      Note: The Storage Transfer Service displays transfer job schedules in your local timezone, but it stores those times in Universal Time Coordinated (UTC). If you are affected by Daylight Savings Time (DST), you might experience a transfer job schedule change when DST starts or ends.
      1. From the Run once drop-down list, select one of the following:

        • Run once: Runs a single transfer, starting at a time that you select.

        • Run every day: Runs a transfer daily, starting at a time that you select.

          You can enter an optional End date, or leave End date blank to run the transfer continually.

        • Run every week: Runs a transfer weekly, starting at a time that you select.

        • Run with custom frequency: Runs a transfer at a frequency that you select. You can choose to repeat the transfer at a regular interval of Hours, Days, or Weeks.

          You can enter an optional End date, or leave End date blank to run the transfer continually.

      2. From the Starting now drop-down list, select one of the following:

        • Starting now: Starts the transfer after you click Create.

        • Starting on: Starts the transfer on the date and time that you select. Click Calendar to display a calendar to select the start date.

    4. Click Next step.

    5. Choose settings for the transfer job. Some options are only available for certain source/sink combinations.

      1. In the Description field, enter a description of the transfer. As a best practice, enter a description that is meaningful and unique so that you can tell jobs apart.

      2. Under Metadata options, choose to use the default options, or click View and select options to specify values for all supported metadata. See Metadata preservation for details.

      3. Under When to overwrite, select one of the following:

        • If different: Overwrites destination files if the source file with the same name has different Etags or checksum values.

        • Always: Always overwrites destination files when the source file has the same name, even if they're identical.

      4. Under When to delete, select one of the following:

        • Never: Never delete files from either the source or destination.

        • Delete file from source after they're transferred: Delete files from the source after they're transferred to the destination. If a source file isn't transferred, for example because it already exists in the destination, the source file is not deleted.

          Important: If you don't have local backup, this option is a non-reversible action.
        • Delete files from destination if they're not also at source: If files in the destination Cloud Storage bucket aren't also in the source, then delete the files from the Cloud Storage bucket.

          This option ensures that the destination Cloud Storage bucket exactly matches your source.

      5. For transfers between Cloud Storage buckets, choose whether to transfer managed folders.

      6. Enable or disable Logging for Storage Transfer Service.

      7. Under Notification options, optionally select your Pub/Sub topic and which events to notify for. See Pub/Sub notifications for more details.

    6. If you're delegating service agent permissions to a user-managed service account, select that option and enter the service account email address in the format SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com.

    7. Click Create.

    File system​
    1. Select an existing agent pool, or select Create agent pool and follow the instructions to create a new pool.

    2. Specify the fully qualified destination directory path.

    3. Click Next step.

    4. Choose your scheduling options:

      Note: The Storage Transfer Service displays transfer job schedules in your local timezone, but it stores those times in Universal Time Coordinated (UTC). If you are affected by Daylight Savings Time (DST), you might experience a transfer job schedule change when DST starts or ends.
      1. From the Run once drop-down list, select one of the following:

        • Run once: Runs a single transfer, starting at a time that you select.

        • Run every day: Runs a transfer daily, starting at a time that you select.

          You can enter an optional End date, or leave End date blank to run the transfer continually.

        • Run every week: Runs a transfer weekly, starting at a time that you select.

        • Run with custom frequency: Runs a transfer at a frequency that you select. You can choose to repeat the transfer at a regular interval of Hours, Days, or Weeks.

          You can enter an optional End date, or leave End date blank to run the transfer continually.

      2. From the Starting now drop-down list, select one of the following:

        • Starting now: Starts the transfer after you click Create.

        • Starting on: Starts the transfer on the date and time that you select. Click Calendar to display a calendar to select the start date.

    5. Click Next step.

    6. Specify whether to use a manifest file.

    7. Select whether to preserve metadata; when to overwrite; and when to delete files at the source or destination.

    8. Select your logging options.

    9. Under Notification options, optionally select your Pub/Sub topic and which events to notify for. See Pub/Sub notifications for more details.

    10. If you're delegating service agent permissions to a user-managed service account, select that option and enter the service account email address in the format SERVICE_ACCOUNT_NAME@PROJECT_ID.iam.gserviceaccount.com.

    11. To create your transfer job, click Create.

gcloud CLI

To create a new transfer job, use the gcloud transfer jobs create command. Creating a new job initiates the specified transfer, unless a schedule or --do-not-run is specified.

gcloud transfer jobs create \
  SOURCE DESTINATION

Where:

If the transfer requires transfer agents, the following options are available:

Additional options include:

Transfers from S3-compatible sources also use the following options:

To view all options, run gcloud transfer jobs create --help or refer to the gcloud reference documentation.

Examples Amazon S3 to Cloud Storage

See Transfer from Amazon S3 to Cloud Storage.

S3-compatible storage to Cloud Storage

See Transfer from S3-compatible storage to Cloud Storage.

File system to Cloud Storage

See Transfer from a file system to Cloud Storage.

Cloud Storage to file system

To transfer from a Cloud Storage bucket to a file system, specify the following.

gcloud transfer jobs create \
  gs://my-storage-bucket posix:///tmp/destination \
  --destination-agent-pool=my-destination-agent-pool
File system to file system

To transfer between two file systems, you must specify a source agent pool, a destination agent pool, and an intermediate Cloud Storage bucket through which the data passes.

See Create a Cloud Storage bucket as an intermediary for details on the intermediate bucket.

Then, specify these 3 resources when calling transfer jobs create:

gcloud transfer jobs create \
  posix:///tmp/source/on/systemA posix:///tmp/destination/on/systemB \
  --source-agent-pool=source_agent_pool \
  --destination-agent-pool=destination_agent_pool \
  --intermediate-storage-path=gs://my-intermediary-bucket
REST

The following samples show you how to use Storage Transfer Service through the REST API.

When you configure or edit transfer jobs using the Storage Transfer Service API, the time must be in UTC. For more information on specifying the schedule of a transfer job, see Schedule.

Transfer between Cloud Storage buckets

In this example, you'll learn how to move files from one Cloud Storage bucket to another. For example, you can move data to a bucket in another location.

Note: The process is the same if the bucket is located in a different project.

Request using transferJobs create:

POST https://storagetransfer.googleapis.com/v1/transferJobs
{
  "description": "YOUR DESCRIPTION",
  "status": "ENABLED",
  "projectId": "PROJECT_ID",
  "schedule": {
      "scheduleStartDate": {
          "day": 1,
          "month": 1,
          "year": 2015
      },
      "startTimeOfDay": {
          "hours": 1,
          "minutes": 1
      }
  },
  "transferSpec": {
      "gcsDataSource": {
          "bucketName": "GCS_SOURCE_NAME"
      },
      "gcsDataSink": {
          "bucketName": "GCS_SINK_NAME"
      },
      "transferOptions": {
          "deleteObjectsFromSourceAfterTransfer": true
      }
  }
}
Response:
200 OK
{
  "transferJob": [
      {
          "creationTime": "2015-01-01T01:01:00.000000000Z",
          "description": "YOUR DESCRIPTION",
          "name": "transferJobs/JOB_ID",
          "status": "ENABLED",
          "lastModificationTime": "2015-01-01T01:01:00.000000000Z",
          "projectId": "PROJECT_ID",
          "schedule": {
              "scheduleStartDate": {
                  "day": 1,
                  "month": 1,
                  "year": 2015
              },
              "startTimeOfDay": {
                  "hours": 1,
                  "minutes": 1
              }
          },
          "transferSpec": {
              "gcsDataSource": {
                  "bucketName": "GCS_SOURCE_NAME",
              },
              "gcsDataSink": {
                  "bucketName": "GCS_NEARLINE_SINK_NAME"
              },
              "objectConditions": {
                  "minTimeElapsedSinceLastModification": "2592000.000s"
              },
              "transferOptions": {
                  "deleteObjectsFromSourceAfterTransfer": true
              }
          }
      }
  ]
}
Transfer from Amazon S3 to Cloud Storage

See Transfer from Amazon S3 to Cloud Storage.

Transfer between Microsoft Azure Blob Storage and Cloud Storage

In this example, you'll learn how to move files from Microsoft Azure Storage to a Cloud Storage bucket, using a Microsoft Azure Storage shared access signature (SAS) token.

For more information on Microsoft Azure Storage SAS, see Grant limited access to Azure Storage resources using shared access signatures (SAS).

Before starting, review Configure access to Microsoft Azure Storage and Pricing to understand the implications of moving data from Microsoft Azure Storage to Cloud Storage.

Storage Transfer Service is able to transfer data from the following Microsoft Azure Storage regions:

Request using transferJobs create:

POST https://storagetransfer.googleapis.com/v1/transferJobs
{
  "description": "YOUR DESCRIPTION",
  "status": "ENABLED",
  "projectId": "PROJECT_ID",
  "schedule": {
      "scheduleStartDate": {
          "day": 14,
          "month": 2,
          "year": 2020
      },
      "scheduleEndDate": {
          "day": 14
          "month": 2,
          "year": 2020
      },
      "startTimeOfDay": {
          "hours": 1,
          "minutes": 1
      }
  },
  "transferSpec": {
      "azureBlobStorageDataSource": {
          "storageAccount": "AZURE_SOURCE_NAME",
          "azureCredentials": {
              "sasToken": "AZURE_SAS_TOKEN",
          },
          "container": "AZURE_CONTAINER",
      },
      "gcsDataSink": {
          "bucketName": "GCS_SINK_NAME"
      }
  }
}
Response:
200 OK
{
  "transferJob": [
      {
          "creationTime": "2020-02-14T01:01:00.000000000Z",
          "description": "YOUR DESCRIPTION",
          "name": "transferJobs/JOB_ID",
          "status": "ENABLED",
          "lastModificationTime": "2020-02-14T01:01:00.000000000Z",
          "projectId": "PROJECT_ID",
          "schedule": {
              "scheduleStartDate": {
                  "day": 14
                  "month": 2,
                  "year": 2020
              },
              "scheduleEndDate": {
                  "day": 14,
                  "month": 2,
                  "year": 2020
              },
              "startTimeOfDay": {
                  "hours": 1,
                  "minutes": 1
              }
          },
          "transferSpec": {
              "azureBlobStorageDataSource": {
                  "storageAccount": "AZURE_SOURCE_NAME",
                  "azureCredentials": {
                      "sasToken": "AZURE_SAS_TOKEN",
                  },
                  "container": "AZURE_CONTAINER",
              },
              "objectConditions": {},
              "transferOptions": {}
          }
      }
  ]
}
Transfer from a file system

See Transfer from a file system to Cloud Storage.

Specifying source and destination paths

Source and destination paths enable you to specify source and destination directories when transferring data to your Cloud Storage bucket. For example, consider that you have files file1.txt and file2.txt and a Cloud Storage bucket named B. If you set a destination path named my-stuff, then after the transfer completes your files are located at gs://B/my-stuff/file1.txt and gs://B/my-stuff/file2.txt.

Specifying a source path

To specify a source path when creating a transfer job, add a path field to the gcsDataSource field in your TransferSpec specification:

{
gcsDataSource: {
  bucketName: "SOURCE_BUCKET",
  path: "SOURCE_PATH/",
},
}

In this example:

Specifying a destination path

To specify a destination folder when you create a transfer job, add a path field to the gcsDataSink field in your TransferSpec specification:

{
gcsDataSink: {
  bucketName: "DESTINATION_BUCKET",
  path: "DESTINATION_PATH/",
},
}

In this example:

Complete example request

The following is an example of a full request:

POST https://storagetransfer.googleapis.com/v1/transferJobs
{
  "description": "YOUR DESCRIPTION",
  "status": "ENABLED",
  "projectId": "PROJECT_ID",
  "schedule": {
      "scheduleStartDate": {
          "day": 1,
          "month": 1,
          "year": 2015
      },
      "startTimeOfDay": {
          "hours": 1,
          "minutes": 1
      }
  },
  "transferSpec": {
      "gcsDataSource": {
          "bucketName": "GCS_SOURCE_NAME",
          "path": "GCS_SOURCE_PATH",
      },
      "gcsDataSink": {
          "bucketName": "GCS_SINK_NAME",
          "path": "GCS_SINK_PATH",
      },
      "objectConditions": {
          "minTimeElapsedSinceLastModification": "2592000s"
      },
      "transferOptions": {
          "deleteObjectsFromSourceAfterTransfer": true
      }
  }

}
Client libraries

The following samples show you how to use Storage Transfer Service programmatically with Go, Java, Node.js, and Python.

When you configure or edit transfer jobs programmatically, the time must be in UTC. For more information on specifying the schedule of a transfer job, see Schedule.

For more information about the Storage Transfer Service client libraries, see Getting started with Storage Transfer Service client libraries.

Transfer between Cloud Storage buckets

In this example, you'll learn how to move files from one Cloud Storage bucket to another. For example, you can move data to a bucket in another location.

Note: The process is the same if the bucket is located in a different project. Transfer from Amazon S3 to Cloud Storage

See Transfer from Amazon S3 to Cloud Storage.

Transfer between Microsoft Azure Blob Storage and Cloud Storage

In this example, you'll learn how to move files from Microsoft Azure Storage to a Cloud Storage bucket, using a Microsoft Azure Storage shared access signature (SAS) token.

For more information on Microsoft Azure Storage SAS, see Grant limited access to Azure Storage resources using shared access signatures (SAS).

Before starting, review Configure access to Microsoft Azure Storage and Pricing to understand the implications of moving data from Microsoft Azure Storage to Cloud Storage.

Storage Transfer Service is able to transfer data from the following Microsoft Azure Storage regions: Go

To learn how to install and use the client library for Storage Transfer Service, see Storage Transfer Service client libraries. For more information, see the Storage Transfer Service Go API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Java

To learn how to install and use the client library for Storage Transfer Service, see Storage Transfer Service client libraries. For more information, see the Storage Transfer Service Java API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Node.js

To learn how to install and use the client library for Storage Transfer Service, see Storage Transfer Service client libraries. For more information, see the Storage Transfer Service Node.js API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Python

To learn how to install and use the client library for Storage Transfer Service, see Storage Transfer Service client libraries. For more information, see the Storage Transfer Service Python API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, see Set up authentication for a local development environment.

Transfer from a file system

See Transfer from a file system to Cloud Storage.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-10-02 UTC.

[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-10-02 UTC."],[],[]]


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.5