Stay organized with collections Save and categorize content based on your preferences.
MySQL | PostgreSQL | SQL ServerThis page describes exporting and importing data into Cloud SQL instances using BAK files and importing data into Cloud SQL instances using transaction log files.
Note: In Cloud SQL, SQL Server currently supports the export of native BAK files. If you're exporting to create a new instance from the exported file, consider restoring from a backup to a different instance or cloning the instance.
WARNING! Don't use a BAK file created from a read-only database or from a database that is in single-user mode. If you import a BAK file created from a read-only database or from a database that's in single-user mode, then an error might occur. Before you begin Important: Before starting a large export, ensure that at least 25 percent of the database size is free (on the instance). Doing so helps prevent issues with aggressive autogrowth, which can affect the availability of the instance.Exports use database resources, but exports don't interfere with normal database operations unless the instance is under-provisioned.
For best practices, see Best Practices for Importing and Exporting Data.
After completing an import operation, verify the results.
Export data from Cloud SQL for SQL ServerCloud SQL supports the export of built-in BAK files.
If you aim to create a new instance from an exported file, then consider restoring from a backup to a different instance or cloning the instance.
Cloud SQL performs a full backup of the selected database during an export operation.
Note: For information about striped export, see Use striped export. Required roles and permissions for exporting from Cloud SQL for SQL ServerTo export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
Identity and Access Management (IAM) rolestorage.objects.create
storage.objects.list
(for striped export and transaction log export)storage.objects.delete
(for striped export and transaction log export)storage.buckets.get
(for transaction log export only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Export data to a BAK file from Cloud SQL for SQL Server Note: You can't export a database snapshot to a BAK file. ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
gcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions.gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/FILENAME \ --database=DATABASE_NAME
For information about using the gcloud sql export bak
command, see the command reference page.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Before exporting a differential database backup, you must export a differential base.
If other services or features, such as point-in-time recovery and read replica, trigger a full backup between your full backup export and differential backup export, then you must trigger a full backup export again.
To understand this better, consider the following example:
Cloud SQL doesn't support database export requests with --differential-base
or --bak-type=DIFF
on replica instances.
gcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions.Export the database as the differential base.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BASE_FILENAME \ --database=DATABASE_NAME --differential-base
For information about using the gcloud sql export bak
command, see the command reference page.
Export a differential backup.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=DIFF
For information about using the gcloud sql export bak
command, see the command reference page.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
offload
: to enable and use serverless export, set this value to TRUE Note: Serverless export costs extra. See the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "databases": ["DATABASE_NAME"] "offload": TRUE | FALSE "bakExportOptions": { "differentialBase":true } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "EXPORT", "exportContext": { "uri": {uri}, "databases": [ DATABASE_NAME ], "kind": "sql#exportContext", "fileType": "BAK", "bakExportOptions": { "bakType": FULL, "differentialBase": true, } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Before using any of the request data, make the following replacements:
true
to use serverless export. Note: Serverless export costs extra. See the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { bakType:"DIFF" } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "EXPORT", "exportContext": { "uri": {uri}, "databases": [ database_name ], "kind": "sql#exportContext", "fileType": "BAK", "bakExportOptions": { "bakType": DIFF, "differentialBase": false, } }, "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
true
. Note: Serverless export costs extra. For more information, see the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { "differentialBase":true } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "EXPORT", "exportContext": { "uri": {uri}, "databases": [ database_name ], "kind": "sql#exportContext", "fileType": "BAK", "bakExportOptions": { "bakType": FULL, "differentialBase": true, } }, "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
Before using any of the request data, make the following replacements:
true
. Note: Serverless export costs extra. For more information, see the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_dump_file", "databases": ["database_name"] "offload": true | false "bakExportOptions": { bakType:"DIFF" } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "EXPORT", "exportContext": { "uri": {uri}, "databases": [ database_name ], "kind": "sql#exportContext", "fileType": "BAK", "bakExportOptions": { "bakType": DIFF, "differentialBase": false, } }, "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
You can export the transaction logs for all Cloud SQL for SQL Server instances that have point-in-time recovery (PITR) enabled and their logs stored in Cloud Storage.
gcloudgcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
gcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.Admin
IAM role to the service account. For more information about setting IAM permissions, see Set and manage IAM policies on buckets.Export the transaction logs.
gcloud sql export bak INSTANCE_NAME gs://BUCKET_NAME/FOLDER_PATH --export-log-start-time=START_DATE_AND_TIME / --export-log-end-time=END_DATE_AND_TIME / --database=DATABASE_NAME --bak-type=TLOG
Note: The export-log-start-time
and export-log-end-time
parameters are optional. The values for these parameters must be in the UTC time zone and have the RFC 3339 format. For example: 2024-05-26T16:19:00:094Z.
If you don't provide a start date and time and an end date and time for the parameters, then Cloud SQL exports all transaction logs within the log retention period in the Cloud Storage bucket. The log retention period can range from 1 to 35 days for Cloud SQL Enterprise Plus edition edition and 1 to 7 days for Cloud SQL Enterprise edition edition.
If you want to export transaction logs continually, then don't use the --export-log-start-time
parameter and always export to the same location in Cloud Storage. Cloud SQL doesn't export any log files that already exist in the destination repeatedly.
For information about using the gcloud sql export bak
command, see the command reference page.
Create a Cloud Storage bucket.
This step isn't required, but strongly recommended, so you don't open up access to any other data.
storage.Admin
IAM role for your bucket. For more information about setting IAM permissions, see Set and manage IAM policies on buckets.Before using any of the request data, make the following replacements:
exportLogStartTime
: the start date and time of the trasnasction logs to export.exportLogEndTime
: the end date and time of the transaction logs to export.Note: The exportLogStartTime
and exportLogEndTime
parameters are optional. The values for these parameters must be in the UTC time zone and have the RFC 3339 format. For example: 2024-05-26T16:19:00:094Z.
If you don't provide a start date and time and an end date and time for the parameters, then Cloud SQL exports all transaction logs in the Cloud Storage bucket.
If you want to export transaction logs continually, then don't use the exportLogStartTime
parameter and always export to the same location in Cloud Storage. Cloud SQL doesn't export any log files that already exist in the destination repeatedly.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/FOLDER_PATH", "databases": ["DATABASE_NAME"] "bakExportOptions": { bakType:"TLOG" exportLogStartTime: START_DATE_AND_TIME exportLogEndTime: END_DATE_AND_TIME } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "EXPORT", "exportContext": { "uri": {uri}, "databases": [ DATABASE_NAME ], "kind": "sql#exportContext", "fileType": "BAK", "bakExportOptions": { "bakType": TLOG, "differentialBase": false, } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
The advantages of striped export are the following:
A potential disadvantage of using striped export is that the backup, rather than consisting of one file, is split across a set of files. This set is called a "stripe set"; see Backup devices in a striped media set (a stripe set). In Cloud SQL, you export to an empty folder in Cloud Storage instead of generating a single file. For more information, see How to use striped export.
Planning your operationsStriped export can improve the performance of exports. However, if your use case requires a single output file, or if your database is less than 5 TB in size, and if faster performance isn't critical, you may want to use a non-striped export.
If you decide to use striped export, then consider the number of stripes. You can specify this value in your gcloud CLI command or REST API call. However, if you want an optimal number of stripes for performance, or if you don't know a number, omit the number. An optimal number of stripes is set automatically.
The maximum number of stripes currently supported by Cloud SQL for SQL Server is 64.
How to use striped export gcloudgcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the service account. For more information about setting IAM permissions, see Using IAM permissions.--striped
parameter and/or specify a value for --stripe_count
. Setting a value for --stripe_count
implies that the --striped
parameter is intended. An error occurs if you specify --no-striped
but specify a value for --stripe_count
:
gcloud beta sql export bak INSTANCE_NAME \ gs://BUCKET_NAME/STRIPED_EXPORT_FOLDER \ --database=DATABASE_NAME --striped --stripe_count=NUMBER
For information about using the gcloud beta sql export bak
command, see the command reference page.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
true
to use striped export. If you specify true
without specifying a stripe count, an optimal number of stripes is set automaticallystriped
is implied as true
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "databases": ["database_name"], "bakExportOptions": { "striped": true | false, "stripe_count": ["number_of_stripes"] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2022-09-29T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but is strongly recommended, so you don't open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
true
to use striped export. If you specify true
without specifying a stripe count, an optimal number of stripes is set automaticallystriped
is implied as true
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "databases": ["database_name"], "bakExportOptions": { "striped": true | false, "stripe_count": ["number_of_stripes"] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2022-09-29T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
IAM rolestorage.objects.get
storage.objects.list
(for striped import only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Import data from a BAK file to Cloud SQL for SQL ServerTo use striped import, see Use striped import.
Various import frameworks are available. For example, Cloud SQL for SQL Server supports change data capture (CDC) for the following database versions:
When importing a CDC-enabled database, the KEEP_CDC flag is retained.
Note: You can't import a database that was exported from a higher version of SQL Server or import from a higher compatibility level into a lower one. For example, if you exported a SQL Server 2017 version, you can't import it into a SQL Server 2014 version.If your instance version is a Microsoft SQL Server Enterprise Edition, then you can import encrypted BAK files.
Microsoft SQL Server Standard Edition instances also import encrypted BAK files, but only through gcloud CLI.
The only supported BAK extensions are .bak
and .bak.gz
. GPG encrypted backups are not currently supported.
For the instructions below, prepare to specify a new database; don't create a database before starting the import of your BAK file.
Note: Cloud SQL only supports importing a full backup with a single backup set.To import data to a Cloud SQL instance using a BAK file:
ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
You can import a compressed (.gz
) or an uncompressed file.
In the File format section, select BAK.
If you're importing data into a Cloud SQL Enterprise Plus edition instance, then you can import encrypted BAK files. Cloud SQL Enterprise edition instances also support importing encrypted BAK files, but only through the gcloud CLI.
To import an encrypted BAK file, do the following:
Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
gcloud sql instances describe INSTANCE_NAME
serviceAccountEmailAddress
field. gcloud storage buckets add-iam-policy-binding
to grant the storage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions.gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAMEFor encrypted imports, use the following command:
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME --database=DATABASE_NAME --cert-path=gs://BUCKET_NAME/CERTIFICATE_NAME --pvk-path=gs://BUCKET_NAME/KEY_NAME --prompt-for-pvk-password
gcloud storage buckets remove-iam-policy-binding
.Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
If you get an error such as ERROR_RDBMS
, ensure the BAK file exists in the bucket and you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.
Before you import a differential database backup, you need a full backup import and your database must be in the RESTORING
state after the full backup import.
Cloud SQL doesn't support importing differential database backups on instances that are enabled with point-in-time recovery. This is because importing a database backup with --no-recovery
is a prerequisite for importing differential database backups. Additionally, you can't enable point-in-time recovery on an instance if the database is in the RESTORING
state. In the case of import failure, do one of the following to enable point-in-time recovery:
Bring the database that's in the RESTORING
state online by using the --recovery-only
flag.
Remove the database.
To import data to a Cloud SQL instance using a differential database backup, perform the following steps:
gcloudCreate a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't mandatory, but we strongly recommend that you perform it so that you don't open up access to any other data.
gcloud sql instances describe INSTANCE_NAME
serviceAccountEmailAddress
field. gcloud storage buckets add-iam-policy-binding
to grant the storage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions.Import a full backup with --no-recovery
.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BASE_FILENAME \ --database=DATABASE_NAME --bak-type=FULL --no-recovery
Import a differential database backup.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/DIFFERENTIAL_BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=DIFF --no-recovery
After restoring all the backup files, use the --recovery-only
flag to bring the imported database online from a RESTORING
state. Users are strongly encouraged not to use T-SQL commands to bring the imported database online.
gcloud sql import bak INSTANCE_NAME \ --database=DATABASE_NAME --recovery-only
gcloud storage buckets remove-iam-policy-binding
.Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.importContext.importUser
property to use a different user for the import. For the complete list of parameters for the request, see the instances:import page.
noRecovery
.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": true, "bakType": FULL, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" "bakImportOptions": { "bakType": "DIFF", "noRecovery": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": database_name, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": false, "bakType": DIFF, "recoveryOnly": false } }, "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
After restoring all the backup files, use the recoveryOnly
flag to bring the imported database online from a RESTORING
state. Users are strongly encouraged not to use T-SQL commands to bring the imported database online.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET-INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "recoveryOnly": true } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.importContext.importUser
property to use a different user for the import. For the complete list of parameters for the request, see the instances:import page.
noRecovery
.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": true, "bakType": FULL, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/operations/operation-id", "targetProject": "PROJECT-ID" }
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_bak_file", "database": "database_name" "bakImportOptions": { "bakType": "DIFF", "noRecovery": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": database_name, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": false, "bakType": DIFF, "recoveryOnly": false } }, "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
After restoring all the backup files, use the recoveryOnly
flag to bring the imported database online from a RESTORING
state. Users are strongly encouraged not to use T-SQL commands to bring the imported database online.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "recoveryOnly": true } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
If you get an error such as ERROR_RDBMS
, then ensure that the BAK file exists in the bucket and you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.
A transaction log is a record of your database's transactions and the modifications made by each transaction. You can use it to re-establish database consistency in the event of a system failure.
Note:--no-recovery
is a prerequisite for importing transaction log backups. Additionally, you can't enable point-in-time recovery on an instance if the database is in the RESTORING
state. In the case of import failure, do one of the following to enable point-in-time recovery:
RESTORING
state online by using the --recovery-only
flag.To import data to a Cloud SQL instance using a transaction log backup, perform the following steps:
gcloudOptional: Create a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
gcloud sql instances describe INSTANCE_NAME
serviceAccountEmailAddress
field. gcloud storage buckets add-iam-policy-binding
to grant the storage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions.Import a full backup using the --no-recovery
parameter. Ensure that your database is in the RESTORING
state after the full backup import.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=FULL --no-recovery
Import a transaction log backup.
gcloud sql import bak INSTANCE_NAME gs://BUCKET_NAME/BACKUP_FILENAME \ --database=DATABASE_NAME --bak-type=TLOG --stop-at=STOP_AT_TIMESTAMP --stop-at-mark=STOP_AT_MARK_NAME --no-recoveryReplace the following:
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.After restoring all the backup files, use the --recovery-only
flag to bring the imported database online from a RESTORING
state. Users are strongly encouraged not to use T-SQL commands to bring the imported database online.
gcloud sql import bak INSTANCE_NAME \ --database=DATABASE_NAME --recovery-only
gcloud storage buckets remove-iam-policy-binding
.Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.noRecovery
. Ensure that your database is in the RESTORING
state after the full backup import.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": true, "bakType": FULL, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Before using any of the request data, make the following replacements:
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_TLOG_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "bakType": "TLOG", "stopAt": STOP_AT_TIMESTAMP, "stopAtMark": STOP_AT_MARK_NAME, "noRecovery": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": false, "bakType": TLOG, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }Repeat this step until all transaction log backups are imported.
After restoring all the backup files, use the recoveryOnly
flag to bring the imported database online from a RESTORING
state. Users are strongly encouraged not to use T-SQL commands to bring the imported database online.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET-INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "recoveryOnly": true } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.importContext.importUser
property to use a different user for the import. For the complete list of parameters for the request, see the instances:import page.
noRecovery
. Ensure that your database is in the RESTORING
state after the full backup import.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "noRecovery": true, "bakType": "FULL", } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": true, "bakType": FULL, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT-ID/operations/operation-id", "targetProject": "PROJECT-ID" }
stopAt
and stopAtMark
are optional fields.
Before using any of the request data, make the following replacements:
lsn:log-sequence-number
, then the transaction log import stops at the given log sequence number.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "bakType": "TLOG", "stopAt": STOP_AT_TIMESTAMP, "stopAtMark":STOP_AT_MARK_NAME, "noRecovery": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "noRecovery": false, "bakType": TLOG, "recoveryOnly": false } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }Repeat this step until all transaction log backups are imported.
After restoring all the backup files, use recoveryOnly
to bring the imported database online.
Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://BUCKET_NAME/PATH_TO_BAK_FILE", "database": "DATABASE_NAME" "bakImportOptions": { "recoveryOnly": true, } } }
To send your request, choose one of these options:
curl Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "IMPORT", "importContext": { "uri": {uri}, "database": DATABASE_NAME, "kind": "sql#importContext", "fileType": "BAK", "bakImportOptions": { "recoveryOnly": true } }, "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
The advantages of striped import are the following:
A potential disadvantage of using striped import is that all of the files in the striped set (rather than a single file) must be uploaded to the same folder in your Cloud Storage bucket, before you perform the import.
Planning your operationsIn most use cases, striped import enables better performance with no disadvantages. However, if you can't back up to a striped set from a given instance, or if your database is less than 5 TB, and if faster performance is not critical, you may want to use a non-striped import.
How to use striped import gcloudCreate a Cloud Storage bucket for the import.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step isn't required, but strongly recommended, so you don't open up access to any other data.
gcloud sql instances describe INSTANCE_NAME
serviceAccountEmailAddress
field. gcloud storage buckets add-iam-policy-binding
to grant the storage.objectViewer
IAM role to the service account for the bucket. For more information about setting IAM permissions, see Using IAM permissions.--striped
parameter:
gcloud beta sql import bak INSTANCE_NAME gs://BUCKET_NAME/FOLDER_NAME \ --database=DATABASE_NAME --striped
gcloud storage buckets remove-iam-policy-binding
.Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
true
to use striped importHTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "database": "database_name", "bakImportOptions": { "striped": true | false } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2022-09-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For more information about setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
true
to use striped importHTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "BAK", "uri": "gs://bucket_name/path_to_folder", "database": "database_name", "bakImportOptions": { "striped": true | false } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2022-09-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
If you get an error such as ERROR_RDBMS
, then ensure that the table exists. If the table exists, then confirm that you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-14 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-14 UTC."],[],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4