Stay organized with collections Save and categorize content based on your preferences.
This page describes exporting and importing data into Cloud SQL instances using SQL dump files.
Note: If you're migrating an entire database from a supported database server (on-premises, in AWS, or Cloud SQL) to a new Cloud SQL instance, you can use Database Migration Service instead of exporting and then importing files. If you're exporting to create a new instance from the exported file, consider restoring from a backup to a different instance or cloning the instance.
You can cancel the import of data into Cloud SQL instances and the export of data from the instances. This data is contained in SQL dump files. For more information about cancelling an import or export operation, see Cancel the import and export of data.
Before you begin Important: Before starting a large export, ensure that at least 25 percent of the database size is free (on the instance). Doing so helps prevent issues with aggressive autogrowth, which can affect the availability of the instance.Exports use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.
For best practices, see Best Practices for Importing and Exporting Data.
After completing an import operation, verify the results.
Export data from Cloud SQL for PostgreSQL Required roles and permissions for exporting from Cloud SQL for PostgreSQLTo export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
Identity and Access Management (IAM) rolestorage.objects.create
storage.objects.list
(for exporting files in parallel only)storage.objects.delete
(for exporting files in parallel only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Export to a SQL dump file from Cloud SQL for PostgreSQLWhen you use Cloud SQL to perform an export, whether from the Google Cloud console, the
gcloud CLI, or the API, you are using the
pg_dump
utility, with the options required to ensure that the resulting export file is valid for import back into Cloud SQL.
If you plan to import your data into Cloud SQL, you must follow the instructions provided in Exporting data from an external database server so that your SQL dump file is formatted correctly for Cloud SQL.
Note: If your data contains large objects (blobs), the export can consume a large amount of memory, impacting instance performance. For help, see Known Issues.To export data from a database on a Cloud SQL instance to a SQL dump file in a Cloud Storage bucket:
ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
gcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the service account. For help with setting IAM permissions, see Using IAM permissions.Note: If you want to use serverless exports, then use the offload
parameter. If you want to include the DROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use the clean
parameter. If you want to include the IF EXISTS
SQL statement with each DROP
statement that's produced by the clean
parameter, then use the if-exists
parameter.
Otherwise, remove these parameters from the following command.
gcloud sql export sql INSTANCE_NAME gs://BUCKET_NAME/sqldumpfile.gz \ --database=DATABASE_NAME \ --offload
The export sql
command does not contain triggers or stored procedures, but does contain views. To export triggers and/or stored procedures, use the pg_dump tool.
For more information about using the export sql
command, see the sql export sql
command reference page.
gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
Note: If you want to use serverless exports, then use the offload
parameter. If you want to include the DROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use the clean
parameter. If you want to include the IF EXISTS
SQL statement with each DROP
statement that's produced by the clean
parameter, then use the ifExists
parameter.
To use these features, set the values of these parameters to TRUE
. Otherwise, set their values to FALSE
.
Serverless exports costs extra. See the pricing page.
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE", "databases": ["DATABASE_NAME"], "offload": TRUE | FALSE, "sqlExportOptions": { "clean": [TRUE|FALSE], "ifExists": [TRUE|FALSE] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/v1/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
gcloud storage buckets create gs://BUCKET_NAME --project=PROJECT_NAME --location=LOCATION_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
Note: If you want to use serverless exports, then use the offload
parameter. If you want to include the DROP <object>
SQL statement that's required to drop (clean) database objects before you import them, then use the clean
parameter. If you want to include the IF EXISTS
SQL statement with each DROP
statement that's produced by the clean
parameter, then use the ifExists
parameter.
To use these features, set the values of these parameters to TRUE
. Otherwise, set their values to FALSE
.
Serverless exports costs extra. See the pricing page.
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export
Request JSON body:
{ "exportContext": { "fileType": "SQL", "uri": "gs://BUCKET_NAME/PATH_TO_DUMP_FILE", "databases": ["DATABASE_NAME"], "offload": TRUE | FALSE, "sqlExportOptions": { "clean": [TRUE|FALSE], "ifExists": [TRUE|FALSE] } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/INSTANCE_ID/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/instances/TARGET_INSTANCE_ID", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "OPERATION_ID", "targetId": "INSTANCE_ID", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/PROJECT_ID/operations/OPERATION_ID", "targetProject": "PROJECT_ID" }
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
IAM rolestorage.objects.get
storage.objects.list
(for importing files in parallel only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Import a SQL dump file to Cloud SQL for PostgreSQLSQL files are plain text files with a sequence of SQL commands.
ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
You can import a compressed (.gz
) or an uncompressed (.sql
) file.
Select the database you want the data to be imported into.
This causes Cloud SQL to run the USE DATABASE
statement before the import.
USE DATABASE
statement, it overrides the database you set in the Google Cloud console.If you want to specify a user to perform the import, select the user.
If your import file contains statements that must be performed by a specific user, use this field to specify that user.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
gcloud sql instances describe INSTANCE_NAME
serviceAccountEmailAddress
field.gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the service account for the bucket.
gcloud storage buckets add-iam-policy-binding gs://BUCKET_NAME \ --member=serviceAccount:SERVICE-ACCOUNT \ --role=roles/storage.objectAdminFor help with setting IAM permissions, see Using IAM permissions.
gcloud sql import sql INSTANCE_NAME gs://BUCKET_NAME/IMPORT_FILE_NAME \ --database=DATABASE_NAME
For information about using the import sql
command, see the sql import sql
command reference page.
If the command returns an error like ERROR_RDBMS
, review the permissions; this error is often due to permissions issues.
gcloud storage buckets remove-iam-policy-binding
.Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
legacyBucketWriter
and objectViewer
IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
Create a SQL dump file. The linked instructions set certain flags that make the dump file compatible with Cloud SQL.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "SQL", "uri": "gs://bucket_name/path_to_sql_file", "database": "database_name" } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
To use a different user for the import, specify the importContext.importUser
property.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-07-02 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-07-02 UTC."],[],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4