Stay organized with collections Save and categorize content based on your preferences.
MySQL | PostgreSQL | SQL ServerThis page describes exporting and importing data into Cloud SQL instances using CSV files.
Note: If you're migrating an entire database from a supported database server (on-premises, in AWS, or Cloud SQL) to a new Cloud SQL instance, you can use the Database Migration Service instead of exporting and then importing files. If you're exporting because you want to create a new instance from the exported file, consider restoring from a backup to a different instance or cloning the instance.
You can cancel the import of data into Cloud SQL instances and the export of data from the instances. This data is contained in CSV files. For more information about cancelling an import or export operation, see Cancel the import and export of data.
Before you beginBefore you begin an export or import operation:
Export and import operations use database resources, but they do not interfere with normal database operations unless the instance is under-provisioned.
Important: Before starting a large operation, ensure that at least 25 percent of the disk is free on the instance. Doing so helps prevent issues with aggressive autogrowth, which can adversely affect the availability of the instance.To export data from Cloud SQL into Cloud Storage, the user initiating the export must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.export
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
Identity and Access Management (IAM) rolestorage.objects.create
storage.objects.list
(for exporting files in parallel only)storage.objects.delete
(for exporting files in parallel only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Export data to a CSV file from Cloud SQL for PostgreSQLYou can export your data in CSV format, which is usable by other tools and environments. Exports happen at the database level. During a CSV export, you can specify the schemas to export. All schemas at the database level are eligible for export.
Note: Cloud SQL uses double quotes (hex value "22") as the default escape character. This can be a problem for databases where values forNULL
are entered as string literals. When importing a file that was exported using the default escape character, the file doesn't treat the value as NULL
but as "NULL"
. We recommend that you use --escape="5C"
to override the default when you export the file. Note: You cannot export to a CSV file from a read replica instance. The export operation creates an export user and grants that user select permissions on the database that the user wants to export. Because read replica instances run in read-only mode, these operations fail.
To export data from a database on a Cloud SQL instance to a CSV file in a Cloud Storage bucket:
ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
If you click Browse:
In the Name box, add a name for the CSV
file, or select an existing file from the list in the Location section.
You can use a file extension of .gz
(the complete extension would be .csv.gz
) to compress your export file.
For SQL query, enter a SQL query to specify the table to export data from.
For example, to export the entire contents of the entries
table in the guestbook
database, you enter
SELECT * FROM guestbook.entries;Your query must specify a table in the specified database. You can't export an entire database in CSV format.
Note: While in-transit, the query might be processed in intermediate locations other than the location of the target instance
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
gcloud sql instances describe
command. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the Cloud SQL instance service account. For help with setting IAM permissions, see Using IAM permissions.--offload
flag if you want to use serverless export. Otherwise, remove it from the following command.
gcloud sql export csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --offload \ --query=SELECT_QUERY
For information about using the export csv
command, see the sql export csv
command reference page.
Note: While in-transit, the SELECT_QUERY might be processed in intermediate locations other than the location of the target instance.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
legacyBucketWriter
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Export your database:
Before using any of the request data, make the following replacements:
true
to use serverless export. Note: Serverless export costs extra. See the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": "database_name", "offload": true | false "csvExportOptions": { "selectQuery":"select_query", "escapeCharacter":"escape_character", "quoteCharacter":"quote_character", "fieldsTerminatedBy":"fields_terminated_by", "linesTerminatedBy":"lines_terminated_by" } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }
You must specify exactly one database with the databases
property, and if the select query specifies a database, it must be the same.
gcloud storage buckets create gs://BUCKET_NAME --location=LOCATION_NAME --project=PROJECT_NAME
This step is not required, but strongly recommended, so you do not open up access to any other data.
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Export your database:
Before using any of the request data, make the following replacements:
true
to use serverless export. Note: Serverless export costs extra. See the pricing page.HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export
Request JSON body:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": "database_name", "offload": true | false "csvExportOptions": { "selectQuery": "select_query", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/export" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }
You must specify exactly one database with the databases
property, and if the select query specifies a database, it must be the same.
You can use gcloud
or the REST API to customize your CSV file format. When you perform an export, you can specify the following formatting options:
"5C"
ASCII hex code for file separator.
--escape
escapeCharacter
Character that appears before a data character that needs to be escaped.
Available only for MySQL and PostgreSQL.
Quote"22"
ASCII hex code for double quotes.
--quote
quoteCharacter
Character that encloses values from columns that have a string data type.
Available only for MySQL and PostgreSQL.
Field delimiter"2C"
ASCII hex code for comma.
--fields-terminated-by
fieldsTerminatedBy
Character that splits column values.
Available only for MySQL and PostgreSQL.
Newline character"0A"
ASCII hex code for newline.
--lines-terminated-by
linesTerminatedBy
Character that splits line records.
Available only for MySQL.
For example, a gcloud
command using all of these arguments could be like the following:
gcloud sql export csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --offload \ --query=SELECT_QUERY \ --quote="22" \ --escape="5C" \ --fields-terminated-by="2C" \ --lines-terminated-by="0A"
The equivalent REST API request body would look like this:
{ "exportContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "databases": "DATABASE_NAME", "offload": true, "csvExportOptions": { "selectQuery": "SELECT_QUERY", "escapeCharacter": "5C", "quoteCharacter": "22", "fieldsTerminatedBy": "2C", "linesTerminatedBy": "0A" } } }
CSV export creates standard CSV output by default. If you need even more options than Cloud SQL provides, you can use the following statement in a psql client:
\copy [table_name] TO '[csv_file_name].csv' WITH (FORMAT csv, ESCAPE '[escape_character]', QUOTE '[quote_character]', DELIMITER '[delimiter_character]', ENCODING 'UTF8', NULL '[null_marker_string]');Import data to Cloud SQL for PostgreSQL Required roles and permissions for importing to Cloud SQL for PostgreSQL
To import data from Cloud Storage into Cloud SQL, the user initiating the import must have one of the following roles:
cloudsql.instances.get
cloudsql.instances.import
Additionally, the service account for the Cloud SQL instance must have one of the following roles:
storage.objectAdmin
IAM rolestorage.objects.get
storage.objects.list
(for importing files in parallel only)For help with IAM roles, see Identity and Access Management.
Note: The changes that you make to the IAM permissions and roles might take a few minutes to take effect. For more information, see Access change propagation. Import data from a CSV file to Cloud SQL for PostgreSQLCSV files must have one line for each row of data and use comma-separated fields.
To import data to a Cloud SQL instance using a CSV file:
ConsoleIn the Google Cloud console, go to the Cloud SQL Instances page.
You can import a compressed (.gz
) or an uncompressed (.csv
) file.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
gcloud sql instances describe
command with the instance name. Look for the serviceAccountEmailAddress
field in the output.
gcloud sql instances describe INSTANCE_NAME
gcloud storage buckets add-iam-policy-binding
to grant the storage.objectAdmin
IAM role to the Cloud SQL instance service account for the bucket. For help with setting IAM permissions, see Using IAM permissions.gcloud sql import csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --table=TABLE_NAME
For information about using the import csv
command, see the sql import csv
command reference page.
gcloud storage buckets remove-iam-policy-binding
.Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
legacyBucketWriter
and objectViewer
IAM roles for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "database_name", "csvImportOptions": { "table": "table_name", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/v1/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/v1/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/v1/projects/project-id/operations/operation-id", "targetProject": "project-id" }For the complete list of parameters for the request, see the instances:import page.
Upload the file to your bucket.
For help with uploading files to buckets, see Uploading objects.
storage.objectAdmin
IAM role for your bucket. For help with setting IAM permissions, see Using IAM permissions.Before using any of the request data, make the following replacements:
HTTP method and URL:
POST https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import
Request JSON body:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "database_name", "csvImportOptions": { "table": "table_name", "escapeCharacter": "escape_character", "quoteCharacter": "quote_character", "fieldsTerminatedBy": "fields_terminated_by", "linesTerminatedBy": "lines_terminated_by" } } }
To send your request, expand one of these options:
curl (Linux, macOS, or Cloud Shell) Note: The following command assumes that you have logged in to thegcloud
CLI with your user account by running gcloud init
or gcloud auth login
, or by using Cloud Shell, which automatically logs you into the gcloud
CLI . You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
curl -X POST \PowerShell (Windows) Note: The following command assumes that you have logged in to the
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import"
gcloud
CLI with your user account by running gcloud init
or gcloud auth login
. You can check the currently active account by running gcloud auth list
.
Save the request body in a file named request.json
, and execute the following command:
$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/instance-id/import" | Select-Object -Expand Content
You should receive a JSON response similar to the following:
Response{ "kind": "sql#operation", "targetLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/instances/target-instance-id", "status": "PENDING", "user": "user@example.com", "insertTime": "2020-01-21T22:43:37.981Z", "operationType": "UPDATE", "name": "operation-id", "targetId": "instance-id", "selfLink": "https://sqladmin.googleapis.com/sql/v1beta4/projects/project-id/operations/operation-id", "targetProject": "project-id" }For the complete list of parameters for the request, see the instances:import page.
You can use gcloud
or the REST API to customize your CSV file format.
A sample gcloud
command follows:
gcloud sql import csv INSTANCE_NAME gs://BUCKET_NAME/FILE_NAME \ --database=DATABASE_NAME \ --table=TABLE_NAME \ --quote="22" \ --escape="5C" \ --fields-terminated-by="2C" \ --lines-terminated-by="0A"
The equivalent REST API request body would look like this:
{ "importContext": { "fileType": "CSV", "uri": "gs://bucket_name/path_to_csv_file", "database": "DATABASE_NAME", "csvImportOptions": { "table": "TABLE_NAME", "escapeCharacter": "5C", "quoteCharacter": "22", "fieldsTerminatedBy": "2C", "linesTerminatedBy": "0A" } } }Note: If you use custom format options in your import commands, make sure the exported file was created with the same options.
If you get an error such as ERROR_RDBMS
, ensure the table exists. If the table exists, confirm that you have the correct permissions on the bucket. For help configuring access control in Cloud Storage, see Create and Manage Access Control Lists.
To see how the
underlying REST API requestis constructed for this task, see the
APIs Explorer on the instances:import page.
What's nextExcept as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-07-02 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-07-02 UTC."],[],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4