A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://cloud.google.com/database-migration/docs/mysql/mysql-dump below:

Exporting a MySQL database using mysqldump | Database Migration Service

Skip to main content Exporting a MySQL database using mysqldump

Stay organized with collections Save and categorize content based on your preferences.

Important: If you're using the Database Migration Service API to create migration jobs, then make sure you use the mysqldump utility to update the dump when you run the REST API call. For more information, see Manage migration jobs using the API.

You can run the mysqldump utility directly against your MySQL database, using whatever options you require. However, if you're exporting to import the data into a Cloud SQL database, then use the mysqldump utility with the following flags:

It's also recommended to use the following flags:

Important: While mysqldump is running, do not perform any DDL operations on the database. Doing so could cause inconsistencies in the export file.

From a machine with network connectivity to your MySQL server, run the following command:

    mysqldump \
        -h [SOURCE_ADDR] -P [SOURCE_PORT] -u [USERNAME] -p \
        --databases [DBS]  \
        --hex-blob \
        --no-autocommit \
        --default-character-set=utf8mb4 \
        --master-data=1  \
        --single-transaction \
        --routines \
        | gzip \
        | gcloud storage cp - gs://[BUCKET_NAME]/[DUMP_FILENAME].gz
Note: If your source database server doesn't support GTID, then use the --master-data=1 property; otherwise, don't use this property. Also, if your source database server supports GTID, then use the --set-gtid-purged=on property; otherwise, don't use this property.

If the source of the migration is a Relational Database Service (RDS) for MySQL:

This command might look like the following example:

    mysqldump \
        -h [SOURCE_ADDR] -P [SOURCE_PORT] -u [USERNAME] -p \
        --databases [DBS]  \
        --hex-blob  \
        --no-autocommit \
        --default-character-set=utf8mb4 \
        --set-gtid-purged=on \
        --single-transaction \
        --routines \ 
        | gzip \
        | gcloud storage cp - gs://[BUCKET_NAME]/[DUMP_FILENAME].gz

Also, you should configure RDS instances to retain binlogs longer. This command might look like the following example:

    # Sets the retention period to one week.
    call mysql.rds_set_configuration('binlog retention hours', 168);

Replace [PROPERTIES_IN_BRACKETS] with the following values:

Property Value [SOURCE_ADDR] The IPv4 address or hostname for the source database server. [SOURCE_PORT] The port for the source database server. [USERNAME] The MySQL user account. [DBS] A space-separated list of the databases on the source database server to include in the dump. Use the SHOW DATABASES MySQL command to list your databases. [BUCKET_NAME] The bucket in Cloud Storage that's created by the user and that's used for storing the dump file (for example, replica-bucket). [DUMP_FILENAME] The dump's filename, ending with a .gz file extension (for example, source-database.sql.gz). Note: The gzip command causes the dump file to be compressed. If your database contains data that does not compress well, such as binary incompressible data or JPG images, then remove | gzip from the command.

Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-08-07 UTC.

[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["If using the Database Migration Service API, ensure `mysqldump` is used to update the dump when making REST API calls."],["When exporting to Cloud SQL, use `mysqldump` with flags such as `--databases` (excluding system databases), `--hex-blob` for binary fields, `--single-transaction` for consistency, and `--routines` for stored procedures/functions, as well as `--column-statistics=0` if you are on `mysqldump` version 8 or later and the database is older than version 8."],["While `mysqldump` is running, avoid DDL operations on the database to prevent inconsistencies in the export file."],["When dealing with Relational Database Service (RDS) for MySQL, note that the `master-data` property is unsupported, and if the source database supports GTID, use `--set-gtid-purged=on`; also be sure to use GTID when manually migrating."],["When calling `mysqldump`, make sure to properly replace the `[PROPERTIES_IN_BRACKETS]` values with proper values such as `[SOURCE_ADDR]`, `[SOURCE_PORT]`, `[USERNAME]`, `[DBS]`, `[BUCKET_NAME]`, and `[DUMP_FILENAME]`, and be mindful that the `gzip` command may not be ideal if your database contains non-compressible data."]]],[]]


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4