Stay organized with collections Save and categorize content based on your preferences.
Manage tablesThis document describes how to manage tables in BigQuery. You can manage your BigQuery tables in the following ways:
For information about how to restore (or undelete) a deleted table, see Restore deleted tables.
For more information about creating and using tables including getting table information, listing tables, and controlling access to table data, see Creating and using tables.
Before you beginGrant Identity and Access Management (IAM) roles that give users the necessary permissions to perform each task in this document. The permissions required to perform a task (if any) are listed in the "Required permissions" section of the task.
Update table propertiesYou can update the following elements of a table:
Required permissionsTo get the permissions that you need to update table properties, ask your administrator to grant you the Data Editor (roles/bigquery.dataEditor
) IAM role on a table. For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the permissions required to update table properties. To see the exact permissions that are required, expand the Required permissions section:
Required permissionsThe following permissions are required to update table properties:
bigquery.tables.update
bigquery.tables.get
You might also be able to get these permissions with custom roles or other predefined roles.
Additionally, if you have the bigquery.datasets.create
permission, you can update the properties of the tables of the datasets that you create.
You can update a table's description in the following ways:
ALTER TABLE
statement.bq update
command.tables.patch
API method.To update a table's description:
ConsoleYou can't add a description when you create a table using the Google Cloud console. After the table is created, you can add a description on the Details page.
In the Explorer panel, expand your project and dataset, then select the table.
In the details panel, click Details.
In the Description section, click the pencil icon to edit the description.
Enter a description in the box, and click Update to save.
Use the ALTER TABLE SET OPTIONS
statement. The following example updates the description of a table named mytable
:
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter the following statement:
ALTER TABLE mydataset.mytable SET OPTIONS ( description = 'Description of mytable');
Click play_circle Run.
For more information about how to run queries, see Run an interactive query.
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Issue the bq update
command with the --description
flag. If you are updating a table in a project other than your default project, add the project ID to the dataset name in the following format: project_id:dataset
.
bq update \ --description "description" \ project_id:dataset.table
Replace the following:
description
: the text describing the table in quotesproject_id
: your project IDdataset
: the name of the dataset that contains the table you're updatingtable
: the name of the table you're updatingExamples:
To change the description of the mytable
table in the mydataset
dataset to "Description of mytable", enter the following command. The mydataset
dataset is in your default project.
bq update --description "Description of mytable" mydataset.mytable
To change the description of the mytable
table in the mydataset
dataset to "Description of mytable", enter the following command. The mydataset
dataset is in the myotherproject
project, not your default project.
bq update \ --description "Description of mytable" \ myotherproject:mydataset.mytable
Call the tables.patch
method and use the description
property in the table resource to update the table's description. Because the tables.update
method replaces the entire table resource, the tables.patch
method is preferred.
Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Configure the Table.description property and call Client.update_table() to send the update to the API. Update a table's expiration timeYou can set a default table expiration time at the dataset level, or you can set a table's expiration time when the table is created. A table's expiration time is often referred to as "time to live" or TTL.
When a table expires, it is deleted along with all of the data it contains. If necessary, you can undelete the expired table within the time travel window specified for the dataset, see Restore deleted tables for more information.
If you set the expiration when the table is created, the dataset's default table expiration is ignored. If you do not set a default table expiration at the dataset level, and you do not set a table expiration when the table is created, the table never expires and you must delete the table manually.
At any point after the table is created, you can update the table's expiration time in the following ways:
ALTER TABLE
statement.bq update
command.tables.patch
API method.To update a table's expiration time:
ConsoleYou can't add an expiration time when you create a table using the Google Cloud console. After a table is created, you can add or update a table expiration on the Table Details page.
In the Explorer panel, expand your project and dataset, then select the table.
In the details panel, click Details.
Click the pencil icon next to Table info
For Table expiration, select Specify date. Then select the expiration date using the calendar widget.
Click Update to save. The updated expiration time appears in the Table info section.
Use the ALTER TABLE SET OPTIONS
statement. The following example updates the expiration time of a table named mytable
:
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter the following statement:
ALTER TABLE mydataset.mytable SET OPTIONS ( -- Sets table expiration to timestamp 2025-02-03 12:34:56 expiration_timestamp = TIMESTAMP '2025-02-03 12:34:56');
Click play_circle Run.
For more information about how to run queries, see Run an interactive query.
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Issue the bq update
command with the --expiration
flag. If you are updating a table in a project other than your default project, add the project ID to the dataset name in the following format: project_id:dataset
.
bq update \
--expiration integer \
project_id:dataset.table
Replace the following:
integer
: the default lifetime (in seconds) for the table. The minimum value is 3600 seconds (one hour). The expiration time evaluates to the current time plus the integer value. If you specify 0
, the table expiration is removed, and the table never expires. Tables with no expiration must be manually deleted.project_id
: your project ID.dataset
: the name of the dataset that contains the table you're updating.table
: the name of the table you're updating.Examples:
To update the expiration time of the mytable
table in the mydataset
dataset to 5 days (432000 seconds), enter the following command. The mydataset
dataset is in your default project.
bq update --expiration 432000 mydataset.mytable
To update the expiration time of the mytable
table in the mydataset
dataset to 5 days (432000 seconds), enter the following command. The mydataset
dataset is in the myotherproject
project, not your default project.
bq update --expiration 432000 myotherproject:mydataset.mytable
Call the tables.patch
method and use the expirationTime
property in the table resource to update the table expiration in milliseconds. Because the tables.update
method replaces the entire table resource, the tables.patch
method is preferred.
Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Node.jsBefore trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Configure Table.expires property and call Client.update_table() to send the update to the API.To update the default dataset partition expiration time:
Update a table's rounding modeYou can update a table's default rounding mode by using the ALTER TABLE SET OPTIONS
DDL statement. The following example updates the default rounding mode for mytable
to ROUND_HALF_EVEN
:
ALTER TABLE mydataset.mytable SET OPTIONS ( default_rounding_mode = "ROUND_HALF_EVEN");
When you add a NUMERIC
or BIGNUMERIC
field to a table and do not specify a rounding mode, then the rounding mode is automatically set to the table's default rounding mode. Changing a table's default rounding mode doesn't alter the rounding mode of existing fields.
For more information about updating a table's schema definition, see Modifying table schemas.
Rename a tableYou can rename a table after it has been created by using the ALTER TABLE RENAME TO
statement. The following example renames mytable
to mynewtable
:
ALTER TABLE mydataset.mytable RENAME TO mynewtable;Limitations on renaming tables
This section describes how to create a full copy of a table. For information about other types of table copies, see table clones and table snapshots.
You can copy a table in the following ways:
bq cp
command.CREATE TABLE COPY
statement.copy
job.Table copy jobs are subject to the following limitations:
When copying multiple source tables to a destination table using the API, bq command-line tool, or the client libraries, all source tables must have identical schemas, including any partitioning or clustering.
Certain table schema updates, such as dropping or renaming columns, can cause tables to have apparently identical schemas but different internal representations. This might cause a table copy job to fail with the error Maximum limit on diverging physical schemas reached
. In this case, you can use the CREATE TABLE LIKE
statement to ensure that your source table's schema matches the destination table's schema exactly.
The time that BigQuery takes to copy tables might vary significantly across different runs because the underlying storage is managed dynamically.
You can't copy and append a source table to a destination table that has more columns than the source table, and the additional columns have default values. Instead, you can run INSERT destination_table SELECT * FROM source_table
to copy over the data.
If the copy operation overwrites an existing table, then the table-level access for the existing table is maintained. Tags from the source table aren't copied to the overwritten table, while tags on the existing table are retained. However, when you copy tables across regions, tags on the existing table are removed.
If the copy operation creates a new table, then the table-level access for the new table is determined by the access policies of the dataset in which the new table is created. Additionally, tags are copied from the source table to the new table.
When you copy multiple source tables to a destination table, all source tables must have identical tags.
To perform the tasks in this document, you need the following permissions.
Roles to copy tables and partitionsTo get the permissions that you need to copy tables and partitions, ask your administrator to grant you the Data Editor (roles/bigquery.dataEditor
) IAM role on the source and destination datasets. For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the permissions required to copy tables and partitions. To see the exact permissions that are required, expand the Required permissions section:
Required permissionsThe following permissions are required to copy tables and partitions:
bigquery.tables.getData
on the source and destination datasets bigquery.tables.get
on the source and destination datasets bigquery.tables.create
on the destination dataset bigquery.tables.update
on the destination datasetYou might also be able to get these permissions with custom roles or other predefined roles.
Permission to run a copy jobTo get the permission that you need to run a copy job, ask your administrator to grant you the Job User (roles/bigquery.jobUser
) IAM role on the source and destination datasets. For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the bigquery.jobs.create
permission, which is required to run a copy job.
You might also be able to get this permission with custom roles or other predefined roles.
Copy a single source tableYou can copy a single table in the following ways:
bq cp
command.CREATE TABLE COPY
statement.jobs.insert
API method, configuring a copy
job, and specifying the sourceTable
property.The Google Cloud console and the CREATE TABLE COPY
statement support only one source table and one destination table in a copy job. To copy multiple source files to a destination table, you must use the bq command-line tool or the API.
To copy a single source table:
ConsoleIn the Explorer panel, expand your project and dataset, then select the table.
In the details panel, click Copy table.
In the Copy table dialog, under Destination:
Click Copy to start the copy job.
Use the CREATE TABLE COPY
statement to copy a table named table1
to a new table named table1copy
:
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter the following statement:
CREATE TABLEmyproject.mydataset.table1copy
COPYmyproject.mydataset.table1
;
Click play_circle Run.
For more information about how to run queries, see Run an interactive query.
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Issue the bq cp
command. Optional flags can be used to control the write disposition of the destination table:
-a
or --append_table
appends the data from the source table to an existing table in the destination dataset.-f
or --force
overwrites an existing table in the destination dataset and doesn't prompt you for confirmation.-n
or --no_clobber
returns the following error message if the table exists in the destination dataset: Table 'project_id:dataset.table' already exists, skipping.
If -n
is not specified, the default behavior is to prompt you to choose whether to replace the destination table.--destination_kms_key
is the customer-managed Cloud KMS key used to encrypt the destination table.--destination_kms_key
is not demonstrated here. See Protecting data with Cloud Key Management Service keys for more information.
If the source or destination dataset is in a project other than your default project, add the project ID to the dataset names in the following format: project_id:dataset
.
(Optional) Supply the --location
flag and set the value to your location.
bq --location=location cp \ -a -f -n \project_id:dataset.source_table
\project_id:dataset.destination_table
Replace the following:
location
: the name of your location. The --location
flag is optional. For example, if you are using BigQuery in the Tokyo region, you can set the flag's value to asia-northeast1
. You can set a default value for the location using the .bigqueryrc
file.project_id
: your project ID.dataset
: the name of the source or destination dataset.source_table
: the table you're copying.destination_table
: the name of the table in the destination dataset.Examples:
To copy the mydataset.mytable
table to the mydataset2.mytable2
table, enter the following command. Both datasets are in your default project.
bq cp mydataset.mytable mydataset2.mytable2
To copy the mydataset.mytable
table and to overwrite a destination table with the same name, enter the following command. The source dataset is in your default project. The destination dataset is in the myotherproject
project. The -f
shortcut is used to overwrite the destination table without a prompt.
bq cp -f \ mydataset.mytable \ myotherproject:myotherdataset.mytable
To copy the mydataset.mytable
table and to return an error if the destination dataset contains a table with the same name, enter the following command. The source dataset is in your default project. The destination dataset is in the myotherproject
project. The -n
shortcut is used to prevent overwriting a table with the same name.
bq cp -n \ mydataset.mytable \ myotherproject:myotherdataset.mytable
To copy the mydataset.mytable
table and to append the data to a destination table with the same name, enter the following command. The source dataset is in your default project. The destination dataset is in the myotherproject
project. The - a
shortcut is used to append to the destination table.
bq cp -a mydataset.mytable myotherproject:myotherdataset.mytable
You can copy an existing table through the API by calling the bigquery.jobs.insert
method, and configuring a copy
job. Specify your location in the location
property in the jobReference
section of the job resource.
You must specify the following values in your job configuration:
"copy": { "sourceTable": { // Required "projectId": string, // Required "datasetId": string, // Required "tableId": string // Required }, "destinationTable": { // Required "projectId": string, // Required "datasetId": string, // Required "tableId": string // Required }, "createDisposition": string, // Optional "writeDisposition": string, // Optional },
Where sourceTable
provides information about the table to be copied, destinationTable
provides information about the new table, createDisposition
specifies whether to create the table if it doesn't exist, and writeDisposition
specifies whether to overwrite or append to an existing table.
Before trying this sample, follow the C# setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery C# API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
GoBefore trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Node.jsBefore trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PHPBefore trying this sample, follow the PHP setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery PHP API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Copy multiple source tablesYou can copy multiple source tables to a destination table in the following ways:
bq cp
command.jobs.insert
method, configuring a copy
job, and specifying the sourceTables
property.All source tables must have identical schemas and tags, and only one destination table is allowed.
Source tables must be specified as a comma-separated list. You can't use wildcards when you copy multiple source tables.
To copy multiple source tables, select one of the following choices:
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Issue the bq cp
command and include multiple source tables as a comma-separated list. Optional flags can be used to control the write disposition of the destination table:
-a
or --append_table
appends the data from the source tables to an existing table in the destination dataset.-f
or --force
overwrites an existing destination table in the destination dataset and doesn't prompt you for confirmation.-n
or --no_clobber
returns the following error message if the table exists in the destination dataset: Table 'project_id:dataset.table' already exists, skipping.
If -n
is not specified, the default behavior is to prompt you to choose whether to replace the destination table.--destination_kms_key
is the customer-managed Cloud Key Management Service key used to encrypt the destination table.--destination_kms_key
is not demonstrated here. See Protecting data with Cloud Key Management Service keys for more information.
If the source or destination dataset is in a project other than your default project, add the project ID to the dataset names in the following format: project_id:dataset
.
(Optional) Supply the --location
flag and set the value to your location.
bq --location=location cp \ -a -f -n \project_id:dataset.source_table
,project_id:dataset.source_table
\project_id:dataset.destination_table
Replace the following:
location
: the name of your location. The --location
flag is optional. For example, if you are using BigQuery in the Tokyo region, you can set the flag's value to asia-northeast1
. You can set a default value for the location using the .bigqueryrc
file.project_id
: your project ID.dataset
: the name of the source or destination dataset.source_table
: the table that you're copying.destination_table
: the name of the table in the destination dataset.Examples:
To copy the mydataset.mytable
table and the mydataset.mytable2
table to mydataset2.tablecopy
table, enter the following command . All datasets are in your default project.
bq cp \ mydataset.mytable,mydataset.mytable2 \ mydataset2.tablecopy
To copy the mydataset.mytable
table and the mydataset.mytable2
table to myotherdataset.mytable
table and to overwrite a destination table with the same name, enter the following command. The destination dataset is in the myotherproject
project, not your default project. The -f
shortcut is used to overwrite the destination table without a prompt.
bq cp -f \ mydataset.mytable,mydataset.mytable2 \ myotherproject:myotherdataset.mytable
To copy the myproject:mydataset.mytable
table and the myproject:mydataset.mytable2
table and to return an error if the destination dataset contains a table with the same name, enter the following command. The destination dataset is in the myotherproject
project. The -n
shortcut is used to prevent overwriting a table with the same name.
bq cp -n \ myproject:mydataset.mytable,myproject:mydataset.mytable2 \ myotherproject:myotherdataset.mytable
To copy the mydataset.mytable
table and the mydataset.mytable2
table and to append the data to a destination table with the same name, enter the following command. The source dataset is in your default project. The destination dataset is in the myotherproject
project. The -a
shortcut is used to append to the destination table.
bq cp -a \ mydataset.mytable,mydataset.mytable2 \ myotherproject:myotherdataset.mytable
To copy multiple tables using the API, call the jobs.insert
method, configure a table copy
job, and specify the sourceTables
property.
Specify your region in the location
property in the jobReference
section of the job resource.
Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Node.jsBefore trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Copy tables across regionsPreview
This product or feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of the Service Specific Terms. Pre-GA products and features are available "as is" and might have limited support. For more information, see the launch stage descriptions.
You can copy a table, table snapshot, or table clone from one BigQuery region or multi-region to another. This includes any tables that have customer-managed Cloud KMS (CMEK) applied.
Copying a table across regions incurs additional data transfer charges according to BigQuery pricing. Additional charges are incurred even if you cancel the cross-region table copy job before it has been completed.
To copy a table across regions, select one of the following options:
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Run the bq cp
command:
bq cp \ -f -n \SOURCE_PROJECT:SOURCE_DATASET.SOURCE_TABLE
\DESTINATION_PROJECT:DESTINATION_DATASET.DESTINATION_TABLE
Replace the following:
SOURCE_PROJECT
: source project ID. If the source dataset is in a project other than your default project, add the project ID to the source dataset name.
DESTINATION_PROJECT
: destination project ID. If the destination dataset is in a project other than your default project, add the project ID to the destination dataset name.
SOURCE_DATASET
: the name of the source dataset.
DESTINATION_DATASET
: the name of the destination dataset.
SOURCE_TABLE
: the table that you are copying.
DESTINATION_TABLE
: the name of the table in the destination dataset.
The following example is a command that copies the mydataset_us.mytable
table from the us
multi-region to the mydataset_eu.mytable2
table in the eu
multi-region. Both datasets are in the default project.
bq cp --sync=false mydataset_us.mytable mydataset_eu.mytable2
To copy a table across regions into a CMEK-enabled destination dataset, you must enable CMEK on the table with a key from the table's region. The CMEK on the table doesn't have to be the same CMEK in use by the destination dataset. The following example copies a CMEK-enabled table to a destination dataset using the bq cp
command.
bq cp source-project-id:source-dataset-id.source-table-id destination-project-id:destination-dataset-id.destination-table-id
Conversely, to copy a CMEK-enabled table across regions into a destination dataset, you can enable CMEK on the destination dataset with a key from the destination dataset's region. You can also use the destination_kms_keys
flag in the bq cp
command, as shown in the following example:
bq cp --destination_kms_key=projects/project_id/locations/eu/keyRings/eu_key/cryptoKeys/eu_region mydataset_us.mytable mydataset_eu.mytable2API
To copy a table across regions using the API, call the jobs.insert
method and configure a table copy
job.
Specify your region in the location
property in the jobReference
section of the job resource.
Before trying this sample, follow the C# setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery C# API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
GoBefore trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Node.jsBefore trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PHPBefore trying this sample, follow the PHP setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery PHP API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
LimitationsCopying a table across regions is subject to the following limitations:
TABLE COPY DDL
statement.You can view your current usage of query, load, extract, or copy jobs by running an INFORMATION_SCHEMA
query to view metadata about the jobs ran over a specified time period. You can compare your current usage against the quota limit to determine your quota usage for a particular type of job. The following example query uses the INFORMATION_SCHEMA.JOBS
view to list the number of query, load, extract, and copy jobs by project:
SELECT sum(case when job_type="QUERY" then 1 else 0 end) as QRY_CNT, sum(case when job_type="LOAD" then 1 else 0 end) as LOAD_CNT, sum(case when job_type="EXTRACT" then 1 else 0 end) as EXT_CNT, sum(case when job_type="COPY" then 1 else 0 end) as CPY_CNT FROM `region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_PROJECT WHERE date(creation_time)= CURRENT_DATE()Note: The
INFORMATION_SCHEMA
view does not display cross-region copy jobs.
To view the quota limits for copy jobs, see Quotas and limits - Copy jobs.
Delete tablesYou can delete a table in the following ways:
DROP TABLE
statement.bq rm
command.tables.delete
API method.To delete all of the tables in the dataset, delete the dataset.
When you delete a table, any data in the table is also deleted. To automatically delete tables after a specified period of time, set the default table expiration for the dataset or set the expiration time when you create the table.
Deleting a table also deletes any permissions associated with this table. When you recreate a deleted table, you must also manually reconfigure any access permissions previously associated with it.
Required rolesTo get the permissions that you need to delete a table, ask your administrator to grant you the Data Editor (roles/bigquery.dataEditor
) IAM role on the dataset. For more information about granting roles, see Manage access to projects, folders, and organizations.
This predefined role contains the permissions required to delete a table. To see the exact permissions that are required, expand the Required permissions section:
Required permissionsThe following permissions are required to delete a table:
bigquery.tables.delete
bigquery.tables.get
You might also be able to get these permissions with custom roles or other predefined roles.
Delete a tableTo delete a table:
ConsoleIn the Explorer panel, expand your project and dataset, then select the table.
In the details panel, click Delete table.
Type "delete"
in the dialog, then click Delete to confirm.
Use the DROP TABLE
statement. The following example deletes a table named mytable
:
In the Google Cloud console, go to the BigQuery page.
In the query editor, enter the following statement:
DROP TABLE mydataset.mytable;
Click play_circle Run.
For more information about how to run queries, see Run an interactive query.
bqIn the Google Cloud console, activate Cloud Shell.
At the bottom of the Google Cloud console, a Cloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.
Use the bq rm
command with the --table
flag (or -t
shortcut) to delete a table. When you use the bq command-line tool to remove a table, you must confirm the action. You can use the --force
flag (or -f
shortcut) to skip confirmation.
If the table is in a dataset in a project other than your default project, add the project ID to the dataset name in the following format: project_id:dataset
.
bq rm \ -f \ -t \ project_id:dataset.table
Replace the following:
project_id
: your project IDdataset
: the name of the dataset that contains the tabletable
: the name of the table that you're deletingExamples:
To delete the mytable
table from the mydataset
dataset, enter the following command. The mydataset
dataset is in your default project.
bq rm -t mydataset.mytable
To delete the mytable
table from the mydataset
dataset, enter the following command. The mydataset
dataset is in the myotherproject
project, not your default project.
bq rm -t myotherproject:mydataset.mytable
To delete the mytable
table from the mydataset
dataset, enter the following command. The mydataset
dataset is in your default project. The command uses the -f
shortcut to bypass confirmation.
bq rm -f -t mydataset.mytableNote: You can enter the
bq ls dataset
command in the bq command-line tool to confirm that a table was removed from a dataset.Call the tables.delete
API method and specify the table to delete using the tableId
parameter.
Before trying this sample, follow the C# setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery C# API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
GoBefore trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
JavaBefore trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Node.jsBefore trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PHPBefore trying this sample, follow the PHP setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery PHP API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
PythonBefore trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
RubyBefore trying this sample, follow the Ruby setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Ruby API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.
Restore deleted tablesTo learn how to restore or undelete deleted tables, see Restore deleted tables.
Table securityTo control access to tables in BigQuery, see Control access to resources with IAM.
What's nextExcept as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-07 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[[["BigQuery tables can be managed by updating properties like descriptions, expiration times, schemas, labels, and default rounding modes, and these updates can be performed using various methods including the Google Cloud console, SQL DDL, `bq` command-line tool, API, and client libraries."],["Tables can be copied using the Google Cloud console, `bq cp` command, `CREATE TABLE COPY` SQL statement, API, or client libraries, but these operations have restrictions, including the inability to stop a copy operation, and cross-region copies have additional limitations."],["Deleting BigQuery tables can be done through the console, SQL `DROP TABLE` command, `bq rm` command, API `tables.delete` call, or client libraries, and it's crucial to understand that this action permanently removes data and permissions unless the table is restored."],["Deleted tables can be restored within a specific time window using time travel capabilities via `bq` command or client libraries, however, table snapshots offer a longer-term retention solution, and no tags are copied over from the original table."],["Managing table properties and performing actions like copying, renaming, and deleting tables require specific IAM roles and permissions, such as `bigquery.tables.update`, `bigquery.tables.getData`, and `bigquery.tables.delete`, with `roles/bigquery.dataEditor` being a frequently used role."]]],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4