Stay organized with collections Save and categorize content based on your preferences.
Introduction to external tablesThis document describes how to work with data stored outside of BigQuery in external tables. To work with external data sources, you can also use External datasets.
Non-BigLake external tables let you query structured data in external data stores. To query a non-BigLake external table, you must have permissions to both the external table and the external data source. For example, to query a non-BigLake external table that uses a data source in Cloud Storage, you must have the following permissions:
bigquery.tables.getData
bigquery.jobs.create
storage.buckets.get
storage.objects.get
You can use non-BigLake external tables with the following data stores:
Temporary table supportYou can query an external data source in BigQuery by using a permanent table or a temporary table. A permanent table is a table that is created in a dataset and is linked to your external data source. Because the table is permanent, you can use access controls to share the table with others who also have access to the underlying external data source, and you can query the table at any time.
When you query an external data source using a temporary table, you submit a command that includes a query and creates a non-permanent table linked to the external data source. When you use a temporary table, you do not create a table in one of your BigQuery datasets. Because the table is not permanently stored in a dataset, it cannot be shared with others. Querying an external data source using a temporary table is useful for one-time, ad-hoc queries over external data, or for extract, transform, and load (ETL) processes.
Multiple source filesIf you create a non-BigLake external table based on Cloud Storage, then you can use multiple external data sources, provided those data sources have the same schema. This isn't supported for non-BigLake external table based on Bigtable or Google Drive.
LimitationsThe following limitations apply to external tables:
TableDataList
JSON API method to retrieve data from external tables. For more information, see tabledata.list
. To work around this limitation, you can save query results in a destination table. You can then use the TableDataList
method on the results table._object_metadata
as a column name in external tables. It is reserved for internal use.When you choose a location for your external table, you need to take into consideration both the location of the BigQuery dataset and the external data source.
Cloud StorageWhen you query data in Cloud Storage by using a BigLake or a non-BigLake external table, the bucket must be colocated with your BigQuery dataset that contains the external table definition. For example:
If your Cloud Storage bucket is in the us-central1
(Iowa) region, your BigQuery dataset must be in the us-central1
(Iowa) region or the US
multi-region.
If your Cloud Storage bucket is in the europe-west4
(Netherlands) region, your BigQuery dataset must in the europe-west4
(Netherlands) or the EU
multi-region.
If your Cloud Storage bucket is in the europe-west1
(Belgium) region, the corresponding BigQuery dataset must also be in the europe-west1
(Belgium) or the EU
multi-region.
If your Cloud Storage bucket is in the NAM4
predefined dual-region or any configurable dual-region that includes the us-central1
(Iowa) region, the corresponding BigQuery dataset must be in the us-central1
(Iowa) region or the US
multi-region.
If your Cloud Storage bucket is in the EUR4
predefined dual-region or any configurable dual-region that includes the europe-west4
(Netherlands) region, the corresponding BigQuery dataset must be in the europe-west4
(Netherlands) region or the EU
multi-region.
If your Cloud Storage bucket is in the ASIA1
predefined dual-region, the corresponding BigQuery dataset must be in the asia-northeast1
(Tokyo) or the asia-northeast2
(Osaka) region.
If your Cloud Storage bucket uses a configurable dual-region that includes the australia-southeast1
(Sydney) and the australia-southeast2
(Melbourne) region, the corresponding BigQuery bucket must be in either the australia-southeast1
(Sydney) or the australia-southeast2
(Melbourne) region.
Using multi-region dataset locations with multi-region Cloud Storage buckets is not recommended for external tables, because external query performance depends on minimal latency and optimal network bandwidth.
If your BigQuery dataset is in the US
multi-region, the corresponding Cloud Storage bucket must be in the US
multi-region, in the single region us-central1
(Iowa), or in a dual-region that includes us-central1
(Iowa), like the NAM4
dual-region, or in a configurable dual-region that includes us-central1
.
If your BigQuery dataset is in the EU
multi-region, the corresponding Cloud Storage bucket must be in the EU
multi-region, in the single region europe-west1
(Belgium) or europe-west4
(Netherlands), or a dual-region that includes europe-west1
(Belgium) or europe-west4
(Netherlands), like the EUR4
dual-region, or in a configurable dual-region that includes europe-west1
or europe-west4
.
For more information about supported Cloud Storage locations, see Bucket locations in the Cloud Storage documentation.
BigtableWhen you query data in Bigtable through a BigQuery external table, your Bigtable instance must be in the same location as your BigQuery dataset:
europe-west1
) regional location, the corresponding Bigtable instance must be in the Belgium region.For more information about supported Bigtable locations, see Bigtable locations.
Google DriveLocation considerations don't apply to Google Drive external data sources.
Moving data between locationsTo manually move a dataset from one location to another, follow these steps:
Export the data from your BigQuery tables to a Cloud Storage bucket.
There are no charges for exporting data from BigQuery, but you do incur charges for storing the exported data in Cloud Storage. BigQuery exports are subject to the limits on export jobs.
Copy or move the data from your export Cloud Storage bucket to a new bucket you created in the destination location. For example, if you are moving your data from the US
multi-region to the asia-northeast1
Tokyo region, you would transfer the data to a bucket that you created in Tokyo. For information about transferring Cloud Storage objects, see Copy, rename, and move objects in the Cloud Storage documentation.
Transferring data between regions incurs network egress charges in Cloud Storage.
Create a new BigQuery dataset in the new location, and then load your data from the Cloud Storage bucket into the new dataset.
You are not charged for loading the data into BigQuery, but you will incur charges for storing the data in Cloud Storage until you delete the data or the bucket. You are also charged for storing the data in BigQuery after it is loaded. Loading data into BigQuery is subject to the load jobs limits.
You can also use Cloud Composer to move and copy large datasets programmatically.
For more information about using Cloud Storage to store and move large datasets, see Use Cloud Storage with big data.
PricingWhen querying an external table from BigQuery, you are charged for running the query and the applicable bytes read if using BigQuery on-demand (per TiB) pricing or slot consumption if using BigQuery capacity (per slot-hour) pricing.
If your data is stored in ORC or Parquet on Cloud Storage, see Data size calculation.
You are also charged for storing the data and any resources used by the source application, subject to the application's pricing guidelines:
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-12 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-12 UTC."],[[["External tables enable querying structured data in external data stores outside of BigQuery, requiring specific permissions for both the external table and the data source."],["Supported data stores for non-BigLake external tables include Cloud Storage, Bigtable, and Google Drive, each with specific creation and location instructions."],["External tables can be either permanent, allowing access controls and sharing, or temporary, useful for ad-hoc queries and ETL processes but not shareable."],["Limitations of external tables include potential data consistency issues, slower query performance compared to standard BigQuery tables, read-only access, and restrictions on certain API methods and operations."],["When using Cloud Storage or Bigtable with external tables, it's essential that the storage resource's location be colocated with the BigQuery dataset's location to ensure optimal performance."]]],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4