A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from http://cloud.google.com/bigquery/docs/reference/standard-sql/other-statements below:

Export statements in GoogleSQL | BigQuery

Stay organized with collections Save and categorize content based on your preferences.

Export statements in GoogleSQL EXPORT DATA statement

The EXPORT DATA statement exports the results of a query to an external storage location. You can export to the following services:

Syntax
EXPORT DATA
[WITH CONNECTION connection_name]
OPTIONS (export_option_list) AS
query_statement
Arguments Export to Cloud Storage, Amazon S3, or Blob Storage

You can export BigQuery data to Cloud Storage, Amazon S3, or Blob Storage in Avro, CSV, JSON, and Parquet formats. For more information about exporting to Cloud Storage, see Export table data to Cloud Storage.

Use the format option to specify the format of the exported data. The following limitations apply:

You are not billed for the export operation, but you are billed for running the query and for storing data in Cloud Storage, Amazon S3, or Blob Storage. For more information, see Cloud Storage pricing, Amazon S3 pricing, or Blob Storage pricing.

Cloud Storage, Amazon S3, and Blob Storage export option list

The option list specifies options for exporting to Cloud Storage, Amazon S3, or Blob Storage. Specify the option list in the following format: NAME=VALUE, ...

Options compression

STRING

Specifies a compression format. If not specified, the exported files are uncompressed. Supported values include: GZIP, DEFLATE, SNAPPY.

field_delimiter

STRING

The delimiter used to separate fields. Default: ',' (comma).

Applies to: CSV.

format

STRING

Required. The format of the exported data. Supported values include: AVRO, CSV, JSON, PARQUET.

header

BOOL

If true, generates column headers for the first row of each data file. Default: false.

Applies to: CSV.

overwrite

BOOL

If true, overwrites any existing files with the same URI. Otherwise, if the destination storage bucket is not empty, the statement returns an error. Default: false.

Note: When overwrite is true, files are only overwritten, no files are ever deleted, even if they match the wildcard specified in the URI.

uri

STRING

Required. The destination URI for the export. The uri option must be a single-wildcard URI as described in Exporting data into one or more files.

Examples: "gs://bucket/path/file_*.csv" or "s3://bucket/path/file_*.csv"

use_avro_logical_types

BOOL

Whether to use appropriate AVRO logical types when exporting TIMESTAMP, DATETIME, TIME and DATE types.

Applies to: AVRO. For more information, see Avro export details.

Examples

The following examples show common use cases for exporting to Cloud Storage, Amazon S3, or Blob Storage.

Export data to Cloud Storage in CSV format

The following example exports data to a CSV file. It includes options to overwrite the destination location, write header rows, and use ';' as a delimiter.

EXPORT DATA OPTIONS(
  uri='gs://bucket/folder/*.csv',
  format='CSV',
  overwrite=true,
  header=true,
  field_delimiter=';') AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10
Export data to Cloud Storage in Avro format

The following example exports data to Avro format using Snappy compression.

EXPORT DATA OPTIONS(
  uri='gs://bucket/folder/*',
  format='AVRO',
  compression='SNAPPY') AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10
Export data to Cloud Storage in Parquet format

The following example exports data to Parquet format. It includes the option to overwrite the destination location.

EXPORT DATA OPTIONS(
  uri='gs://bucket/folder/*',
  format='PARQUET',
  overwrite=true) AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10
Export data to Amazon S3 in JSON format

The following example exports query results that run against a BigLake table based on Amazon S3 to your Amazon S3 bucket:

EXPORT DATA
  WITH CONNECTION myproject.us.myconnection
  OPTIONS(
  uri='s3://bucket/folder/*',
  format='JSON',
  overwrite=true) AS
SELECT field1, field2 FROM mydataset.table1 ORDER BY field1 LIMIT 10
Export to Bigtable

You can export BigQuery data to a Bigtable table by using the EXPORT DATA statement. For Bigtable export examples and configuration options, see Export data to Bigtable.

You are not billed for the export operation, but you are billed for running the query and for storing data in Bigtable. For more information, see Bigtable pricing.

Bigtable export option list

The option list specifies options for exporting to Bigtable. Specify the option list in the following format: NAME=VALUE, ...

Options format

STRING

Required. When exporting to Bigtable, the value must always be CLOUD_BIGTABLE.

bigtable_options

STRING

JSON string containing configurations related to mapping exported fields to Bigtable columns families and columns. For more information, see Configure exports with bigtable_options.

overwrite

BOOL

If true, allows export to overwrite existing data in the destination Bigtable table. When set to false, and if the destination table is not empty, the statement returns an error. Default: false.

truncate

BOOL

If true, all existing data in the destination table will be deleted before any new data is written. Otherwise the export will proceed with a non-empty destination table. Default: false.

Warning: Only use truncate when the destination table doesn't contain data that needs to be retained. uri

STRING

Required. The destination URI for the export. We recommend specifying an app profile for traffic routing and visibility at monitoring dashboards provided by Bigtable. The uri option for a Bigtable export must be provided in the following format: https://bigtable.googleapis.com/projects/PROJECT_ID/instances/INSTANCE_ID/appProfiles/APP_PROFILE/tables/TABLE_NAME

auto_create_column_families

BOOL

If true, allows export to create missing column families in the target table. If false and if the destination table is missing a column family, the statement returns an error. Default: false.

Example

The following example exports data to a Bigtable table. Data in field1 becomes a row key in Bigtable destination table. The fields field2, field3 and field4 are written as columns cbtFeld2, cbtField3 and cbtField4 into column family column_family.

EXPORT DATA OPTIONS (
uri="https://bigtable.googleapis.com/projects/my-project/instances/my-instance/tables/my-table",
format="CLOUD_BIGTABLE",
bigtable_options="""{
   "columnFamilies" : [
      {
        "familyId": "column_family",
        "columns": [
           {"qualifierString": "cbtField2", "fieldName": "field2"},
           {"qualifierString": "cbtField3", "fieldName": "field3"},
           {"qualifierString": "cbtField4", "fieldName": "field4"},
        ]
      }
   ]
}"""
) AS
SELECT
CAST(field1 as STRING) as rowkey,
STRUCT(field2, field3, field4) as column_family
FROM `bigquery_table`
Export to Pub/Sub

Preview

This feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of the Service Specific Terms. Pre-GA features are available "as is" and might have limited support. For more information, see the launch stage descriptions.

Note: To enroll in the continuous queries preview, fill out the request form. To give feedback or request support for this feature, contact bq-continuous-queries-feedback@google.com.

You can export BigQuery data to a Pub/Sub topic by using the EXPORT DATA statement in a continuous query. For more information about Pub/Sub configuration options, see Export data to Pub/Sub.

For information about the costs involved with exporting to Pub/Sub by using a continuous query, see Costs.

Pub/Sub export option list

The option list specifies options for exporting to Pub/Sub. Specify the option list in the following format: NAME=VALUE, ...

Options format

STRING

Required. When exporting to Pub/Sub, the value must always be CLOUD_PUBSUB.

uri

STRING

Required. The destination URI for the export. The uri option for a Pub/Sub export must be provided in the following format: https://pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID

Example

The following example shows a continuous query that filters data from a BigQuery table that is receiving streaming taxi ride information, and publishes the data to a Pub/Sub topic in real time:

EXPORT DATA
  OPTIONS (
    format = 'CLOUD_PUBSUB',
    uri = 'https://pubsub.googleapis.com/projects/myproject/topics/taxi-real-time-rides')
AS (
  SELECT
    TO_JSON_STRING(
      STRUCT(
        ride_id,
        timestamp,
        latitude,
        longitude)) AS message
  FROM `myproject.real_time_taxi_streaming.taxi_rides`
  WHERE ride_status = 'enroute'
);
Export to Spanner

Preview

This feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of the Service Specific Terms. Pre-GA features are available "as is" and might have limited support. For more information, see the launch stage descriptions.

To provide feedback or request support for this feature, send email to bq-cloud-spanner-federation-preview@google.com.

You can export data from a BigQuery table to a Spanner table by using the EXPORT DATA statement.

Spanner export option list

The option list specifies options for the export operation. Specify the option list in the following format: NAME=VALUE, ...

Options format

STRING

Required. To export data from BigQuery to Spanner, the value must always be CLOUD_SPANNER.

uri

STRING

Required. The destination URI for the export. For Spanner, the URI must be provided in the following format: https://spanner.googleapis.com/projects/PROJECT_ID/instances/INSTANCE_ID/databases/DATABASE_ID

spanner_options

STRING

Required. A JSON string containing configurations related to mapping exported fields to Spanner column families and columns. For more information, see Configure exports with spanner_options option.

Examples Export data to Spanner

The following example exports data to a Spanner table:

EXPORT DATA OPTIONS (
  uri="https://spanner.googleapis.com/projects/my-project/instances/my-instance/databases/my-database",
  format="CLOUD_SPANNER",
  spanner_options="""{ "table": "my_table" }"""
)
AS SELECT * FROM `bigquery_table`

For more Spanner export examples and configuration options, see Export data to Spanner.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4