A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.snowflake.com/en/developer-guide/python-connector/../../sql-reference/sql/create-pipe below:

Website Navigation


CREATE PIPE | Snowflake Documentation

CREATE PIPE

Creates a new pipe in the system for defining the COPY INTO <table> statement used by Snowpipe to load data from an ingestion queue into tables.

See also:

ALTER PIPE, DROP PIPE , SHOW PIPES , DESCRIBE PIPE

Syntax
CREATE [ OR REPLACE ] PIPE [ IF NOT EXISTS ] <name>
  [ AUTO_INGEST = [ TRUE | FALSE ] ]
  [ ERROR_INTEGRATION = <integration_name> ]
  [ AWS_SNS_TOPIC = '<string>' ]
  [ INTEGRATION = '<string>' ]
  [ COMMENT = '<string_literal>' ]
  AS <copy_statement>

Copy

Required parameters
name

Identifier for the pipe; must be unique for the schema in which the pipe is created.

The identifier must start with an alphabetic character and cannot contain spaces or special characters unless the entire identifier string is enclosed in double quotes (e.g. "My object"). Identifiers enclosed in double quotes are also case-sensitive.

For more details, see Identifier requirements.

copy_statement

COPY INTO <table> statement used to load data from queued files into a Snowflake table. This statement serves as the text/definition for the pipe and is displayed in the SHOW PIPES output.

Note

We currently do not recommend using the following functions in the copy_statement for Snowpipe:

It is a known issue that the time values inserted using these functions can be a few hours earlier than the LOAD_TIME values returned by the COPY_HISTORY function or the COPY_HISTORY view.

It is recommended to query METADATA$START_SCAN_TIME instead, which provides a more accurate representation of record loading.

Optional parameters
AUTO_INGEST = TRUE | FALSE

Specifies whether to automatically load data files from the internal or external stage:

  • TRUE enables automatic data loading.

    Snowpipe supports loading from external stages (Amazon S3, Google Cloud Storage, or Microsoft Azure).

  • FALSE disables automatic data loading. You must make calls to the Snowpipe REST API endpoints to load data files.

    Snowpipe supports loading from internal stages (i.e. Snowflake named stages or table stages, but not user stages) or external stage (Amazon S3, Google Cloud Storage, or Microsoft Azure).

ERROR_INTEGRATION = 'integration_name'

Required only when configuring Snowpipe to send error notifications to a cloud messaging service.

Specifies the name of the notification integration used to communicate with the messaging service. For more information, see Snowpipe error notifications.

AWS_SNS_TOPIC = 'string'

Required only when configuring AUTO_INGEST for Amazon S3 external stages using SNS.

Specifies the Amazon Resource Name (ARN) for the SNS topic for your S3 bucket. The CREATE PIPE statement subscribes the Amazon Simple Queue Service (SQS) queue to the specified SNS topic. The pipe copies files to the ingest queue triggered by event notifications via the SNS topic. For more information, see Automating Snowpipe for Amazon S3.

INTEGRATION = 'string'

Required only when configuring AUTO_INGEST for Google Cloud Storage or Microsoft Azure external stages.

Specifies the existing notification integration used to access the storage queue. For more information, see:

The integration name must be typed in all uppercase.

COMMENT = 'string_literal'

Specifies a comment for the pipe.

Default: No value

Usage notes

Important

If you recreate a pipe (using the CREATE OR REPLACE PIPE syntax), see Recreating pipes for related considerations and best practices.

Examples

Create a pipe in the current schema that loads all the data from files staged in the mystage stage into mytable:

CREATE PIPE mypipe
  AS
  COPY INTO mytable
  FROM @mystage
  FILE_FORMAT = (TYPE = 'JSON');

Copy

Same as the previous example, but with a data transformation. Only load data from the 4th and 5th columns in the staged files, in reverse order:

CREATE PIPE mypipe2
  AS
  COPY INTO mytable(C1, C2)
  FROM (SELECT $5, $4 FROM @mystage)
  FILE_FORMAT = (TYPE = 'JSON');

Copy

Create a pipe that loads all the data into columns in the target table that match corresponding columns represented in the data. Column names are case-insensitive.

In addition, load metadata from the METADATA$START_SCAN_TIME and METADATA$FILENAME metadata columns to the columns named c1 and c2.

CREATE PIPE mypipe3
  AS
  (COPY INTO mytable
    FROM @mystage
    MATCH_BY_COLUMN_NAME=CASE_INSENSITIVE
    INCLUDE_METADATA = (c1= METADATA$START_SCAN_TIME, c2=METADATA$FILENAME)
    FILE_FORMAT = (TYPE = 'JSON'));

Copy

Create a pipe in the current schema for automatic loading of data using event notifications received from a messaging service:

Amazon S3

CREATE PIPE mypipe_s3
  AUTO_INGEST = TRUE
  AWS_SNS_TOPIC = 'arn:aws:sns:us-west-2:001234567890:s3_mybucket'
  AS
  COPY INTO snowpipe_db.public.mytable
  FROM @snowpipe_db.public.mystage
  FILE_FORMAT = (TYPE = 'JSON');

Copy

Google Cloud Storage

CREATE PIPE mypipe_gcs
  AUTO_INGEST = TRUE
  INTEGRATION = 'MYINT'
  AS
  COPY INTO snowpipe_db.public.mytable
  FROM @snowpipe_db.public.mystage
  FILE_FORMAT = (TYPE = 'JSON');

Copy

Microsoft Azure

CREATE PIPE mypipe_azure
  AUTO_INGEST = TRUE
  INTEGRATION = 'MYINT'
  AS
  COPY INTO snowpipe_db.public.mytable
  FROM @snowpipe_db.public.mystage
  FILE_FORMAT = (TYPE = 'JSON');

Copy

Internal named stage

Create a pipe in the current schema that automatically loads all the data files on the internal named stage named mystage.

CREATE PIPE mypipe_aws
  AUTO_INGEST = TRUE
  AS
  COPY INTO snowpipe_db.public.mytable
  FROM @snowpipe_db.public.mystage
  FILE_FORMAT = (TYPE = 'JSON');

Copy


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4