A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://docs.snowflake.com/en/developer-guide/python-connector/../../sql-reference/sql/put below:

Website Navigation


PUT | Snowflake Documentation

PUT

Uploads one or more data files from a local file system onto an internal stage.

After you upload files onto an internal stage, you can load data from the files into a table using the COPY INTO <table> command.

Note

See also:

GET , LIST , REMOVE , COPY FILES , CREATE STAGE , Overview of data loading

Syntax
PUT file://<absolute_path_to_file>/<filename> internalStage
    [ PARALLEL = <integer> ]
    [ AUTO_COMPRESS = TRUE | FALSE ]
    [ SOURCE_COMPRESSION = AUTO_DETECT | GZIP | BZ2 | BROTLI | ZSTD | DEFLATE | RAW_DEFLATE | NONE ]
    [ OVERWRITE = TRUE | FALSE ]

Copy

Where:

internalStage ::=
    @[<namespace>.]<int_stage_name>[/<path>]
  | @[<namespace>.]%<table_name>[/<path>]
  | @~[/<path>]

Copy

Required parameters
file://absolute_path_to_file/filename

Specifies the URI for the data files on the client machine, where:

The URI formatting differs depending on your client operating system.

Linux/macOS:

Specify the absolute path to the file from the root directory (/). For example, for a file named my-data.csv use file:///my/file/path/my-data.csv.

Windows:

Specify the absolute path from the root of the drive where the file or files are located. For example, for a file named my-data.csv use file://C:temp\my-data.csv.

If the file path includes special characters, you must enclose the entire path in single quotes and change the drive and path separator from a backward slash to a forward slash (/). For example, for a file named my$data.csv, use: 'file://C:/temp/my$data.csv'.

Note

Snowflake doesn’t support tar (tape archive) files.

internalStage

Specifies the internal stage location to upload the files onto:

@[namespace.]int_stage_name[/path]

Files are uploaded onto the specified named internal stage.

@[namespace.]%table_name[/path]

Files are uploaded onto the stage for the specified table.

@~[/path]

Files are uploaded onto the stage for the current user.

Where:

Note

If the stage name or path includes spaces or special characters, enclose it in single quotes. For example, use '@"my stage"' for a stage named "my stage".

Optional parameters
PARALLEL = integer

Specifies the number of threads to use for uploading files. Snowflake uploads separate batches of data files by size:

Increasing the number of threads can improve performance when uploading large files.

Supported values: Any integer value from 1 (no parallelism) to 99 (use 99 threads for uploading files).

Default: 4

Note

A 16 MB limit applies to older versions of Snowflake drivers, including:

AUTO_COMPRESS = TRUE | FALSE

Specifies whether Snowflake uses gzip to compress files during upload:

This option does not support other compression types. To use a different compression type, compress the file separately before executing the PUT command. Then, identify the compression type using the SOURCE_COMPRESSION option.

Ensure your local folder has sufficient space for Snowflake to compress the data files before staging them. If necessary, set the TEMP, TMPDIR or TMP environment variable in your operating system to point to a local folder that contains additional free space.

Default: TRUE

SOURCE_COMPRESSION = AUTO_DETECT | GZIP | BZ2 | BROTLI | ZSTD | DEFLATE | RAW_DEFLATE | NONE

Specifies the method of compression used on already-compressed files that are being staged:

Supported Values

Notes

AUTO_DETECT

Compression algorithm detected automatically, except for Brotli-compressed files, which cannot currently be detected automatically. If you’re uploading Brotli-compressed files, explicitly use BROTLI instead of AUTO_DETECT.

GZIP

Doesn’t support the *.tar.gz file format.

BZ2

Doesn’t support the *.tar.bz2 file format.

BROTLI

Must be used if uploading Brotli-compressed files.

ZSTD

Zstandard v0.8 (and higher) supported.

DEFLATE

Deflate-compressed files (with zlib header, RFC1950).

RAW_DEFLATE

Raw Deflate-compressed files (without header, RFC1951).

NONE

Data files have not been compressed.

Default: AUTO_DETECT

Note

Snowflake uses this option to detect how the data files were compressed so that they can be uncompressed and the data extracted for uploading; it does not use this option to compress the files.

Loading files that were compressed with other utilities is not currently supported.

OVERWRITE = TRUE | FALSE

Specifies whether Snowflake overwrites an existing file with the same name during upload:

If your Snowflake account is hosted on Google Cloud, PUT statements don’t recognize when the OVERWRITE parameter is set to TRUE. A PUT operation always overwrites any existing files in the target stage with the local files you’re uploading.

The following clients support the OVERWRITE option for Snowflake accounts hosted on Amazon Web Services or Microsoft Azure:

  • SnowSQL

  • Snowflake ODBC Driver

  • Snowflake JDBC Driver

  • Snowflake Connector for Python

Supported values: TRUE, FALSE.

Default: FALSE.

Usage notes

Tip

For security reasons, the command times out after a set period of time. This can occur when uploading large, uncompressed data files. To avoid timeout issues, we recommend compressing large data files using one of the supported compression types before uploading the files. Then, specify the compression type for the files using the SOURCE_COMPRESSION option.

You can also consider increasing the value of the PARALLEL option, which can help with performance when uploading large data files.

Furthermore, to take advantage of parallel operations when loading data into tables (using the COPY INTO <table> command), we recommend using data files ranging in size from roughly 100 to 250 MB compressed. If your data files are larger, consider using a third-party tool to split them into smaller files before compressing and uploading them.

Examples Linux and macOS

Load a file onto an internal stage

Load a file named mydata.csv in the /tmp/data directory to an internal stage named my_int_stage:

PUT file:///tmp/data/mydata.csv @my_int_stage;

Copy

Load a file onto a table stage

Load a file named orders_001.csv in the /tmp/data directory to the stage for the orderstiny_ext table, with automatic data compression disabled:

PUT file:///tmp/data/orders_001.csv @%orderstiny_ext
  AUTO_COMPRESS = FALSE;

Copy

Load multiple files onto an internal stage

Use wildcard characters in the filename to upload multiple files:

PUT file:///tmp/data/orders_*01.csv @my_int_stage
  AUTO_COMPRESS = FALSE;

Copy

Specify a file path with special characters

Enclose a file path with special characters or spaces in single quotes:

PUT 'file:///tmp/data/orders 001.csv' @my_int_stage
  AUTO_COMPRESS = FALSE;

Copy

Windows

Load a file onto the current user’s stage

Load a file named mydata.csv in the C:\temp\data directory onto the stage for the current user, with automatic data compression enabled:

PUT file://C:\temp\data\mydata.csv @~
  AUTO_COMPRESS = TRUE;

Copy

Specify a file path with special characters

To specify a Windows file path with special characters, you must enclose the path in single quotes and change backslashes to forward slashes.

In this example, the file name contains a space (my data.csv):

PUT 'file://C:/temp/data/my data.csv' @my_int_stage
  AUTO_COMPRESS = TRUE;

Copy


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4