A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://learn.microsoft.com/en-us/javascript/api/@azure/storage-file-datalake/datalakefileclient below:

DataLakeFileClient class | Microsoft Learn

DataLakeFileClient class

A DataLakeFileClient represents a URL to the Azure Storage file.

Properties fileSystemName

Name of current file system.

name

Name of current path (directory or file).

Inherited Properties accountName credential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

url

Encoded URL string value.

Methods append(RequestBodyType, number, number, FileAppendOptions)

Uploads data to be appended to a file. Data can only be appended to a file. To apply perviously uploaded data to a file, call flush.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

create(FileCreateOptions)

Create a file.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

create(PathResourceType, PathCreateOptions)

Create a file.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

createIfNotExists(FileCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

createIfNotExists(PathResourceType, PathCreateIfNotExistsOptions)

Create a file if it doesn't already exists.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

flush(number, FileFlushOptions)

Flushes (writes) previously appended data to a file.

generateSasStringToSign(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

generateSasUrl(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

generateUserDelegationSasStringToSign(FileGenerateSasUrlOptions, UserDelegationKey)

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the input user delegation key.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

generateUserDelegationSasUrl(FileGenerateSasUrlOptions, UserDelegationKey)

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the input user delegation key.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

query(string, FileQueryOptions)

Quick query for a JSON or CSV formatted file.

Example usage (Node.js):

import { DataLakeServiceClient } from "@azure/storage-file-datalake";

const account = "<account>";
const sas = "<sas token>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net${sas}`,
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Query and convert a file to a string
const queryResponse = await fileClient.query("select * from BlobStorage");
if (queryResponse.readableStreamBody) {
  const responseBuffer = await streamToBuffer(queryResponse.readableStreamBody);
  const downloaded = responseBuffer.toString();
  console.log(`Query file content: ${downloaded}`);
}

async function streamToBuffer(readableStream: NodeJS.ReadableStream): Promise<Buffer> {
  return new Promise((resolve, reject) => {
    const chunks: Buffer[] = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}
read(number, number, FileReadOptions)

Downloads a file from the service, including its metadata and properties.

See https://learn.microsoft.com/rest/api/storageservices/get-blob

import { DataLakeServiceClient } from "@azure/storage-file-datalake";
import { DefaultAzureCredential } from "@azure/identity";

const account = "<account>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net`,
  new DefaultAzureCredential(),
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Get file content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadResponse.readableStreamBody
const downloadResponse = await fileClient.read();
if (downloadResponse.readableStreamBody) {
  const downloaded = await streamToBuffer(downloadResponse.readableStreamBody);
  console.log("Downloaded file content:", downloaded.toString());
}

// [Node.js only] A helper method used to read a Node.js readable stream into a Buffer.
async function streamToBuffer(readableStream: NodeJS.ReadableStream): Promise<Buffer> {
  return new Promise((resolve, reject) => {
    const chunks: Buffer[] = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}

Example usage (browser):

import { DataLakeServiceClient } from "@azure/storage-file-datalake";

const account = "<account>";
const sas = "<sas token>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net${sas}`,
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Get file content from position 0 to the end
// In browsers, get downloaded data by accessing downloadResponse.contentAsBlob
const downloadResponse = await fileClient.read();
if (downloadResponse.contentAsBlob) {
  const blob = await downloadResponse.contentAsBlob;
  const downloaded = await blob.text();
  console.log(`Downloaded file content ${downloaded}`);
}
readToBuffer(Buffer, number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file.

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

readToBuffer(number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

readToFile(string, number, number, FileReadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Downloads a Data Lake file to a local file. Fails if the the given file path already exits. Offset and count are optional, pass 0 and undefined respectively to download the entire file.

setExpiry(PathExpiryOptions, FileSetExpiryOptions)

Sets an expiry time on a file, once that time is met the file is deleted.

upload(Blob | ArrayBuffer | ArrayBufferView | Buffer, FileParallelUploadOptions)

Uploads a Buffer(Node.js)/Blob/ArrayBuffer/ArrayBufferView to a File.

uploadFile(string, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a local file to a Data Lake file.

uploadStream(Readable, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a Node.js Readable stream into a Data Lake file. This method will try to create a file, then starts uploading chunk by chunk. Please make sure potential size of stream doesn't exceed FILE_MAX_SIZE_BYTES and potential number of chunks doesn't exceed BLOCK_BLOB_MAX_BLOCKS.

PERFORMANCE IMPROVEMENT TIPS:

Inherited Methods delete(boolean, PathDeleteOptions)

Delete current path (directory or file).

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/delete

deleteIfExists(boolean, PathDeleteOptions)

Delete current path (directory or file) if it exists.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/delete

exists(PathExistsOptions)

Returns true if the Data Lake file represented by this client exists; false otherwise.

NOTE: use this function with care since an existing file might be deleted by other clients or applications. Vice versa new files might be added by other clients or applications after this function completes.

getAccessControl(PathGetAccessControlOptions)

Returns the access control data for a path (directory of file).

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/getproperties

getDataLakeLeaseClient(string)

Get a DataLakeLeaseClient that manages leases on the path (directory or file).

getProperties(PathGetPropertiesOptions)

Returns all user-defined metadata, standard HTTP properties, and system properties for the path (directory or file).

WARNING: The metadata object returned in the response will have its keys in lowercase, even if they originally contained uppercase characters. This differs from the metadata keys returned by the methods of DataLakeFileSystemClient that list paths using the includeMetadata option, which will retain their original casing.

See https://learn.microsoft.com/rest/api/storageservices/get-blob-properties

move(string, PathMoveOptions)

Move directory or file within same file system.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

move(string, string, PathMoveOptions)

Move directory or file to another file system.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

removeAccessControlRecursive(RemovePathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Removes the Access Control on a path and sub paths.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

setAccessControl(PathAccessControlItem[], PathSetAccessControlOptions)

Set the access control data for a path (directory of file).

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

setAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Sets the Access Control on a path and sub paths.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

setHttpHeaders(PathHttpHeaders, PathSetHttpHeadersOptions)

Sets system properties on the path (directory or file).

If no value provided, or no value provided for the specified blob HTTP headers, these blob HTTP headers without a value will be cleared.

See https://learn.microsoft.com/rest/api/storageservices/set-blob-properties

setMetadata(Metadata, PathSetMetadataOptions)

Sets user-defined metadata for the specified path (directory of file) as one or more name-value pairs.

If no option provided, or no metadata defined in the parameter, the path metadata will be removed.

See https://learn.microsoft.com/rest/api/storageservices/set-blob-metadata

setPermissions(PathPermissions, PathSetPermissionsOptions)

Sets the file permissions on a path.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

toDirectoryClient()

Convert current DataLakePathClient to DataLakeDirectoryClient if current path is a directory.

toFileClient()

Convert current DataLakePathClient to DataLakeFileClient if current path is a file.

updateAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

Modifies the Access Control on a path and sub paths.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/update

Constructor Details DataLakeFileClient(string, Pipeline)

Creates an instance of DataLakeFileClient from url and pipeline.

new DataLakeFileClient(url: string, pipeline: Pipeline)
Parameters
url

string

A Client string pointing to Azure Storage data lake file, such as "https://myaccount.dfs.core.windows.net/filesystem/file". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net/filesystem/directory/file?sasString".

pipeline
Pipeline

Call newPipeline() to create a default pipeline, or provide a customized pipeline.

DataLakeFileClient(string, StorageSharedKeyCredential | AnonymousCredential | TokenCredential, StoragePipelineOptions)

Creates an instance of DataLakeFileClient from url and credential.

new DataLakeFileClient(url: string, credential?: StorageSharedKeyCredential | AnonymousCredential | TokenCredential, options?: StoragePipelineOptions)
Parameters
url

string

A Client string pointing to Azure Storage data lake file, such as "https://myaccount.dfs.core.windows.net/filesystem/file". You can append a SAS if using AnonymousCredential, such as "https://myaccount.dfs.core.windows.net/filesystem/directory/file?sasString".

credential

StorageSharedKeyCredential | AnonymousCredential | TokenCredential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

Property Details fileSystemName

Name of current file system.

string fileSystemName
Property Value

string

name

Name of current path (directory or file).

string name
Property Value

string

Inherited Property Details credential

Such as AnonymousCredential, StorageSharedKeyCredential or any credential from the @azure/identity package to authenticate requests to the service. You can also provide an object that implements the TokenCredential interface. If not specified, AnonymousCredential is used.

credential: StorageSharedKeyCredential | AnonymousCredential | TokenCredential
Property Value

StorageSharedKeyCredential | AnonymousCredential | TokenCredential

Inherited From DataLakePathClient.credential

Method Details append(RequestBodyType, number, number, FileAppendOptions) create(FileCreateOptions) create(PathResourceType, PathCreateOptions) createIfNotExists(FileCreateIfNotExistsOptions) createIfNotExists(PathResourceType, PathCreateIfNotExistsOptions) flush(number, FileFlushOptions)

Flushes (writes) previously appended data to a file.

function flush(position: number, options?: FileFlushOptions): Promise<FileFlushResponse>
Parameters
position

number

File position to flush. This parameter allows the caller to upload data in parallel and control the order in which it is appended to the file. It is required when uploading data to be appended to the file and when flushing previously uploaded data to the file. The value must be the position where the data is to be appended. Uploaded data is not immediately flushed, or written, to the file. To flush, the previously uploaded data must be contiguous, the position parameter must be specified and equal to the length of the file after all data has been written, and there must not be a request entity body included with the request.

Returns

Promise<FileFlushResponse>

generateSasStringToSign(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

function generateSasStringToSign(options: FileGenerateSasUrlOptions): string
Parameters Returns

string

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

generateSasUrl(FileGenerateSasUrlOptions)

Only available for clients constructed with a shared key credential.

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the shared key credential of the client.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

function generateSasUrl(options: FileGenerateSasUrlOptions): Promise<string>
Parameters Returns

Promise<string>

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

generateUserDelegationSasStringToSign(FileGenerateSasUrlOptions, UserDelegationKey)

Generates string to sign for a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the input user delegation key.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

function generateUserDelegationSasStringToSign(options: FileGenerateSasUrlOptions, userDelegationKey: UserDelegationKey): string
Parameters
userDelegationKey
UserDelegationKey

Return value of blobServiceClient.getUserDelegationKey()

Returns

string

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

generateUserDelegationSasUrl(FileGenerateSasUrlOptions, UserDelegationKey)

Generates a Service Shared Access Signature (SAS) URI based on the client properties and parameters passed in. The SAS is signed by the input user delegation key.

See https://learn.microsoft.com/rest/api/storageservices/constructing-a-service-sas

function generateUserDelegationSasUrl(options: FileGenerateSasUrlOptions, userDelegationKey: UserDelegationKey): Promise<string>
Parameters
userDelegationKey
UserDelegationKey

Return value of blobServiceClient.getUserDelegationKey()

Returns

Promise<string>

The SAS URI consisting of the URI to the resource represented by this client, followed by the generated SAS token.

query(string, FileQueryOptions)

Quick query for a JSON or CSV formatted file.

Example usage (Node.js):

import { DataLakeServiceClient } from "@azure/storage-file-datalake";

const account = "<account>";
const sas = "<sas token>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net${sas}`,
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Query and convert a file to a string
const queryResponse = await fileClient.query("select * from BlobStorage");
if (queryResponse.readableStreamBody) {
  const responseBuffer = await streamToBuffer(queryResponse.readableStreamBody);
  const downloaded = responseBuffer.toString();
  console.log(`Query file content: ${downloaded}`);
}

async function streamToBuffer(readableStream: NodeJS.ReadableStream): Promise<Buffer> {
  return new Promise((resolve, reject) => {
    const chunks: Buffer[] = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}
function query(query: string, options?: FileQueryOptions): Promise<FileReadResponse>
Parameters Returns

Promise<FileReadResponse>

read(number, number, FileReadOptions)

Downloads a file from the service, including its metadata and properties.

See https://learn.microsoft.com/rest/api/storageservices/get-blob

import { DataLakeServiceClient } from "@azure/storage-file-datalake";
import { DefaultAzureCredential } from "@azure/identity";

const account = "<account>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net`,
  new DefaultAzureCredential(),
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Get file content from position 0 to the end
// In Node.js, get downloaded data by accessing downloadResponse.readableStreamBody
const downloadResponse = await fileClient.read();
if (downloadResponse.readableStreamBody) {
  const downloaded = await streamToBuffer(downloadResponse.readableStreamBody);
  console.log("Downloaded file content:", downloaded.toString());
}

// [Node.js only] A helper method used to read a Node.js readable stream into a Buffer.
async function streamToBuffer(readableStream: NodeJS.ReadableStream): Promise<Buffer> {
  return new Promise((resolve, reject) => {
    const chunks: Buffer[] = [];
    readableStream.on("data", (data) => {
      chunks.push(data instanceof Buffer ? data : Buffer.from(data));
    });
    readableStream.on("end", () => {
      resolve(Buffer.concat(chunks));
    });
    readableStream.on("error", reject);
  });
}

Example usage (browser):

import { DataLakeServiceClient } from "@azure/storage-file-datalake";

const account = "<account>";
const sas = "<sas token>";
const datalakeServiceClient = new DataLakeServiceClient(
  `https://${account}.dfs.core.windows.net${sas}`,
);

const fileSystemName = "<file system name>";
const fileName = "<file name>";
const fileSystemClient = datalakeServiceClient.getFileSystemClient(fileSystemName);
const fileClient = fileSystemClient.getFileClient(fileName);

// Get file content from position 0 to the end
// In browsers, get downloaded data by accessing downloadResponse.contentAsBlob
const downloadResponse = await fileClient.read();
if (downloadResponse.contentAsBlob) {
  const blob = await downloadResponse.contentAsBlob;
  const downloaded = await blob.text();
  console.log(`Downloaded file content ${downloaded}`);
}
function read(offset?: number, count?: number, options?: FileReadOptions): Promise<FileReadResponse>
Parameters
offset

number

Optional. Offset to read file, default value is 0.

count

number

Optional. How many bytes to read, default will read from offset to the end.

Returns

Promise<FileReadResponse>

readToBuffer(Buffer, number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file.

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

function readToBuffer(buffer: Buffer, offset?: number, count?: number, options?: FileReadToBufferOptions): Promise<Buffer>
Parameters
buffer

Buffer

Buffer to be fill, must have length larger than count

offset

number

From which position of the Data Lake file to read

count

number

How much data to be read. Will read to the end when passing undefined

Returns

Promise<Buffer>

readToBuffer(number, number, FileReadToBufferOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME

Reads a Data Lake file in parallel to a buffer. Offset and count are optional, pass 0 for both to read the entire file

Warning: Buffers can only support files up to about one gigabyte on 32-bit systems or about two gigabytes on 64-bit systems due to limitations of Node.js/V8. For files larger than this size, consider readToFile.

function readToBuffer(offset?: number, count?: number, options?: FileReadToBufferOptions): Promise<Buffer>
Parameters
offset

number

From which position of the Data Lake file to read(in bytes)

count

number

How much data(in bytes) to be read. Will read to the end when passing undefined

Returns

Promise<Buffer>

readToFile(string, number, number, FileReadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Downloads a Data Lake file to a local file. Fails if the the given file path already exits. Offset and count are optional, pass 0 and undefined respectively to download the entire file.

function readToFile(filePath: string, offset?: number, count?: number, options?: FileReadOptions): Promise<FileReadResponse>
Parameters
offset

number

From which position of the file to download.

count

number

How much data to be downloaded. Will download to the end when passing undefined.

Returns

Promise<FileReadResponse>

The response data for file read operation, but with readableStreamBody set to undefined since its content is already read and written into a local file at the specified path.

setExpiry(PathExpiryOptions, FileSetExpiryOptions)

Sets an expiry time on a file, once that time is met the file is deleted.

function setExpiry(mode: PathExpiryOptions, options?: FileSetExpiryOptions): Promise<FileSetExpiryResponse>
Parameters Returns

Promise<FileSetExpiryResponse>

upload(Blob | ArrayBuffer | ArrayBufferView | Buffer, FileParallelUploadOptions)

Uploads a Buffer(Node.js)/Blob/ArrayBuffer/ArrayBufferView to a File.

function upload(data: Blob | ArrayBuffer | ArrayBufferView | Buffer, options?: FileParallelUploadOptions): Promise<FileUploadResponse>
Parameters
data

Blob | ArrayBuffer | ArrayBufferView | Buffer

Buffer(Node), Blob, ArrayBuffer or ArrayBufferView

Returns

Promise<FileUploadResponse>

uploadFile(string, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a local file to a Data Lake file.

function uploadFile(filePath: string, options?: FileParallelUploadOptions): Promise<FileUploadResponse>
Parameters
filePath

string

Full path of the local file

Returns

Promise<FileUploadResponse>

uploadStream(Readable, FileParallelUploadOptions)

ONLY AVAILABLE IN NODE.JS RUNTIME.

Uploads a Node.js Readable stream into a Data Lake file. This method will try to create a file, then starts uploading chunk by chunk. Please make sure potential size of stream doesn't exceed FILE_MAX_SIZE_BYTES and potential number of chunks doesn't exceed BLOCK_BLOB_MAX_BLOCKS.

PERFORMANCE IMPROVEMENT TIPS:

function uploadStream(stream: Readable, options?: FileParallelUploadOptions): Promise<FileUploadResponse>
Parameters
stream

Readable

Node.js Readable stream.

Returns

Promise<FileUploadResponse>

Inherited Method Details delete(boolean, PathDeleteOptions) deleteIfExists(boolean, PathDeleteOptions) exists(PathExistsOptions)

Returns true if the Data Lake file represented by this client exists; false otherwise.

NOTE: use this function with care since an existing file might be deleted by other clients or applications. Vice versa new files might be added by other clients or applications after this function completes.

function exists(options?: PathExistsOptions): Promise<boolean>
Parameters Returns

Promise<boolean>

Inherited From DataLakePathClient.exists

getAccessControl(PathGetAccessControlOptions) getDataLakeLeaseClient(string) getProperties(PathGetPropertiesOptions) move(string, PathMoveOptions) move(string, string, PathMoveOptions)

Move directory or file to another file system.

See https://learn.microsoft.com/rest/api/storageservices/datalakestoragegen2/path/create

function move(destinationFileSystem: string, destinationPath: string, options?: PathMoveOptions): Promise<PathMoveResponse>
Parameters
destinationFileSystem

string

Destination file system like "filesystem".

destinationPath

string

Destination directory path like "directory" or file path "directory/file" If the destinationPath is authenticated with SAS, add the SAS to the destination path like "directory/file?sasToken".

Returns

Promise<PathMoveResponse>

Inherited From DataLakePathClient.move

removeAccessControlRecursive(RemovePathAccessControlItem[], PathChangeAccessControlRecursiveOptions) setAccessControl(PathAccessControlItem[], PathSetAccessControlOptions) setAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions) setMetadata(Metadata, PathSetMetadataOptions) setPermissions(PathPermissions, PathSetPermissionsOptions) toDirectoryClient()

Convert current DataLakePathClient to DataLakeDirectoryClient if current path is a directory.

function toDirectoryClient(): DataLakeDirectoryClient
Returns

Inherited From DataLakePathClient.toDirectoryClient

toFileClient()

Convert current DataLakePathClient to DataLakeFileClient if current path is a file.

function toFileClient(): DataLakeFileClient
Returns

Inherited From DataLakePathClient.toFileClient

updateAccessControlRecursive(PathAccessControlItem[], PathChangeAccessControlRecursiveOptions)

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4