This article shows you how to use Node.js to get, set, and update the access control lists of directories and files.
Package (Node Package Manager) | Samples | Give Feedback
Prerequisites2.6.0
or higher.This section walks you through preparing a project to work with the Azure Data Lake Storage client library for JavaScript.
Install packagesInstall packages for the Azure Data Lake Storage and Azure Identity client libraries using the npm install
command. The @azure/identity package is needed for passwordless connections to Azure services.
npm install @azure/storage-file-datalake
npm install @azure/identity
Load modules
Add the following code at the top of your file to load the required modules:
const {
AzureStorageDataLake,
DataLakeServiceClient,
StorageSharedKeyCredential
} = require("@azure/storage-file-datalake");
const { DefaultAzureCredential } = require('@azure/identity');
Connect to the account
To run the code examples in this article, you need to create a DataLakeServiceClient instance that represents the storage account. You can authorize the client object with Microsoft Entra ID credentials or with an account key.
You can use the Azure identity client library for JavaScript to authenticate your application with Microsoft Entra ID.
First, you'll have to assign one of the following Azure role-based access control (Azure RBAC) roles to your security principal:
Next, create a DataLakeServiceClient instance and pass in a new instance of the DefaultAzureCredential class.
function GetDataLakeServiceClientAD(accountName) {
const dataLakeServiceClient = new DataLakeServiceClient(
`https://${accountName}.dfs.core.windows.net`,
new DefaultAzureCredential()
);
return dataLakeServiceClient;
}
To learn more about using DefaultAzureCredential
to authorize access to data, see Overview: Authenticate JavaScript apps to Azure using the Azure SDK.
You can authorize access to data using your account access keys (Shared Key). This example creates a DataLakeServiceClient instance that is authorized with the account key.
function GetDataLakeServiceClient(accountName, accountKey) {
const sharedKeyCredential =
new StorageSharedKeyCredential(accountName, accountKey);
const dataLakeServiceClient = new DataLakeServiceClient(
`https://${accountName}.dfs.core.windows.net`, sharedKeyCredential);
return dataLakeServiceClient;
}
Caution
Authorization with Shared Key is not recommended as it may be less secure. For optimal security, disable authorization via Shared Key for your storage account, as described in Prevent Shared Key authorization for an Azure Storage account.
Use of access keys and connection strings should be limited to initial proof of concept apps or development prototypes that don't access production or sensitive data. Otherwise, the token-based authentication classes available in the Azure SDK should always be preferred when authenticating to Azure resources.
Microsoft recommends that clients use either Microsoft Entra ID or a shared access signature (SAS) to authorize access to data in Azure Storage. For more information, see Authorize operations for data access.
Get and set a directory ACLThis example gets and then sets the ACL of a directory named my-directory
. This example gives the owning user read, write, and execute permissions, gives the owning group only read and execute permissions, and gives all others read access.
Note
If your application authorizes access by using Microsoft Entra ID, then make sure that the security principal that your application uses to authorize access has been assigned the Storage Blob Data Owner role. To learn more about how ACL permissions are applied and the effects of changing them, see Access control in Azure Data Lake Storage.
async function ManageDirectoryACLs(fileSystemClient) {
const directoryClient = fileSystemClient.getDirectoryClient("my-directory");
const permissions = await directoryClient.getAccessControl();
console.log(permissions.acl);
const acl = [
{
accessControlType: "user",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: true,
execute: true
}
},
{
accessControlType: "group",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: false,
execute: true
}
},
{
accessControlType: "other",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: true,
execute: false
}
}
];
await directoryClient.setAccessControl(acl);
}
You can also get and set the ACL of the root directory of a container. To get the root directory, pass an empty string (/
) into the DataLakeFileSystemClient.getDirectoryClient method.
This example gets and then sets the ACL of a file named upload-file.txt
. This example gives the owning user read, write, and execute permissions, gives the owning group only read and execute permissions, and gives all others read access.
Note
If your application authorizes access by using Microsoft Entra ID, then make sure that the security principal that your application uses to authorize access has been assigned the Storage Blob Data Owner role. To learn more about how ACL permissions are applied and the effects of changing them, see Access control in Azure Data Lake Storage.
async function ManageFileACLs(fileSystemClient) {
const fileClient = fileSystemClient.getFileClient("my-directory/uploaded-file.txt");
const permissions = await fileClient.getAccessControl();
console.log(permissions.acl);
const acl = [
{
accessControlType: "user",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: true,
execute: true
}
},
{
accessControlType: "group",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: false,
execute: true
}
},
{
accessControlType: "other",
entityId: "",
defaultScope: false,
permissions: {
read: true,
write: true,
execute: false
}
}
];
await fileClient.setAccessControl(acl);
}
See also
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4