How to Fix Status 403 Code AuthorizationFailure Message This request is not authorized to perform this operation Error in Azure Data Factory
If you are using Azure Data Factory to copy data from or to Azure Blob Storage, you may encounter the following error message:
1 |
Status 403 Code AuthorizationFailure Message This request is not authorized to perform this operation |
This error means that the Azure Data Factory service does not have the required permissions to access the Azure Blob Storage account or container. This can happen for several reasons, such as:
- The Azure Blob Storage account or container has a firewall or network rule that blocks the Azure Data Factory service.
- The Azure Blob Storage account or container has a private endpoint that prevents the Azure Data Factory service from accessing it.
- The Azure Data Factory service does not have the correct role assignment or access policy to access the Azure Blob Storage account or container.
In this blog post, we will show you how to fix this error by following these steps:
- Check and update your firewall and network settings for Azure Blob Storage.
- Check and update your private endpoint settings for Azure Blob Storage.
- Check and update your role assignment and access policy settings for Azure Data Factory.
Check and update your firewall and network settings for Azure Blob Storage
The first step is to check and update your firewall and network settings for Azure Blob Storage to make sure they allow the Azure Data Factory service to access it. You can do this by following these steps:
- In the Azure portal, go to your Azure Blob Storage account and select Networking under Settings.
- Under Firewall and virtual networks, select Allow trusted Microsoft services to access this storage account. This option allows the Azure Data Factory service to bypass the firewall rules.
- Under Exceptions, select Allow access from All networks. This option allows the Azure Data Factory service to access the storage account from any IP address.
- Save your changes and test your data factory pipeline again.
Check and update your private endpoint settings for Azure Blob Storage
The second step is to check and update your private endpoint settings for Azure Blob Storage to make sure they allow the Azure Data Factory service to access it. You can do this by following these steps:
- In the Azure portal, go to your Azure Blob Storage account and select Private endpoint connections under Settings.
- If you have any private endpoint connections, make sure they are in Approved state. If they are in Pending or Rejected state, you need to approve them or delete them.
- If you don’t have any private endpoint connections, you need to create one for the Azure Data Factory service. You can do this by following these steps:
- In the same page, select Create private endpoint.
- In the Basics tab, enter a name and resource group for your private endpoint, and select Next: Resource.
- In the Resource tab, select Microsoft.Storage/storageAccounts as the resource type, select your storage account as the resource, and select blob as the target sub-resource. Then select Next: Configuration.
- In the Configuration tab, select an existing virtual network and subnet where you want to create your private endpoint, or create a new one. Then select Review + create.
- In the Review + create tab, review your settings and select Create.
- Wait for the deployment to complete and verify that your private endpoint connection is in Approved state.
Check and update your role assignment and access policy settings for Azure Data Factory
The third step is to check and update your role assignment and access policy settings for Azure Data Factory to make sure they grant the required permissions to access the Azure Blob Storage account or container. You can do this by following these steps:
- In the Azure portal, go to your data factory resource and select Manage identity under Settings.
- Copy the Object ID of your data factory’s system-assigned managed identity. This is a unique identifier that represents your data factory service in Azure AD.
- Go to your Azure Blob Storage account and select Access control (IAM) under Settings.
- Select Role assignments and then Add -> Add role assignment.
- In the Add role assignment pane, select Storage Blob Data Contributor as the role, select User assigned managed identity as the assign access to option, paste the Object ID of your data factory’s managed identity as the identity, and select Save. This option grants your data factory service read/write/delete permissions on all blobs in your storage account.
- Alternatively, if you want to grant more granular permissions on a specific container, you can use Access policies instead of Role assignments. To do this, follow these steps:
- Go to your container under Containers in your storage account and select Access policy under Settings.
- Select Add policy under Container access policy.
- In the Add policy pane, enter a name for your policy, select the permissions you want to grant (such as Read, Write, Delete), enter a start date and an expiry date for your policy, and select OK.
- Copy the ID of your policy. This is a unique identifier that represents your policy in the container.
- Go to your data factory resource and select Author and Monitor.
- In the Data Factory UI, go to your linked service for Azure Blob Storage and select Edit.
- In the Edit linked service pane, select SAS URI as the authentication method, enter your storage account name and container name, and paste the ID of your policy as the SAS token. Then select Test connection and Finish.
By following these steps, you should be able to fix Status 403 Code AuthorizationFailure Message This request is not authorized to perform this operation error and copy data from or to Azure Blob Storage using Azure Data Factory.