https://kedro.org/ logo
#questions
Title
# questions
o

Olivia Lihn

12/13/2022, 9:30 PM
[DATABRICKS - AZURE] Hi team! I'm loading data into the catalog from an azure blob storage account, but i only have access to auth SAS and a fixed token. This is how credentials would look like in a databricks notebook:
Copy code
spark.conf.set("fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "<token>")
how should i set this credentials in the catalog? Do i need to create a custom dataset?
w

William Caicedo

12/13/2022, 10:15 PM
Are you using the spark project starter? if you do maybe you can put those configurations into the
conf/base/spark.yml
conf file
👍 2
o

Olivia Lihn

12/14/2022, 3:53 PM
good point... in that case, should we add this to
spark.yml
like this:
Copy code
fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: SAS
fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider
fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: <token>
?? If so, could we parametrize the storage and token using globals or credentials?
w

William Caicedo

12/14/2022, 10:32 PM
o

Olivia Lihn

12/14/2022, 10:32 PM
Thanks William!
👍 1
5 Views