[DATABRICKS - AZURE] Hi team! I'm loading data in...
# questions
o
[DATABRICKS - AZURE] Hi team! I'm loading data into the catalog from an azure blob storage account, but i only have access to auth SAS and a fixed token. This is how credentials would look like in a databricks notebook:
Copy code
spark.conf.set("fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "<token>")
how should i set this credentials in the catalog? Do i need to create a custom dataset?
w
Are you using the spark project starter? if you do maybe you can put those configurations into the
conf/base/spark.yml
conf file
👍 2
o
good point... in that case, should we add this to
spark.yml
like this:
Copy code
fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: SAS
fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider
fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: <token>
?? If so, could we parametrize the storage and token using globals or credentials?
w
o
Thanks William!
👍 1