Olivia Lihn
12/13/2022, 9:30 PMspark.conf.set("fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "SAS")
spark.conf.set("fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider")
spark.conf.set("fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>", "<token>")
how should i set this credentials in the catalog? Do i need to create a custom dataset?William Caicedo
12/13/2022, 10:15 PMconf/base/spark.yml
conf fileOlivia Lihn
12/14/2022, 3:53 PMspark.yml
like this:
fs.azure.account.auth.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: SAS
fs.azure.sas.token.provider.type.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: org.apache.hadoop.fs.azurebfs.sas.FixedSASTokenProvider
fs.azure.sas.fixed.token.<storage-account>.<http://dfs.core.windows.net|dfs.core.windows.net>: <token>
?? If so, could we parametrize the storage and token using globals or credentials?William Caicedo
12/14/2022, 10:32 PMOlivia Lihn
12/14/2022, 10:32 PM