Hi all, I've recently begun exploring Kedro as a w...
# questions
b
Hi all, I've recently begun exploring Kedro as a way to better structure our ML projects. Our company is pretty involved with Databricks for our data processes so I'm quite excited to see the upcoming changes related to Kedro & Databricks. I just have a question around loading Delta Tables in Databricks as a data catalog into Kedro. I see there is a DeltaTableDataSet class - is it possible to just provide a table name to this class to load the data or must it be an absolute path? Also is it then exposed as a spark df to pass into pipelines? Thanks and I'm very keen to learn more about Kedro
d
so we built this about a year ago and DeltaLake has changed a bit in the time since. So if you could give us a bit more context we could think about how best to do what you’re trying to do / we could extend / introduce a new dataset / accept a contribution. So it looks like we just support the
DeltaLake.forPath
method today, it’s super easy for you to copy and past the class above and start using your won custom dataset that uses
DeltaLkae.forName
in the immediate term. What we need to do is put some thought into how best to change the dataset itself to achieve this