Hey there, another question on the ibis.TableDatas...
# plugins-integrations
m
Hey there, another question on the ibis.TableDataset. Just moving a bunch of our local code (duckdb) to databricks and hit a snag. We're using unity catalog (UC). I loaded raw tables into UC manually for simplicity and confirmed I can load them using an ibis connection (see screenshot 1). When I try to load this table in using the TableDataset I get an error saying "`raw_tracks` cannot be found" (see screenshot 2). I think this is because the load() method doesn't pull in database from the config...
Copy code
raw_tracks:
  type: ibis.TableDataset
  table_name: raw_tracks
  connection:
    backend: pyspark
    database: comms_media_dev.dart_extensions
Copy code
def load(self) -> ir.Table:
            return self.connection.table(self._table_name)
I think updating load() seems fairly simple, something like the code below works, but was the initial intent that we could pass a catalog / database through the config here? If yes on the latter I think perhaps I'm not using the spark config properly or databricks is doing something strange... posted a question about that here for context.
Copy code
def load(self) -> ir.Table:
            return self.connection.table(name = self._table_name, database = self._database)
d
Would this be the same issue as https://github.com/kedro-org/kedro-plugins/issues/833? I can look a bit more later
thankyou 1
m
I was just rereading that to try to figure that out, somehow already forgot about it 🤯
d
I was just rereading that to try to figure that out, somehow already forgot about it 🤯
So did I 😀
m
It's definitely similar, but duckdb has no catalog so at least additional requirements. This honestly might be an issue with the way I tried to update my spark session as I referenced in my SO post. I'm just still trying to wrap my head around it...