Hi guys, I'm looking for an MLflow example that: ...
# questions
e
Hi guys, I'm looking for an MLflow example that: 1. The client makes the query to my MLflow service in Databricks. 2. Within my ModelWrapper(), I receive the data to query a Databricks table. 3. The query returns the data needed to execute my model. 4. The MLflow returns the model's response. I can't run the query before it enters my MLflow, as it's in a different environment. That's why I need to run the query from my MLflow. Someone have an example?
👀 1
j
hi @Ecram, just to clarify - is your question about MLflow Serving? or what do you mean by "querying the MLflow service"?
e
Hi @Juan Luis, is about MLflow Serving.
I'm looking for a working example or guidance on how to achieve the following with MLflow and Databricks: 1. I have an MLflow model deployed in Databricks, accessible via the REST API. 2. The client sends a request to the model endpoint, including some input (e.g., an ID or filter parameters). 3. Inside the
ModelWrapper
(used in
pyfunc
flavor), I need to connect to a different Databricks workspace. 4. This queried data is then used as input to the actual model logic. 5. Finally, the model returns the prediction/result via the MLflow API. Thanks @Juan Luis
j
and to clarify, where in this sequence would you be using Kedro?
e
At inference time, the model is deployed via MLflow in Azure Databricks, and we access our production tables (also hosted in Azure Databricks) from within the
ModelWrapper
. This means the wrapper includes logic to connect to another Databricks workspace to query the latest data — independent of Kedro.