Dynamical arguments for function inside ExternalPythonOperator scheduled using asset events #63806
Unanswered
michaltarana
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am struggling with passing some dynamical parameters to a task and would be grateful for any ideas or suggestions. This is Apache Airflow 3.1.7.
In my DAG, there is a
@task.external_pythonthat is executed using ExternalPythonOperator, since it depends on some Conda environment. The task itself works fine, the DAG itself works fine. I am able to pass the arguments to the external Python task using templates_dict, as long as their values are known at the time when the DAG is triggered.However, I would like to run this DAG as a part of the asset-event aware scheduling pipeline. The DAG is triggered by an
@assetthat emmits some additional information to the asset_event:I would like to use these metadata in the external Python task:
Would something like this be possible? If not, is there any other way how I can dynamically pass parameters from other tasks to the ExternalPython OPerator? I am aware that context is not available. It seems, however, that even the templating does not work. The code above is telling me that the name
triggering_asset_eventsis not recognized. Similarly, adding another argument inlet_events to consume() does not work.For some particular reasons, I would like to avoid installing apache into the environment of the external Python.
I would be very grateful for any insight or hint.
Beta Was this translation helpful? Give feedback.
All reactions