如何记录 Airflow DAG 的输出以进行调试?

2024-04-20

我正在编写 Airflow DAG,但在函数方面遇到一些问题。我正在尝试通过将数据打印到标准输出并使用logging图书馆。

我的示例 DAG 是:

    from datetime import timedelta
    
    import airflow
    import logging
    
    from airflow.models import DAG
    from airflow.operators.dummy_operator import DummyOperator
    from airflow.contrib.hooks.datadog_hook import DatadogHook
    
    def datadog_event(title, text, dag_id, task_id):
        hook = DatadogHook()
        tags = [
            f'dag:{dag_id}',
            f'task:{task_id}',
        ]
    
        hook.post_event(title, text, tags)
    
    def datadog_event_success(context):
        dag_id = context['task_instance'].dag_id
        task_id = context['task_instance'].task_id
        text = f'Airflow DAG failure for {dag_id}\n\nDAG: {dag_id}\nTasks: {task_id}'
        title = f'Airflow DAG success for {dag_id}'
    
        logging.info(title)
        logging.info(text)
        logging.info(dag_id)
        logging.info(task_id)
    
        datadog_event(title, text, dag_id, task_id)
    
    args = {
        'owner': 'airflow',
        'start_date': airflow.utils.dates.days_ago(2),
    }
    
    dag = DAG(
        dag_id='example_callback',
        default_args=args,
        schedule_interval='*/5 * * * *',
        dagrun_timeout=timedelta(minutes=60),
        on_success_callback=datadog_event_success,
    )
    
    my_task = DummyOperator(
        task_id='run_this_last',
        dag=dag,
    )

在运行过程中我收到一个错误:

airflow[9490]: Process DagFileProcessor4195-Process:
airflow[9490]: Traceback (most recent call last):
airflow[9490]:   File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
airflow[9490]:     self.run()
airflow[9490]:   File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
airflow[9490]:     self._target(*self._args, **self._kwargs)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 148, in _run_file_processor
airflow[9490]:     result = scheduler_job.process_file(file_path, pickle_dags)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
airflow[9490]:     return func(*args, **kwargs)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1542, in process_file
airflow[9490]:     self._process_dags(dagbag, dags, ti_keys_to_schedule)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1239, in _process_dags
airflow[9490]:     self._process_task_instances(dag, tis_out)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
airflow[9490]:     return func(*args, **kwargs)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 732, in _process_task_instances
airflow[9490]:     run.update_state(session=session)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
airflow[9490]:     return func(*args, **kwargs)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/models/dagrun.py", line 318, in update_state
airflow[9490]:     dag.handle_callback(self, success=True, reason='success', session=session)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
airflow[9490]:     return func(*args, **kwargs)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/models/dag.py", line 620, in handle_callback
airflow[9490]:     callback(context)
airflow[9490]:   File "/home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py", line 68, in datadog_event_success
airflow[9490]:     datadog_event(title, text, dag_id, task_id)
airflow[9490]:   File "/home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py", line 45, in datadog_event
airflow[9490]:     hook.post_event(title, text, tags)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/contrib/hooks/datadog_hook.py", line 157, in post_event
airflow[9490]:     self.validate_response(response)
airflow[9490]:   File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/contrib/hooks/datadog_hook.py", line 58, in validate_response
airflow[9490]:     if response['status'] != 'ok':
airflow[9490]: KeyError: 'status'

但我登录的内容都不是在调度程序、网络服务器、工作人员或任务日志中出现错误之前或之后。

我已经测试过datadog_event通过手动导入代码来调用我的 Airflow Worker,当我以这种方式运行时,它会正确记录:

airflow@airflow-worker-0:~/analytics$ /home/airflow/virtualenv/bin/python -i /home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py
[2019-08-07 20:48:01,890] {settings.py:213} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=29941
[2019-08-07 20:48:02,227] {__init__.py:51} INFO - Using executor DaskExecutor

>>> datadog_event('My title', 'My task', 'example_bash_operator_andy', 'run_this_last')
[2019-08-07 20:51:17,542] {datadog_hook.py:54} INFO - Setting up api keys for Datadog
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:38} INFO - My title
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:39} INFO - My task
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:40} INFO - example_bash_operator_andy
[2019-08-07 20:51:17,545] {example_bash_operator_andy.py:41} INFO - run_this_last
[2019-08-07 20:51:17,658] {api_client.py:139} INFO - 202 POST https://api.datadoghq.com/api/v1/events (113.2174ms)

My airflow.cfg发布于https://gist.github.com/andyshinn/d743ddc61956ed7440c500fca962ce92 https://gist.github.com/andyshinn/d743ddc61956ed7440c500fca962ce92我正在使用 Airflow 1.10.4。

如何从 DAG 本身输出日志记录或消息以更好地调试可能发生的情况?


DAG 级回调(on_success、on_failure)发生在主调度程序循环中。请参阅此未解决的问题,将这些函数的执行移出调度程序线程 https://issues.apache.org/jira/browse/AIRFLOW-6253?jql=project%20%3D%20AIRFLOW%20AND%20resolution%20%3D%20Unresolved%20AND%20text%20~%20%22on_failure_callback%22%20ORDER%20BY%20priority%20DESC%2C%20updated%20DESC。回调函数中引发的异常将出现在调度程序日志中。然而,令人恼火的是,print and logging似乎没有进入调度程序日志。

出于调试目的,我通常只是将尝试记录的信息作为异常提出,以便它将显示在调度程序日志中。

或者,您可以将回调移至任务级别。这意味着将其移至您的 default_args 中,如下所示:

args = {
    'owner': 'airflow',
    'start_date': airflow.utils.dates.days_ago(2),
    'on_success_callback': datadog_event_success
}

dag = DAG(
    dag_id='example_callback',
    default_args=args,
    schedule_interval='*/5 * * * *',
    dagrun_timeout=timedelta(minutes=60)
)

您的回调的日志记录现在将显示在任务日志(而不是调度程序日志)中。但是,这意味着将为每个符合条件的任务调用回调,而不仅仅是为 DAG 调用一次。

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

如何记录 Airflow DAG 的输出以进行调试? 的相关文章

随机推荐