我正在编写 Airflow DAG,但在函数方面遇到一些问题。我正在尝试通过将数据打印到标准输出并使用logging
图书馆。
我的示例 DAG 是:
from datetime import timedelta
import airflow
import logging
from airflow.models import DAG
from airflow.operators.dummy_operator import DummyOperator
from airflow.contrib.hooks.datadog_hook import DatadogHook
def datadog_event(title, text, dag_id, task_id):
hook = DatadogHook()
tags = [
f'dag:{dag_id}',
f'task:{task_id}',
]
hook.post_event(title, text, tags)
def datadog_event_success(context):
dag_id = context['task_instance'].dag_id
task_id = context['task_instance'].task_id
text = f'Airflow DAG failure for {dag_id}\n\nDAG: {dag_id}\nTasks: {task_id}'
title = f'Airflow DAG success for {dag_id}'
logging.info(title)
logging.info(text)
logging.info(dag_id)
logging.info(task_id)
datadog_event(title, text, dag_id, task_id)
args = {
'owner': 'airflow',
'start_date': airflow.utils.dates.days_ago(2),
}
dag = DAG(
dag_id='example_callback',
default_args=args,
schedule_interval='*/5 * * * *',
dagrun_timeout=timedelta(minutes=60),
on_success_callback=datadog_event_success,
)
my_task = DummyOperator(
task_id='run_this_last',
dag=dag,
)
在运行过程中我收到一个错误:
airflow[9490]: Process DagFileProcessor4195-Process:
airflow[9490]: Traceback (most recent call last):
airflow[9490]: File "/usr/lib/python3.6/multiprocessing/process.py", line 258, in _bootstrap
airflow[9490]: self.run()
airflow[9490]: File "/usr/lib/python3.6/multiprocessing/process.py", line 93, in run
airflow[9490]: self._target(*self._args, **self._kwargs)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 148, in _run_file_processor
airflow[9490]: result = scheduler_job.process_file(file_path, pickle_dags)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
airflow[9490]: return func(*args, **kwargs)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1542, in process_file
airflow[9490]: self._process_dags(dagbag, dags, ti_keys_to_schedule)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 1239, in _process_dags
airflow[9490]: self._process_task_instances(dag, tis_out)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 74, in wrapper
airflow[9490]: return func(*args, **kwargs)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/jobs/scheduler_job.py", line 732, in _process_task_instances
airflow[9490]: run.update_state(session=session)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
airflow[9490]: return func(*args, **kwargs)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/models/dagrun.py", line 318, in update_state
airflow[9490]: dag.handle_callback(self, success=True, reason='success', session=session)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/utils/db.py", line 70, in wrapper
airflow[9490]: return func(*args, **kwargs)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/models/dag.py", line 620, in handle_callback
airflow[9490]: callback(context)
airflow[9490]: File "/home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py", line 68, in datadog_event_success
airflow[9490]: datadog_event(title, text, dag_id, task_id)
airflow[9490]: File "/home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py", line 45, in datadog_event
airflow[9490]: hook.post_event(title, text, tags)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/contrib/hooks/datadog_hook.py", line 157, in post_event
airflow[9490]: self.validate_response(response)
airflow[9490]: File "/home/airflow/virtualenv/lib/python3.6/site-packages/airflow/contrib/hooks/datadog_hook.py", line 58, in validate_response
airflow[9490]: if response['status'] != 'ok':
airflow[9490]: KeyError: 'status'
但我登录的内容都不是在调度程序、网络服务器、工作人员或任务日志中出现错误之前或之后。
我已经测试过datadog_event
通过手动导入代码来调用我的 Airflow Worker,当我以这种方式运行时,它会正确记录:
airflow@airflow-worker-0:~/analytics$ /home/airflow/virtualenv/bin/python -i /home/airflow/analytics/etl_v2/airflow_data/dags/example_bash_operator_andy.py
[2019-08-07 20:48:01,890] {settings.py:213} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=29941
[2019-08-07 20:48:02,227] {__init__.py:51} INFO - Using executor DaskExecutor
>>> datadog_event('My title', 'My task', 'example_bash_operator_andy', 'run_this_last')
[2019-08-07 20:51:17,542] {datadog_hook.py:54} INFO - Setting up api keys for Datadog
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:38} INFO - My title
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:39} INFO - My task
[2019-08-07 20:51:17,544] {example_bash_operator_andy.py:40} INFO - example_bash_operator_andy
[2019-08-07 20:51:17,545] {example_bash_operator_andy.py:41} INFO - run_this_last
[2019-08-07 20:51:17,658] {api_client.py:139} INFO - 202 POST https://api.datadoghq.com/api/v1/events (113.2174ms)
My airflow.cfg
发布于https://gist.github.com/andyshinn/d743ddc61956ed7440c500fca962ce92 https://gist.github.com/andyshinn/d743ddc61956ed7440c500fca962ce92我正在使用 Airflow 1.10.4。
如何从 DAG 本身输出日志记录或消息以更好地调试可能发生的情况?