GCP R3M15-CSV Ingestion : In execut raise PipelineFailedError("Dag failed")
gs://us-central1-airflow-v2-36b0ff41-bucket/logs/csv_ingestion/update_status_finished/2022-12-27T15:27:44.295473+00:00/1.log. [2022-12-27, 15:34:14 UTC] {taskinstance.py:1044} INFO - Dependencies all met for <TaskInstance: csv_ingestion.update_status_finished test_workflow707265 [queued]> [2022-12-27, 15:34:14 UTC] {taskinstance.py:1044} INFO - Dependencies all met for <TaskInstance: csv_ingestion.update_status_finished test_workflow707265 [queued]> [2022-12-27, 15:34:14 UTC] {taskinstance.py:1250} INFO -
*** Reading remote log from[2022-12-27, 15:34:14 UTC] {taskinstance.py:1251} INFO - Starting attempt 1 of 2 [2022-12-27, 15:34:14 UTC] {taskinstance.py:1252} INFO -
[2022-12-27, 15:34:14 UTC] {taskinstance.py:1271} INFO - Executing <Task(UpdateStatusOperator): update_status_finished> on 2022-12-27 15:27:44.295473+00:00 [2022-12-27, 15:34:14 UTC] {standard_task_runner.py:52} INFO - Started process 185313 to run task [2022-12-27, 15:34:14 UTC] {standard_task_runner.py:79} INFO - Running: ['airflow', 'tasks', 'run', 'csv_ingestion', 'update_status_finished', 'test_workflow707265', '--job-id', '217613', '--raw', '--subdir', 'DAGS_FOLDER/csv-parser/csv_ingestion_all_steps.py', '--cfg-path', '/tmp/tmpltcu6ban', '--error-file', '/tmp/tmpfmf51_8b'] [2022-12-27, 15:34:14 UTC] {standard_task_runner.py:80} INFO - Job 217613: Subtask update_status_finished [2022-12-27, 15:34:15 UTC] {task_command.py:298} INFO - Running <TaskInstance: csv_ingestion.update_status_finished test_workflow707265 [running]> on host airflow-worker-6467cc5845-fqzgm [2022-12-27, 15:34:15 UTC] {taskinstance.py:1448} INFO - Exporting the following env vars: AIRFLOW_CTX_DAG_EMAIL=airflow@example.com AIRFLOW_CTX_DAG_OWNER=airflow AIRFLOW_CTX_DAG_ID=csv_ingestion AIRFLOW_CTX_TASK_ID=update_status_finished AIRFLOW_CTX_EXECUTION_DATE=2022-12-27T15:27:44.295473+00:00 AIRFLOW_CTX_DAG_RUN_ID=test_workflow707265 [2022-12-27, 15:34:15 UTC] {update_status.py:67} INFO - There are failed tasks before this one. So it has status FAILED [2022-12-27, 15:34:16 UTC] {warnings.py:109} WARNING - /opt/python3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:1043: InsecureRequestWarning: Unverified HTTPS request is being made to host 'preship.gcp.gnrg-osdu.projects.epam.com'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings warnings.warn(
[2022-12-27, 15:34:17 UTC] {taskinstance.py:1776} ERROR - Task failed with exception Traceback (most recent call last): File "/opt/python3.8/lib/python3.8/site-packages/osdu_airflow/operators/update_status.py", line 153, in execute raise PipelineFailedError("Dag failed") osdu_ingestion.libs.exceptions.PipelineFailedError: Dag failed [2022-12-27, 15:34:17 UTC] {taskinstance.py:1279} INFO - Marking task as UP_FOR_RETRY. dag_id=csv_ingestion, task_id=update_status_finished, execution_date=20221227T152744, start_date=20221227T153414, end_date=20221227T153417 [2022-12-27, 15:34:17 UTC] {standard_task_runner.py:93} ERROR - Failed to execute job 217613 for task update_status_finished (Dag failed; 185313) [2022-12-27, 15:34:17 UTC] {local_task_job.py:154} INFO - Task exited with return code 1 [2022-12-27, 15:34:17 UTC] {local_task_job.py:264} INFO - 0 downstream tasks scheduled from follow-on schedule check