M20 GCP Performance Loading Test Batch Load 12mb json payload workflow dag run failed status
I have run a manifest ingestion with json payload of ~12mb size. (Let me know if you need the sample json it can't be uploaded through here as the limit is 10mb).
Dag run status request:
GET: https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/workflow/Osdu_ingest/workflowRun/21e76859-0ea5-489d-ad20-6aa4b27c07ec
Return:
{
"workflowId": "a1ac5b94-dd98-4c90-8d48-916096076453",
"runId": "21e76859-0ea5-489d-ad20-6aa4b27c07ec",
"startTimeStamp": 1695436403031,
"endTimeStamp": 1695437043735,
"status": "failed",
"submittedBy": "preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com"
}
I have checked DAG airflow logs and below are few concerns:
- As this is a batch loading, one of the process trunk was failed but it does not show any logs whatsoever
- Last process I have checked the XCOM and it shows that all records were successfully saved. None of the records were in skipped_ids which proves all records are successfully saved.