GCP 50K Batch Performance Loading Fails
Hello,
I have run the BATCH performance loading for 500 (runId: 6e861338-c55d-42c8-8c74-1a6707a8658a) & 1000 (runId: 96a5ffd6-9b12-4e03-9715-3b1b4b3a5391) which runs fine and completed okay with data evidenced in the xcomm summary.
When I run the 50,000 batch loading (twice); it doesn't generate any runid and same error is observed as noted below;
DEBUG:root:Response: 413 DEBUG:root:json = {"code": 413, "reason": "Failed to send request.", "message": "Unable to send request to Airflow. 413 Request Entity Too Large___
413 Request Entity Too Large______ 413 Request Entity Too Large
nginx______"}