Skip to content
GitLab
Projects Groups Snippets
  • /
  • Help
    • Help
    • Support
    • Community forum
    • Submit feedback
  • Register
  • Sign in
  • P Pre Shipping
  • Project information
    • Project information
    • Activity
    • Labels
    • Members
  • Repository
    • Repository
    • Files
    • Commits
    • Branches
    • Tags
    • Contributors
    • Graph
    • Compare
    • Locked Files
  • Issues 39
    • Issues 39
    • List
    • Boards
    • Service Desk
    • Milestones
    • Iterations
    • Requirements
  • Merge requests 0
    • Merge requests 0
  • CI/CD
    • CI/CD
    • Pipelines
    • Jobs
    • Schedules
    • Test Cases
  • Deployments
    • Deployments
    • Environments
    • Releases
  • Packages and registries
    • Packages and registries
    • Package Registry
    • Container Registry
    • Infrastructure Registry
  • Monitor
    • Monitor
    • Incidents
  • Analytics
    • Analytics
    • Value stream
    • CI/CD
    • Code review
    • Insights
    • Issue
    • Repository
  • Wiki
    • Wiki
  • Snippets
    • Snippets
  • Activity
  • Graph
  • Create a new issue
  • Jobs
  • Commits
  • Issue Boards
Collapse sidebar
  • Open Subsurface Data Universe SoftwareOpen Subsurface Data Universe Software
  • Platform
  • Pre Shipping
  • Issues
  • #90
Closed
Open
Issue created Sep 17, 2021 by Grant Marblestone@gmarblestoneMaintainer

Unable to upload larger file to Azure ('Connection aborted.', timeout('The write operation timed out'))

I have attempted to write a segy file to azure via Sdutil.

I have successfully uploaded a small file 17k using the command following successfully. python sdutil cp data1.txt sd://opendes/grant-test/grant/data1.txt

But I fail on a 80Meg file with the following command. python sdutil cp data2.txt sd://opendes/grant-test/grant/data2.txt

I attempted to track down the source of the issue but was not able to spend the time. in \seismic-store-sdutil\sdlib\cmd\cp\cmd.py on line approx 47 there is the method upload_data_chunks

The small file goes thru the if/else for a single put in seismic-store-sdutil\sdutilenv\Lib\site-packages\azure\storage\blob_upload_helpers.py while the larger file goes thru the use_original_upload_path (line 122).

After that i followed the data thru the code to seismic-store-sdutil\sdutilenv\Lib\site-packages\azure\storage\blob_shared\uploads.py The large file seems to have a max_concurrency of 1. Which seems strange.

Anyway, at this point i gave up. I was unable to set the timeout anywhere.

Note: Debasis and I are unable to upload but Chris can.

Assignee
Assign to
Time tracking