-
Notifications
You must be signed in to change notification settings - Fork 1.5k
Chunked Cache Upload APIs #128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
zarenner
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Reserve before tarring up cache?
7f6523f to
71d9426
Compare
|
Implement a (very small number of) retries as discussed, or did you decide not to? |
|
@zarenner I've added retries to the upload chunk requests |
src/cacheHttpClient.ts
Outdated
| ? fileSize - offset | ||
| : MAX_CHUNK_SIZE; | ||
| const start = offset; | ||
| const end = offset + chunkSize - 1; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What happens when we try to upload a size 0 file, i.e. when end is -1, what happens?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In that case:
offset = 0, chunkSize = fileSize - offset = 1, end = 0, which is valid since the content range is inclusive
Is a size 0 file valid? I'd imagine something else will fail before it gets here
To enable larger cache sizes, we're updating our internal APIs to prevent network timeouts that occur from the large single upload stream of the previous iteration.
Bumping the per cache limit to match the per repo limit, as that's the only limit we have now.