Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(services/s3): Batch max operations from config #2354

Closed

Conversation

manulpatel
Copy link
Contributor

Related issue: #2228

This PR allows to read batch_max_operations from config and takes default value of 1000 if not specified explicity.

@@ -314,6 +314,8 @@ pub struct S3Builder {
/// the part size of s3 multipart upload, which should be 5 MiB to 5 GiB.
/// There is no minimum size limit on the last part of your multipart upload
write_min_size: Option<usize>,

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please add a new function for batch_max_operations so that users can assign it.

@Xuanwo
Copy link
Member

Xuanwo commented Jun 1, 2023

The next step is load batch_max_operations (set to default if not user set) in build.

@@ -19,6 +19,7 @@ use std::collections::HashMap;
use std::fmt::Debug;
use std::fmt::Formatter;
use std::fmt::Write;
use std::string;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not seeing why we need to use std::string here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants