Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Continuing #576 #619

Merged
merged 1 commit into from
Jan 12, 2023
Merged

Continuing #576 #619

merged 1 commit into from
Jan 12, 2023

Conversation

ekurtulus
Copy link
Contributor

Please see.

Comment on lines +97 to +98
parser.add_argument("--poly_eps", type=float, default=1.0)
parser.add_argument("--seq2seq_model", action="store_true")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would add these to the config files

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would add these to the config files

Yes, normally it was but because of the file problems I had, it seems like that part was not updated.

metrics, preprocess_fn = [evaluate.load("squad_v2")], preprocess_qa
elif any(dataset in summarization_datasets for dataset in conf.datasets):
metrics, preprocess_fn = [evaluate.load("rouge")], partial(
preprocess_summarization, tokenizer, ignore_pad_token_for_loss=conf.ignore_pad_token_for_loss
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is ignore_pad_token_for_loss defined in the config?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is ignore_pad_token_for_loss defined in the config?

It is one of the default arguments of huggingface trainer, so I thought it would be in the config.

bitsandbytes.optim.GlobalOptimManager.get_instance().register_module_override(
module, "weight", {"optim_bits": 32}
)
metrics, preprocess_fn = get_metrics(training_conf)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure these are used.

theblackcat102 added a commit that referenced this pull request Jan 12, 2023
Changes on #619. Datasets is getting a bit dirty. I will do a refactoring this week.
@theblackcat102 theblackcat102 merged commit 5b77dd2 into LAION-AI:main Jan 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants