Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade to xgboost 1.0.1 #133

Closed
stevedodson opened this issue Feb 24, 2020 · 1 comment · Fixed by #200
Closed

Upgrade to xgboost 1.0.1 #133

stevedodson opened this issue Feb 24, 2020 · 1 comment · Fixed by #200
Labels
good first issue Good for newcomers help wanted Solution is fleshed out and ready to be worked on topic:NLP Issue or PR about NLP model support and eland_import_hub_model
Milestone

Comments

@stevedodson
Copy link
Contributor

xgboost 1.0.1 https://pypi.org/project/xgboost/#history is now on pypi and has slightly different semantics to 0.90.

@benwtrent can add more detail

@stevedodson stevedodson added help wanted Solution is fleshed out and ready to be worked on good first issue Good for newcomers labels Feb 24, 2020
@benwtrent
Copy link
Member

The upgrade should be trivial. From what I can tell from the docs, the external API changes that occurred should not cause many changes.

This is the only API change I can see: dmlc/xgboost#4541

Need to verify that the model transformers work well with that estimator.

Key file: https://github.com/elastic/eland/blob/master/eland/ml/_model_transformers.py#L228

@sethmlarson sethmlarson added the topic:NLP Issue or PR about NLP model support and eland_import_hub_model label Apr 3, 2020
@sethmlarson sethmlarson added this to the v7.9.0 milestone Apr 9, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers help wanted Solution is fleshed out and ready to be worked on topic:NLP Issue or PR about NLP model support and eland_import_hub_model
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants