Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncertainties are displayed although result is below Detection Limit #1575

Merged
merged 1 commit into from
Apr 27, 2020

Conversation

xispa
Copy link
Member

@xispa xispa commented Apr 26, 2020

Description of the issue/feature this PR addresses

In Analyses listings, the uncertainties are displayed even if the result is below (or above) the Detection Limit. Also, units and +- symbol are not displayed.

Current behavior before PR

Captura de 2020-04-26 23-56-14

Desired behavior after PR is merged

Captura de 2020-04-26 23-56-36

--
I confirm I have tested this PR thoroughly and coded it according to PEP8
and Plone's Python styleguide standards.

Copy link
Contributor

@ramonski ramonski left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is something missing in this PR?
I see only that before/after the result changes were made.

@xispa
Copy link
Member Author

xispa commented Apr 27, 2020

@ramonski note I've removed the else

@xispa
Copy link
Member Author

xispa commented Apr 27, 2020

the function format_uncertainty takes care of this case already, so there is no need of an else and "force" the retrieval of the uncertainty set on analysis creation

@ramonski
Copy link
Contributor

Ah, missed that little else removal, thanks

@ramonski ramonski merged commit 9a008d8 into master Apr 27, 2020
@ramonski ramonski deleted the uncertainties branch April 27, 2020 08:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

2 participants