Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pip_import / requirement() does not work for the pypi azure package #273

Closed
jasonh-logic2020 opened this issue Jan 11, 2020 · 8 comments
Closed

Comments

@jasonh-logic2020
Copy link

Hello.

The official azure package does not install correctly using bazel pip_import / requirment rules. My project loads a dozen libraries via pip_import and py_library / deps.

They all import successfully except azure.

The Pypi azure readme says that it installs no code itself but rather includes a list of other packages. I assume Bazel doe not cover this case.

I am using the lastest revs of every Bazel package I use. I can provide my WORKSPACE and BUILDs if it will help but this seems to be a pretty clear, isolated case.

Any workarounds or fixes would be welcome.

@thundergolfer
Copy link

thundergolfer commented Jan 12, 2020

Can you please post the actual error you get?

At work we were trying to use the Snowflake connector package, which pulled in an Azure storage package which didn't work. We addressed the issue in this project https://github.com/dillon-giacoppo/rules_python_external. We now use that alternative packaging solution at work.


You can see here my original comment in another issue talking about having a problem with an Azure package: #93 (comment)

@jasonh-logic2020
Copy link
Author

Thanks for asking.

I do not get an error until the import in my python code fails. Bazel does not report any issues.

To explain more fully, my project requires Python 3.7 language features, so I had to get Bazel to install 3.7 (which I got to work, using code from a distroless example), but Bazel pip_import cannot currently target this install.

In order to get past that hurdle, I used rules_docker's py_layer to build a layer which only contains my external dependencies, azure among them. My py_image includes this layer, and it all seems to hold together. I can verify my version and the existence of the code for every library I list as a requirement() in my py_layer deps /except/ azure.

Also, since I can grep the layer manifest I can see that Bazel does not attempt to include any Python code from azure, just

  • METADATA
  • RECORD
  • WHEEL
  • top_level.txt

Other libraries have lots of code in the manifest.

Further, since I can untar the layer, i can see all the other libraries actually exist, or at least their init.py does.

There is no mention of the azure package at all in the untarred layer.

@thundergolfer
Copy link

I do not get an error until the import in my python code fails.

Yes that's consistent with the behaviour I saw. As Python isn't compiled the import errors will only show up at runtime.

Bazel pip_import cannot currently target this install.

I don't think this is true. pip_import has the python_interpreter attribute. If you make your python3 interpreter 3.7 then it will be used by pip_import. Can you explain this a bit more?

Would you consider trying rules_python_external that I linked above? Currently rules_python does not work with Azure packages. I think if you want your code to work you need to stop using it for the packaging stuff.

With rules_python_external we are able to successfully use the Azure packages we want. This is part of our requirements.txt:

azure-common==1.1.23 \
    --hash=sha256:53b1195b8f20943ccc0e71a17849258f7781bc6db1c72edc7d6c055f79bd54e3 \
    --hash=sha256:99ef36e74b6395329aada288764ce80504da16ecc8206cb9a72f55fb02e8b484 \
    # via azure-storage-blob, azure-storage-common, snowflake-connector-python
azure-storage-blob==2.1.0 \
    --hash=sha256:a8e91a51d4f62d11127c7fd8ba0077385c5b11022f0269f8a2a71b9fc36bef31 \
    --hash=sha256:b90323aad60f207f9f90a0c4cf94c10acc313c20b39403398dfba51f25f7b454 \
    # via snowflake-connector-python
azure-storage-common==2.1.0 \
    --hash=sha256:b01a491a18839b9d05a4fe3421458a0ddb5ab9443c14e487f40d16f9a1dc2fbe \
    --hash=sha256:ccedef5c67227bc4d6670ffd37cec18fb529a1b7c3a5e53e4096eb0cf23dc73f \
    # via azure-storage-blob

All of those work.

@jasonh-logic2020
Copy link
Author

rules_python_external seems to work. I'm ironing out some non-import-related container issues and will report back when I'm absolutely sure everything is working as expected.

🥇 @thundergolfer

@thundergolfer
Copy link

thundergolfer commented Jan 15, 2020

Good to hear. You can PM me in the bazelbuild.slack.com workspace if you have further questions or problems in this area. Also happy to hear any feedback on rules_python_external.

@jasonh-logic2020
Copy link
Author

Following this problem step by step, I finally got images building with Python 3.7 and all of my dependencies, but I'm now getting errors when Python starts:

Fatal Python error: initfsencoding: Unable to get the locale encoding
ModuleNotFoundError: No module named 'encodings'

This did not happen when I was building stock py_image Python 3.5 images.

Any guesses?

@thundergolfer
Copy link

I'd guess that an environment variable that tells the library which local encoding to use. Something like "LC_CTYPE": "en_US.UTF-8". That's a complete guess though.

This problem is I think out of scope of this Github Issue though. Feel free to ping me in the Bazel Slack.

@thundergolfer
Copy link

rules_python adopted rules_python_external, so azure now works fine with the pip_install repository rule.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants