//python:pip.bzl — rules_python 0.0.0 documentation (original) (raw)

Rules for pip integration.

This contains a set of rules that are used to support inclusion of third-party dependencies via fully locked requirements.txt files. Some of the exported symbols should not be used and they are either undocumented here or marked as for internal use only.

If you are using a bazel version 7 or above with bzlmod, you should only care about the compile_pip_requirements macro exposed in this file. The rest of the symbols are for legacy WORKSPACE setups.

compile_pip_requirements(name, srcs=None, src=None, extra_args=[], extra_deps=[], generate_hashes=True, py_binary='<function py_binary from //python:py_binary.bzl>', py_test='<function py_test from //python:py_test.bzl>', requirements_in=None, requirements_txt=None, requirements_darwin=None, requirements_linux=None, requirements_windows=None, visibility=['//visibility:private'], tags=None, **kwargs)

Generates targets for managing pip dependencies with pip-compile.

By default this rules generates a filegroup named “[name]” which can be included in the data of some other compile_pip_requirements rule that references these requirements (e.g. with -r ../other/requirements.txt).

It also generates two targets for running pip-compile:

If you are using a version control system, the requirements.txt generated by this rule should be checked into it to ensure that all developers/users have the same dependency versions.

Args:

multi_pip_parse(name, default_version, python_versions, python_interpreter_target, requirements_lock, minor_mapping, **kwargs)

NOT INTENDED FOR DIRECT USE!

This is intended to be used by the multi_pip_parse implementation in the template of the multi_toolchain_aliases repository rule.

Args:

Returns:

The internal implementation of multi_pip_parse repository rule.

package_annotation(additive_build_content=None, copy_files={}, copy_executables={}, data=[], data_exclude_glob=[], srcs_exclude_glob=[])

Annotations to apply to the BUILD file content from package generated from a pip_repository rule.

Args:

Returns:

str: A json encoded string of the provided content.

repo rule pip_parse(name, repo_mapping, annotations={}, download_only=False, enable_implicit_namespace_pkgs=False, environment={}, envsubst=[], experimental_requirement_cycles={}, experimental_target_platforms=[], extra_hub_aliases={}, extra_pip_args=[], isolated=True, pip_data_exclude=[], python_interpreter='', python_interpreter_target=None, quiet=True, requirements_by_platform={}, requirements_darwin=None, requirements_linux=None, requirements_lock=None, requirements_windows=None, timeout=600, use_hub_alias_dependencies=False)

Accepts a locked/compiled requirements file and installs the dependencies listed within.

Those dependencies become available in a generated requirements.bzl file. You can instead check this requirements.bzl file into your repo, see the “vendoring” section below.

In your WORKSPACE file:

load("@rules_python//python:pip.bzl", "pip_parse")

pip_parse( name = "pypi", requirements_lock = ":requirements.txt", )

load("@pypi//:requirements.bzl", "install_deps")

install_deps()

You can then reference installed dependencies from a BUILD file with the alias targets generated in the same repo, for example, for PyYAML we would have the following:

py_library( name = "bar", ... deps = [ "//my/other:dep", "@pypi//numpy", "@pypi//requests", ], )

or

load("@pypi//:requirements.bzl", "requirement")

py_library( name = "bar", ... deps = [ "//my/other:dep", requirement("numpy"), requirement("requests"), ], )

In addition to the requirement macro, which is used to access the generated py_librarytarget generated from a package’s wheel, The generated requirements.bzl file contains functionality for exposing entry points as py_binary targets as well.

load("@pypi//:requirements.bzl", "entry_point")

alias( name = "pip-compile", actual = entry_point( pkg = "pip-tools", script = "pip-compile", ), )

Note that for packages whose name and script are the same, only the name of the package is needed when calling the entry_point macro.

load("@pip//:requirements.bzl", "entry_point")

alias( name = "flake8", actual = entry_point("flake8"), )

Vendoring the requirements.bzl file

In some cases you may not want to generate the requirements.bzl file as a repository rule while Bazel is fetching dependencies. For example, if you produce a reusable Bazel module such as a ruleset, you may want to include the requirements.bzl file rather than make your users install the WORKSPACE setup to generate it. See https://github.com/bazelbuild/rules\_python/issues/608

This is the same workflow as Gazelle, which creates go_repository rules withupdate-repos

To do this, use the “write to source file” pattern documented in https://blog.aspect.dev/bazel-can-write-to-the-source-folder to put a copy of the generated requirements.bzl into your project. Then load the requirements.bzl file directly rather than from the generated repository. See the example in rules_python/examples/pip_parse_vendored.

Attributes:

Envvars:

RULES_PYTHON_PIP_ISOLATED, RULES_PYTHON_REPO_DEBUG

pip_utils.normalize_name(name)

normalize a PyPI package name and return a valid bazel label.

Args:

Returns:

a normalized name as a string.

rule whl_filegroup(name, whl, pattern='', runfiles=False)

Extract files matching a regular expression from a wheel file.

An empty pattern will match all files.

Example usage:

load("@rules_cc//cc:cc_library.bzl", "cc_library") load("@rules_python//python:pip.bzl", "whl_filegroup")

whl_filegroup( name = "numpy_includes", pattern = "numpy/core/include/numpy", whl = "@pypi//numpy:whl", )

cc_library( name = "numpy_headers", hdrs = [":numpy_includes"], includes = ["numpy_includes/numpy/core/include"], deps = ["@rules_python//python/cc:current_py_cc_headers"], )

Attributes:

repo rule whl_library_alias(name, minor_mapping, repo_mapping, version_map, wheel_name, default_version='')

Attributes: