GitHub - scientific-python/cookie: Scientific Python Library Development Guide and Cookiecutter (original) (raw)
Scientific Python: guide, cookie, & sp-repo-review
Cookie
A copier/cookiecutter template for new Python projects based on the Scientific Python Developer Guide. What makes this different from other templates for Python packages?
- Lives with the Scientific-Python Development Guide: Every decision is clearly documented and every tool described, and everything is kept in sync.
- Nine different backends to choose from for building packages.
- Optional VCS versioning for most backends.
- Template generation tested in GitHub Actions using nox.
- Supports generation with copier, cookiecutter, and cruft.
- Supports GitHub Actions if targeting a
github.com
url (the default), and adds experimental GitLab CI support otherwise. - Includes several compiled backends using pybind11, with wheels produced for all platforms using cibuildwheel.
- Provides sp-repo-review to evaluate existing repos against the guidelines, with a WebAssembly version integrated with the guide. All checks cross-linked.
- Follows PyPA best practices and regularly updated. Recent additions:
- Uses uv for high performance CI and task running.
Be sure you have read the Scientific-Python Development Guide first, and possibly used them on a project or two. This is not a minimal example or tutorial. It is a collection of useful tooling for starting a new project using cookiecutter, or for copying in individual files for an existing project (by hand, from {{cookiecutter.project_name}}/
).
During generation you can select from the following backends for your package:
- hatch: This uses hatchling, a modern builder with nice file inclusion, extendable via plugins, and good error messages. (Recommended for pure Python projects)
- flit: A modern, lightweight PEP 621 build system for pure Python projects. Replaces setuptools, no MANIFEST.in, setup.py, or setup.cfg. Low learning curve. Easy to bootstrap into new distributions. Difficult to get the right files included, little dynamic metadata support.
- pdm: A modern, less opinionated all-in-one solution to pure Python projects supporting standards. Replaces setuptools, venv/pipenv, pip, wheel, and twine. Supports PEP 621.
- poetry: An all-in-one solution to pure Python projects. Replaces setuptools, venv/pipenv, pip, wheel, and twine. Higher learning curve, but is all-in-one. Makes some bad default assumptions for libraries.
- setuptools: The classic build system, but with the new standardized configuration.
- pybind11: This is setuptools but with an C++ extension written inpybind11 and wheels generated by cibuildwheel.
- scikit-build: A scikit-build (CMake) project also using pybind11, using scikit-build-core. (Recommended for C++ projects)
- meson-python: A Meson project also using pybind11. (No VCS versioning)
- maturin: A PEP 621 builder for Rust binary extensions. (No VCS versioning) (Recommended for Rust projects)
Currently, the best choice is probably hatch for pure Python projects, and scikit-build (such as the scikit-build-core + pybind11 choice) for binary projects.
To use (copier version)
Install copier
and copier-templates-extensions
. Using uv, that's:
uv tool install --with copier-templates-extensions copier
Now, run copier to generate your project:
copier copy gh:scientific-python/cookie --trust --vcs-ref=HEAD
(<pkg>
is the path to put the new project. --vcs-ref=HEAD
gets the current version instead of the last tag, matching cookiecutter's behavior. Note you can combine these two lines into one with uvx
, just remember to pass --with
before the program name in that case.)
You will get a nicer CLI experience with answer validation. You will also get a.copier-answers.yml
file, which will allow you to perform updates in the future.
Note: Add
--vcs-ref=HEAD
to get the latest version instead of the last tagged version; HEAD always passes tests (and is what cookiecutter uses).
To use (cookiecutter version)
Install cookiecutter, ideally with brew install cookiecutter
if you use brew, otherwise with uv tool install cookiecutter
(or prepend uvx
to the command below, and skip installation). Then run:
cookiecutter gh:scientific-python/cookie
If you are using cookiecutter 2.2.3+, you will get nice descriptions for the options like copier!
To use (cruft version)
You can also use cruft, which adds the ability update to cookiecutter projects. Install with uv tool install cruft
(or prepend uvx
to the command below, and skip installation). Then run:
cruft create https://github.com/scientific-python/cookie
Post generation
Check the key setup files, pyproject.toml
, and possibly setup.cfg
andsetup.py
(pybind11 example). Update README.md
. Also update and add docs todocs/
.
There are a few example dependencies and a minimum Python version of 3.9, feel free to change it to whatever you actually need/want. There is also a basic backports structure with a small typing example.
Contained components:
- GitHub Actions runs testing for the generation itself
- Uses nox so cookie development can be checked locally
- Uses uv for high performance CI
- GitHub actions deploy
- C++ backends include cibuildwheel for wheel builds
- Uses PyPI trusted publisher deployment
- Dependabot keeps actions up to date periodically, through useful pull requests
- Formatting handled by pre-commit
- No reason not to be strict on a new project; remove what you don't want.
- Includes MyPy - static typing
- Includes Ruff - standard formatting, linting and autofixes
* Replaces Flake8, isort, pyupgrade, yesqa, pycln, and dozens of plugins - Includes spell checking
- An pylint nox target can be used to run pylint, which integrated GHA annotations
- A ReadTheDocs-ready Sphinx docs folder and
docs
dependency-group - A test folder and pytest
test
dependency-group - A dev group for
uv run
integration - A noxfile is included with a few common targets
For developers:
You can test locally with nox:
See all commands
nox -l
Run a specific check
nox -s "lint(scikit-build)"
Run a noxfile command on the project noxfile
nox -s "nox(hatch)" -- docs
If you don't have nox
locally, you can use uv, such as uvx nox
instead.
Other similar projects
Hypermodern-Python is another project worth checking out with many similarities, like great documentation for each feature and many of the same tools used. It has a slightly different set of features, and has a stronger focus on GitHub Actions - most our guide could be adapted to a different CI system fairly easily if you don't want to use GHA. It also forces the use of Poetry (instead of having a backend selection), and doesn't support compiled projects. It currently dumps all development dependencies into a shared environment, causing long solve times and high chance of conflicts. It also does not use pre-commit the way it was intended to be used. It also has quite a bit of custom code.
History
A lot of the guide, cookiecutter, and repo-review started out as part ofScikit-HEP. These projects were merged, generalized, and combined with theNSLS-II guide during the 2023 Scientific-Python Developers Summit.
sp-repo-review
sp-repo-review
provides checks based on the Scientific-Python Development Guide at scientific-python/cookie for repo-review.
This tool can check the style of a repository. Use like this:
uvx sp-repo-review[cli]
This will produce a list of results - green checkmarks mean this rule is followed, red x’s mean the rule is not. A yellow warning sign means that the check was skipped because a previous required check failed. Some checks will fail, that’s okay - the goal is bring all possible issues to your attention, not to force compliance with arbitrary checks. Eventually there might be a way to mark checks as ignored.
For example, GH101
expects all your action files to have a nice name:
field. If you are happy with the file-based names you see in CI, you should feel free to simply ignore this check (you can specify ignored checks in pyproject.toml or by passing args to repo-review, see the repo-review docs).
All checks are mentioned at least in some way in the Scientific-Python Development Guide. You should read that first - if you are not attempting to follow them, some of the checks might not work. For example, the guidelines specify pytest configuration be placed in pyproject.toml
. If you place it somewhere else, then all the pytest checks will be skipped.
This was originally developed for Scikit-HEP before moving to Scientific Python.
Extras
cli
: Dependencies to run the CLI (not needed for programmatic access, like on Web Assembly)pyproject
: Include validate pyproject with schema store.all
: All extras
Other ways to use
You can also use GitHub Actions:
- uses: scientific-python/cookie@
Or pre-commit:
- repo: https://github.com/scientific-python/cookie
rev:
hooks:
- id: sp-repo-review
If you use additional_dependencies
to add more plugins, likevalidate-pyproject
, you should also include "repo-review[cli]"
to ensure the CLI requirements are included.
List of checks
General
- PY001: Has a pyproject.toml
- PY002: Has a README.(md|rst) file
- PY003: Has a LICENSE* file
- PY004: Has docs folder
- PY005: Has tests folder
- PY006: Has pre-commit config
- PY007: Supports an easy task runner (nox, tox, pixi, etc.)
PyProject
- PP002: Has a proper build-system table
- PP003: Does not list wheel as a build-dep
- PP004: Does not upper cap Python requires
- PP005: Using SPDX project.license should not use deprecated trove classifiers
- PP301: Has pytest in pyproject
- PP302: Sets a minimum pytest to at least 6
- PP303: Sets the test paths
- PP304: Sets the log level in pytest
- PP305: Specifies xfail_strict
- PP306: Specifies strict config
- PP307: Specifies strict markers
- PP308: Specifies useful pytest summary
- PP309: Filter warnings specified
Documentation
- RTD100: Uses ReadTheDocs (pyproject config)
- RTD101: You have to set the RTD version number to 2
- RTD102: You have to set the RTD build image
- RTD103: You have to set the RTD python version
- RTD104: You have to specify a build configuration now for readthedocs.
GitHub Actions
- GH100: Has GitHub Actions config
- GH101: Has nice names
- GH102: Auto-cancel on repeated PRs
- GH103: At least one workflow with manual dispatch trigger
- GH104: Use unique names for upload-artifact
- GH200: Maintained by Dependabot
- GH210: Maintains the GitHub action versions with Dependabot
- GH211: Do not pin core actions as major versions
- GH212: Require GHA update grouping
MyPy
- MY100: Uses MyPy (pyproject config)
- MY101: MyPy strict mode
MY102
: MyPy show_error_codes deprecated- MY103: MyPy warn unreachable
- MY104: MyPy enables ignore-without-code
- MY105: MyPy enables redundant-expr
- MY106: MyPy enables truthy-bool
Pre-commit
- PC100: Has pre-commit-hooks
- PC110: Uses black or ruff-format
- PC111: Uses blacken-docs
- PC140: Uses a type checker
- PC160: Uses a spell checker
- PC170: Uses PyGrep hooks (only needed if rST present)
- PC180: Uses a markdown formatter
- PC190: Uses Ruff
- PC191: Ruff show fixes if fixes enabled
- PC901: Custom pre-commit CI message
Ruff
- RF001: Has Ruff config
- RF002: Target version must be set
- RF003: src directory doesn't need to be specified anymore (0.6+)
- RF101: Bugbear must be selected
- RF102: isort must be selected
- RF103: pyupgrade must be selected
RF201
: Avoid using deprecated config settingsRF202
: Use (new) lint config section
Setuptools Config
- SCFG001: Avoid deprecated setup.cfg names