To develop, we suggest using Python virtual environments
pip and steered by nox.
Once the virtual environment is activated and you have SSH keys setup with GitHub, clone the
repo from GitHub
git clone email@example.com:scikit-hep/pyhf
and install all necessary packages for development
python -m pip install --upgrade --editable '.[complete]'
Then setup the Git pre-commit hooks by running
inside of the virtual environment. pre-commit.ci keeps the pre-commit hooks updated through time, so pre-commit will automatically update itself when you run it locally after the hooks were updated.
It is then suggested that you use
nox to actually run all development operations
in “sessions” defined in
To list all of the available sessions run
Linting and code formatting is handled by
To run the linting either run
pre-commit run --all-files
nox --session lint
A function-scoped fixture called
datadir exists for a given test module
which will automatically copy files from the associated test modules data
directory into a temporary directory for the given test execution. That is, for
example, if a test was defined in
test_schema.py, then data files located
test_schema/ will be copied to a temporary directory whose path is made
available by the
datadir fixture. Therefore, one can do:
def test_patchset(datadir): data_file = open(datadir.join("test.txt"), encoding="utf-8") ...
which will load the copy of
text.txt in the temporary directory. This also
works for parameterizations as this will effectively sandbox the file
Running with pytest¶
To run the test suite in full, from the top level of the repository run
More practically for most local testing you will not want to test the benchmarks, contrib module, or notebooks, and so instead to test the core codebase a developer can run
nox --session tests --python 3.10
Contrib module matplotlib image tests¶
To run the visualization tests for the
contrib module with the
pytest plugin run
nox --session tests --python 3.10 -- contrib
If the image files need to be regenerated, run the tests with the
--mpl-generate-path=tests/contrib/baseline option or just run
nox --session regenerate
pyhf’s configuration of
pytest will automatically run
doctest on all the
modules when the full test suite is run.
doctest on an individual module or file just run
pytest on its path.
For example, to run
doctest on the JAX backend run
To measure coverage for the codebase run the tests under
coverage run --module pytest
coverage as a positional argument to the
nox --session tests --python 3.10 -- coverage
To generate a coverage report after running the tests under
or to also generate XML and HTML versions of the report run the coverage
nox --session coverage
To build the docs run
nox --session docs
To view the built docs locally, open the resulting
in a web browser or run
nox --session docs -- serve
As part of the release process a checklist is required to be completed to make sure steps aren’t missed. There is a GitHub Issue template for this that the maintainer in charge of the release should step through and update if needed.
The push of a tag to the repository will trigger a build of a sdist and wheel, and then the deployment of them to TestPyPI.
pyhf tests packaging and distribution by publishing to TestPyPI in advance of
Installation of the latest test release from TestPyPI can be tested
by first installing
pyhf normally, to ensure all dependencies are installed
from PyPI, and then upgrading
pyhf to a test release from TestPyPI
python -m pip install pyhf python -m pip install --upgrade --extra-index-url https://test.pypi.org/simple/ --pre pyhf
This adds TestPyPI as an additional package index to search
PyPI will still be the default package index
pip will attempt to install
from for all dependencies, but if a package has a release on TestPyPI that
is a more recent release then the package will be installed from TestPyPI instead.
Note that dev releases are considered pre-releases, so
0.1.2 is a “newer”
Once the TestPyPI deployment has been examined, installed, and tested locally by the maintainers final deployment to PyPI can be done by creating a GitHub Release:
Select the release tag that was just pushed, and set the release title to be the tag (e.g.
Use the “Auto-generate release notes” button to generate a skeleton of the release notes and then augment them with the preprepared release notes the release maintainer has written.
Select “This is a pre-release” if the release is a release candidate.
Select “Create a discussion for this release” if the release is a stable release.
Select “Publish release”.
Once the release has been published to GitHub, the publishing workflow will build a sdist and wheel, and then deploy them to PyPI.
Context Files and Archive Metadata¶
codemeta.json files have the version number
automatically updated through
tbump, though their additional metadata
should be checked periodically by the dev team (probably every release).
codemeta.json file can be generated automatically from a PyPI install
codemetapy --no-extras pyhf > codemeta.json
author metadata will still need to be checked and revised by hand.
.zenodo.json is currently generated by hand, so it is worth using
codemeta.json as a guide to edit it.