import pyproject-rpm-macros-1.16.2-1.el10

cs10 imports/cs10/pyproject-rpm-macros-1.16.2-1.el10
MSVSphere Packaging Team 1 month ago
parent d0e8e59743
commit 2ac580592b
Signed by: sys_gitsync
GPG Key ID: B2B0B9F29E528FE8

@ -79,8 +79,21 @@ using the `-R` flag:
%generate_buildrequires
%pyproject_buildrequires -R
Alternatively, the runtime dependencies can be obtained by building the wheel and reading the metadata from the built wheel.
This can be enabled by using the `-w` flag.
Alternatively, if the project specifies its dependencies in the pyproject.toml
`[project]` table (as defined in [PEP 621](https://www.python.org/dev/peps/pep-0621/)),
the runtime dependencies can be obtained by reading that metadata.
This can be enabled by using the `-p` flag.
This flag supports reading both the runtime dependencies, and the selected extras
(see the `-x` flag described below).
Please note that not all build backends which use pyproject.toml support the
`[project]` table scheme.
For example, poetry-core (at least in 1.9.0) defines package metadata in the
custom `[tool.poetry]` table which is not supported by the `%pyproject_buildrequires` macro.
Finally, the runtime dependencies can be obtained by building the wheel and reading the metadata from the built wheel.
This can be enabled with the `-w` flag and cannot be combined with `-p`.
Support for building wheels with `%pyproject_buildrequires -w` is **provisional** and the behavior might change.
Please subscribe to Fedora's [python-devel list] if you use the option.
@ -111,6 +124,14 @@ For example, if upstream suggests installing test dependencies with
%generate_buildrequires
%pyproject_buildrequires -x testing
For projects that specify test requirements using [PEP 735] dependency groups,
these can be added using the `-g` flag.
Multiple groups can be supplied by repeating the flag or as a comma separated list.
For example, if upstream uses a dependency group called `tests`, the test deps would be generated by:
%generate_buildrequires
%pyproject_buildrequires -g tests
For projects that specify test requirements in their [tox] configuration,
these can be added using the `-t` flag (default tox environment)
or the `-e` flag followed by the tox environment.
@ -134,16 +155,21 @@ The `-e` option redefines `%{toxenv}` for further reuse.
Use `%{default_toxenv}` to get the default value.
The `-t`/`-e` option uses [tox-current-env]'s `--print-deps-to-file` behind the scenes.
It generates dependencies listed directly in `deps`,
dependencies defined through `extras`,
and on tox 4.22+ also dependencies defined through `dependency_groups`.
If your package specifies some tox plugins in `tox.requires`,
such plugins will be BuildRequired as well.
Not all plugins are guaranteed to play well with [tox-current-env],
in worst case, patch/sed the requirement out from the tox configuration.
Note that neither `-x` or `-t` can be used with `-R`,
Note that neither `-x` or `-t` can be used with `-R` or `-N`,
because runtime dependencies are always required for testing.
You can only use those options if the build backend supports the [prepare-metadata-for-build-wheel hook],
or together with `-w`.
or together with `-p` or `-w`.
However, using `-g` with `-R` or `-N` is supported because dependency groups don't need to be used for testing
and can be obtained by reading `pyproject.toml` only.
[tox]: https://tox.readthedocs.io/
[tox-current-env]: https://github.com/fedora-python/tox-current-env/
@ -158,7 +184,7 @@ Dependencies will be loaded from them:
For packages not using build system you can use `-N` to entirely skip automatical
generation of requirements and install requirements only from manually specified files.
`-N` option implies `-R` and cannot be used in combination with other options mentioned above
(`-w`, `-e`, `-t`, `-x`).
(`-w`, `-e`, `-t`, `-x`, `-p`).
The `%pyproject_buildrequires` macro also accepts the `-r` flag for backward compatibility;
it means "include runtime dependencies" which has been the default since version 0-53.
@ -288,7 +314,7 @@ However, in Fedora packages, always list executables explicitly to avoid uninten
`%pyproject_save_files` can automatically mark license files with `%license` macro
and language (`*.mo`) files with `%lang` macro and appropriate language code.
Only license files declared via [PEP 639] `License-File` field are detected.
[PEP 639] is still a draft and can be changed in the future.
[PEP 639] is still provisional and can be changed in the future.
It is possible to use the `-l` flag to declare that a missing license should
terminate the build or `-L` (the default) to explicitly disable this check.
Packagers are encouraged to use the `-l` flag when the `%license` file is not manually listed in `%files`
@ -507,6 +533,7 @@ so be prepared for problems.
[PEP 517]: https://www.python.org/dev/peps/pep-0517/
[PEP 518]: https://www.python.org/dev/peps/pep-0518/
[PEP 639]: https://www.python.org/dev/peps/pep-0639/
[PEP 735]: https://www.python.org/dev/peps/pep-0735/
[pip's documentation]: https://pip.pypa.io/en/stable/cli/pip_install/#vcs-support

@ -4,7 +4,7 @@
# this macro will cause the package with the real macro to be installed.
# When macros.pyproject is installed, it overrides this macro.
# Note: This needs to maintain the same set of options as the real macro.
%pyproject_buildrequires(rRxtNwe:C:) echo 'pyproject-rpm-macros' && exit 0
%pyproject_buildrequires(rRxtNwpe:g:C:) echo 'pyproject-rpm-macros' && exit 0
# Declarative buildsystem, requires RPM 4.20+ to work

@ -25,6 +25,11 @@
%_pyproject_record %{_builddir}/%{_pyproject_files_prefix}-pyproject-record
%_pyproject_buildrequires %{_builddir}/%{_pyproject_files_prefix}-pyproject-buildrequires
# Internal macro, takes %%set_build_flags and strips all the exports
# TODO: Make such a list an actual source of %%set_build_flags (in redhat-rpm-config)
# Cannot use %%gsub directly to preserve EL 9 compatibility
%_pyproject_build_flags %{lua:local exports = rpm.expand('%{set_build_flags} ;'); print((exports:gsub('%s*;+%s+export%s+[%u_]+%s*;+%s*', ' ')))}
# Avoid leaking %%{_pyproject_builddir} to pytest collection
# https://bugzilla.redhat.com/show_bug.cgi?id=1935212
# The value is read and used by the %%pytest and %%tox macros:
@ -33,7 +38,8 @@
%pyproject_wheel(C:) %{expand:\\\
%_set_pytest_addopts
mkdir -p "%{_pyproject_builddir}"
CFLAGS="${CFLAGS:-${RPM_OPT_FLAGS}}" LDFLAGS="${LDFLAGS:-${RPM_LD_FLAGS}}" TMPDIR="%{_pyproject_builddir}" \\\
%{_pyproject_build_flags} \\\
TMPDIR="%{_pyproject_builddir}" \\\
%{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_wheel.py %{?**} %{_pyproject_wheeldir}
}
@ -152,13 +158,17 @@ fi
%default_toxenv py%{python3_version_nodots}
%toxenv %{default_toxenv}
%_pyproject_tomlidep %["%{python3_pkgversion}" == "3"\
? "echo '(python%{python3_pkgversion}dist(tomli) if python%{python3_pkgversion}-devel < 3.11)'"\
: "%[v"%{python3_pkgversion}" < v"3.11"\
? "echo 'python%{python3_pkgversion}dist(tomli)'"\
: "true # will use tomllib, echo nothing"\
]"\
]
# Note: Keep the options in sync with this macro from macros.aaa-pyproject-srpm
%pyproject_buildrequires(rRxtNwe:C:) %{expand:\\\
%pyproject_buildrequires(rRxtNwpe:g:C:) %{expand:\\\
%_set_pytest_addopts
# The _auto_set_build_flags feature does not do this in %%generate_buildrequires section,
# but we want to get an environment consistent with %%build:
%{?_auto_set_build_flags:%set_build_flags}
# The default flags expect the package note file to exist
# see https://bugzilla.redhat.com/show_bug.cgi?id=2097535
%{?_package_note_flags:%_generate_package_note_file}
@ -168,6 +178,7 @@ fi
%{-e:%{error:The -R and -e options are mutually exclusive}}
%{-t:%{error:The -R and -t options are mutually exclusive}}
%{-w:%{error:The -R and -w options are mutually exclusive}}
%{-p:%{error:The -R and -p options are mutually exclusive}}
}
%{-N:
%{-r:%{error:The -N and -r options are mutually exclusive}}
@ -175,25 +186,25 @@ fi
%{-e:%{error:The -N and -e options are mutually exclusive}}
%{-t:%{error:The -N and -t options are mutually exclusive}}
%{-w:%{error:The -N and -w options are mutually exclusive}}
%{-p:%{error:The -N and -p options are mutually exclusive}}
%{-C:%{error:The -N and -C options are mutually exclusive}}
%{-g:if [ -f pyproject.toml ]; then
%_pyproject_tomlidep
fi}
}
%{-w:
%{-p:%{error:The -w and -p options are mutually exclusive}}
}
%{-e:%{expand:%global toxenv %(%{__python3} -s %{_rpmconfigdir}/redhat/pyproject_construct_toxenv.py %{?**})}}
echo 'pyproject-rpm-macros' # first stdout line matches the implementation in macros.aaa-pyproject-srpm
echo 'python%{python3_pkgversion}-devel'
echo 'python%{python3_pkgversion}dist(pip) >= 19'
echo 'python%{python3_pkgversion}dist(packaging)'
%{!-N:if [ -f pyproject.toml ]; then
%["%{python3_pkgversion}" == "3"
? "echo '(python%{python3_pkgversion}dist(tomli) if python%{python3_pkgversion}-devel < 3.11)'"
: "%[v"%{python3_pkgversion}" < v"3.11"
? "echo 'python%{python3_pkgversion}dist(tomli)'"
: "true # will use tomllib, echo nothing"
]"
]
%{!-N:echo 'python%{python3_pkgversion}dist(pip) >= 19'
if [ -f pyproject.toml ]; then
%_pyproject_tomlidep
elif [ -f setup.py ]; then
# Note: If the default requirements change, also change them in the script!
echo 'python%{python3_pkgversion}dist(setuptools) >= 40.8'
echo 'python%{python3_pkgversion}dist(wheel)'
else
echo 'ERROR: Neither pyproject.toml nor setup.py found, consider using %%%%pyproject_buildrequires -N <requirements-file> if this is not a Python package.' >&2
exit 1
@ -203,7 +214,8 @@ rm -rfv *.dist-info/ >&2
if [ -f %{__python3} ]; then
mkdir -p "%{_pyproject_builddir}"
echo -n > %{_pyproject_buildrequires}
CFLAGS="${CFLAGS:-${RPM_OPT_FLAGS}}" LDFLAGS="${LDFLAGS:-${RPM_LD_FLAGS}}" TMPDIR="%{_pyproject_builddir}" \\\
%{_pyproject_build_flags} \\\
TMPDIR="%{_pyproject_builddir}" \\\
RPM_TOXENV="%{toxenv}" HOSTNAME="rpmbuild" %{__python3} -Bs %{_rpmconfigdir}/redhat/pyproject_buildrequires.py %{?!_python_no_extras_requires:--generate-extras} --python3_pkgversion %{python3_pkgversion} --wheeldir %{_pyproject_wheeldir} --output %{_pyproject_buildrequires} %{?**} >&2
cat %{_pyproject_buildrequires}
fi

@ -10,6 +10,7 @@ import subprocess
import re
import tempfile
import email.parser
import functools
import pathlib
import zipfile
@ -34,6 +35,7 @@ def print_err(*args, **kwargs):
try:
from packaging.markers import Marker
from packaging.requirements import Requirement, InvalidRequirement
from packaging.utils import canonicalize_name
except ImportError as e:
@ -99,18 +101,23 @@ class Requirements:
return True
return False
def add(self, requirement_str, *, package_name=None, source=None):
def add(self, requirement, *, package_name=None, source=None, extra=None):
"""Output a Python-style requirement string as RPM dep"""
requirement_str = str(requirement)
print_err(f'Handling {requirement_str} from {source}')
try:
requirement = Requirement(requirement_str)
except InvalidRequirement:
hint = guess_reason_for_invalid_requirement(requirement_str)
message = f'Requirement {requirement_str!r} from {source} is invalid.'
if hint:
message += f' Hint: {hint}'
raise ValueError(message)
# requirements read initially from the metadata are strings
# further on we work with them as Requirement instances
if not isinstance(requirement, Requirement):
try:
requirement = Requirement(requirement)
except InvalidRequirement:
hint = guess_reason_for_invalid_requirement(requirement)
message = f'Requirement {requirement!r} from {source} is invalid.'
if hint:
message += f' Hint: {hint}'
raise ValueError(message)
if requirement.url:
print_err(
@ -118,10 +125,17 @@ class Requirements:
)
name = canonicalize_name(requirement.name)
if extra is not None:
extra_str = f'extra == "{extra}"'
if requirement.marker is not None:
extra_str = f'({requirement.marker}) and {extra_str}'
requirement.marker = Marker(extra_str)
if (requirement.marker is not None and
not self.evaluate_all_environments(requirement)):
print_err(f'Ignoring alien requirement:', requirement_str)
self.ignored_alien_requirements.append(requirement_str)
self.ignored_alien_requirements.append(requirement)
return
# Handle self-referencing requirements
@ -215,7 +229,8 @@ def toml_load(opened_binary_file):
return tomllib.load(opened_binary_file)
def get_backend(requirements):
@functools.cache
def load_pyproject():
try:
f = open('pyproject.toml', 'rb')
except FileNotFoundError:
@ -223,6 +238,11 @@ def get_backend(requirements):
else:
with f:
pyproject_data = toml_load(f)
return pyproject_data
def get_backend(requirements):
pyproject_data = load_pyproject()
buildsystem_data = pyproject_data.get('build-system', {})
requirements.extend(
@ -248,15 +268,6 @@ def get_backend(requirements):
# with pyproject.toml without a specified build backend.
# If the default requirements change, also change them in the macro!
requirements.add('setuptools >= 40.8', source='default build backend')
# PEP 517 doesn't mandate depending on wheel when the default backend is used.
# Historically, it used to be assumed as necessary, but later it turned out to be wrong.
# See the removal in pip and build:
# https://github.com/pypa/pip/pull/12449
# https://github.com/pypa/build/pull/716
# However, the requirement *will* be generated by setuptools anyway
# as part of get_requires_for_build_wheel().
# So we might as well keep it to skip one redundant step.
requirements.add('wheel', source='default build backend')
requirements.check(source='build backend')
@ -310,7 +321,9 @@ def generate_run_requirements_hook(backend, requirements):
raise ValueError(
'The build backend cannot provide build metadata '
'(incl. runtime requirements) before build. '
'Use the provisional -w flag to build the wheel and parse the metadata from it, '
'If the dependencies are specified in the pyproject.toml [project] '
'table, you can use the -p flag to read them.'
'Alternatively, use the provisional -w flag to build the wheel and parse the metadata from it, '
'or use the -R flag not to generate runtime dependencies.'
)
dir_basename = prepare_metadata('.', config_settings=requirements.config_settings)
@ -368,8 +381,35 @@ def generate_run_requirements_wheel(backend, requirements, wheeldir):
raise RuntimeError('Could not find *.dist-info/METADATA in built wheel.')
def generate_run_requirements(backend, requirements, *, build_wheel, wheeldir):
if build_wheel:
def generate_run_requirements_pyproject(requirements):
pyproject_data = load_pyproject()
if not (project_table := pyproject_data.get('project', {})):
raise ValueError('Could not find the [project] table in pyproject.toml.')
dynamic_fields = project_table.get('dynamic', [])
if 'dependencies' in dynamic_fields or 'optional-dependencies' in dynamic_fields:
raise ValueError('Could not read the dependencies or optional-dependencies '
'from the [project] table in pyproject.toml, as the field is dynamic.')
dependencies = project_table.get('dependencies', [])
name = project_table.get('name')
requirements.extend(dependencies,
package_name=name,
source=f'pyproject.toml generated metadata: [dependencies] ({name})')
optional_dependencies = project_table.get('optional-dependencies', {})
for extra, dependencies in optional_dependencies.items():
requirements.extend(dependencies,
package_name=name,
source=f'pyproject.toml generated metadata: [optional-dependencies] {extra} ({name})',
extra=extra)
def generate_run_requirements(backend, requirements, *, build_wheel, read_pyproject_dependencies, wheeldir):
if read_pyproject_dependencies:
generate_run_requirements_pyproject(requirements)
elif build_wheel:
generate_run_requirements_wheel(backend, requirements, wheeldir)
else:
generate_run_requirements_hook(backend, requirements)
@ -419,6 +459,103 @@ def generate_tox_requirements(toxenv, requirements):
source=f'tox --print-deps-only: {toxenv}')
def tox_dependency_groups(toxenv):
# We call this command separately instead of folding it into the previous one
# becasue --print-dependency-groups-to only works with tox 4.22+ and tox-current-env 0.0.14+.
# We handle failure gracefully: upstreams using dependency_groups should require tox >= 4.22.
toxenv = ','.join(toxenv)
with tempfile.NamedTemporaryFile('r') as groups:
r = subprocess.run(
[sys.executable, '-m', 'tox',
'--print-dependency-groups-to', groups.name,
'-q', '-e', toxenv],
check=False,
encoding='utf-8',
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
if r.returncode == 0:
if r.stdout:
print_err(r.stdout, end='')
if output := groups.read().strip():
return output.splitlines()
return []
def generate_dependency_groups(requested_groups, requirements):
"""Adapted from https://peps.python.org/pep-0735/#reference-implementation (public domain)"""
from collections import defaultdict
def _normalize_name(name: str) -> str:
return re.sub(r"[-_.]+", "-", name).lower()
def _normalize_group_names(dependency_groups: dict) -> dict:
original_names = defaultdict(list)
normalized_groups = {}
for group_name, value in dependency_groups.items():
normed_group_name = _normalize_name(group_name)
original_names[normed_group_name].append(group_name)
normalized_groups[normed_group_name] = value
errors = []
for normed_name, names in original_names.items():
if len(names) > 1:
errors.append(f"{normed_name} ({', '.join(names)})")
if errors:
raise ValueError(f"Duplicate dependency group names: {', '.join(errors)}")
return normalized_groups
def _resolve_dependency_group(
dependency_groups: dict, group: str, past_groups: tuple[str, ...] = ()
) -> list[str]:
if group in past_groups:
raise ValueError(f"Cyclic dependency group include: {group} -> {past_groups}")
if group not in dependency_groups:
raise LookupError(f"Dependency group '{group}' not found")
raw_group = dependency_groups[group]
if not isinstance(raw_group, list):
raise ValueError(f"Dependency group '{group}' is not a list")
realized_group = []
for item in raw_group:
if isinstance(item, str):
realized_group.append(item)
elif isinstance(item, dict):
if tuple(item.keys()) != ("include-group",):
raise ValueError(f"Invalid dependency group item: {item}")
include_group = _normalize_name(next(iter(item.values())))
realized_group.extend(
_resolve_dependency_group(
dependency_groups, include_group, past_groups + (group,)
)
)
else:
raise ValueError(f"Invalid dependency group item: {item}")
return realized_group
def resolve(dependency_groups: dict, group: str) -> list[str]:
if not isinstance(dependency_groups, dict):
raise TypeError("Dependency Groups table is not a dict")
return _resolve_dependency_group(dependency_groups, _normalize_name(group))
pyproject_data = load_pyproject()
dependency_groups_raw = pyproject_data.get("dependency-groups", {})
dependency_groups = _normalize_group_names(dependency_groups_raw)
for group_names in requested_groups:
for group_name in group_names.split(","):
requirements.extend(
resolve(dependency_groups, group_name),
source=f"Dependency group {group_name}",
)
def python3dist(name, op=None, version=None, python3_pkgversion="3"):
prefix = f"python{python3_pkgversion}dist"
@ -431,9 +568,10 @@ def python3dist(name, op=None, version=None, python3_pkgversion="3"):
def generate_requires(
*, include_runtime=False, build_wheel=False, wheeldir=None, toxenv=None, extras=None,
*, include_runtime=False, build_wheel=False, wheeldir=None, toxenv=None, extras=None, dependency_groups=None,
get_installed_version=importlib.metadata.version, # for dep injection
generate_extras=False, python3_pkgversion="3", requirement_files=None, use_build_system=True,
read_pyproject_dependencies=False,
output, config_settings=None,
):
"""Generate the BuildRequires for the project in the current directory
@ -449,9 +587,10 @@ def generate_requires(
config_settings=config_settings,
)
dependency_groups = dependency_groups or []
try:
if (include_runtime or toxenv) and not use_build_system:
raise ValueError('-N option cannot be used in combination with -r, -e, -t, -x options')
if (include_runtime or toxenv or read_pyproject_dependencies) and not use_build_system:
raise ValueError('-N option cannot be used in combination with -r, -e, -t, -x, -p options')
if requirement_files:
for req_file in requirement_files:
requirements.extend(
@ -465,8 +604,12 @@ def generate_requires(
if toxenv:
include_runtime = True
generate_tox_requirements(toxenv, requirements)
dependency_groups.extend(tox_dependency_groups(toxenv))
if dependency_groups:
generate_dependency_groups(dependency_groups, requirements)
if include_runtime:
generate_run_requirements(backend, requirements, build_wheel=build_wheel, wheeldir=wheeldir)
generate_run_requirements(backend, requirements, build_wheel=build_wheel,
read_pyproject_dependencies=read_pyproject_dependencies, wheeldir=wheeldir)
except EndPass:
return
finally:
@ -493,7 +636,7 @@ def main(argv):
help=argparse.SUPPRESS,
)
parser.add_argument(
'-p', '--python3_pkgversion', metavar='PYTHON3_PKGVERSION',
'--python3_pkgversion', metavar='PYTHON3_PKGVERSION',
default="3", help=argparse.SUPPRESS,
)
parser.add_argument(
@ -508,6 +651,11 @@ def main(argv):
help='comma separated list of "extras" for runtime requirements '
'(e.g. -x testing,feature-x) (implies --runtime, can be repeated)',
)
parser.add_argument(
'-g', '--dependency-groups', metavar='GROUPS', action='append',
help='comma separated list of dependency groups (PEP 735) for requirements '
'(e.g. -g tests,docs) (can be repeated)',
)
parser.add_argument(
'-t', '--tox', action='store_true',
help=('generate test tequirements from tox environment '
@ -523,6 +671,11 @@ def main(argv):
help=('Generate run-time requirements by building the wheel '
'(useful for build backends without the prepare_metadata_for_build_wheel hook)'),
)
parser.add_argument(
'-p', '--read-pyproject-dependencies', action='store_true', default=False,
help=('Generate dependencies from [project] table of pyproject.toml '
'instead of calling prepare_metadata_for_build_wheel hook)'),
)
parser.add_argument(
'-R', '--no-runtime', action='store_false', dest='runtime',
help="Don't generate run-time requirements (implied by -N)",
@ -571,10 +724,12 @@ def main(argv):
wheeldir=args.wheeldir,
toxenv=args.toxenv,
extras=args.extras,
dependency_groups=args.dependency_groups,
generate_extras=args.generate_extras,
python3_pkgversion=args.python3_pkgversion,
requirement_files=args.requirement_files,
use_build_system=args.use_build_system,
read_pyproject_dependencies=args.read_pyproject_dependencies,
output=args.output,
config_settings=parse_config_settings_args(args.config_settings),
)

File diff suppressed because it is too large Load Diff

@ -6,16 +6,36 @@ import pytest
import setuptools
import yaml
from pyproject_buildrequires import generate_requires
from pyproject_buildrequires import generate_requires, load_pyproject
SETUPTOOLS_VERSION = packaging.version.parse(setuptools.__version__)
SETUPTOOLS_60 = SETUPTOOLS_VERSION >= packaging.version.parse('60')
try:
import tox
except ImportError:
TOX_4_22 = False
else:
TOX_VERSION = packaging.version.parse(tox.__version__)
TOX_4_22 = TOX_VERSION >= packaging.version.parse('4.22')
testcases = {}
with Path(__file__).parent.joinpath('pyproject_buildrequires_testcases.yaml').open() as f:
testcases = yaml.safe_load(f)
@pytest.fixture(autouse=True)
def clear_pyproject_data():
"""
Clear pyproject data before each test.
In reality we build one RPM package at a time, so we can keep the once-loaded
pyproject.toml contents.
When testing, the cached data would leak the once-loaded data to all the
following test cases.
"""
load_pyproject.cache_clear()
@pytest.mark.parametrize('case_name', testcases)
def test_data(case_name, capfd, tmp_path, monkeypatch):
case = testcases[case_name]
@ -51,6 +71,7 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
requirement_files = case.get('requirement_files', [])
requirement_files = [open(f) for f in requirement_files]
use_build_system = case.get('use_build_system', True)
read_pyproject_dependencies = case.get('read_pyproject_dependencies', False)
try:
generate_requires(
get_installed_version=get_installed_version,
@ -58,10 +79,12 @@ def test_data(case_name, capfd, tmp_path, monkeypatch):
build_wheel=case.get('build_wheel', False),
wheeldir=str(wheeldir),
extras=case.get('extras', []),
dependency_groups=case.get('dependency_groups', []),
toxenv=case.get('toxenv', None),
generate_extras=case.get('generate_extras', False),
requirement_files=requirement_files,
use_build_system=use_build_system,
read_pyproject_dependencies=read_pyproject_dependencies,
output=output,
config_settings=case.get('config_settings'),
)

@ -14,8 +14,8 @@ License: MIT
# Increment Y and reset Z when new macros or features are added
# Increment Z when this is a bugfix or a cosmetic change
# Dropping support for EOL Fedoras is *not* considered a breaking change
Version: 1.14.0
Release: 2%{?dist}
Version: 1.16.2
Release: 1%{?dist}
# Macro files
Source001: macros.pyproject
@ -173,9 +173,28 @@ export HOSTNAME="rpmbuild" # to speedup tox in network-less mock, see rhbz#1856
%changelog
* Tue Oct 29 2024 Troy Dawson <tdawson@redhat.com> - 1.14.0-2
- Bump release for October 2024 mass rebuild:
Resolves: RHEL-64018
* Wed Nov 13 2024 Miro Hrončok <mhroncok@redhat.com> - 1.16.2-1
- Fix one remaining test for setuptools 70+
* Thu Nov 07 2024 Miro Hrončok <miro@hroncok.cz> - 1.16.1-1
- Support for setuptools 70+
- wheel is no longer generated as a dependency of the default build system
* Mon Nov 04 2024 Miro Hrončok <mhroncok@redhat.com> - 1.16.0-1
- %%pyproject_buildrequires: Add support for dependency groups (PEP 735), via the -g flag
- This is implied when used tox testenvs depend on dependency groups (requires tox 4.22+)
- Fixes: rhbz#2318849
* Thu Oct 03 2024 Karolina Surma <ksurma@redhat.com> - 1.15.1-1
- Fix handling of self-referencing extras when reading pyproject.toml
* Tue Sep 17 2024 Python Maint <python-maint@redhat.com> - 1.15.0-1
- Add a possibility to read runtime requirements from pyproject.toml [project] table
- Fixes: rhbz#2261939
- Don't generate a dependency on pip when %%pyproject_buildrequires -N is used
- Fixes: rhbz#2294510
- Even when %%_auto_set_build_flags is disabled, set all compiler flags when building wheels
- Fixes: rhbz#2293616
* Tue Jul 23 2024 Miro Hrončok <mhroncok@redhat.com> - 1.14.0-1
- Add a provisional RPM Declarative Buildsystem (RPM 4.20+)

Loading…
Cancel
Save