Import upstream version 0.0.3, md5 0791dd16d6447ee2e85c86a7b3b1a959

This commit is contained in:
Jelmer Vernooij 2021-03-27 17:41:02 +00:00
commit 6d621f1903
37 changed files with 3658 additions and 1490 deletions

View file

@ -13,28 +13,34 @@ jobs:
fail-fast: false fail-fast: false
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }} - name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2 uses: actions/setup-python@v2
with: with:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip flake8 cython python -m pip install --upgrade pip flake8 cython
python -m pip install git+https://github.com/jelmer/buildlog-consultant python setup.py develop
python setup.py develop - name: Install Debian-specific dependencies
mkdir -p ~/.config/breezy/plugins run: |
brz branch lp:brz-debian ~/.config/breezy/plugins/debian sudo apt install libapt-pkg-dev
- name: Style checks python -m pip install wheel
run: | python -m pip install git+https://salsa.debian.org/apt-team/python-apt
python -m flake8 python -m pip install -e ".[debian]"
- name: Typing checks mkdir -p ~/.config/breezy/plugins
run: | brz branch lp:brz-debian ~/.config/breezy/plugins/debian
pip install -U mypy if: "matrix.python-version != 'pypy3' && matrix.os == 'ubuntu-latest'"
python -m mypy ognibuild - name: Style checks
if: "matrix.python-version != 'pypy3'" run: |
- name: Test suite run python -m flake8
run: | - name: Typing checks
python -m unittest ognibuild.tests.test_suite run: |
env: pip install -U mypy
PYTHONHASHSEED: random python -m mypy ognibuild
if: "matrix.python-version != 'pypy3'"
- name: Test suite run
run: |
python -m unittest ognibuild.tests.test_suite
env:
PYTHONHASHSEED: random

17
PKG-INFO Normal file
View file

@ -0,0 +1,17 @@
Metadata-Version: 2.1
Name: ognibuild
Version: 0.0.3
Summary: Detect and run any build system
Home-page: https://jelmer.uk/code/ognibuild
Maintainer: Jelmer Vernooij
Maintainer-email: jelmer@jelmer.uk
License: GNU GPLv2 or later
Description: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Operating System :: POSIX
Provides-Extra: debian

View file

@ -1,5 +1,4 @@
ognibuild # ognibuild
=========
Ognibuild is a simple wrapper with a common interface for invoking any kind of Ognibuild is a simple wrapper with a common interface for invoking any kind of
build tool. build tool.
@ -10,8 +9,7 @@ parameters.
It can also detect and install missing dependencies. It can also detect and install missing dependencies.
Goals ## Goals
-----
The goal of ognibuild is to provide a consistent CLI that can be used for any The goal of ognibuild is to provide a consistent CLI that can be used for any
software package. It is mostly useful for automated building of software package. It is mostly useful for automated building of
@ -20,8 +18,7 @@ large sets of diverse packages (e.g. different programming languages).
It is not meant to expose all functionality that is present in the underlying It is not meant to expose all functionality that is present in the underlying
build systems. To use that, invoke those build systems directly. build systems. To use that, invoke those build systems directly.
Usage ## Usage
-----
Ognibuild has a number of subcommands: Ognibuild has a number of subcommands:
@ -34,7 +31,53 @@ Ognibuild has a number of subcommands:
It also includes a subcommand that can fix up the build dependencies It also includes a subcommand that can fix up the build dependencies
for Debian packages, called deb-fix-build. for Debian packages, called deb-fix-build.
License ## Status
-------
Ognibuild is functional, but sometimes rough around the edges. If you run into
issues (or lack of support for a particular ecosystem), please file a bug.
### Supported Build Systems
- Cabal
- Cargo
- Golang
- Gradle
- Make, including various makefile generators:
- autoconf/automake
- CMake
- Makefile.PL
- qmake
- Maven
- ninja, including ninja file generators:
- meson
- Node
- Octave
- Perl
- Module::Build::Tiny
- PHP Pear
- Python - setup.py/setup.cfg/pyproject.toml
- R
- Ruby gems
- Waf
### Supported package repositories
Package repositories are used to install missing dependencies.
The following "native" repositories are supported:
- pypi
- cpan
- hackage
- npm
- cargo
- cran
- golang\*
As well one distribution repository:
- apt
## License
Ognibuild is licensed under the GNU GPL, v2 or later. Ognibuild is licensed under the GNU GPL, v2 or later.

View file

@ -0,0 +1,17 @@
Metadata-Version: 2.1
Name: ognibuild
Version: 0.0.3
Summary: Detect and run any build system
Home-page: https://jelmer.uk/code/ognibuild
Maintainer: Jelmer Vernooij
Maintainer-email: jelmer@jelmer.uk
License: GNU GPLv2 or later
Description: UNKNOWN
Platform: UNKNOWN
Classifier: Development Status :: 4 - Beta
Classifier: License :: OSI Approved :: GNU General Public License v2 or later (GPLv2+)
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Programming Language :: Python :: Implementation :: CPython
Classifier: Operating System :: POSIX
Provides-Extra: debian

View file

@ -0,0 +1,52 @@
.flake8
.gitignore
AUTHORS
CODE_OF_CONDUCT.md
LICENSE
README.md
SECURITY.md
TODO
releaser.conf
setup.cfg
setup.py
.github/workflows/pythonpackage.yml
notes/architecture.md
notes/concepts.md
notes/roadmap.md
ognibuild/__init__.py
ognibuild/__main__.py
ognibuild/build.py
ognibuild/buildlog.py
ognibuild/buildsystem.py
ognibuild/clean.py
ognibuild/dist.py
ognibuild/dist_catcher.py
ognibuild/fix_build.py
ognibuild/fixers.py
ognibuild/info.py
ognibuild/install.py
ognibuild/outputs.py
ognibuild/requirements.py
ognibuild/test.py
ognibuild/vcs.py
ognibuild.egg-info/PKG-INFO
ognibuild.egg-info/SOURCES.txt
ognibuild.egg-info/dependency_links.txt
ognibuild.egg-info/entry_points.txt
ognibuild.egg-info/requires.txt
ognibuild.egg-info/top_level.txt
ognibuild/debian/__init__.py
ognibuild/debian/apt.py
ognibuild/debian/build.py
ognibuild/debian/build_deps.py
ognibuild/debian/file_search.py
ognibuild/debian/fix_build.py
ognibuild/debian/udd.py
ognibuild/resolver/__init__.py
ognibuild/resolver/apt.py
ognibuild/session/__init__.py
ognibuild/session/plain.py
ognibuild/session/schroot.py
ognibuild/tests/__init__.py
ognibuild/tests/test_debian_build.py
ognibuild/tests/test_debian_fix_build.py

View file

@ -0,0 +1 @@

View file

@ -0,0 +1,4 @@
[console_scripts]
deb-fix-build = ognibuild.debian.fix_build:main
ogni = ognibuild.__main__:main

View file

@ -0,0 +1,8 @@
breezy
buildlog-consultant>=0.0.4
requirements-parser
[debian]
debmutate
python_apt
python_debian

View file

@ -0,0 +1 @@
ognibuild

View file

@ -20,6 +20,12 @@ import os
import stat import stat
__version__ = (0, 0, 3)
USER_AGENT = "Ognibuild"
class DetailedFailure(Exception): class DetailedFailure(Exception):
def __init__(self, retcode, argv, error): def __init__(self, retcode, argv, error):
self.retcode = retcode self.retcode = retcode
@ -28,12 +34,22 @@ class DetailedFailure(Exception):
class UnidentifiedError(Exception): class UnidentifiedError(Exception):
"""An unidentified error."""
def __init__(self, retcode, argv, lines, secondary=None): def __init__(self, retcode, argv, lines, secondary=None):
self.retcode = retcode self.retcode = retcode
self.argv = argv self.argv = argv
self.lines = lines self.lines = lines
self.secondary = secondary self.secondary = secondary
def __repr__(self):
return "<%s(%r, %r, ..., secondary=%r)>" % (
type(self).__name__,
self.retcode,
self.argv,
self.secondary,
)
def shebang_binary(p): def shebang_binary(p):
if not (os.stat(p).st_mode & stat.S_IEXEC): if not (os.stat(p).st_mode & stat.S_IEXEC):
@ -42,7 +58,7 @@ def shebang_binary(p):
firstline = f.readline() firstline = f.readline()
if not firstline.startswith(b"#!"): if not firstline.startswith(b"#!"):
return None return None
args = firstline[2:].split(b" ") args = firstline[2:].strip().split(b" ")
if args[0] in (b"/usr/bin/env", b"env"): if args[0] in (b"/usr/bin/env", b"env"):
return os.path.basename(args[1].decode()).strip() return os.path.basename(args[1].decode()).strip()
return os.path.basename(args[0].decode()).strip() return os.path.basename(args[0].decode()).strip()

View file

@ -20,12 +20,16 @@ import os
import shlex import shlex
import sys import sys
from . import UnidentifiedError, DetailedFailure from . import UnidentifiedError, DetailedFailure
from .buildlog import InstallFixer, ExplainInstallFixer, ExplainInstall from .buildlog import (
InstallFixer,
ExplainInstallFixer,
ExplainInstall,
install_missing_reqs,
)
from .buildsystem import NoBuildToolsFound, detect_buildsystems from .buildsystem import NoBuildToolsFound, detect_buildsystems
from .resolver import ( from .resolver import (
auto_resolver, auto_resolver,
native_resolvers, native_resolvers,
UnsatisfiedRequirements,
) )
from .resolver.apt import AptResolver from .resolver.apt import AptResolver
@ -35,8 +39,7 @@ def display_explain_commands(commands):
for command, reqs in commands: for command, reqs in commands:
if isinstance(command, list): if isinstance(command, list):
command = shlex.join(command) command = shlex.join(command)
logging.info( logging.info(" %s (to install %s)", command, ", ".join(map(str, reqs)))
' %s (to install %s)', command, ', '.join(map(str, reqs)))
def get_necessary_declared_requirements(resolver, requirements, stages): def get_necessary_declared_requirements(resolver, requirements, stages):
@ -47,12 +50,14 @@ def get_necessary_declared_requirements(resolver, requirements, stages):
return missing return missing
def install_necessary_declared_requirements(session, resolver, buildsystems, stages, explain=False): def install_necessary_declared_requirements(
session, resolver, fixers, buildsystems, stages, explain=False
):
relevant = [] relevant = []
declared_reqs = [] declared_reqs = []
for buildsystem in buildsystems: for buildsystem in buildsystems:
try: try:
declared_reqs.extend(buildsystem.get_declared_dependencies()) declared_reqs.extend(buildsystem.get_declared_dependencies(session, fixers))
except NotImplementedError: except NotImplementedError:
logging.warning( logging.warning(
"Unable to determine declared dependencies from %r", buildsystem "Unable to determine declared dependencies from %r", buildsystem
@ -60,21 +65,8 @@ def install_necessary_declared_requirements(session, resolver, buildsystems, sta
relevant.extend( relevant.extend(
get_necessary_declared_requirements(resolver, declared_reqs, stages) get_necessary_declared_requirements(resolver, declared_reqs, stages)
) )
missing = []
for req in relevant: install_missing_reqs(session, resolver, relevant, explain=explain)
try:
if not req.met(session):
missing.append(req)
except NotImplementedError:
missing.append(req)
if missing:
if explain:
commands = resolver.explain(missing)
if not commands:
raise UnsatisfiedRequirements(missing)
raise ExplainInstall(commands)
else:
resolver.install(missing)
# Types of dependencies: # Types of dependencies:
@ -154,35 +146,46 @@ def main(): # noqa: C901
session = PlainSession() session = PlainSession()
with session: with session:
logging.info("Preparing directory %s", args.directory)
external_dir, internal_dir = session.setup_from_directory(args.directory)
session.chdir(internal_dir)
os.chdir(external_dir)
if args.resolve == "apt": if args.resolve == "apt":
resolver = AptResolver.from_session(session) resolver = AptResolver.from_session(session)
elif args.resolve == "native": elif args.resolve == "native":
resolver = native_resolvers(session) resolver = native_resolvers(session, user_local=args.user)
elif args.resolve == "auto": elif args.resolve == "auto":
resolver = auto_resolver(session) resolver = auto_resolver(session, explain=args.explain)
logging.info("Using requirement resolver: %s", resolver) logging.info("Using requirement resolver: %s", resolver)
os.chdir(args.directory)
try: try:
bss = list(detect_buildsystems(args.directory)) bss = list(detect_buildsystems(args.directory))
logging.info( logging.info("Detected buildsystems: %s", ", ".join(map(str, bss)))
"Detected buildsystems: %s", ', '.join(map(str, bss))) fixers = determine_fixers(session, resolver, explain=args.explain)
if not args.ignore_declared_dependencies: if not args.ignore_declared_dependencies:
stages = STAGE_MAP[args.subcommand] stages = STAGE_MAP[args.subcommand]
if stages: if stages:
logging.info("Checking that declared requirements are present") logging.info("Checking that declared requirements are present")
try: try:
install_necessary_declared_requirements( install_necessary_declared_requirements(
session, resolver, bss, stages, explain=args.explain) session, resolver, fixers, bss, stages, explain=args.explain
)
except ExplainInstall as e: except ExplainInstall as e:
display_explain_commands(e.commands) display_explain_commands(e.commands)
return 1 return 1
fixers = determine_fixers(session, resolver, explain=args.explain)
if args.subcommand == "dist": if args.subcommand == "dist":
from .dist import run_dist from .dist import run_dist, DistNoTarball
run_dist( try:
session=session, buildsystems=bss, resolver=resolver, fixers=fixers run_dist(
) session=session,
buildsystems=bss,
resolver=resolver,
fixers=fixers,
target_directory=".",
)
except DistNoTarball:
logging.fatal('No tarball created.')
return 1
if args.subcommand == "build": if args.subcommand == "build":
from .build import run_build from .build import run_build

View file

@ -34,10 +34,14 @@ from buildlog_consultant.common import (
MissingPerlModule, MissingPerlModule,
MissingXmlEntity, MissingXmlEntity,
MissingJDKFile, MissingJDKFile,
MissingJDK,
MissingJRE,
MissingNodeModule, MissingNodeModule,
MissingNodePackage,
MissingPhpClass, MissingPhpClass,
MissingRubyGem, MissingRubyGem,
MissingLibrary, MissingLibrary,
MissingSetupPyCommand,
MissingJavaClass, MissingJavaClass,
MissingCSharpCompiler, MissingCSharpCompiler,
MissingRPackage, MissingRPackage,
@ -46,10 +50,16 @@ from buildlog_consultant.common import (
MissingValaPackage, MissingValaPackage,
MissingXfceDependency, MissingXfceDependency,
MissingHaskellDependencies, MissingHaskellDependencies,
MissingVagueDependency,
DhAddonLoadFailure, DhAddonLoadFailure,
MissingMavenArtifacts, MissingMavenArtifacts,
GnomeCommonMissing, GnomeCommonMissing,
MissingGnomeCommonDependency, MissingGnomeCommonDependency,
UnknownCertificateAuthority,
CMakeFilesMissing,
MissingLibtool,
MissingQt,
MissingX11,
) )
from .fix_build import BuildFixer from .fix_build import BuildFixer
@ -71,15 +81,24 @@ from .requirements import (
XmlEntityRequirement, XmlEntityRequirement,
SprocketsFileRequirement, SprocketsFileRequirement,
JavaClassRequirement, JavaClassRequirement,
CMakefileRequirement,
HaskellPackageRequirement, HaskellPackageRequirement,
MavenArtifactRequirement, MavenArtifactRequirement,
GnomeCommonRequirement, GnomeCommonRequirement,
JDKFileRequirement, JDKFileRequirement,
JDKRequirement,
JRERequirement,
PerlModuleRequirement, PerlModuleRequirement,
PerlFileRequirement, PerlFileRequirement,
AutoconfMacroRequirement, AutoconfMacroRequirement,
PythonModuleRequirement, PythonModuleRequirement,
PythonPackageRequirement, PythonPackageRequirement,
CertificateAuthorityRequirement,
NodeModuleRequirement,
QTRequirement,
X11Requirement,
LibtoolRequirement,
VagueDependencyRequirement,
) )
from .resolver import UnsatisfiedRequirements from .resolver import UnsatisfiedRequirements
@ -108,7 +127,11 @@ def problem_to_upstream_requirement(problem): # noqa: C901
elif isinstance(problem, MissingRPackage): elif isinstance(problem, MissingRPackage):
return RPackageRequirement(problem.package, problem.minimum_version) return RPackageRequirement(problem.package, problem.minimum_version)
elif isinstance(problem, MissingNodeModule): elif isinstance(problem, MissingNodeModule):
return NodePackageRequirement(problem.module) return NodeModuleRequirement(problem.module)
elif isinstance(problem, MissingNodePackage):
return NodePackageRequirement(problem.package)
elif isinstance(problem, MissingVagueDependency):
return VagueDependencyRequirement(problem.name, minimum_version=problem.minimum_version)
elif isinstance(problem, MissingLibrary): elif isinstance(problem, MissingLibrary):
return LibraryRequirement(problem.library) return LibraryRequirement(problem.library)
elif isinstance(problem, MissingRubyFile): elif isinstance(problem, MissingRubyFile):
@ -119,16 +142,37 @@ def problem_to_upstream_requirement(problem): # noqa: C901
return SprocketsFileRequirement(problem.content_type, problem.name) return SprocketsFileRequirement(problem.content_type, problem.name)
elif isinstance(problem, MissingJavaClass): elif isinstance(problem, MissingJavaClass):
return JavaClassRequirement(problem.classname) return JavaClassRequirement(problem.classname)
elif isinstance(problem, CMakeFilesMissing):
return [CMakefileRequirement(filename) for filename in problem.filenames]
elif isinstance(problem, MissingHaskellDependencies): elif isinstance(problem, MissingHaskellDependencies):
return [HaskellPackageRequirement.from_string(dep) for dep in problem.deps] return [HaskellPackageRequirement.from_string(dep) for dep in problem.deps]
elif isinstance(problem, MissingMavenArtifacts): elif isinstance(problem, MissingMavenArtifacts):
return [MavenArtifactRequirement(artifact) for artifact in problem.artifacts] return [
MavenArtifactRequirement.from_str(artifact)
for artifact in problem.artifacts
]
elif isinstance(problem, MissingCSharpCompiler): elif isinstance(problem, MissingCSharpCompiler):
return BinaryRequirement("msc") return BinaryRequirement("msc")
elif isinstance(problem, GnomeCommonMissing): elif isinstance(problem, GnomeCommonMissing):
return GnomeCommonRequirement() return GnomeCommonRequirement()
elif isinstance(problem, MissingJDKFile): elif isinstance(problem, MissingJDKFile):
return JDKFileRequirement(problem.jdk_path, problem.filename) return JDKFileRequirement(problem.jdk_path, problem.filename)
elif isinstance(problem, MissingJDK):
return JDKRequirement()
elif isinstance(problem, MissingJRE):
return JRERequirement()
elif isinstance(problem, MissingQt):
return QTRequirement()
elif isinstance(problem, MissingX11):
return X11Requirement()
elif isinstance(problem, MissingLibtool):
return LibtoolRequirement()
elif isinstance(problem, UnknownCertificateAuthority):
return CertificateAuthorityRequirement(problem.url)
elif isinstance(problem, MissingSetupPyCommand):
if problem.command == "test":
return PythonPackageRequirement("setuptools")
return None
elif isinstance(problem, MissingGnomeCommonDependency): elif isinstance(problem, MissingGnomeCommonDependency):
if problem.package == "glib-gettext": if problem.package == "glib-gettext":
return BinaryRequirement("glib-gettextize") return BinaryRequirement("glib-gettextize")
@ -159,7 +203,7 @@ def problem_to_upstream_requirement(problem): # noqa: C901
) )
elif isinstance(problem, MissingPythonDistribution): elif isinstance(problem, MissingPythonDistribution):
return PythonPackageRequirement( return PythonPackageRequirement(
problem.module, problem.distribution,
python_version=problem.python_version, python_version=problem.python_version,
minimum_version=problem.minimum_version, minimum_version=problem.minimum_version,
) )
@ -181,7 +225,7 @@ class InstallFixer(BuildFixer):
req = problem_to_upstream_requirement(error) req = problem_to_upstream_requirement(error)
return req is not None return req is not None
def fix(self, error, context): def fix(self, error, phase):
reqs = problem_to_upstream_requirement(error) reqs = problem_to_upstream_requirement(error)
if reqs is None: if reqs is None:
return False return False
@ -197,7 +241,6 @@ class InstallFixer(BuildFixer):
class ExplainInstall(Exception): class ExplainInstall(Exception):
def __init__(self, commands): def __init__(self, commands):
self.commands = commands self.commands = commands
@ -216,7 +259,7 @@ class ExplainInstallFixer(BuildFixer):
req = problem_to_upstream_requirement(error) req = problem_to_upstream_requirement(error)
return req is not None return req is not None
def fix(self, error, context): def fix(self, error, phase):
reqs = problem_to_upstream_requirement(error) reqs = problem_to_upstream_requirement(error)
if reqs is None: if reqs is None:
return False return False
@ -228,3 +271,23 @@ class ExplainInstallFixer(BuildFixer):
if not explanations: if not explanations:
return False return False
raise ExplainInstall(explanations) raise ExplainInstall(explanations)
def install_missing_reqs(session, resolver, reqs, explain=False):
if not reqs:
return
missing = []
for req in reqs:
try:
if not req.met(session):
missing.append(req)
except NotImplementedError:
missing.append(req)
if missing:
if explain:
commands = resolver.explain(missing)
if not commands:
raise UnsatisfiedRequirements(missing)
raise ExplainInstall(commands)
else:
resolver.install(missing)

File diff suppressed because it is too large Load diff

View file

@ -15,14 +15,14 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import os
from debian.deb822 import Deb822 from debian.deb822 import Deb822
from ..session import Session from ..session import Session
# TODO(jelmer): move this to debian/ def satisfy_build_deps(session: Session, tree, debian_path):
def satisfy_build_deps(session: Session, tree): source = Deb822(tree.get_file(os.path.join(debian_path, "control")))
source = Deb822(tree.get_file("debian/control"))
deps = [] deps = []
for name in ["Build-Depends", "Build-Depends-Indep", "Build-Depends-Arch"]: for name in ["Build-Depends", "Build-Depends-Indep", "Build-Depends-Arch"]:
try: try:

View file

@ -17,8 +17,7 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging import logging
import re from typing import List, Optional
from typing import List, Iterator, Optional, Set
import os import os
from buildlog_consultant.apt import ( from buildlog_consultant.apt import (
@ -26,12 +25,22 @@ from buildlog_consultant.apt import (
) )
from .. import DetailedFailure, UnidentifiedError from .. import DetailedFailure, UnidentifiedError
from ..session import Session, run_with_tee from ..session import Session, run_with_tee, get_user
from .file_search import (
FileSearcher,
AptCachedContentsFileSearcher,
GENERATED_FILE_SEARCHER,
get_packages_for_paths,
)
def run_apt(session: Session, args: List[str]) -> None: def run_apt(
session: Session, args: List[str], prefix: Optional[List[str]] = None
) -> None:
"""Run apt.""" """Run apt."""
args = ["apt", "-y"] + args if prefix is None:
prefix = []
args = prefix = ["apt", "-y"] + args
retcode, lines = run_with_tee(session, args, cwd="/", user="root") retcode, lines = run_with_tee(session, args, cwd="/", user="root")
if retcode == 0: if retcode == 0:
return return
@ -43,25 +52,31 @@ def run_apt(session: Session, args: List[str]) -> None:
raise UnidentifiedError(retcode, args, lines, secondary=match) raise UnidentifiedError(retcode, args, lines, secondary=match)
class FileSearcher(object):
def search_files(self, path: str, regex: bool = False) -> Iterator[str]:
raise NotImplementedError(self.search_files)
class AptManager(object): class AptManager(object):
session: Session session: Session
_searchers: Optional[List[FileSearcher]] _searchers: Optional[List[FileSearcher]]
def __init__(self, session): def __init__(self, session, prefix=None):
self.session = session self.session = session
self._apt_cache = None self._apt_cache = None
self._searchers = None self._searchers = None
if prefix is None:
prefix = []
self.prefix = prefix
@classmethod
def from_session(cls, session):
if get_user(session) != "root":
prefix = ["sudo"]
else:
prefix = []
return cls(session, prefix=prefix)
def searchers(self): def searchers(self):
if self._searchers is None: if self._searchers is None:
self._searchers = [ self._searchers = [
AptContentsFileSearcher.from_session(self.session), AptCachedContentsFileSearcher.from_session(self.session),
GENERATED_FILE_SEARCHER, GENERATED_FILE_SEARCHER,
] ]
return self._searchers return self._searchers
@ -73,10 +88,12 @@ class AptManager(object):
self._apt_cache = apt.Cache(rootdir=self.session.location) self._apt_cache = apt.Cache(rootdir=self.session.location)
return package in self._apt_cache return package in self._apt_cache
def get_package_for_paths(self, paths, regex=False): def get_packages_for_paths(self, paths, regex=False, case_insensitive=False):
logging.debug("Searching for packages containing %r", paths) logging.debug("Searching for packages containing %r", paths)
# TODO(jelmer): Make sure we use whatever is configured in self.session # TODO(jelmer): Make sure we use whatever is configured in self.session
return get_package_for_paths(paths, self.searchers(), regex=regex) return get_packages_for_paths(
paths, self.searchers(), regex=regex, case_insensitive=case_insensitive
)
def missing(self, packages): def missing(self, packages):
root = getattr(self.session, "location", "/") root = getattr(self.session, "location", "/")
@ -98,220 +115,10 @@ class AptManager(object):
logging.info("Installing using apt: %r", packages) logging.info("Installing using apt: %r", packages)
packages = self.missing(packages) packages = self.missing(packages)
if packages: if packages:
run_apt(self.session, ["install"] + packages) run_apt(self.session, ["install"] + packages, prefix=self.prefix)
def satisfy(self, deps: List[str]) -> None: def satisfy(self, deps: List[str]) -> None:
run_apt(self.session, ["satisfy"] + deps) run_apt(self.session, ["satisfy"] + deps, prefix=self.prefix)
def satisfy_command(self, deps: List[str]) -> List[str]:
class ContentsFileNotFound(Exception): return self.prefix + ["apt", "satisfy"] + deps
"""The contents file was not found."""
class AptContentsFileSearcher(FileSearcher):
def __init__(self):
self._db = {}
@classmethod
def from_session(cls, session):
logging.info("Loading apt contents information")
# TODO(jelmer): what about sources.list.d?
from aptsources.sourceslist import SourcesList
sl = SourcesList()
sl.load(os.path.join(session.location, "etc/apt/sources.list"))
return cls.from_sources_list(
sl,
cache_dirs=[
os.path.join(session.location, "var/lib/apt/lists"),
"/var/lib/apt/lists",
],
)
def __setitem__(self, path, package):
self._db[path] = package
def search_files(self, path, regex=False):
c = re.compile(path)
for p, pkg in sorted(self._db.items()):
if regex:
if c.match(p):
yield pkg
else:
if path == p:
yield pkg
def load_file(self, f):
for line in f:
(path, rest) = line.rsplit(maxsplit=1)
package = rest.split(b"/")[-1]
decoded_path = "/" + path.decode("utf-8", "surrogateescape")
self[decoded_path] = package.decode("utf-8")
@classmethod
def _load_cache_file(cls, url, cache_dir):
from urllib.parse import urlparse
parsed = urlparse(url)
p = os.path.join(
cache_dir, parsed.hostname + parsed.path.replace("/", "_") + ".lz4"
)
if not os.path.exists(p):
return None
logging.debug("Loading cached contents file %s", p)
import lz4.frame
return lz4.frame.open(p, mode="rb")
@classmethod
def from_urls(cls, urls, cache_dirs=None):
self = cls()
for url, mandatory in urls:
for cache_dir in cache_dirs or []:
f = cls._load_cache_file(url, cache_dir)
if f is not None:
self.load_file(f)
break
else:
if not mandatory and self._db:
logging.debug(
"Not attempting to fetch optional contents " "file %s", url
)
else:
logging.debug("Fetching contents file %s", url)
try:
self.load_url(url)
except ContentsFileNotFound:
if mandatory:
logging.warning("Unable to fetch contents file %s", url)
else:
logging.debug(
"Unable to fetch optional contents file %s", url
)
return self
@classmethod
def from_sources_list(cls, sl, cache_dirs=None):
# TODO(jelmer): Use aptsources.sourceslist.SourcesList
from .build import get_build_architecture
# TODO(jelmer): Verify signatures, etc.
urls = []
arches = [(get_build_architecture(), True), ("all", False)]
for source in sl.list:
if source.invalid or source.disabled:
continue
if source.type == "deb-src":
continue
if source.type != "deb":
logging.warning("Invalid line in sources: %r", source)
continue
base_url = source.uri.rstrip("/")
name = source.dist.rstrip("/")
components = source.comps
if components:
dists_url = base_url + "/dists"
else:
dists_url = base_url
if components:
for component in components:
for arch, mandatory in arches:
urls.append(
(
"%s/%s/%s/Contents-%s"
% (dists_url, name, component, arch),
mandatory,
)
)
else:
for arch, mandatory in arches:
urls.append(
(
"%s/%s/Contents-%s" % (dists_url, name.rstrip("/"), arch),
mandatory,
)
)
return cls.from_urls(urls, cache_dirs=cache_dirs)
@staticmethod
def _get(url):
from urllib.request import urlopen, Request
request = Request(url, headers={"User-Agent": "Debian Janitor"})
return urlopen(request)
def load_url(self, url, allow_cache=True):
from urllib.error import HTTPError
for ext in [".xz", ".gz", ""]:
try:
response = self._get(url + ext)
except HTTPError as e:
if e.status == 404:
continue
raise
break
else:
raise ContentsFileNotFound(url)
if ext == ".gz":
import gzip
f = gzip.GzipFile(fileobj=response)
elif ext == ".xz":
import lzma
from io import BytesIO
f = BytesIO(lzma.decompress(response.read()))
elif response.headers.get_content_type() == "text/plain":
f = response
else:
raise Exception(
"Unknown content type %r" % response.headers.get_content_type()
)
self.load_file(f)
class GeneratedFileSearcher(FileSearcher):
def __init__(self, db):
self._db = db
def search_files(self, path: str, regex: bool = False) -> Iterator[str]:
for p, pkg in sorted(self._db.items()):
if regex:
if re.match(path, p):
yield pkg
else:
if path == p:
yield pkg
# TODO(jelmer): read from a file
GENERATED_FILE_SEARCHER = GeneratedFileSearcher(
{
"/etc/locale.gen": "locales",
# Alternative
"/usr/bin/rst2html": "/usr/share/docutils/scripts/python3/rst2html",
}
)
def get_package_for_paths(
paths: List[str], searchers: List[FileSearcher], regex: bool = False
) -> Optional[str]:
candidates: Set[str] = set()
for path in paths:
for searcher in searchers:
candidates.update(searcher.search_files(path, regex=regex))
if candidates:
break
if len(candidates) == 0:
logging.warning("No packages found that contain %r", paths)
return None
if len(candidates) > 1:
logging.warning(
"More than 1 packages found that contain %r: %r", path, candidates
)
# Euhr. Pick the one with the shortest name?
return sorted(candidates, key=len)[0]
else:
return candidates.pop()

View file

@ -33,9 +33,9 @@ import sys
from debian.changelog import Changelog from debian.changelog import Changelog
from debmutate.changelog import get_maintainer, format_datetime from debmutate.changelog import get_maintainer, format_datetime
from breezy import osutils
from breezy.mutabletree import MutableTree from breezy.mutabletree import MutableTree
from breezy.plugins.debian.builder import BuildFailedError from breezy.plugins.debian.builder import BuildFailedError
from breezy.tree import Tree
from buildlog_consultant.sbuild import ( from buildlog_consultant.sbuild import (
worker_failure_from_sbuild_log, worker_failure_from_sbuild_log,
@ -71,6 +71,18 @@ def get_build_architecture():
raise Exception("Could not find the build architecture: %s" % e) raise Exception("Could not find the build architecture: %s" % e)
def control_files_in_root(tree: Tree, subpath: str) -> bool:
debian_path = os.path.join(subpath, "debian")
if tree.has_filename(debian_path):
return False
control_path = os.path.join(subpath, "control")
if tree.has_filename(control_path):
return True
if tree.has_filename(control_path + ".in"):
return True
return False
def add_dummy_changelog_entry( def add_dummy_changelog_entry(
tree: MutableTree, tree: MutableTree,
subpath: str, subpath: str,
@ -99,7 +111,10 @@ def add_dummy_changelog_entry(
else: else:
return v + suffix + "1" return v + suffix + "1"
path = os.path.join(subpath, "debian", "changelog") if control_files_in_root(tree, subpath):
path = os.path.join(subpath, "changelog")
else:
path = os.path.join(subpath, "debian", "changelog")
if maintainer is None: if maintainer is None:
maintainer = get_maintainer() maintainer = get_maintainer()
if timestamp is None: if timestamp is None:
@ -126,7 +141,10 @@ def add_dummy_changelog_entry(
def get_latest_changelog_version(local_tree, subpath=""): def get_latest_changelog_version(local_tree, subpath=""):
path = osutils.pathjoin(subpath, "debian/changelog") if control_files_in_root(local_tree, subpath):
path = os.path.join(subpath, "changelog")
else:
path = os.path.join(subpath, "debian", "changelog")
with local_tree.get_file(path) as f: with local_tree.get_file(path) as f:
cl = Changelog(f, max_blocks=1) cl = Changelog(f, max_blocks=1)
return cl.package, cl.version return cl.package, cl.version

View file

@ -0,0 +1,84 @@
#!/usr/bin/python3
# Copyright (C) 2021 Jelmer Vernooij
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
"""Tie breaking by build deps."""
import logging
class BuildDependencyTieBreaker(object):
def __init__(self, rootdir):
self.rootdir = rootdir
self._counts = None
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.rootdir)
@classmethod
def from_session(cls, session):
return cls(session.location)
def _count(self):
counts = {}
import apt_pkg
apt_pkg.init()
apt_pkg.config.set("Dir", self.rootdir)
apt_cache = apt_pkg.SourceRecords()
apt_cache.restart()
while apt_cache.step():
try:
for d in apt_cache.build_depends.values():
for o in d:
for p in o:
counts.setdefault(p[0], 0)
counts[p[0]] += 1
except AttributeError:
pass
return counts
def __call__(self, reqs):
if self._counts is None:
self._counts = self._count()
by_count = {}
for req in reqs:
try:
by_count[req] = self._counts[list(req.package_names())[0]]
except KeyError:
pass
if not by_count:
return None
top = max(by_count.items(), key=lambda k: k[1])
logging.info(
"Breaking tie between %r to %r based on build-depends count",
[repr(r) for r in reqs],
top[0],
)
return top[0]
if __name__ == "__main__":
import argparse
from ..resolver.apt import AptRequirement
parser = argparse.ArgumentParser()
parser.add_argument("req", nargs="+")
args = parser.parse_args()
reqs = [AptRequirement.from_str(req) for req in args.req]
tie_breaker = BuildDependencyTieBreaker("/")
print(tie_breaker(reqs))

View file

@ -0,0 +1,349 @@
#!/usr/bin/python
# Copyright (C) 2019-2020 Jelmer Vernooij <jelmer@jelmer.uk>
# encoding: utf-8
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import apt_pkg
from datetime import datetime
from debian.deb822 import Release
import os
import re
from typing import Iterator, List
import logging
from .. import USER_AGENT
class FileSearcher(object):
def search_files(
self, path: str, regex: bool = False, case_insensitive: bool = False
) -> Iterator[str]:
raise NotImplementedError(self.search_files)
class ContentsFileNotFound(Exception):
"""The contents file was not found."""
def read_contents_file(f):
for line in f:
(path, rest) = line.rsplit(maxsplit=1)
yield path, rest
def contents_urls_from_sources_entry(source, arches, load_url):
if source.invalid or source.disabled:
return
if source.type == "deb-src":
return
if source.type != "deb":
logging.warning("Invalid line in sources: %r", source)
return
base_url = source.uri.rstrip("/")
name = source.dist.rstrip("/")
components = source.comps
if components:
dists_url = base_url + "/dists"
else:
dists_url = base_url
inrelease_url = "%s/%s/InRelease" % (dists_url, name)
try:
response = load_url(inrelease_url)
except FileNotFoundError:
release_url = "%s/%s/Release" % (dists_url, name)
try:
response = load_url(release_url)
except FileNotFoundError as e:
logging.warning(
"Unable to download %s or %s: %s", inrelease_url, release_url, e
)
return
existing_names = {}
release = Release(response.read())
for hn in ["MD5Sum", "SHA1Sum", "SHA256Sum"]:
for entry in release.get(hn, []):
existing_names[os.path.splitext(entry["name"])[0]] = entry["name"]
contents_files = set()
if components:
for component in components:
for arch in arches:
contents_files.add("%s/Contents-%s" % (component, arch))
else:
for arch in arches:
contents_files.add("Contents-%s" % (arch,))
for fn in contents_files:
if fn in existing_names:
url = "%s/%s/%s" % (dists_url, name, fn)
yield url
def contents_urls_from_sourceslist(sl, arch, load_url):
# TODO(jelmer): Verify signatures, etc.
arches = [arch, "all"]
for source in sl.list:
yield from contents_urls_from_sources_entry(source, arches, load_url)
def _unwrap(f, ext):
if ext == ".gz":
import gzip
return gzip.GzipFile(fileobj=f)
elif ext == ".xz":
import lzma
from io import BytesIO
f = BytesIO(lzma.decompress(f.read()))
else:
return f
def load_direct_url(url):
from urllib.error import HTTPError
from urllib.request import urlopen, Request
for ext in [".xz", ".gz", ""]:
try:
request = Request(url + ext, headers={"User-Agent": USER_AGENT})
response = urlopen(request)
except HTTPError as e:
if e.status == 404:
continue
raise
break
else:
raise FileNotFoundError(url)
return _unwrap(response, ext)
def load_url_with_cache(url, cache_dirs):
for cache_dir in cache_dirs:
try:
return load_apt_cache_file(url, cache_dir)
except FileNotFoundError:
pass
return load_direct_url(url)
def load_apt_cache_file(url, cache_dir):
fn = apt_pkg.uri_to_filename(url)
for ext in [".xz", ".gz", ".lz4", ""]:
p = os.path.join(cache_dir, fn + ext)
if not os.path.exists(p):
continue
# return os.popen('/usr/lib/apt/apt-helper cat-file %s' % p)
logging.debug("Loading cached contents file %s", p)
if ext == ".lz4":
import lz4.frame
return lz4.frame.open(p, mode="rb")
return _unwrap(open(p, "rb"), ext)
raise FileNotFoundError(url)
class AptCachedContentsFileSearcher(FileSearcher):
def __init__(self):
self._db = {}
@classmethod
def from_session(cls, session):
logging.info("Loading apt contents information")
self = cls()
self.load_from_session(session)
return self
def load_local(self):
# TODO(jelmer): what about sources.list.d?
from aptsources.sourceslist import SourcesList
sl = SourcesList()
sl.load("/etc/apt/sources.list")
from .build import get_build_architecture
cache_dirs = set(["/var/lib/apt/lists"])
def load_url(url):
return load_url_with_cache(url, cache_dirs)
urls = list(
contents_urls_from_sourceslist(sl, get_build_architecture(), load_url)
)
self._load_urls(urls, cache_dirs, load_url)
def load_from_session(self, session):
# TODO(jelmer): what about sources.list.d?
from aptsources.sourceslist import SourcesList
sl = SourcesList()
sl.load(os.path.join(session.location, "etc/apt/sources.list"))
from .build import get_build_architecture
cache_dirs = set(
[
os.path.join(session.location, "var/lib/apt/lists"),
"/var/lib/apt/lists",
]
)
def load_url(url):
return load_url_with_cache(url, cache_dirs)
urls = list(
contents_urls_from_sourceslist(sl, get_build_architecture(), load_url)
)
self._load_urls(urls, cache_dirs, load_url)
def _load_urls(self, urls, cache_dirs, load_url):
for url in urls:
try:
f = load_url(url)
self.load_file(f, url)
except ContentsFileNotFound:
logging.warning("Unable to fetch contents file %s", url)
def __setitem__(self, path, package):
self._db[path] = package
def search_files(self, path, regex=False, case_insensitive=False):
path = path.lstrip("/").encode("utf-8", "surrogateescape")
if case_insensitive and not regex:
regex = True
path = re.escape(path)
if regex:
flags = 0
if case_insensitive:
flags |= re.I
c = re.compile(path, flags=flags)
ret = []
for p, rest in self._db.items():
if c.match(p):
pkg = rest.split(b"/")[-1]
ret.append((p, pkg.decode("utf-8")))
for p, pkg in sorted(ret):
yield pkg
else:
try:
yield self._db[path].split(b"/")[-1].decode("utf-8")
except KeyError:
pass
def load_file(self, f, url):
start_time = datetime.now()
for path, rest in read_contents_file(f.readlines()):
self[path] = rest
logging.debug("Read %s in %s", url, datetime.now() - start_time)
class GeneratedFileSearcher(FileSearcher):
def __init__(self, db):
self._db = db
@classmethod
def from_path(cls, path):
self = cls({})
self.load_from_path(path)
return self
def load_from_path(self, path):
with open(path, "r") as f:
for line in f:
(path, pkg) = line.strip().split(None, 1)
self._db[path] = pkg
def search_files(
self, path: str, regex: bool = False, case_insensitive: bool = False
) -> Iterator[str]:
for p, pkg in sorted(self._db.items()):
if regex:
flags = 0
if case_insensitive:
flags |= re.I
if re.match(path, p, flags=flags):
yield pkg
elif case_insensitive:
if path.lower() == p.lower():
yield pkg
else:
if path == p:
yield pkg
# TODO(jelmer): read from a file
GENERATED_FILE_SEARCHER = GeneratedFileSearcher(
{
"/etc/locale.gen": "locales",
# Alternative
"/usr/bin/rst2html": "python3-docutils",
# aclocal is a symlink to aclocal-1.XY
"/usr/bin/aclocal": "automake",
"/usr/bin/automake": "automake",
# maven lives in /usr/share
"/usr/bin/mvn": "maven",
}
)
def get_packages_for_paths(
paths: List[str],
searchers: List[FileSearcher],
regex: bool = False,
case_insensitive: bool = False,
) -> List[str]:
candidates: List[str] = list()
for path in paths:
for searcher in searchers:
for pkg in searcher.search_files(
path, regex=regex, case_insensitive=case_insensitive
):
if pkg not in candidates:
candidates.append(pkg)
return candidates
def main(argv):
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("path", help="Path to search for.", type=str, nargs="*")
parser.add_argument("--regex", "-x", help="Search for regex.", action="store_true")
parser.add_argument("--debug", action="store_true")
args = parser.parse_args()
if args.debug:
logging.basicConfig(level=logging.DEBUG)
else:
logging.basicConfig(level=logging.INFO)
main_searcher = AptCachedContentsFileSearcher()
main_searcher.load_local()
searchers = [main_searcher, GENERATED_FILE_SEARCHER]
packages = get_packages_for_paths(args.path, searchers=searchers, regex=args.regex)
for package in packages:
print(package)
if __name__ == "__main__":
import sys
sys.exit(main(sys.argv))

View file

@ -19,8 +19,10 @@ __all__ = [
"build_incrementally", "build_incrementally",
] ]
from functools import partial
import logging import logging
import os import os
import shutil
import sys import sys
from typing import List, Set, Optional, Type from typing import List, Set, Optional, Type
@ -30,8 +32,8 @@ from debian.deb822 import (
) )
from breezy.commit import PointlessCommit from breezy.commit import PointlessCommit
from breezy.mutabletree import MutableTree
from breezy.tree import Tree from breezy.tree import Tree
from debmutate.changelog import ChangelogEditor
from debmutate.control import ( from debmutate.control import (
ensure_relation, ensure_relation,
ControlEditor, ControlEditor,
@ -49,12 +51,47 @@ from debmutate.reformatting import (
try: try:
from breezy.workspace import reset_tree from breezy.workspace import reset_tree
except ImportError: except ImportError: # breezy < 3.2
from lintian_brush import reset_tree
def delete_items(deletables, dry_run=False):
"""Delete files in the deletables iterable"""
import errno
import shutil
def onerror(function, path, excinfo):
"""Show warning for errors seen by rmtree."""
# Handle only permission error while removing files.
# Other errors are re-raised.
if function is not os.remove or excinfo[1].errno != errno.EACCES:
raise
logging.warning("unable to remove %s" % path)
for path, subp in deletables:
if os.path.isdir(path):
shutil.rmtree(path, onerror=onerror)
else:
try:
os.unlink(path)
except OSError as e:
# We handle only permission error here
if e.errno != errno.EACCES:
raise e
logging.warning('unable to remove "%s": %s.', path, e.strerror)
def reset_tree(local_tree, subpath=""):
from breezy.transform import revert
from breezy.clean_tree import iter_deletables
revert(
local_tree,
local_tree.branch.basis_tree(),
[subpath] if subpath not in (".", "") else None,
)
deletables = list(
iter_deletables(local_tree, unknown=True, ignored=False, detritus=False)
)
delete_items(deletables)
from lintian_brush.changelog import (
add_changelog_entry,
)
from debmutate._rules import ( from debmutate._rules import (
dh_invoke_add_with, dh_invoke_add_with,
@ -71,8 +108,6 @@ from buildlog_consultant.common import (
MissingAutomakeInput, MissingAutomakeInput,
MissingConfigure, MissingConfigure,
NeedPgBuildExtUpdateControl, NeedPgBuildExtUpdateControl,
MissingPythonModule,
MissingPythonDistribution,
MissingPerlFile, MissingPerlFile,
) )
from buildlog_consultant.sbuild import ( from buildlog_consultant.sbuild import (
@ -80,10 +115,9 @@ from buildlog_consultant.sbuild import (
) )
from ..buildlog import problem_to_upstream_requirement from ..buildlog import problem_to_upstream_requirement
from ..fix_build import BuildFixer, resolve_error, DependencyContext from ..fix_build import BuildFixer, resolve_error
from ..resolver.apt import ( from ..resolver.apt import (
AptRequirement, AptRequirement,
get_package_for_python_module,
) )
from .build import attempt_build, DEFAULT_BUILDER from .build import attempt_build, DEFAULT_BUILDER
@ -98,10 +132,46 @@ class CircularDependency(Exception):
self.package = package self.package = package
class PackageDependencyFixer(BuildFixer): class DebianPackagingContext(object):
def __init__(
self, tree, subpath, committer, update_changelog, commit_reporter=None
):
self.tree = tree
self.subpath = subpath
self.committer = committer
self.update_changelog = update_changelog
self.commit_reporter = commit_reporter
def __init__(self, apt_resolver): def abspath(self, *parts):
return self.tree.abspath(os.path.join(self.subpath, *parts))
def commit(self, summary: str, update_changelog: Optional[bool] = None) -> bool:
if update_changelog is None:
update_changelog = self.update_changelog
with self.tree.lock_write():
try:
if update_changelog:
cl_path = self.abspath("debian/changelog")
with ChangelogEditor(cl_path) as editor:
editor.add_entry([summary])
debcommit(self.tree, committer=self.committer, subpath=self.subpath)
else:
self.tree.commit(
message=summary,
committer=self.committer,
specific_files=[self.subpath],
reporter=self.commit_reporter,
)
except PointlessCommit:
return False
else:
return True
class PackageDependencyFixer(BuildFixer):
def __init__(self, context, apt_resolver):
self.apt_resolver = apt_resolver self.apt_resolver = apt_resolver
self.context = context
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.apt_resolver) return "%s(%r)" % (type(self).__name__, self.apt_resolver)
@ -113,7 +183,7 @@ class PackageDependencyFixer(BuildFixer):
req = problem_to_upstream_requirement(error) req = problem_to_upstream_requirement(error)
return req is not None return req is not None
def fix(self, error, context): def fix(self, error, phase):
reqs = problem_to_upstream_requirement(error) reqs = problem_to_upstream_requirement(error)
if reqs is None: if reqs is None:
return False return False
@ -123,82 +193,29 @@ class PackageDependencyFixer(BuildFixer):
changed = False changed = False
for req in reqs: for req in reqs:
package = self.apt_resolver.resolve(req) apt_req = self.apt_resolver.resolve(req)
if package is None: if apt_req is None:
return False
if context.phase[0] == "autopkgtest":
return add_test_dependency(
context.tree,
context.phase[1],
package,
committer=context.committer,
subpath=context.subpath,
update_changelog=context.update_changelog,
)
elif context.phase[0] == "build":
return add_build_dependency(
context.tree,
package,
committer=context.committer,
subpath=context.subpath,
update_changelog=context.update_changelog,
)
else:
logging.warning('Unknown phase %r', context.phase)
return False return False
if add_dependency(self.context, phase, apt_req):
changed = True
return changed return changed
class BuildDependencyContext(DependencyContext): def add_dependency(context, phase, requirement: AptRequirement):
def __init__( if phase[0] == "autopkgtest":
self, phase, tree, apt, subpath="", committer=None, update_changelog=True return add_test_dependency(context, phase[1], requirement)
): elif phase[0] == "build":
self.phase = phase return add_build_dependency(context, requirement)
super(BuildDependencyContext, self).__init__( else:
tree, apt, subpath, committer, update_changelog logging.warning("Unknown phase %r", phase)
) return False
def add_dependency(self, requirement: AptRequirement):
return add_build_dependency(
self.tree,
requirement,
committer=self.committer,
subpath=self.subpath,
update_changelog=self.update_changelog,
)
class AutopkgtestDependencyContext(DependencyContext): def add_build_dependency(context, requirement: AptRequirement):
def __init__(
self, phase, tree, apt, subpath="", committer=None, update_changelog=True
):
self.phase = phase
super(AutopkgtestDependencyContext, self).__init__(
tree, apt, subpath, committer, update_changelog
)
def add_dependency(self, requirement):
return add_test_dependency(
self.tree,
self.testname,
requirement,
committer=self.committer,
subpath=self.subpath,
update_changelog=self.update_changelog,
)
def add_build_dependency(
tree: Tree,
requirement: AptRequirement,
committer: Optional[str] = None,
subpath: str = "",
update_changelog: bool = True,
):
if not isinstance(requirement, AptRequirement): if not isinstance(requirement, AptRequirement):
raise TypeError(requirement) raise TypeError(requirement)
control_path = os.path.join(tree.abspath(subpath), "debian/control") control_path = context.abspath("debian/control")
try: try:
with ControlEditor(path=control_path) as updater: with ControlEditor(path=control_path) as updater:
for binary in updater.binaries: for binary in updater.binaries:
@ -219,27 +236,17 @@ def add_build_dependency(
return False return False
logging.info("Adding build dependency: %s", desc) logging.info("Adding build dependency: %s", desc)
return commit_debian_changes( return context.commit("Add missing build dependency on %s." % desc)
tree,
subpath,
"Add missing build dependency on %s." % desc,
committer=committer,
update_changelog=update_changelog,
)
def add_test_dependency( def add_test_dependency(context, testname, requirement):
tree,
testname,
requirement,
committer=None,
subpath="",
update_changelog=True,
):
if not isinstance(requirement, AptRequirement): if not isinstance(requirement, AptRequirement):
raise TypeError(requirement) raise TypeError(requirement)
tests_control_path = os.path.join(tree.abspath(subpath), "debian/tests/control") tests_control_path = context.abspath("debian/tests/control")
# TODO(jelmer): If requirement is for one of our binary packages
# but "@" is already present then don't do anything.
try: try:
with Deb822Editor(path=tests_control_path) as updater: with Deb822Editor(path=tests_control_path) as updater:
@ -265,176 +272,59 @@ def add_test_dependency(
desc = requirement.pkg_relation_str() desc = requirement.pkg_relation_str()
logging.info("Adding dependency to test %s: %s", testname, desc) logging.info("Adding dependency to test %s: %s", testname, desc)
return commit_debian_changes( return context.commit(
tree,
subpath,
"Add missing dependency for test %s on %s." % (testname, desc), "Add missing dependency for test %s on %s." % (testname, desc),
update_changelog=update_changelog,
) )
def commit_debian_changes( def targeted_python_versions(tree: Tree, subpath: str) -> List[str]:
tree: MutableTree, with tree.get_file(os.path.join(subpath, "debian/control")) as f:
subpath: str,
summary: str,
committer: Optional[str] = None,
update_changelog: bool = True,
) -> bool:
with tree.lock_write():
try:
if update_changelog:
add_changelog_entry(
tree, os.path.join(subpath, "debian/changelog"), [summary]
)
debcommit(tree, committer=committer, subpath=subpath)
else:
tree.commit(
message=summary, committer=committer, specific_files=[subpath]
)
except PointlessCommit:
return False
else:
return True
def targeted_python_versions(tree: Tree) -> Set[str]:
with tree.get_file("debian/control") as f:
control = Deb822(f) control = Deb822(f)
build_depends = PkgRelation.parse_relations(control.get("Build-Depends", "")) build_depends = PkgRelation.parse_relations(control.get("Build-Depends", ""))
all_build_deps: Set[str] = set() all_build_deps: Set[str] = set()
for or_deps in build_depends: for or_deps in build_depends:
all_build_deps.update(or_dep["name"] for or_dep in or_deps) all_build_deps.update(or_dep["name"] for or_dep in or_deps)
targeted = set() targeted = []
if any(x.startswith("pypy") for x in all_build_deps):
targeted.add("pypy")
if any(x.startswith("python-") for x in all_build_deps):
targeted.add("cpython2")
if any(x.startswith("python3-") for x in all_build_deps): if any(x.startswith("python3-") for x in all_build_deps):
targeted.add("cpython3") targeted.append("python3")
if any(x.startswith("pypy") for x in all_build_deps):
targeted.append("pypy")
if any(x.startswith("python-") for x in all_build_deps):
targeted.append("python")
return targeted return targeted
def fix_missing_python_distribution(error, context): # noqa: C901 def python_tie_breaker(tree, subpath, reqs):
targeted = targeted_python_versions(context.tree) targeted = targeted_python_versions(tree, subpath)
default = not targeted if not targeted:
return None
pypy_pkg = context.apt.get_package_for_paths( def same(pkg, python_version):
["/usr/lib/pypy/dist-packages/%s-.*.egg-info" % error.distribution], regex=True if pkg.startswith(python_version + "-"):
) return True
if pypy_pkg is None: if pkg.startswith("lib%s-" % python_version):
pypy_pkg = "pypy-%s" % error.distribution return True
if not context.apt.package_exists(pypy_pkg):
pypy_pkg = None
py2_pkg = context.apt.get_package_for_paths(
["/usr/lib/python2\\.[0-9]/dist-packages/%s-.*.egg-info" % error.distribution],
regex=True,
)
if py2_pkg is None:
py2_pkg = "python-%s" % error.distribution
if not context.apt.package_exists(py2_pkg):
py2_pkg = None
py3_pkg = context.apt.get_package_for_paths(
["/usr/lib/python3/dist-packages/%s-.*.egg-info" % error.distribution],
regex=True,
)
if py3_pkg is None:
py3_pkg = "python3-%s" % error.distribution
if not context.apt.package_exists(py3_pkg):
py3_pkg = None
extra_build_deps = []
if error.python_version == 2:
if "pypy" in targeted:
if not pypy_pkg:
logging.warning("no pypy package found for %s", error.module)
else:
extra_build_deps.append(pypy_pkg)
if "cpython2" in targeted or default:
if not py2_pkg:
logging.warning("no python 2 package found for %s", error.module)
return False
extra_build_deps.append(py2_pkg)
elif error.python_version == 3:
if not py3_pkg:
logging.warning("no python 3 package found for %s", error.module)
return False
extra_build_deps.append(py3_pkg)
else:
if py3_pkg and ("cpython3" in targeted or default):
extra_build_deps.append(py3_pkg)
if py2_pkg and ("cpython2" in targeted or default):
extra_build_deps.append(py2_pkg)
if pypy_pkg and "pypy" in targeted:
extra_build_deps.append(pypy_pkg)
if not extra_build_deps:
return False return False
for dep_pkg in extra_build_deps: for python_version in targeted:
assert dep_pkg is not None for req in reqs:
if not context.add_dependency(dep_pkg): if any(same(name, python_version) for name in req.package_names()):
return False logging.info(
"Breaking tie between %r to %r, since package already "
"has %r build-dependencies",
[str(req) for req in reqs],
str(req),
python_version,
)
return req
return None
def retry_apt_failure(error, phase, apt, context):
return True return True
def fix_missing_python_module(error, context): def enable_dh_autoreconf(context, phase):
if getattr(context, "tree", None) is not None:
targeted = targeted_python_versions(context.tree)
else:
targeted = set()
default = not targeted
if error.minimum_version:
specs = [(">=", error.minimum_version)]
else:
specs = []
pypy_pkg = get_package_for_python_module(context.apt, error.module, "pypy", specs)
py2_pkg = get_package_for_python_module(context.apt, error.module, "python2", specs)
py3_pkg = get_package_for_python_module(context.apt, error.module, "python3", specs)
extra_build_deps = []
if error.python_version == 2:
if "pypy" in targeted:
if not pypy_pkg:
logging.warning("no pypy package found for %s", error.module)
else:
extra_build_deps.append(pypy_pkg)
if "cpython2" in targeted or default:
if not py2_pkg:
logging.warning("no python 2 package found for %s", error.module)
return False
extra_build_deps.append(py2_pkg)
elif error.python_version == 3:
if not py3_pkg:
logging.warning("no python 3 package found for %s", error.module)
return False
extra_build_deps.append(py3_pkg)
else:
if py3_pkg and ("cpython3" in targeted or default):
extra_build_deps.append(py3_pkg)
if py2_pkg and ("cpython2" in targeted or default):
extra_build_deps.append(py2_pkg)
if pypy_pkg and "pypy" in targeted:
extra_build_deps.append(pypy_pkg)
if not extra_build_deps:
return False
for dep_pkg in extra_build_deps:
assert dep_pkg is not None
if not context.add_dependency(dep_pkg):
return False
return True
def retry_apt_failure(error, context):
return True
def enable_dh_autoreconf(context):
# Debhelper >= 10 depends on dh-autoreconf and enables autoreconf by # Debhelper >= 10 depends on dh-autoreconf and enables autoreconf by
# default. # default.
debhelper_compat_version = get_debhelper_compat_level(context.tree.abspath(".")) debhelper_compat_version = get_debhelper_compat_level(context.tree.abspath("."))
@ -448,28 +338,30 @@ def enable_dh_autoreconf(context):
return dh_invoke_add_with(line, b"autoreconf") return dh_invoke_add_with(line, b"autoreconf")
if update_rules(command_line_cb=add_with_autoreconf): if update_rules(command_line_cb=add_with_autoreconf):
return context.add_dependency(AptRequirement.simple("dh-autoreconf")) return add_dependency(
context, phase, AptRequirement.simple("dh-autoreconf")
)
return False return False
def fix_missing_configure(error, context): def fix_missing_configure(error, phase, context):
if not context.tree.has_filename("configure.ac") and not context.tree.has_filename( if not context.tree.has_filename("configure.ac") and not context.tree.has_filename(
"configure.in" "configure.in"
): ):
return False return False
return enable_dh_autoreconf(context) return enable_dh_autoreconf(context, phase)
def fix_missing_automake_input(error, context): def fix_missing_automake_input(error, phase, context):
# TODO(jelmer): If it's ./NEWS, ./AUTHORS or ./README that's missing, then # TODO(jelmer): If it's ./NEWS, ./AUTHORS or ./README that's missing, then
# try to set 'export AUTOMAKE = automake --foreign' in debian/rules. # try to set 'export AUTOMAKE = automake --foreign' in debian/rules.
# https://salsa.debian.org/jelmer/debian-janitor/issues/88 # https://salsa.debian.org/jelmer/debian-janitor/issues/88
return enable_dh_autoreconf(context) return enable_dh_autoreconf(context, phase)
def fix_missing_config_status_input(error, context): def fix_missing_config_status_input(error, phase, context):
autogen_path = "autogen.sh" autogen_path = "autogen.sh"
rules_path = "debian/rules" rules_path = "debian/rules"
if context.subpath not in (".", ""): if context.subpath not in (".", ""):
@ -488,38 +380,39 @@ def fix_missing_config_status_input(error, context):
if not update_rules(makefile_cb=add_autogen, path=rules_path): if not update_rules(makefile_cb=add_autogen, path=rules_path):
return False return False
if context.update_changelog: return context.commit("Run autogen.sh during build.")
commit_debian_changes(
context.tree,
context.subpath,
"Run autogen.sh during build.",
committer=context.committer,
update_changelog=context.update_changelog,
)
return True
class PgBuildExtOutOfDateControlFixer(BuildFixer): class PgBuildExtOutOfDateControlFixer(BuildFixer):
def __init__(self, session): def __init__(self, packaging_context, session, apt):
self.session = session self.session = session
self.context = packaging_context
self.apt = apt
def can_fix(self, problem): def can_fix(self, problem):
return isinstance(problem, NeedPgBuildExtUpdateControl) return isinstance(problem, NeedPgBuildExtUpdateControl)
def _fix(self, error, context): def __repr__(self):
return "%s()" % (type(self).__name__,)
def _fix(self, error, phase):
logging.info("Running 'pg_buildext updatecontrol'") logging.info("Running 'pg_buildext updatecontrol'")
self.apt.install(['postgresql-common'])
external_dir, internal_dir = self.session.setup_from_vcs(
self.context.tree, include_controldir=None,
subdir=self.context.subpath)
self.session.chdir(internal_dir)
self.session.check_call(["pg_buildext", "updatecontrol"]) self.session.check_call(["pg_buildext", "updatecontrol"])
return commit_debian_changes( shutil.copy(
context.tree, os.path.join(external_dir, error.generated_path),
context.subpath, self.context.abspath(error.generated_path)
"Run 'pgbuildext updatecontrol'.", )
committer=context.committer, return self.context.commit(
update_changelog=False, "Run 'pgbuildext updatecontrol'.", update_changelog=False
) )
def fix_missing_makefile_pl(error, context): def fix_missing_makefile_pl(error, phase, context):
if ( if (
error.filename == "Makefile.PL" error.filename == "Makefile.PL"
and not context.tree.has_filename("Makefile.PL") and not context.tree.has_filename("Makefile.PL")
@ -531,38 +424,76 @@ def fix_missing_makefile_pl(error, context):
class SimpleBuildFixer(BuildFixer): class SimpleBuildFixer(BuildFixer):
def __init__(self, problem_cls: Type[Problem], fn): def __init__(self, packaging_context, problem_cls: Type[Problem], fn):
self.context = packaging_context
self._problem_cls = problem_cls self._problem_cls = problem_cls
self._fn = fn self._fn = fn
def __repr__(self): def __repr__(self):
return "%s(%r, %r)" % (type(self).__name__, self._problem_cls, self._fn) return "%s(%s, %s)" % (
type(self).__name__,
self._problem_cls.__name__,
self._fn.__name__,
)
def can_fix(self, problem: Problem): def can_fix(self, problem: Problem):
return isinstance(problem, self._problem_cls) return isinstance(problem, self._problem_cls)
def _fix(self, problem: Problem, context): def _fix(self, problem: Problem, phase):
return self._fn(problem, context) return self._fn(problem, phase, self.context)
def versioned_package_fixers(session): class DependencyBuildFixer(BuildFixer):
def __init__(self, packaging_context, apt_resolver, problem_cls: Type[Problem], fn):
self.context = packaging_context
self.apt_resolver = apt_resolver
self._problem_cls = problem_cls
self._fn = fn
def __repr__(self):
return "%s(%s, %s)" % (
type(self).__name__,
self._problem_cls.__name__,
self._fn.__name__,
)
def can_fix(self, problem: Problem):
return isinstance(problem, self._problem_cls)
def _fix(self, problem: Problem, phase):
return self._fn(problem, phase, self.apt_resolver, self.context)
def versioned_package_fixers(session, packaging_context, apt):
return [ return [
PgBuildExtOutOfDateControlFixer(session), PgBuildExtOutOfDateControlFixer(packaging_context, session, apt),
SimpleBuildFixer(MissingConfigure, fix_missing_configure), SimpleBuildFixer(packaging_context, MissingConfigure, fix_missing_configure),
SimpleBuildFixer(MissingAutomakeInput, fix_missing_automake_input), SimpleBuildFixer(
SimpleBuildFixer(MissingConfigStatusInput, fix_missing_config_status_input), packaging_context, MissingAutomakeInput, fix_missing_automake_input
SimpleBuildFixer(MissingPerlFile, fix_missing_makefile_pl), ),
SimpleBuildFixer(
packaging_context, MissingConfigStatusInput, fix_missing_config_status_input
),
SimpleBuildFixer(packaging_context, MissingPerlFile, fix_missing_makefile_pl),
] ]
def apt_fixers(apt) -> List[BuildFixer]: def apt_fixers(apt, packaging_context) -> List[BuildFixer]:
from ..resolver.apt import AptResolver from ..resolver.apt import AptResolver
resolver = AptResolver(apt) from .udd import popcon_tie_breaker
from .build_deps import BuildDependencyTieBreaker
apt_tie_breakers = [
partial(python_tie_breaker, packaging_context.tree, packaging_context.subpath),
BuildDependencyTieBreaker.from_session(apt.session),
popcon_tie_breaker,
]
resolver = AptResolver(apt, apt_tie_breakers)
return [ return [
SimpleBuildFixer(MissingPythonModule, fix_missing_python_module), DependencyBuildFixer(
SimpleBuildFixer(MissingPythonDistribution, fix_missing_python_distribution), packaging_context, apt, AptFetchFailure, retry_apt_failure
SimpleBuildFixer(AptFetchFailure, retry_apt_failure), ),
PackageDependencyFixer(resolver), PackageDependencyFixer(packaging_context, resolver),
] ]
@ -581,7 +512,12 @@ def build_incrementally(
update_changelog=True, update_changelog=True,
): ):
fixed_errors = [] fixed_errors = []
fixers = versioned_package_fixers(apt.session) + apt_fixers(apt) packaging_context = DebianPackagingContext(
local_tree, subpath, committer, update_changelog
)
fixers = versioned_package_fixers(apt.session, packaging_context, apt) + apt_fixers(
apt, packaging_context
)
logging.info("Using fixers: %r", fixers) logging.info("Using fixers: %r", fixers)
while True: while True:
try: try:
@ -608,30 +544,9 @@ def build_incrementally(
if max_iterations is not None and len(fixed_errors) > max_iterations: if max_iterations is not None and len(fixed_errors) > max_iterations:
logging.warning("Last fix did not address the issue. Giving up.") logging.warning("Last fix did not address the issue. Giving up.")
raise raise
reset_tree(local_tree, local_tree.basis_tree(), subpath=subpath) reset_tree(local_tree, subpath=subpath)
if e.phase[0] == "build":
context = BuildDependencyContext(
e.phase,
local_tree,
apt,
subpath=subpath,
committer=committer,
update_changelog=update_changelog,
)
elif e.phase[0] == "autopkgtest":
context = AutopkgtestDependencyContext(
e.phase,
local_tree,
apt,
subpath=subpath,
committer=committer,
update_changelog=update_changelog,
)
else:
logging.warning("unable to install for context %r", e.phase)
raise
try: try:
if not resolve_error(e.error, context, fixers): if not resolve_error(e.error, e.phase, fixers):
logging.warning("Failed to resolve error %r. Giving up.", e.error) logging.warning("Failed to resolve error %r. Giving up.", e.error)
raise raise
except GeneratedFile: except GeneratedFile:
@ -654,10 +569,9 @@ def build_incrementally(
os.path.join(output_directory, "build.log.%d" % i) os.path.join(output_directory, "build.log.%d" % i)
): ):
i += 1 i += 1
os.rename( target_path = os.path.join(output_directory, "build.log.%d" % i)
os.path.join(output_directory, "build.log"), os.rename(os.path.join(output_directory, "build.log"), target_path)
os.path.join(output_directory, "build.log.%d" % i), logging.debug("Storing build log at %s", target_path)
)
def main(argv=None): def main(argv=None):
@ -696,17 +610,23 @@ def main(argv=None):
help="force updating of the changelog", help="force updating of the changelog",
default=None, default=None,
) )
parser.add_argument("--schroot", type=str, help="chroot to use.")
parser.add_argument("--verbose", action="store_true", help="Be verbose")
args = parser.parse_args() args = parser.parse_args()
from breezy.workingtree import WorkingTree from breezy.workingtree import WorkingTree
import breezy.git # noqa: F401
import breezy.bzr # noqa: F401
from .apt import AptManager from .apt import AptManager
from ..session.plain import PlainSession from ..session.plain import PlainSession
from ..session.schroot import SchrootSession
import tempfile import tempfile
import contextlib import contextlib
apt = AptManager(PlainSession()) if args.verbose:
logging.basicConfig(level=logging.DEBUG, format="%(message)s")
logging.basicConfig(level=logging.INFO, format="%(message)s") else:
logging.basicConfig(level=logging.INFO, format="%(message)s")
with contextlib.ExitStack() as es: with contextlib.ExitStack() as es:
if args.output_directory is None: if args.output_directory is None:
@ -716,17 +636,43 @@ def main(argv=None):
output_directory = args.output_directory output_directory = args.output_directory
tree = WorkingTree.open(".") tree = WorkingTree.open(".")
build_incrementally( if args.schroot:
tree, session = SchrootSession(args.schroot)
apt, else:
args.suffix, session = PlainSession()
args.suite,
output_directory, es.enter_context(session)
args.build_command,
None, apt = AptManager(session)
committer=args.committer,
update_changelog=args.update_changelog, try:
) (changes_filename, cl_version) = build_incrementally(
tree,
apt,
args.suffix,
args.suite,
output_directory,
args.build_command,
None,
committer=args.committer,
update_changelog=args.update_changelog,
)
except SbuildFailure as e:
if e.phase is None:
phase = "unknown phase"
elif len(e.phase) == 1:
phase = e.phase[0]
else:
phase = "%s (%s)" % (e.phase[0], e.phase[1])
if e.error:
logging.fatal("Error during %s: %s", phase, e.error)
else:
logging.fatal("Error during %s: %s", phase, e.description)
return 1
logging.info(
'Built %s - changes file at %s.',
os.path.join(output_directory, changes_filename))
if __name__ == "__main__": if __name__ == "__main__":

60
ognibuild/debian/udd.py Normal file
View file

@ -0,0 +1,60 @@
#!/usr/bin/python3
# Copyright (C) 2021 Jelmer Vernooij
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
"""Support for accessing UDD."""
import logging
class UDD(object):
def connect(self):
import psycopg2
self._conn = psycopg2.connect(
database="udd",
user="udd-mirror",
password="udd-mirror",
port=5432,
host="udd-mirror.debian.net",
)
def get_most_popular(self, packages):
cursor = self._conn.cursor()
cursor.execute(
"SELECT package FROM popcon WHERE package IN %s ORDER BY insts DESC LIMIT 1",
(tuple(packages),),
)
return cursor.fetchone()[0]
def popcon_tie_breaker(candidates):
# TODO(jelmer): Pick package based on what appears most commonly in
# build-depends{-indep,-arch}
try:
from .udd import UDD
except ModuleNotFoundError:
logging.warning("Unable to import UDD, not ranking by popcon")
return sorted(candidates, key=len)[0]
udd = UDD()
udd.connect()
names = {list(c.package_names())[0]: c for c in candidates}
winner = udd.get_most_popular(list(names.keys()))
if winner is None:
logging.warning("No relevant popcon information found, not ranking by popcon")
return None
logging.info("Picked winner using popcon")
return names[winner]

View file

@ -15,14 +15,18 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
__all__ = [
"UnidentifiedError",
"DetailedFailure",
"create_dist",
"create_dist_schroot",
]
import errno import errno
import logging import logging
import os import os
import shutil
import sys import sys
import tempfile from typing import Optional, List
import time
from typing import Optional
from debian.deb822 import Deb822 from debian.deb822 import Deb822
@ -34,93 +38,69 @@ from buildlog_consultant.common import (
) )
from . import DetailedFailure from . import DetailedFailure, UnidentifiedError
from .dist_catcher import DistNoTarball
from .buildsystem import NoBuildToolsFound from .buildsystem import NoBuildToolsFound
from .resolver import auto_resolver
from .session import Session
from .session.schroot import SchrootSession from .session.schroot import SchrootSession
from .vcs import dupe_vcs_tree, export_vcs_tree
SUPPORTED_DIST_EXTENSIONS = [ def run_dist(session, buildsystems, resolver, fixers, target_directory, quiet=False):
".tar.gz",
".tgz",
".tar.bz2",
".tar.xz",
".tar.lzma",
".tbz2",
".tar",
".zip",
]
def is_dist_file(fn):
for ext in SUPPORTED_DIST_EXTENSIONS:
if fn.endswith(ext):
return True
return False
class DistNoTarball(Exception):
"""Dist operation did not create a tarball."""
def run_dist(session, buildsystems, resolver, fixers, quiet=False):
# Some things want to write to the user's home directory, # Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache # e.g. pip caches in ~/.cache
session.create_home() session.create_home()
for buildsystem in buildsystems: for buildsystem in buildsystems:
buildsystem.dist(session, resolver, fixers, quiet=quiet) filename = buildsystem.dist(
return session, resolver, fixers, target_directory, quiet=quiet
)
return filename
raise NoBuildToolsFound() raise NoBuildToolsFound()
class DistCatcher(object): def create_dist(
def __init__(self, directory): session: Session,
self.export_directory = directory tree: Tree,
self.files = [] target_dir: str,
self.existing_files = None include_controldir: bool = True,
self.start_time = time.time() subdir: Optional[str] = None,
cleanup: bool = False,
) -> Optional[str]:
from .buildsystem import detect_buildsystems
from .buildlog import InstallFixer
from .fix_build import BuildFixer
from .fixers import (
GitIdentityFixer,
SecretGpgKeyFixer,
UnexpandedAutoconfMacroFixer,
)
def __enter__(self): if subdir is None:
self.existing_files = os.listdir(self.export_directory) subdir = "package"
return self try:
export_directory, reldir = session.setup_from_vcs(
tree, include_controldir=include_controldir, subdir=subdir
)
except OSError as e:
if e.errno == errno.ENOSPC:
raise DetailedFailure(1, ["mkdtemp"], NoSpaceOnDevice())
raise
def find_files(self): # TODO(jelmer): use scan_buildsystems to also look in subdirectories
new_files = os.listdir(self.export_directory) buildsystems = list(detect_buildsystems(export_directory))
diff_files = set(new_files) - set(self.existing_files) resolver = auto_resolver(session)
diff = set([n for n in diff_files if is_dist_file(n)]) fixers: List[BuildFixer] = [UnexpandedAutoconfMacroFixer(session, resolver)]
if len(diff) == 1:
fn = diff.pop()
logging.info("Found tarball %s in package directory.", fn)
self.files.append(os.path.join(self.export_directory, fn))
return fn
if "dist" in diff_files:
for entry in os.scandir(os.path.join(self.export_directory, "dist")):
if is_dist_file(entry.name):
logging.info("Found tarball %s in dist directory.", entry.name)
self.files.append(entry.path)
return entry.name
logging.info("No tarballs found in dist directory.")
parent_directory = os.path.dirname(self.export_directory) fixers.append(InstallFixer(resolver))
diff = set(os.listdir(parent_directory)) - set([self.export_directory])
if len(diff) == 1:
fn = diff.pop()
logging.info("Found tarball %s in parent directory.", fn)
self.files.append(os.path.join(parent_directory, fn))
return fn
if "dist" in new_files: if session.is_temporary:
for entry in os.scandir(os.path.join(self.export_directory, "dist")): # Only muck about with temporary sessions
if is_dist_file(entry.name) and entry.stat().st_mtime > self.start_time: fixers.extend([GitIdentityFixer(session), SecretGpgKeyFixer(session)])
logging.info("Found tarball %s in dist directory.", entry.name)
self.files.append(entry.path)
return entry.name
def __exit__(self, exc_type, exc_val, exc_tb): session.chdir(reldir)
self.find_files() return run_dist(session, buildsystems, resolver, fixers, target_dir)
return False
def create_dist_schroot( def create_dist_schroot(
@ -128,54 +108,24 @@ def create_dist_schroot(
target_dir: str, target_dir: str,
chroot: str, chroot: str,
packaging_tree: Optional[Tree] = None, packaging_tree: Optional[Tree] = None,
packaging_subpath: Optional[str] = None,
include_controldir: bool = True, include_controldir: bool = True,
subdir: Optional[str] = None, subdir: Optional[str] = None,
) -> str: cleanup: bool = False,
from .buildsystem import detect_buildsystems ) -> Optional[str]:
from .resolver.apt import AptResolver
from .buildlog import InstallFixer
if subdir is None:
subdir = "package"
with SchrootSession(chroot) as session: with SchrootSession(chroot) as session:
if packaging_tree is not None: if packaging_tree is not None:
from .debian import satisfy_build_deps from .debian import satisfy_build_deps
satisfy_build_deps(session, packaging_tree) satisfy_build_deps(session, packaging_tree, packaging_subpath)
build_dir = os.path.join(session.location, "build") return create_dist(
session,
try: tree,
directory = tempfile.mkdtemp(dir=build_dir) target_dir,
except OSError as e: include_controldir=include_controldir,
if e.errno == errno.ENOSPC: subdir=subdir,
raise DetailedFailure(1, ["mkdtemp"], NoSpaceOnDevice()) cleanup=cleanup,
reldir = "/" + os.path.relpath(directory, session.location) )
export_directory = os.path.join(directory, subdir)
if not include_controldir:
export_vcs_tree(tree, export_directory)
else:
dupe_vcs_tree(tree, export_directory)
buildsystems = list(detect_buildsystems(export_directory))
resolver = AptResolver.from_session(session)
fixers = [InstallFixer(resolver)]
with DistCatcher(export_directory) as dc:
oldcwd = os.getcwd()
os.chdir(export_directory)
try:
session.chdir(os.path.join(reldir, subdir))
run_dist(session, buildsystems, resolver, fixers)
finally:
os.chdir(oldcwd)
for path in dc.files:
shutil.copy(path, target_dir)
return os.path.join(target_dir, os.path.basename(path))
logging.info("No tarball created :(")
raise DistNoTarball()
if __name__ == "__main__": if __name__ == "__main__":
@ -205,6 +155,9 @@ if __name__ == "__main__":
"--target-directory", type=str, default="..", help="Target directory" "--target-directory", type=str, default="..", help="Target directory"
) )
parser.add_argument("--verbose", action="store_true", help="Be verbose") parser.add_argument("--verbose", action="store_true", help="Be verbose")
parser.add_argument(
"--include-controldir", action="store_true", help="Clone rather than export."
)
args = parser.parse_args() args = parser.parse_args()
@ -231,10 +184,23 @@ if __name__ == "__main__":
target_dir=os.path.abspath(args.target_directory), target_dir=os.path.abspath(args.target_directory),
packaging_tree=packaging_tree, packaging_tree=packaging_tree,
chroot=args.chroot, chroot=args.chroot,
include_controldir=args.include_controldir,
) )
except NoBuildToolsFound: except (NoBuildToolsFound, NotImplementedError):
logging.info("No build tools found, falling back to simple export.") logging.info("No build tools found, falling back to simple export.")
export(tree, "dist.tar.gz", "tgz", None) export(tree, "dist.tar.gz", "tgz", None)
except NotImplementedError:
logging.info(
"Build system does not support dist tarball creation, "
"falling back to simple export."
)
export(tree, "dist.tar.gz", "tgz", None)
except UnidentifiedError as e:
logging.fatal("Unidentified error: %r", e.lines)
except DetailedFailure as e:
logging.fatal("Identified error during dist creation: %s", e.error)
except DistNoTarball:
logging.fatal("dist operation did not create a tarball")
else: else:
print("Created %s" % ret) logging.info("Created %s", ret)
sys.exit(0) sys.exit(0)

117
ognibuild/dist_catcher.py Normal file
View file

@ -0,0 +1,117 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import os
import logging
import shutil
import time
class DistNoTarball(Exception):
"""Dist operation did not create a tarball."""
SUPPORTED_DIST_EXTENSIONS = [
".tar.gz",
".tgz",
".tar.bz2",
".tar.xz",
".tar.lzma",
".tbz2",
".tar",
".zip",
]
def is_dist_file(fn):
for ext in SUPPORTED_DIST_EXTENSIONS:
if fn.endswith(ext):
return True
return False
class DistCatcher(object):
def __init__(self, directories):
self.directories = [os.path.abspath(d) for d in directories]
self.files = []
self.existing_files = None
self.start_time = time.time()
@classmethod
def default(cls, directory):
return cls(
[os.path.join(directory, "dist"), directory, os.path.join(directory, "..")]
)
def __enter__(self):
self.existing_files = {}
for directory in self.directories:
try:
self.existing_files[directory] = {
entry.name: entry for entry in os.scandir(directory)
}
except FileNotFoundError:
self.existing_files[directory] = {}
return self
def find_files(self):
for directory in self.directories:
old_files = self.existing_files[directory]
possible_new = []
possible_updated = []
if not os.path.isdir(directory):
continue
for entry in os.scandir(directory):
if not entry.is_file() or not is_dist_file(entry.name):
continue
old_entry = old_files.get(entry.name)
if not old_entry:
possible_new.append(entry)
continue
if entry.stat().st_mtime > self.start_time:
possible_updated.append(entry)
continue
if len(possible_new) == 1:
entry = possible_new[0]
logging.info("Found new tarball %s in %s.", entry.name, directory)
self.files.append(entry.path)
return entry.name
elif len(possible_new) > 1:
logging.warning(
"Found multiple tarballs %r in %s.", possible_new, directory
)
return
if len(possible_updated) == 1:
entry = possible_updated[0]
logging.info("Found updated tarball %s in %s.", entry.name, directory)
self.files.append(entry.path)
return entry.name
def __exit__(self, exc_type, exc_val, exc_tb):
self.find_files()
return False
def copy_single(self, target_dir):
for path in self.files:
try:
shutil.copy(path, target_dir)
except shutil.SameFileError:
pass
return os.path.basename(path)
logging.info("No tarball created :(")
raise DistNoTarball()

View file

@ -15,60 +15,45 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from functools import partial
import logging import logging
from typing import List, Optional from typing import List, Tuple, Callable, Any
from buildlog_consultant import Problem
from buildlog_consultant.common import ( from buildlog_consultant.common import (
find_build_failure_description, find_build_failure_description,
MissingCommand,
) )
from breezy.mutabletree import MutableTree
from . import DetailedFailure, UnidentifiedError from . import DetailedFailure, UnidentifiedError
from .debian.apt import AptManager
from .session import Session, run_with_tee from .session import Session, run_with_tee
class BuildFixer(object): class BuildFixer(object):
"""Build fixer.""" """Build fixer."""
def can_fix(self, problem): def can_fix(self, problem: Problem):
raise NotImplementedError(self.can_fix) raise NotImplementedError(self.can_fix)
def _fix(self, problem, context): def _fix(self, problem: Problem, phase: Tuple[str, ...]):
raise NotImplementedError(self._fix) raise NotImplementedError(self._fix)
def fix(self, problem, context): def fix(self, problem: Problem, phase: Tuple[str, ...]):
if not self.can_fix(problem): if not self.can_fix(problem):
return None return None
return self._fix(problem, context) return self._fix(problem, phase)
class DependencyContext(object): def run_detecting_problems(session: Session, args: List[str], **kwargs):
def __init__( try:
self, retcode, contents = run_with_tee(session, args, **kwargs)
tree: MutableTree, except FileNotFoundError:
apt: AptManager, error = MissingCommand(args[0])
subpath: str = "", retcode = 1
committer: Optional[str] = None, else:
update_changelog: bool = True,
):
self.tree = tree
self.apt = apt
self.subpath = subpath
self.committer = committer
self.update_changelog = update_changelog
def add_dependency(self, package) -> bool:
raise NotImplementedError(self.add_dependency)
def run_with_build_fixers(session: Session, args: List[str], fixers: List[BuildFixer]):
logging.info("Running %r", args)
fixed_errors = []
while True:
retcode, lines = run_with_tee(session, args)
if retcode == 0: if retcode == 0:
return return contents
lines = "".join(contents).splitlines(False)
match, error = find_build_failure_description(lines) match, error = find_build_failure_description(lines)
if error is None: if error is None:
if match: if match:
@ -77,24 +62,56 @@ def run_with_build_fixers(session: Session, args: List[str], fixers: List[BuildF
else: else:
logging.warning("Build failed and unable to find cause. Giving up.") logging.warning("Build failed and unable to find cause. Giving up.")
raise UnidentifiedError(retcode, args, lines, secondary=match) raise UnidentifiedError(retcode, args, lines, secondary=match)
raise DetailedFailure(retcode, args, error)
logging.info("Identified error: %r", error)
if error in fixed_errors:
logging.warning(
"Failed to resolve error %r, it persisted. Giving up.", error
)
raise DetailedFailure(retcode, args, error)
if not resolve_error(
error,
None,
fixers=fixers,
):
logging.warning("Failed to find resolution for error %r. Giving up.", error)
raise DetailedFailure(retcode, args, error)
fixed_errors.append(error)
def resolve_error(error, context, fixers): def iterate_with_build_fixers(fixers: List[BuildFixer], cb: Callable[[], Any]):
"""Call cb() until there are no more DetailedFailures we can fix.
Args:
fixers: List of fixers to use to resolve issues
"""
fixed_errors = []
while True:
to_resolve = []
try:
return cb()
except DetailedFailure as e:
to_resolve.append(e)
while to_resolve:
f = to_resolve.pop(-1)
logging.info("Identified error: %r", f.error)
if f.error in fixed_errors:
logging.warning(
"Failed to resolve error %r, it persisted. Giving up.", f.error
)
raise f
try:
resolved = resolve_error(f.error, None, fixers=fixers)
except DetailedFailure as n:
logging.info("New error %r while resolving %r", n, f)
if n in to_resolve:
raise
to_resolve.append(f)
to_resolve.append(n)
else:
if not resolved:
logging.warning(
"Failed to find resolution for error %r. Giving up.", f.error
)
raise f
fixed_errors.append(f.error)
def run_with_build_fixers(
session: Session, args: List[str], fixers: List[BuildFixer], **kwargs
):
return iterate_with_build_fixers(
fixers, partial(run_detecting_problems, session, args, **kwargs)
)
def resolve_error(error, phase, fixers):
relevant_fixers = [] relevant_fixers = []
for fixer in fixers: for fixer in fixers:
if fixer.can_fix(error): if fixer.can_fix(error):
@ -104,7 +121,7 @@ def resolve_error(error, context, fixers):
return False return False
for fixer in relevant_fixers: for fixer in relevant_fixers:
logging.info("Attempting to use fixer %s to address %r", fixer, error) logging.info("Attempting to use fixer %s to address %r", fixer, error)
made_changes = fixer.fix(error, context) made_changes = fixer.fix(error, phase)
if made_changes: if made_changes:
return True return True
return False return False

103
ognibuild/fixers.py Normal file
View file

@ -0,0 +1,103 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import subprocess
from typing import Tuple
from buildlog_consultant import Problem
from buildlog_consultant.common import (
MissingGitIdentity,
MissingSecretGpgKey,
MissingAutoconfMacro,
)
from ognibuild.requirements import AutoconfMacroRequirement
from ognibuild.resolver import UnsatisfiedRequirements
from .fix_build import BuildFixer
class GitIdentityFixer(BuildFixer):
def __init__(self, session):
self.session = session
def can_fix(self, problem: Problem):
return isinstance(problem, MissingGitIdentity)
def _fix(self, problem: Problem, phase: Tuple[str, ...]):
for name in ["user.email", "user.name"]:
value = (
subprocess.check_output(["git", "config", "--global", name])
.decode()
.strip()
)
self.session.check_call(["git", "config", "--global", name, value])
return True
class SecretGpgKeyFixer(BuildFixer):
def __init__(self, session):
self.session = session
def can_fix(self, problem: Problem):
return isinstance(problem, MissingSecretGpgKey)
def _fix(self, problem: Problem, phase: Tuple[str, ...]):
SCRIPT = b"""\
Key-Type: 1
Key-Length: 4096
Subkey-Type: 1
Subkey-Length: 4096
Name-Real: Dummy Key for ognibuild
Name-Email: dummy@example.com
Expire-Date: 0
Passphrase: ""
"""
p = self.session.Popen(
["gpg", "--gen-key", "--batch", "/dev/stdin"],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
)
p.communicate(SCRIPT)
if p.returncode == 0:
return True
return False
class UnexpandedAutoconfMacroFixer(BuildFixer):
def __init__(self, session, resolver):
self.session = session
self.resolver = resolver
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.resolver)
def __str__(self):
return "unexpanded m4 macro fixer (%s)" % self.resolver
def can_fix(self, error):
return isinstance(error, MissingAutoconfMacro)
def _fix(self, error, phase):
try:
self.resolver.install([AutoconfMacroRequirement(error.macro)])
except UnsatisfiedRequirements:
return False
from .fix_build import run_detecting_problems
run_detecting_problems(self.session, ["autoconf", "-f"])
return True

View file

@ -21,7 +21,7 @@ def run_info(session, buildsystems):
print("%r:" % buildsystem) print("%r:" % buildsystem)
deps = {} deps = {}
try: try:
for kind, dep in buildsystem.get_declared_dependencies(): for kind, dep in buildsystem.get_declared_dependencies(session):
deps.setdefault(kind, []).append(dep) deps.setdefault(kind, []).append(dep)
except NotImplementedError: except NotImplementedError:
print( print(
@ -35,7 +35,7 @@ def run_info(session, buildsystems):
print("\t\t\t%s" % dep) print("\t\t\t%s" % dep)
print("") print("")
try: try:
outputs = list(buildsystem.get_declared_outputs()) outputs = list(buildsystem.get_declared_outputs(session))
except NotImplementedError: except NotImplementedError:
print("\tUnable to detect declared outputs for this type of build system") print("\tUnable to detect declared outputs for this type of build system")
outputs = [] outputs = []

View file

@ -46,3 +46,15 @@ class PythonPackageOutput(UpstreamOutput):
self.name, self.name,
self.python_version, self.python_version,
) )
class RPackageOutput(UpstreamOutput):
def __init__(self, name):
super(RPackageOutput, self).__init__("r-package")
self.name = name
def __str__(self):
return "R package: %s" % self.name
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.name)

View file

@ -17,8 +17,9 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import posixpath import posixpath
import re
import subprocess import subprocess
from typing import Optional, List, Tuple from typing import Optional, List, Set
from . import Requirement from . import Requirement
@ -71,14 +72,39 @@ class PythonPackageRequirement(Requirement):
cmd = "python3" cmd = "python3"
else: else:
raise NotImplementedError raise NotImplementedError
text = self.package + ','.join([''.join(spec) for spec in self.specs]) text = self.package + ",".join(["".join(spec) for spec in self.specs])
p = session.Popen( p = session.Popen(
[cmd, "-c", "import pkg_resources; pkg_resources.require(%r)" % text], [cmd, "-c", "import pkg_resources; pkg_resources.require(%r)" % text],
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
p.communicate() p.communicate()
return p.returncode == 0 return p.returncode == 0
class PhpPackageRequirement(Requirement):
def __init__(
self,
package: str,
channel: Optional[str] = None,
min_version: Optional[str] = None,
max_version: Optional[str] = None,
):
self.package = package
self.channel = channel
self.min_version = min_version
self.max_version = max_version
def __repr__(self):
return "%s(%r, %r, %r, %r)" % (
type(self).__name__,
self.package,
self.channel,
self.min_version,
self.max_version,
)
class BinaryRequirement(Requirement): class BinaryRequirement(Requirement):
binary_name: str binary_name: str
@ -87,10 +113,15 @@ class BinaryRequirement(Requirement):
super(BinaryRequirement, self).__init__("binary") super(BinaryRequirement, self).__init__("binary")
self.binary_name = binary_name self.binary_name = binary_name
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.binary_name)
def met(self, session): def met(self, session):
p = session.Popen( p = session.Popen(
["which", self.binary_name], stdout=subprocess.DEVNULL, ["which", self.binary_name],
stderr=subprocess.DEVNULL) stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
p.communicate() p.communicate()
return p.returncode == 0 return p.returncode == 0
@ -107,9 +138,47 @@ class PerlModuleRequirement(Requirement):
self.filename = filename self.filename = filename
self.inc = inc self.inc = inc
@property
def relfilename(self): def relfilename(self):
return self.module.replace("::", "/") + ".pm" return self.module.replace("::", "/") + ".pm"
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.module)
class VagueDependencyRequirement(Requirement):
name: str
minimum_version: Optional[str] = None
def __init__(self, name, minimum_version=None):
super(VagueDependencyRequirement, self).__init__("vague")
self.name = name
self.minimum_version = minimum_version
def expand(self):
if " " not in self.name:
yield BinaryRequirement(self.name)
yield LibraryRequirement(self.name)
yield PkgConfigRequirement(self.name, minimum_version=self.minimum_version)
if self.name.lower() != self.name:
yield BinaryRequirement(self.name.lower())
yield LibraryRequirement(self.name.lower())
yield PkgConfigRequirement(self.name.lower(), minimum_version=self.minimum_version)
from .resolver.apt import AptRequirement
yield AptRequirement.simple(self.name.lower(), minimum_version=self.minimum_version)
yield AptRequirement.simple('lib%s-dev' % self.name.lower(), minimum_version=self.minimum_version)
def met(self, session):
for x in self.expand():
if x.met(session):
return True
return False
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.name)
class NodePackageRequirement(Requirement): class NodePackageRequirement(Requirement):
@ -119,23 +188,53 @@ class NodePackageRequirement(Requirement):
super(NodePackageRequirement, self).__init__("npm-package") super(NodePackageRequirement, self).__init__("npm-package")
self.package = package self.package = package
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.package)
class NodeModuleRequirement(Requirement):
module: str
def __init__(self, module):
super(NodeModuleRequirement, self).__init__("npm-module")
self.module = module
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.module)
class CargoCrateRequirement(Requirement): class CargoCrateRequirement(Requirement):
crate: str crate: str
features: Set[str]
version: Optional[str]
def __init__(self, crate): def __init__(self, crate, features=None, version=None):
super(CargoCrateRequirement, self).__init__("cargo-crate") super(CargoCrateRequirement, self).__init__("cargo-crate")
self.crate = crate self.crate = crate
if features is None:
features = set()
self.features = features
self.version = version
def __repr__(self): def __repr__(self):
return "%s(%r)" % ( return "%s(%r, features=%r, version=%r)" % (
type(self).__name__, type(self).__name__,
self.crate, self.crate,
self.features,
self.version,
) )
def __str__(self): def __str__(self):
return "cargo crate: %s" % self.crate if self.features:
return "cargo crate: %s %s (%s)" % (
self.crate,
self.version or "",
", ".join(sorted(self.features)),
)
else:
return "cargo crate: %s %s" % (self.crate, self.version or "")
class PkgConfigRequirement(Requirement): class PkgConfigRequirement(Requirement):
@ -194,10 +293,29 @@ class RubyGemRequirement(Requirement):
class GoPackageRequirement(Requirement): class GoPackageRequirement(Requirement):
package: str package: str
version: Optional[str]
def __init__(self, package: str): def __init__(self, package: str, version: Optional[str] = None):
super(GoPackageRequirement, self).__init__("go") super(GoPackageRequirement, self).__init__("go-package")
self.package = package self.package = package
self.version = version
def __str__(self):
if self.version:
return "go package: %s (= %s)" % (self.package, self.version)
return "go package: %s" % self.package
class GoRequirement(Requirement):
version: Optional[str]
def __init__(self, version: Optional[str] = None):
super(GoRequirement, self).__init__("go")
self.version = version
def __str__(self):
return "go %s" % self.version
class DhAddonRequirement(Requirement): class DhAddonRequirement(Requirement):
@ -228,6 +346,65 @@ class RPackageRequirement(Requirement):
self.package = package self.package = package
self.minimum_version = minimum_version self.minimum_version = minimum_version
def __repr__(self):
return "%s(%r, minimum_version=%r)" % (
type(self).__name__,
self.package,
self.minimum_version,
)
def __str__(self):
if self.minimum_version:
return "R package: %s (>= %s)" % (self.package, self.minimum_version)
else:
return "R package: %s" % (self.package,)
@classmethod
def from_str(cls, text):
# TODO(jelmer): More complex parser
m = re.fullmatch(r"(.*)\s+\(>=\s+(.*)\)", text)
if m:
return cls(m.group(1), m.group(2))
m = re.fullmatch(r"([^ ]+)", text)
if m:
return cls(m.group(1))
raise ValueError(text)
class OctavePackageRequirement(Requirement):
package: str
minimum_version: Optional[str]
def __init__(self, package: str, minimum_version: Optional[str] = None):
super(OctavePackageRequirement, self).__init__("octave-package")
self.package = package
self.minimum_version = minimum_version
def __repr__(self):
return "%s(%r, minimum_version=%r)" % (
type(self).__name__,
self.package,
self.minimum_version,
)
def __str__(self):
if self.minimum_version:
return "Octave package: %s (>= %s)" % (self.package, self.minimum_version)
else:
return "Octave package: %s" % (self.package,)
@classmethod
def from_str(cls, text):
# TODO(jelmer): More complex parser
m = re.fullmatch(r"(.*)\s+\(>=\s+(.*)\)", text)
if m:
return cls(m.group(1), m.group(2))
m = re.fullmatch(r"([^ ]+)", text)
if m:
return cls(m.group(1))
raise ValueError(text)
class LibraryRequirement(Requirement): class LibraryRequirement(Requirement):
@ -276,6 +453,15 @@ class JavaClassRequirement(Requirement):
self.classname = classname self.classname = classname
class CMakefileRequirement(Requirement):
filename: str
def __init__(self, filename: str):
super(CMakefileRequirement, self).__init__("cmake-file")
self.filename = filename
class HaskellPackageRequirement(Requirement): class HaskellPackageRequirement(Requirement):
package: str package: str
@ -293,11 +479,43 @@ class HaskellPackageRequirement(Requirement):
class MavenArtifactRequirement(Requirement): class MavenArtifactRequirement(Requirement):
artifacts: List[Tuple[str, str, str]] group_id: str
artifact_id: str
version: Optional[str]
kind: Optional[str]
def __init__(self, artifacts): def __init__(self, group_id, artifact_id, version=None, kind=None):
super(MavenArtifactRequirement, self).__init__("maven-artifact") super(MavenArtifactRequirement, self).__init__("maven-artifact")
self.artifacts = artifacts self.group_id = group_id
self.artifact_id = artifact_id
self.version = version
self.kind = kind
def __str__(self):
return "maven requirement: %s:%s:%s" % (
self.group_id,
self.artifact_id,
self.version,
)
@classmethod
def from_str(cls, text):
return cls.from_tuple(text.split(":"))
@classmethod
def from_tuple(cls, parts):
if len(parts) == 4:
(group_id, artifact_id, kind, version) = parts
elif len(parts) == 3:
(group_id, artifact_id, version) = parts
kind = "jar"
elif len(parts) == 2:
version = None
(group_id, artifact_id) = parts
kind = "jar"
else:
raise ValueError("invalid number of parts to artifact %r" % parts)
return cls(group_id, artifact_id, version, kind)
class GnomeCommonRequirement(Requirement): class GnomeCommonRequirement(Requirement):
@ -320,6 +538,32 @@ class JDKFileRequirement(Requirement):
return posixpath.join(self.jdk_path, self.filename) return posixpath.join(self.jdk_path, self.filename)
class JDKRequirement(Requirement):
def __init__(self):
super(JDKRequirement, self).__init__("jdk")
class JRERequirement(Requirement):
def __init__(self):
super(JRERequirement, self).__init__("jre")
class QTRequirement(Requirement):
def __init__(self):
super(QTRequirement, self).__init__("qt")
class X11Requirement(Requirement):
def __init__(self):
super(X11Requirement, self).__init__("x11")
class CertificateAuthorityRequirement(Requirement):
def __init__(self, url):
super(CertificateAuthorityRequirement, self).__init__("ca-cert")
self.url = url
class PerlFileRequirement(Requirement): class PerlFileRequirement(Requirement):
filename: str filename: str
@ -338,6 +582,11 @@ class AutoconfMacroRequirement(Requirement):
self.macro = macro self.macro = macro
class LibtoolRequirement(Requirement):
def __init__(self):
super(LibtoolRequirement, self).__init__("libtool")
class PythonModuleRequirement(Requirement): class PythonModuleRequirement(Requirement):
module: str module: str
@ -346,6 +595,7 @@ class PythonModuleRequirement(Requirement):
def __init__(self, module, python_version=None, minimum_version=None): def __init__(self, module, python_version=None, minimum_version=None):
super(PythonModuleRequirement, self).__init__("python-module") super(PythonModuleRequirement, self).__init__("python-module")
self.module = module
self.python_version = python_version self.python_version = python_version
self.minimum_version = minimum_version self.minimum_version = minimum_version
@ -364,6 +614,8 @@ class PythonModuleRequirement(Requirement):
raise NotImplementedError raise NotImplementedError
p = session.Popen( p = session.Popen(
[cmd, "-c", "import %s" % self.module], [cmd, "-c", "import %s" % self.module],
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL) stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
p.communicate() p.communicate()
return p.returncode == 0 return p.returncode == 0

View file

@ -17,6 +17,7 @@
import subprocess import subprocess
from ..fix_build import run_detecting_problems
class UnsatisfiedRequirements(Exception): class UnsatisfiedRequirements(Exception):
@ -34,13 +35,15 @@ class Resolver(object):
def explain(self, requirements): def explain(self, requirements):
raise NotImplementedError(self.explain) raise NotImplementedError(self.explain)
def met(self, requirement): def env(self):
raise NotImplementedError(self.met) return {}
class CPANResolver(Resolver): class CPANResolver(Resolver):
def __init__(self, session): def __init__(self, session, user_local=False, skip_tests=True):
self.session = session self.session = session
self.user_local = user_local
self.skip_tests = skip_tests
def __str__(self): def __str__(self):
return "cpan" return "cpan"
@ -48,6 +51,13 @@ class CPANResolver(Resolver):
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.session) return "%s(%r)" % (type(self).__name__, self.session)
def _cmd(self, reqs):
ret = ["cpan", "-i"]
if self.skip_tests:
ret.append("-T")
ret.extend([req.module for req in reqs])
return ret
def explain(self, requirements): def explain(self, requirements):
from ..requirements import PerlModuleRequirement from ..requirements import PerlModuleRequirement
@ -57,28 +67,148 @@ class CPANResolver(Resolver):
continue continue
perlreqs.append(requirement) perlreqs.append(requirement)
if perlreqs: if perlreqs:
yield (["cpan", "-i"] + [req.module for req in perlreqs], [perlreqs]) yield (self._cmd(perlreqs), [perlreqs])
def install(self, requirements): def install(self, requirements):
from ..requirements import PerlModuleRequirement from ..requirements import PerlModuleRequirement
env = {
"PERL_MM_USE_DEFAULT": "1",
"PERL_MM_OPT": "",
"PERL_MB_OPT": "",
}
if not self.user_local:
user = "root"
else:
user = None
missing = [] missing = []
for requirement in requirements: for requirement in requirements:
if not isinstance(requirement, PerlModuleRequirement): if not isinstance(requirement, PerlModuleRequirement):
missing.append(requirement) missing.append(requirement)
continue continue
# TODO(jelmer): Specify -T to skip tests? run_detecting_problems(
self.session.check_call( self.session,
["cpan", "-i", requirement.module], self._cmd([requirement]),
env={"PERL_MM_USE_DEFAULT": "1"}, env=env,
user=user,
) )
if missing: if missing:
raise UnsatisfiedRequirements(missing) raise UnsatisfiedRequirements(missing)
class HackageResolver(Resolver): class RResolver(Resolver):
def __init__(self, session): def __init__(self, session, repos, user_local=False):
self.session = session self.session = session
self.repos = repos
self.user_local = user_local
def __str__(self):
return "cran"
def __repr__(self):
return "%s(%r, %r)" % (type(self).__name__, self.session, self.repos)
def _cmd(self, req):
# TODO(jelmer: Handle self.user_local
return [
"R",
"-e",
"install.packages('%s', repos=%r)" % (req.package, self.repos),
]
def explain(self, requirements):
from ..requirements import RPackageRequirement
rreqs = []
for requirement in requirements:
if not isinstance(requirement, RPackageRequirement):
continue
rreqs.append(requirement)
if rreqs:
yield ([self._cmd(req) for req in rreqs])
def install(self, requirements):
from ..requirements import RPackageRequirement
if self.user_local:
user = None
else:
user = "root"
missing = []
for requirement in requirements:
if not isinstance(requirement, RPackageRequirement):
missing.append(requirement)
continue
self.session.check_call(self._cmd(requirement), user=user)
if missing:
raise UnsatisfiedRequirements(missing)
class OctaveForgeResolver(Resolver):
def __init__(self, session, user_local=False):
self.session = session
self.user_local = user_local
def __str__(self):
return "octave-forge"
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.session)
def _cmd(self, req):
# TODO(jelmer: Handle self.user_local
return ["octave-cli", "--eval", "pkg install -forge %s" % req.package]
def explain(self, requirements):
from ..requirements import OctavePackageRequirement
rreqs = []
for requirement in requirements:
if not isinstance(requirement, OctavePackageRequirement):
continue
rreqs.append(requirement)
if rreqs:
yield ([self._cmd(req) for req in rreqs])
def install(self, requirements):
from ..requirements import OctavePackageRequirement
if self.user_local:
user = None
else:
user = "root"
missing = []
for requirement in requirements:
if not isinstance(requirement, OctavePackageRequirement):
missing.append(requirement)
continue
self.session.check_call(self._cmd(requirement), user=user)
if missing:
raise UnsatisfiedRequirements(missing)
class CRANResolver(RResolver):
def __init__(self, session, user_local=False):
super(CRANResolver, self).__init__(
session, "http://cran.r-project.org", user_local=user_local
)
class BioconductorResolver(RResolver):
def __init__(self, session, user_local=False):
super(BioconductorResolver, self).__init__(
session, "https://hedgehog.fhcrc.org/bioconductor", user_local=user_local
)
class HackageResolver(Resolver):
def __init__(self, session, user_local=False):
self.session = session
self.user_local = user_local
def __str__(self): def __str__(self):
return "hackage" return "hackage"
@ -86,17 +216,26 @@ class HackageResolver(Resolver):
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.session) return "%s(%r)" % (type(self).__name__, self.session)
def _cmd(self, reqs):
extra_args = []
if self.user_local:
extra_args.append("--user")
return ["cabal", "install"] + extra_args + [req.package for req in reqs]
def install(self, requirements): def install(self, requirements):
from ..requirements import HaskellPackageRequirement from ..requirements import HaskellPackageRequirement
if self.user_local:
user = None
else:
user = "root"
missing = [] missing = []
for requirement in requirements: for requirement in requirements:
if not isinstance(requirement, HaskellPackageRequirement): if not isinstance(requirement, HaskellPackageRequirement):
missing.append(requirement) missing.append(requirement)
continue continue
self.session.check_call( self.session.check_call(self._cmd([requirement]), user=user)
["cabal", "install", requirement.package]
)
if missing: if missing:
raise UnsatisfiedRequirements(missing) raise UnsatisfiedRequirements(missing)
@ -109,13 +248,13 @@ class HackageResolver(Resolver):
continue continue
haskellreqs.append(requirement) haskellreqs.append(requirement)
if haskellreqs: if haskellreqs:
yield (["cabal", "install"] + [req.package for req in haskellreqs], yield (self._cmd(haskellreqs), haskellreqs)
haskellreqs)
class PypiResolver(Resolver): class PypiResolver(Resolver):
def __init__(self, session): def __init__(self, session, user_local=False):
self.session = session self.session = session
self.user_local = user_local
def __str__(self): def __str__(self):
return "pypi" return "pypi"
@ -123,17 +262,27 @@ class PypiResolver(Resolver):
def __repr__(self): def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.session) return "%s(%r)" % (type(self).__name__, self.session)
def _cmd(self, reqs):
extra_args = []
if self.user_local:
extra_args.append("--user")
return ["pip", "install"] + extra_args + [req.package for req in reqs]
def install(self, requirements): def install(self, requirements):
from ..requirements import PythonPackageRequirement from ..requirements import PythonPackageRequirement
if self.user_local:
user = None
else:
user = "root"
missing = [] missing = []
for requirement in requirements: for requirement in requirements:
if not isinstance(requirement, PythonPackageRequirement): if not isinstance(requirement, PythonPackageRequirement):
missing.append(requirement) missing.append(requirement)
continue continue
try: try:
self.session.check_call( self.session.check_call(self._cmd([requirement]), user=user)
["pip", "install", requirement.package])
except subprocess.CalledProcessError: except subprocess.CalledProcessError:
missing.append(requirement) missing.append(requirement)
if missing: if missing:
@ -148,14 +297,13 @@ class PypiResolver(Resolver):
continue continue
pyreqs.append(requirement) pyreqs.append(requirement)
if pyreqs: if pyreqs:
yield (["pip", "install"] + [req.package for req in pyreqs], yield (self._cmd(pyreqs), pyreqs)
pyreqs)
class GoResolver(Resolver): class GoResolver(Resolver):
def __init__(self, session, user_local):
def __init__(self, session):
self.session = session self.session = session
self.user_local = user_local
def __str__(self): def __str__(self):
return "go" return "go"
@ -166,12 +314,18 @@ class GoResolver(Resolver):
def install(self, requirements): def install(self, requirements):
from ..requirements import GoPackageRequirement from ..requirements import GoPackageRequirement
if self.user_local:
env = {}
else:
# TODO(jelmer): Isn't this Debian-specific?
env = {"GOPATH": "/usr/share/gocode"}
missing = [] missing = []
for requirement in requirements: for requirement in requirements:
if not isinstance(requirement, GoPackageRequirement): if not isinstance(requirement, GoPackageRequirement):
missing.append(requirement) missing.append(requirement)
continue continue
self.session.check_call(["go", "get", requirement.package]) self.session.check_call(["go", "get", requirement.package], env=env)
if missing: if missing:
raise UnsatisfiedRequirements(missing) raise UnsatisfiedRequirements(missing)
@ -184,18 +338,20 @@ class GoResolver(Resolver):
continue continue
goreqs.append(requirement) goreqs.append(requirement)
if goreqs: if goreqs:
yield (["go", "get"] + [req.package for req in goreqs], yield (["go", "get"] + [req.package for req in goreqs], goreqs)
goreqs)
NPM_COMMAND_PACKAGES = { NPM_COMMAND_PACKAGES = {
"del-cli": "del-cli", "del-cli": "del-cli",
"husky": "husky",
} }
class NpmResolver(Resolver): class NpmResolver(Resolver):
def __init__(self, session): def __init__(self, session, user_local=False):
self.session = session self.session = session
self.user_local = user_local
# TODO(jelmer): Handle user_local
def __str__(self): def __str__(self):
return "npm" return "npm"
@ -204,19 +360,35 @@ class NpmResolver(Resolver):
return "%s(%r)" % (type(self).__name__, self.session) return "%s(%r)" % (type(self).__name__, self.session)
def install(self, requirements): def install(self, requirements):
from ..requirements import NodePackageRequirement from ..requirements import (
NodePackageRequirement,
NodeModuleRequirement,
BinaryRequirement,
)
if self.user_local:
user = None
else:
user = "root"
missing = [] missing = []
for requirement in requirements: for requirement in requirements:
if isinstance(requirement, BinaryRequirement):
try:
package = NPM_COMMAND_PACKAGES[requirement.binary_name]
except KeyError:
pass
else:
requirement = NodePackageRequirement(package)
if isinstance(requirement, NodeModuleRequirement):
# TODO: Is this legit?
requirement = NodePackageRequirement(requirement.module.split("/")[0])
if not isinstance(requirement, NodePackageRequirement): if not isinstance(requirement, NodePackageRequirement):
missing.append(requirement) missing.append(requirement)
continue continue
try: self.session.check_call(
package = NPM_COMMAND_PACKAGES[requirement.command] ["npm", "-g", "install", requirement.package], user=user
except KeyError: )
missing.append(requirement)
continue
self.session.check_call(["npm", "-g", "install", package])
if missing: if missing:
raise UnsatisfiedRequirements(missing) raise UnsatisfiedRequirements(missing)
@ -248,6 +420,13 @@ class StackedResolver(Resolver):
def __str__(self): def __str__(self):
return "[" + ", ".join(map(str, self.subs)) + "]" return "[" + ", ".join(map(str, self.subs)) + "]"
def env(self):
ret = {}
# Reversed so earlier resolvers override later ones
for sub in reversed(self.subs):
ret.update(sub.env())
return ret
def explain(self, requirements): def explain(self, requirements):
for sub in self.subs: for sub in self.subs:
yield from sub.explain(requirements) yield from sub.explain(requirements)
@ -260,6 +439,8 @@ class StackedResolver(Resolver):
requirements = e.requirements requirements = e.requirements
else: else:
return return
if requirements:
raise UnsatisfiedRequirements(requirements)
NATIVE_RESOLVER_CLS = [ NATIVE_RESOLVER_CLS = [
@ -268,35 +449,31 @@ NATIVE_RESOLVER_CLS = [
NpmResolver, NpmResolver,
GoResolver, GoResolver,
HackageResolver, HackageResolver,
] CRANResolver,
BioconductorResolver,
OctaveForgeResolver,
]
def native_resolvers(session): def native_resolvers(session, user_local):
return StackedResolver([kls(session) for kls in NATIVE_RESOLVER_CLS]) return StackedResolver([kls(session, user_local) for kls in NATIVE_RESOLVER_CLS])
class ExplainResolver(Resolver): def auto_resolver(session, explain=False):
def __init__(self, session):
self.session = session
@classmethod
def from_session(cls, session):
return cls(session)
def install(self, requirements):
raise UnsatisfiedRequirements(requirements)
def auto_resolver(session):
# if session is SchrootSession or if we're root, use apt # if session is SchrootSession or if we're root, use apt
from .apt import AptResolver from .apt import AptResolver
from ..session.schroot import SchrootSession from ..session.schroot import SchrootSession
from ..session import get_user
user = session.check_output(["echo", "$USER"]).decode().strip() user = get_user(session)
resolvers = [] resolvers = []
# TODO(jelmer): Check VIRTUAL_ENV, and prioritize PypiResolver if # TODO(jelmer): Check VIRTUAL_ENV, and prioritize PypiResolver if
# present? # present?
if isinstance(session, SchrootSession) or user == "root": if isinstance(session, SchrootSession) or user == "root" or explain:
user_local = False
else:
user_local = True
if not user_local:
resolvers.append(AptResolver.from_session(session)) resolvers.append(AptResolver.from_session(session))
resolvers.extend([kls(session) for kls in NATIVE_RESOLVER_CLS]) resolvers.extend([kls(session, user_local) for kls in NATIVE_RESOLVER_CLS])
return StackedResolver(resolvers) return StackedResolver(resolvers)

View file

@ -19,6 +19,8 @@ from itertools import chain
import logging import logging
import os import os
import posixpath import posixpath
import re
from typing import Optional, List
from debian.changelog import Version from debian.changelog import Version
from debian.deb822 import PkgRelation from debian.deb822 import PkgRelation
@ -28,6 +30,7 @@ from ..debian.apt import AptManager
from . import Resolver, UnsatisfiedRequirements from . import Resolver, UnsatisfiedRequirements
from ..requirements import ( from ..requirements import (
Requirement, Requirement,
CargoCrateRequirement,
BinaryRequirement, BinaryRequirement,
CHeaderRequirement, CHeaderRequirement,
PkgConfigRequirement, PkgConfigRequirement,
@ -36,30 +39,43 @@ from ..requirements import (
ValaPackageRequirement, ValaPackageRequirement,
RubyGemRequirement, RubyGemRequirement,
GoPackageRequirement, GoPackageRequirement,
GoRequirement,
DhAddonRequirement, DhAddonRequirement,
PhpClassRequirement, PhpClassRequirement,
PhpPackageRequirement,
RPackageRequirement, RPackageRequirement,
NodeModuleRequirement,
NodePackageRequirement, NodePackageRequirement,
LibraryRequirement, LibraryRequirement,
RubyFileRequirement, RubyFileRequirement,
XmlEntityRequirement, XmlEntityRequirement,
SprocketsFileRequirement, SprocketsFileRequirement,
JavaClassRequirement, JavaClassRequirement,
CMakefileRequirement,
HaskellPackageRequirement, HaskellPackageRequirement,
MavenArtifactRequirement, MavenArtifactRequirement,
GnomeCommonRequirement, GnomeCommonRequirement,
JDKFileRequirement, JDKFileRequirement,
JDKRequirement,
JRERequirement,
QTRequirement,
X11Requirement,
PerlModuleRequirement, PerlModuleRequirement,
PerlFileRequirement, PerlFileRequirement,
AutoconfMacroRequirement, AutoconfMacroRequirement,
PythonModuleRequirement, PythonModuleRequirement,
PythonPackageRequirement, PythonPackageRequirement,
CertificateAuthorityRequirement,
LibtoolRequirement,
VagueDependencyRequirement,
) )
class AptRequirement(Requirement): class AptRequirement(Requirement):
def __init__(self, relations): def __init__(self, relations):
super(AptRequirement, self).__init__("apt") super(AptRequirement, self).__init__("apt")
if not isinstance(relations, list):
raise TypeError(relations)
self.relations = relations self.relations = relations
@classmethod @classmethod
@ -76,17 +92,53 @@ class AptRequirement(Requirement):
def pkg_relation_str(self): def pkg_relation_str(self):
return PkgRelation.str(self.relations) return PkgRelation.str(self.relations)
def __hash__(self):
return hash((type(self), self.pkg_relation_str()))
def __eq__(self, other):
return isinstance(self, type(other)) and self.relations == other.relations
def __str__(self): def __str__(self):
return "apt requirement: %s" % self.pkg_relation_str() return "apt requirement: %s" % self.pkg_relation_str()
def touches_package(self, package): def __repr__(self):
return "%s.from_str(%r)" % (type(self).__name__, self.pkg_relation_str())
def package_names(self):
for rel in self.relations: for rel in self.relations:
for entry in rel: for entry in rel:
if entry["name"] == package: yield entry["name"]
return True
def touches_package(self, package):
for name in self.package_names():
if name == package:
return True
return False return False
def find_package_names(
apt_mgr: AptManager, paths: List[str], regex: bool = False, case_insensitive=False
) -> List[str]:
if not isinstance(paths, list):
raise TypeError(paths)
return apt_mgr.get_packages_for_paths(paths, regex, case_insensitive)
def find_reqs_simple(
apt_mgr: AptManager,
paths: List[str],
regex: bool = False,
minimum_version=None,
case_insensitive=False,
) -> List[str]:
if not isinstance(paths, list):
raise TypeError(paths)
return [
AptRequirement.simple(package, minimum_version=minimum_version)
for package in find_package_names(apt_mgr, paths, regex, case_insensitive)
]
def python_spec_to_apt_rels(pkg_name, specs): def python_spec_to_apt_rels(pkg_name, specs):
# TODO(jelmer): Dealing with epoch, etc? # TODO(jelmer): Dealing with epoch, etc?
if not specs: if not specs:
@ -94,99 +146,153 @@ def python_spec_to_apt_rels(pkg_name, specs):
else: else:
rels = [] rels = []
for spec in specs: for spec in specs:
c = {">=": ">=", "<=": "<=", "<": "<<", ">": ">>", "=": "="}[spec[0]] deb_version = Version(spec[1])
rels.append([{"name": pkg_name, "version": (c, Version(spec[1]))}]) if spec[0] == "~=":
# PEP 440: For a given release identifier V.N , the compatible
# release clause is approximately equivalent to the pair of
# comparison clauses: >= V.N, == V.*
parts = spec[1].split(".")
parts.pop(-1)
parts[-1] = str(int(parts[-1]) + 1)
next_maj_deb_version = Version(".".join(parts))
rels.extend(
[
{"name": pkg_name, "version": (">=", deb_version)},
{"name": pkg_name, "version": ("<<", next_maj_deb_version)},
]
)
elif spec[0] == "!=":
rels.extend(
[
{"name": pkg_name, "version": (">>", deb_version)},
{"name": pkg_name, "version": ("<<", deb_version)},
]
)
elif spec[1].endswith(".*") and spec[0] == "==":
s = spec[1].split(".")
s.pop(-1)
n = list(s)
n[-1] = str(int(n[-1]) + 1)
rels.extend(
[
{"name": pkg_name, "version": (">=", Version(".".join(s)))},
{"name": pkg_name, "version": ("<<", Version(".".join(n)))},
]
)
else:
c = {">=": ">=", "<=": "<=", "<": "<<", ">": ">>", "==": "="}[spec[0]]
rels.append([{"name": pkg_name, "version": (c, deb_version)}])
return rels return rels
def get_package_for_python_package(apt_mgr, package, python_version, specs=None): def get_package_for_python_package(
apt_mgr, package, python_version: Optional[str], specs=None
):
pypy_regex = "/usr/lib/pypy/dist-packages/%s-.*.egg-info" % re.escape(
package.replace("-", "_")
)
cpython2_regex = (
"/usr/lib/python2\\.[0-9]/dist-packages/%s-.*.egg-info"
% re.escape(package.replace("-", "_"))
)
cpython3_regex = "/usr/lib/python3/dist-packages/%s-.*.egg-info" % re.escape(
package.replace("-", "_")
)
if python_version == "pypy": if python_version == "pypy":
pkg_name = apt_mgr.get_package_for_paths( paths = [pypy_regex]
["/usr/lib/pypy/dist-packages/%s-.*.egg-info" % package.replace("-", "_")],
regex=True,
)
elif python_version == "cpython2": elif python_version == "cpython2":
pkg_name = apt_mgr.get_package_for_paths( paths = [cpython2_regex]
[
"/usr/lib/python2\\.[0-9]/dist-packages/%s-.*.egg-info"
% package.replace("-", "_")
],
regex=True,
)
elif python_version == "cpython3": elif python_version == "cpython3":
pkg_name = apt_mgr.get_package_for_paths( paths = [cpython3_regex]
[ elif python_version is None:
"/usr/lib/python3/dist-packages/%s-.*.egg-info" paths = [cpython3_regex, cpython2_regex, pypy_regex]
% package.replace("-", "_")
],
regex=True,
)
else: else:
raise NotImplementedError raise NotImplementedError("unsupported python version %s" % python_version)
if pkg_name is None: names = find_package_names(apt_mgr, paths, regex=True, case_insensitive=True)
return None return [AptRequirement(python_spec_to_apt_rels(name, specs)) for name in names]
rels = python_spec_to_apt_rels(pkg_name, specs)
return AptRequirement(rels)
def get_package_for_python_module(apt_mgr, module, python_version, specs): def get_package_for_python_module(apt_mgr, module, python_version, specs):
if python_version == "python3": cpython3_regexes = [
paths = [ posixpath.join(
posixpath.join( "/usr/lib/python3/dist-packages",
"/usr/lib/python3/dist-packages", re.escape(module.replace(".", "/")),
module.replace(".", "/"), "__init__.py",
"__init__.py", ),
), posixpath.join(
posixpath.join( "/usr/lib/python3/dist-packages",
"/usr/lib/python3/dist-packages", module.replace(".", "/") + ".py" re.escape(module.replace(".", "/")) + ".py",
), ),
posixpath.join( posixpath.join(
"/usr/lib/python3\\.[0-9]+/lib-dynload", "/usr/lib/python3\\.[0-9]+/lib-dynload",
module.replace(".", "/") + "\\.cpython-.*\\.so", re.escape(module.replace(".", "/")) + "\\.cpython-.*\\.so",
), ),
posixpath.join( posixpath.join(
"/usr/lib/python3\\.[0-9]+/", module.replace(".", "/") + ".py" "/usr/lib/python3\\.[0-9]+/", re.escape(module.replace(".", "/")) + ".py"
), ),
posixpath.join( posixpath.join(
"/usr/lib/python3\\.[0-9]+/", module.replace(".", "/"), "__init__.py" "/usr/lib/python3\\.[0-9]+/",
), re.escape(module.replace(".", "/")),
] "__init__.py",
elif python_version == "python2": ),
paths = [ ]
posixpath.join( cpython2_regexes = [
"/usr/lib/python2\\.[0-9]/dist-packages", posixpath.join(
module.replace(".", "/"), "/usr/lib/python2\\.[0-9]/dist-packages",
"__init__.py", re.escape(module.replace(".", "/")),
), "__init__.py",
posixpath.join( ),
"/usr/lib/python2\\.[0-9]/dist-packages", posixpath.join(
module.replace(".", "/") + ".py", "/usr/lib/python2\\.[0-9]/dist-packages",
), re.escape(module.replace(".", "/")) + ".py",
posixpath.join( ),
"/usr/lib/python2.\\.[0-9]/lib-dynload", posixpath.join(
module.replace(".", "/") + ".so", "/usr/lib/python2.\\.[0-9]/lib-dynload",
), re.escape(module.replace(".", "/")) + ".so",
] ),
]
pypy_regexes = [
posixpath.join(
"/usr/lib/pypy/dist-packages",
re.escape(module.replace(".", "/")),
"__init__.py",
),
posixpath.join(
"/usr/lib/pypy/dist-packages", re.escape(module.replace(".", "/")) + ".py"
),
posixpath.join(
"/usr/lib/pypy/dist-packages",
re.escape(module.replace(".", "/")) + "\\.pypy-.*\\.so",
),
]
if python_version == "cpython3":
paths = cpython3_regexes
elif python_version == "cpython2":
paths = cpython2_regexes
elif python_version == "pypy": elif python_version == "pypy":
paths = [ paths = pypy_regexes
posixpath.join( elif python_version is None:
"/usr/lib/pypy/dist-packages", module.replace(".", "/"), "__init__.py" paths = cpython3_regexes + cpython2_regexes + pypy_regexes
),
posixpath.join(
"/usr/lib/pypy/dist-packages", module.replace(".", "/") + ".py"
),
posixpath.join(
"/usr/lib/pypy/dist-packages",
module.replace(".", "/") + "\\.pypy-.*\\.so",
),
]
else: else:
raise AssertionError("unknown python version %r" % python_version) raise AssertionError("unknown python version %r" % python_version)
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True) names = find_package_names(apt_mgr, paths, regex=True)
if pkg_name is None: return [AptRequirement(python_spec_to_apt_rels(name, specs)) for name in names]
return None
rels = python_spec_to_apt_rels(pkg_name, specs)
return AptRequirement(rels) vague_map = {
"the Gnu Scientific Library": "libgsl-dev",
"the required FreeType library": "libfreetype-dev",
}
def resolve_vague_dep_req(apt_mgr, req):
name = req.name
options = []
if name in vague_map:
options.append(AptRequirement.simple(vague_map[name]))
for x in req.expand():
options.extend(resolve_requirement_apt(apt_mgr, x))
return options
def resolve_binary_req(apt_mgr, req): def resolve_binary_req(apt_mgr, req):
@ -196,149 +302,142 @@ def resolve_binary_req(apt_mgr, req):
paths = [ paths = [
posixpath.join(dirname, req.binary_name) for dirname in ["/usr/bin", "/bin"] posixpath.join(dirname, req.binary_name) for dirname in ["/usr/bin", "/bin"]
] ]
pkg_name = apt_mgr.get_package_for_paths(paths) return find_reqs_simple(apt_mgr, paths)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_pkg_config_req(apt_mgr, req): def resolve_pkg_config_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths( names = find_package_names(
[posixpath.join("/usr/lib/pkgconfig", req.module + ".pc")], apt_mgr,
[
posixpath.join(
"/usr/lib", ".*", "pkgconfig", re.escape(req.module) + "\\.pc"
)
],
regex=True,
) )
if package is None: if not names:
package = apt_mgr.get_package_for_paths( names = find_package_names(
[posixpath.join("/usr/lib", ".*", "pkgconfig", req.module + ".pc")], apt_mgr, [posixpath.join("/usr/lib/pkgconfig", req.module + ".pc")]
regex=True,
) )
if package is not None: return [
return AptRequirement.simple(package, minimum_version=req.minimum_version) AptRequirement.simple(name, minimum_version=req.minimum_version)
return None for name in names
]
def resolve_path_req(apt_mgr, req): def resolve_path_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths([req.path]) return find_reqs_simple(apt_mgr, [req.path])
if package is not None:
return AptRequirement.simple(package)
return None
def resolve_c_header_req(apt_mgr, req): def resolve_c_header_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths( reqs = find_reqs_simple(
[posixpath.join("/usr/include", req.header)], regex=False apt_mgr, [posixpath.join("/usr/include", req.header)], regex=False
) )
if package is None: if not reqs:
package = apt_mgr.get_package_for_paths( reqs = find_reqs_simple(
[posixpath.join("/usr/include", ".*", req.header)], regex=True apt_mgr,
[posixpath.join("/usr/include", ".*", re.escape(req.header))],
regex=True,
) )
if package is None: return reqs
return None
return AptRequirement.simple(package)
def resolve_js_runtime_req(apt_mgr, req): def resolve_js_runtime_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths( return find_reqs_simple(apt_mgr, ["/usr/bin/node", "/usr/bin/duk"])
["/usr/bin/node", "/usr/bin/duk"], regex=False
)
if package is not None:
return AptRequirement.simple(package)
return None
def resolve_vala_package_req(apt_mgr, req): def resolve_vala_package_req(apt_mgr, req):
path = "/usr/share/vala-[0-9.]+/vapi/%s.vapi" % req.package path = "/usr/share/vala-[0-9.]+/vapi/%s\\.vapi" % re.escape(req.package)
package = apt_mgr.get_package_for_paths([path], regex=True) return find_reqs_simple(apt_mgr, [path], regex=True)
if package is not None:
return AptRequirement.simple(package)
return None
def resolve_ruby_gem_req(apt_mgr, req): def resolve_ruby_gem_req(apt_mgr, req):
paths = [ paths = [
posixpath.join( posixpath.join(
"/usr/share/rubygems-integration/all/" "/usr/share/rubygems-integration/all/"
"specifications/%s-.*\\.gemspec" % req.gem "specifications/%s-.*\\.gemspec" % re.escape(req.gem)
) )
] ]
package = apt_mgr.get_package_for_paths(paths, regex=True) return find_reqs_simple(
if package is not None: apt_mgr, paths, regex=True, minimum_version=req.minimum_version
return AptRequirement.simple(package, minimum_version=req.minimum_version) )
return None
def resolve_go_package_req(apt_mgr, req): def resolve_go_package_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths( return find_reqs_simple(
[posixpath.join("/usr/share/gocode/src", req.package, ".*")], regex=True apt_mgr,
[posixpath.join("/usr/share/gocode/src", re.escape(req.package), ".*")],
regex=True,
) )
if package is not None:
return AptRequirement.simple(package)
return None def resolve_go_req(apt_mgr, req):
return [AptRequirement.simple("golang-go", minimum_version="2:%s" % req.version)]
def resolve_dh_addon_req(apt_mgr, req): def resolve_dh_addon_req(apt_mgr, req):
paths = [posixpath.join("/usr/share/perl5", req.path)] paths = [posixpath.join("/usr/share/perl5", req.path)]
package = apt_mgr.get_package_for_paths(paths) return find_reqs_simple(apt_mgr, paths)
if package is not None:
return AptRequirement.simple(package)
return None
def resolve_php_class_req(apt_mgr, req): def resolve_php_class_req(apt_mgr, req):
path = "/usr/share/php/%s.php" % req.php_class.replace("\\", "/") path = "/usr/share/php/%s.php" % req.php_class.replace("\\", "/")
package = apt_mgr.get_package_for_paths([path]) return find_reqs_simple(apt_mgr, [path])
if package is not None:
return AptRequirement.simple(package)
return None def resolve_php_package_req(apt_mgr, req):
return [
AptRequirement.simple("php-%s" % req.package, minimum_version=req.min_version)
]
def resolve_r_package_req(apt_mgr, req): def resolve_r_package_req(apt_mgr, req):
paths = [posixpath.join("/usr/lib/R/site-library/.*/R/%s$" % req.package)] paths = [
package = apt_mgr.get_package_for_paths(paths, regex=True) posixpath.join("/usr/lib/R/site-library/.*/R/%s$" % re.escape(req.package))
if package is not None: ]
return AptRequirement.simple(package) return find_reqs_simple(apt_mgr, paths, regex=True)
return None
def resolve_node_module_req(apt_mgr, req):
paths = [
"/usr/share/nodejs/.*/node_modules/%s/index.js" % re.escape(req.module),
"/usr/lib/nodejs/%s/index.js" % re.escape(req.module),
"/usr/share/nodejs/%s/index.js" % re.escape(req.module),
]
return find_reqs_simple(apt_mgr, paths, regex=True)
def resolve_node_package_req(apt_mgr, req): def resolve_node_package_req(apt_mgr, req):
paths = [ paths = [
"/usr/share/nodejs/.*/node_modules/%s/package.json" % req.package, "/usr/share/nodejs/.*/node_modules/%s/package\\.json" % re.escape(req.package),
"/usr/lib/nodejs/%s/package.json" % req.package, "/usr/lib/nodejs/%s/package\\.json" % re.escape(req.package),
"/usr/share/nodejs/%s/package.json" % req.package, "/usr/share/nodejs/%s/package\\.json" % re.escape(req.package),
] ]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True) return find_reqs_simple(apt_mgr, paths, regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_library_req(apt_mgr, req): def resolve_library_req(apt_mgr, req):
paths = [ paths = [
posixpath.join("/usr/lib/lib%s.so$" % req.library), posixpath.join("/usr/lib/lib%s.so$" % re.escape(req.library)),
posixpath.join("/usr/lib/.*/lib%s.so$" % req.library), posixpath.join("/usr/lib/.*/lib%s.so$" % re.escape(req.library)),
posixpath.join("/usr/lib/lib%s.a$" % req.library), posixpath.join("/usr/lib/lib%s.a$" % re.escape(req.library)),
posixpath.join("/usr/lib/.*/lib%s.a$" % req.library), posixpath.join("/usr/lib/.*/lib%s.a$" % re.escape(req.library)),
] ]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True) return find_reqs_simple(apt_mgr, paths, regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_ruby_file_req(apt_mgr, req): def resolve_ruby_file_req(apt_mgr, req):
paths = [posixpath.join("/usr/lib/ruby/vendor_ruby/%s.rb" % req.filename)] paths = [posixpath.join("/usr/lib/ruby/vendor_ruby/%s.rb" % req.filename)]
package = apt_mgr.get_package_for_paths(paths) reqs = find_reqs_simple(apt_mgr, paths, regex=False)
if package is not None: if reqs:
return AptRequirement.simple(package) return reqs
paths = [ paths = [
posixpath.join( posixpath.join(
r"/usr/share/rubygems-integration/all/gems/([^/]+)/" r"/usr/share/rubygems-integration/all/gems/([^/]+)/"
"lib/%s.rb" % req.filename "lib/%s\\.rb" % re.escape(req.filename)
) )
] ]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True) return find_reqs_simple(apt_mgr, paths, regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_xml_entity_req(apt_mgr, req): def resolve_xml_entity_req(apt_mgr, req):
@ -354,28 +453,21 @@ def resolve_xml_entity_req(apt_mgr, req):
else: else:
return None return None
pkg_name = apt_mgr.get_package_for_paths([search_path], regex=False) return find_reqs_simple(apt_mgr, [search_path], regex=False)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_sprockets_file_req(apt_mgr, req): def resolve_sprockets_file_req(apt_mgr, req):
if req.content_type == "application/javascript": if req.content_type == "application/javascript":
path = "/usr/share/.*/app/assets/javascripts/%s.js$" % req.name path = "/usr/share/.*/app/assets/javascripts/%s\\.js$" % re.escape(req.name)
else: else:
logging.warning("unable to handle content type %s", req.content_type) logging.warning("unable to handle content type %s", req.content_type)
return None return None
pkg_name = apt_mgr.get_package_for_paths([path], regex=True) return find_reqs_simple(apt_mgr, [path], regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_java_class_req(apt_mgr, req): def resolve_java_class_req(apt_mgr, req):
# Unfortunately this only finds classes in jars installed on the host # Unfortunately this only finds classes in jars installed on the host
# system :( # system :(
# TODO(jelmer): Call in session
output = apt_mgr.session.check_output( output = apt_mgr.session.check_output(
["java-propose-classpath", "-c" + req.classname] ["java-propose-classpath", "-c" + req.classname]
) )
@ -384,63 +476,70 @@ def resolve_java_class_req(apt_mgr, req):
logging.warning("unable to find classpath for %s", req.classname) logging.warning("unable to find classpath for %s", req.classname)
return False return False
logging.info("Classpath for %s: %r", req.classname, classpath) logging.info("Classpath for %s: %r", req.classname, classpath)
package = apt_mgr.get_package_for_paths(classpath) return find_reqs_simple(apt_mgr, [classpath])
if package is None:
logging.warning("no package for files in %r", classpath)
return None def resolve_cmake_file_req(apt_mgr, req):
return AptRequirement.simple(package) paths = ['/usr/lib/.*/cmake/.*/%s' % re.escape(req.filename)]
return find_reqs_simple(apt_mgr, paths, regex=True)
def resolve_haskell_package_req(apt_mgr, req): def resolve_haskell_package_req(apt_mgr, req):
path = "/var/lib/ghc/package.conf.d/%s-.*.conf" % req.deps[0][0] path = "/var/lib/ghc/package\\.conf\\.d/%s-.*\\.conf" % re.escape(req.deps[0][0])
pkg_name = apt_mgr.get_package_for_paths([path], regex=True) return find_reqs_simple(apt_mgr, [path], regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_maven_artifact_req(apt_mgr, req): def resolve_maven_artifact_req(apt_mgr, req):
artifact = req.artifacts[0] if req.version is None:
parts = artifact.split(":")
if len(parts) == 4:
(group_id, artifact_id, kind, version) = parts
regex = False
elif len(parts) == 3:
(group_id, artifact_id, version) = parts
kind = "jar"
regex = False
elif len(parts) == 2:
version = ".*" version = ".*"
(group_id, artifact_id) = parts
kind = "jar"
regex = True regex = True
escape = re.escape
else: else:
raise AssertionError("invalid number of parts to artifact %s" % artifact) version = req.version
paths = [ regex = False
posixpath.join(
"/usr/share/maven-repo", def escape(x):
group_id.replace(".", "/"), return x
artifact_id,
version, kind = req.kind or "jar"
"%s-%s.%s" % (artifact_id, version, kind), path = posixpath.join(
) escape("/usr/share/maven-repo"),
] escape(req.group_id.replace(".", "/")),
pkg_name = apt_mgr.get_package_for_paths(paths, regex=regex) escape(req.artifact_id),
if pkg_name is not None: version,
return AptRequirement.simple(pkg_name) escape("%s-") + version + escape("." + kind),
return None )
return find_reqs_simple(apt_mgr, [path], regex=regex)
def resolve_gnome_common_req(apt_mgr, req): def resolve_gnome_common_req(apt_mgr, req):
return AptRequirement.simple("gnome-common") return [AptRequirement.simple("gnome-common")]
def resolve_jdk_file_req(apt_mgr, req): def resolve_jdk_file_req(apt_mgr, req):
path = req.jdk_path + ".*/" + req.filename path = re.escape(req.jdk_path) + ".*/" + re.escape(req.filename)
pkg_name = apt_mgr.get_package_for_paths([path], regex=True) return find_reqs_simple(apt_mgr, [path], regex=True)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None def resolve_jdk_req(apt_mgr, req):
return [AptRequirement.simple("default-jdk")]
def resolve_jre_req(apt_mgr, req):
return [AptRequirement.simple("default-jre")]
def resolve_x11_req(apt_mgr, req):
return [AptRequirement.simple("libx11-dev")]
def resolve_qt_req(apt_mgr, req):
return find_reqs_simple(apt_mgr, ["/usr/lib/.*/qt[0-9]+/bin/qmake"], regex=True)
def resolve_libtool_req(apt_mgr, req):
return [AptRequirement.simple("libtool")]
def resolve_perl_module_req(apt_mgr, req): def resolve_perl_module_req(apt_mgr, req):
@ -455,28 +554,24 @@ def resolve_perl_module_req(apt_mgr, req):
paths = [req.filename] paths = [req.filename]
else: else:
paths = [posixpath.join(inc, req.filename) for inc in req.inc] paths = [posixpath.join(inc, req.filename) for inc in req.inc]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=False) return find_reqs_simple(apt_mgr, paths, regex=False)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_perl_file_req(apt_mgr, req): def resolve_perl_file_req(apt_mgr, req):
pkg_name = apt_mgr.get_package_for_paths([req.filename], regex=False) return find_reqs_simple(apt_mgr, [req.filename], regex=False)
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def _find_aclocal_fun(macro): def _find_aclocal_fun(macro):
# TODO(jelmer): Use the API for codesearch.debian.net instead? # TODO(jelmer): Use the API for codesearch.debian.net instead?
defun_prefix = b"AC_DEFUN([%s]," % macro.encode("ascii") defun_prefix = b"AC_DEFUN([%s]," % macro.encode("ascii")
au_alias_prefix = b"AU_ALIAS([%s]," % macro.encode("ascii")
prefixes = [defun_prefix, au_alias_prefix]
for entry in os.scandir("/usr/share/aclocal"): for entry in os.scandir("/usr/share/aclocal"):
if not entry.is_file(): if not entry.is_file():
continue continue
with open(entry.path, "rb") as f: with open(entry.path, "rb") as f:
for line in f: for line in f:
if line.startswith(defun_prefix): if any([line.startswith(prefix) for prefix in prefixes]):
return entry.path return entry.path
raise KeyError raise KeyError
@ -487,17 +582,18 @@ def resolve_autoconf_macro_req(apt_mgr, req):
except KeyError: except KeyError:
logging.info("No local m4 file found defining %s", req.macro) logging.info("No local m4 file found defining %s", req.macro)
return None return None
pkg_name = apt_mgr.get_package_for_paths([path]) return find_reqs_simple(apt_mgr, [path])
if pkg_name is not None:
return AptRequirement.simple(pkg_name)
return None
def resolve_python_module_req(apt_mgr, req): def resolve_python_module_req(apt_mgr, req):
if req.minimum_version:
specs = [(">=", req.minimum_version)]
else:
specs = []
if req.python_version == 2: if req.python_version == 2:
return get_package_for_python_module(apt_mgr, req.module, "cpython2", req.specs) return get_package_for_python_module(apt_mgr, req.module, "cpython2", specs)
elif req.python_version in (None, 3): elif req.python_version in (None, 3):
return get_package_for_python_module(apt_mgr, req.module, "cpython3", req.specs) return get_package_for_python_module(apt_mgr, req.module, "cpython3", specs)
else: else:
return None return None
@ -515,8 +611,27 @@ def resolve_python_package_req(apt_mgr, req):
return None return None
def resolve_cargo_crate_req(apt_mgr, req):
paths = ["/usr/share/cargo/registry/%s-[0-9]+.*/Cargo.toml" % re.escape(req.crate)]
return find_reqs_simple(apt_mgr, paths, regex=True)
def resolve_ca_req(apt_mgr, req):
return [AptRequirement.simple("ca-certificates")]
def resolve_apt_req(apt_mgr, req):
# TODO(jelmer): This should be checking whether versions match as well.
for package_name in req.package_names():
if not apt_mgr.package_exists(package_name):
return []
return [req]
APT_REQUIREMENT_RESOLVERS = [ APT_REQUIREMENT_RESOLVERS = [
(AptRequirement, resolve_apt_req),
(BinaryRequirement, resolve_binary_req), (BinaryRequirement, resolve_binary_req),
(VagueDependencyRequirement, resolve_vague_dep_req),
(PkgConfigRequirement, resolve_pkg_config_req), (PkgConfigRequirement, resolve_pkg_config_req),
(PathRequirement, resolve_path_req), (PathRequirement, resolve_path_req),
(CHeaderRequirement, resolve_c_header_req), (CHeaderRequirement, resolve_c_header_req),
@ -524,44 +639,66 @@ APT_REQUIREMENT_RESOLVERS = [
(ValaPackageRequirement, resolve_vala_package_req), (ValaPackageRequirement, resolve_vala_package_req),
(RubyGemRequirement, resolve_ruby_gem_req), (RubyGemRequirement, resolve_ruby_gem_req),
(GoPackageRequirement, resolve_go_package_req), (GoPackageRequirement, resolve_go_package_req),
(GoRequirement, resolve_go_req),
(DhAddonRequirement, resolve_dh_addon_req), (DhAddonRequirement, resolve_dh_addon_req),
(PhpClassRequirement, resolve_php_class_req), (PhpClassRequirement, resolve_php_class_req),
(PhpPackageRequirement, resolve_php_package_req),
(RPackageRequirement, resolve_r_package_req), (RPackageRequirement, resolve_r_package_req),
(NodeModuleRequirement, resolve_node_module_req),
(NodePackageRequirement, resolve_node_package_req), (NodePackageRequirement, resolve_node_package_req),
(LibraryRequirement, resolve_library_req), (LibraryRequirement, resolve_library_req),
(RubyFileRequirement, resolve_ruby_file_req), (RubyFileRequirement, resolve_ruby_file_req),
(XmlEntityRequirement, resolve_xml_entity_req), (XmlEntityRequirement, resolve_xml_entity_req),
(SprocketsFileRequirement, resolve_sprockets_file_req), (SprocketsFileRequirement, resolve_sprockets_file_req),
(JavaClassRequirement, resolve_java_class_req), (JavaClassRequirement, resolve_java_class_req),
(CMakefileRequirement, resolve_cmake_file_req),
(HaskellPackageRequirement, resolve_haskell_package_req), (HaskellPackageRequirement, resolve_haskell_package_req),
(MavenArtifactRequirement, resolve_maven_artifact_req), (MavenArtifactRequirement, resolve_maven_artifact_req),
(GnomeCommonRequirement, resolve_gnome_common_req), (GnomeCommonRequirement, resolve_gnome_common_req),
(JDKFileRequirement, resolve_jdk_file_req), (JDKFileRequirement, resolve_jdk_file_req),
(JDKRequirement, resolve_jdk_req),
(JRERequirement, resolve_jre_req),
(QTRequirement, resolve_qt_req),
(X11Requirement, resolve_x11_req),
(LibtoolRequirement, resolve_libtool_req),
(PerlModuleRequirement, resolve_perl_module_req), (PerlModuleRequirement, resolve_perl_module_req),
(PerlFileRequirement, resolve_perl_file_req), (PerlFileRequirement, resolve_perl_file_req),
(AutoconfMacroRequirement, resolve_autoconf_macro_req), (AutoconfMacroRequirement, resolve_autoconf_macro_req),
(PythonModuleRequirement, resolve_python_module_req), (PythonModuleRequirement, resolve_python_module_req),
(PythonPackageRequirement, resolve_python_package_req), (PythonPackageRequirement, resolve_python_package_req),
(CertificateAuthorityRequirement, resolve_ca_req),
(CargoCrateRequirement, resolve_cargo_crate_req),
] ]
def resolve_requirement_apt(apt_mgr, req: Requirement) -> AptRequirement: def resolve_requirement_apt(apt_mgr, req: Requirement) -> List[AptRequirement]:
for rr_class, rr_fn in APT_REQUIREMENT_RESOLVERS: for rr_class, rr_fn in APT_REQUIREMENT_RESOLVERS:
if isinstance(req, rr_class): if isinstance(req, rr_class):
return rr_fn(apt_mgr, req) ret = rr_fn(apt_mgr, req)
if not ret:
return []
if not isinstance(ret, list):
raise TypeError(ret)
return ret
raise NotImplementedError(type(req)) raise NotImplementedError(type(req))
class AptResolver(Resolver): class AptResolver(Resolver):
def __init__(self, apt): def __init__(self, apt, tie_breakers=None):
self.apt = apt self.apt = apt
if tie_breakers is None:
tie_breakers = []
self.tie_breakers = tie_breakers
def __str__(self): def __str__(self):
return "apt" return "apt"
def __repr__(self):
return "%s(%r, %r)" % (type(self).__name__, self.apt, self.tie_breakers)
@classmethod @classmethod
def from_session(cls, session): def from_session(cls, session, tie_breakers=None):
return cls(AptManager(session)) return cls(AptManager.from_session(session), tie_breakers=tie_breakers)
def install(self, requirements): def install(self, requirements):
missing = [] missing = []
@ -595,7 +732,29 @@ class AptResolver(Resolver):
if apt_req is not None: if apt_req is not None:
apt_requirements.append((r, apt_req)) apt_requirements.append((r, apt_req))
if apt_requirements: if apt_requirements:
yield (["apt", "satisfy"] + [PkgRelation.str(chain(*[r.relations for o, r in apt_requirements]))], [o for o, r in apt_requirements]) yield (
self.apt.satisfy_command(
[
PkgRelation.str(
chain(*[r.relations for o, r in apt_requirements])
)
]
),
[o for o, r in apt_requirements],
)
def resolve(self, req: Requirement): def resolve(self, req: Requirement):
return resolve_requirement_apt(self.apt, req) ret = resolve_requirement_apt(self.apt, req)
if not ret:
return None
if len(ret) == 1:
return ret[0]
logging.info("Need to break tie between %r with %r", ret, self.tie_breakers)
for tie_breaker in self.tie_breakers:
winner = tie_breaker(ret)
if winner is not None:
if not isinstance(winner, AptRequirement):
raise TypeError(winner)
return winner
logging.info("Unable to break tie over %r, picking first: %r", ret, ret[0])
return ret[0]

View file

@ -16,11 +16,25 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from typing import Optional, List, Dict from typing import Optional, List, Dict, Tuple
import sys import sys
import subprocess import subprocess
class NoSessionOpen(Exception):
"""There is no session open."""
def __init__(self, session):
self.session = session
class SessionAlreadyOpen(Exception):
"""There is already a session open."""
def __init__(self, session):
self.session = session
class Session(object): class Session(object):
def __enter__(self) -> "Session": def __enter__(self) -> "Session":
return self return self
@ -41,6 +55,7 @@ class Session(object):
cwd: Optional[str] = None, cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None, env: Optional[Dict[str, str]] = None,
close_fds: bool = True,
): ):
raise NotImplementedError(self.check_call) raise NotImplementedError(self.check_call)
@ -74,12 +89,27 @@ class Session(object):
def scandir(self, path: str): def scandir(self, path: str):
raise NotImplementedError(self.scandir) raise NotImplementedError(self.scandir)
def setup_from_vcs(
self, tree, include_controldir: Optional[bool] = None, subdir="package"
) -> Tuple[str, str]:
raise NotImplementedError(self.setup_from_vcs)
def setup_from_directory(self, path, subdir="package") -> Tuple[str, str]:
raise NotImplementedError(self.setup_from_directory)
def external_path(self, path: str) -> str:
raise NotImplementedError
is_temporary: bool
class SessionSetupFailure(Exception): class SessionSetupFailure(Exception):
"""Session failed to be set up.""" """Session failed to be set up."""
def run_with_tee(session: Session, args: List[str], **kwargs): def run_with_tee(session: Session, args: List[str], **kwargs):
if "stdin" not in kwargs:
kwargs["stdin"] = subprocess.DEVNULL
p = session.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kwargs) p = session.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kwargs)
contents = [] contents = []
while p.poll() is None: while p.poll() is None:
@ -88,3 +118,19 @@ def run_with_tee(session: Session, args: List[str], **kwargs):
sys.stdout.buffer.flush() sys.stdout.buffer.flush()
contents.append(line.decode("utf-8", "surrogateescape")) contents.append(line.decode("utf-8", "surrogateescape"))
return p.returncode, contents return p.returncode, contents
def get_user(session):
return session.check_output(["echo", "$USER"], cwd="/").decode().strip()
def which(session, name):
try:
ret = session.check_output(["which", name], cwd="/").decode().strip()
except subprocess.CalledProcessError as e:
if e.returncode == 1:
return None
raise
if not ret:
return None
return ret

View file

@ -16,10 +16,13 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from . import Session from . import Session, NoSessionOpen, SessionAlreadyOpen
import contextlib
import os import os
import subprocess import subprocess
import tempfile
from typing import Optional, Dict, List
class PlainSession(Session): class PlainSession(Session):
@ -27,20 +30,63 @@ class PlainSession(Session):
location = "/" location = "/"
def __init__(self):
self.es = None
def _prepend_user(self, user, args):
if self.es is None:
raise NoSessionOpen(self)
if user is not None:
import getpass
if user != getpass.getuser():
args = ["sudo", "-u", user] + args
return args
def __repr__(self): def __repr__(self):
return "%s()" % (type(self).__name__, ) return "%s()" % (type(self).__name__,)
def __enter__(self) -> "Session":
if self.es is not None:
raise SessionAlreadyOpen(self)
self.es = contextlib.ExitStack()
self.es.__enter__()
return self
def __exit__(self, exc_type, exc_val, exc_tb):
if self.es is None:
raise NoSessionOpen(self)
self.es.__exit__(exc_type, exc_val, exc_tb)
self.es = None
return False
def create_home(self): def create_home(self):
pass pass
def check_call(self, args): def check_call(
return subprocess.check_call(args) self,
argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None,
env: Optional[Dict[str, str]] = None,
close_fds: bool = True,
):
argv = self._prepend_user(user, argv)
return subprocess.check_call(argv, cwd=cwd, env=env, close_fds=close_fds)
def check_output(self, args): def check_output(
return subprocess.check_output(args) self,
argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None,
env: Optional[Dict[str, str]] = None,
) -> bytes:
argv = self._prepend_user(user, argv)
return subprocess.check_output(argv, cwd=cwd, env=env)
def Popen(self, args, stdout=None, stderr=None, user=None, cwd=None): def Popen(self, args, stdout=None, stderr=None, stdin=None, user=None, cwd=None, env=None):
return subprocess.Popen(args, stdout=stdout, stderr=stderr, cwd=cwd) args = self._prepend_user(user, args)
return subprocess.Popen(args, stdout=stdout, stderr=stderr, stdin=stdin, cwd=cwd, env=env)
def exists(self, path): def exists(self, path):
return os.path.exists(path) return os.path.exists(path)
@ -50,3 +96,27 @@ class PlainSession(Session):
def chdir(self, path): def chdir(self, path):
os.chdir(path) os.chdir(path)
def external_path(self, path):
return os.path.abspath(path)
def setup_from_vcs(self, tree, include_controldir=None, subdir="package"):
from ..vcs import dupe_vcs_tree, export_vcs_tree
if include_controldir is False or (
not hasattr(tree, "base") and include_controldir is None
):
td = self.es.enter_context(tempfile.TemporaryDirectory())
export_vcs_tree(tree, td)
return td, td
elif not hasattr(tree, "base"):
td = self.es.enter_context(tempfile.TemporaryDirectory())
dupe_vcs_tree(tree, td)
return td, td
else:
return tree.base, tree.base
def setup_from_directory(self, path):
return path, path
is_temporary = False

View file

@ -19,11 +19,12 @@ import logging
import os import os
import shlex import shlex
import subprocess import subprocess
import tempfile
from typing import Optional, List, Dict from typing import Optional, List, Dict
from . import Session, SessionSetupFailure from . import Session, SessionSetupFailure, NoSessionOpen, SessionAlreadyOpen
class SchrootSession(Session): class SchrootSession(Session):
@ -31,6 +32,7 @@ class SchrootSession(Session):
_cwd: Optional[str] _cwd: Optional[str]
_location: Optional[str] _location: Optional[str]
chroot: str chroot: str
session_id: Optional[str]
def __init__(self, chroot: str): def __init__(self, chroot: str):
if not isinstance(chroot, str): if not isinstance(chroot, str):
@ -38,8 +40,11 @@ class SchrootSession(Session):
self.chroot = chroot self.chroot = chroot
self._location = None self._location = None
self._cwd = None self._cwd = None
self.session_id = None
def _get_location(self) -> str: def _get_location(self) -> str:
if self.session_id is None:
raise NoSessionOpen(self)
return ( return (
subprocess.check_output( subprocess.check_output(
["schroot", "--location", "-c", "session:" + self.session_id] ["schroot", "--location", "-c", "session:" + self.session_id]
@ -48,10 +53,29 @@ class SchrootSession(Session):
.decode() .decode()
) )
def _end_session(self) -> None: def _end_session(self) -> bool:
subprocess.check_output(["schroot", "-c", "session:" + self.session_id, "-e"]) if self.session_id is None:
raise NoSessionOpen(self)
try:
subprocess.check_output(
["schroot", "-c", "session:" + self.session_id, "-e"],
stderr=subprocess.PIPE,
)
except subprocess.CalledProcessError as e:
for line in e.stderr.splitlines(False):
if line.startswith(b"E: "):
logging.error("%s", line[3:].decode(errors="replace"))
logging.warning(
"Failed to close schroot session %s, leaving stray.", self.session_id
)
self.session_id = None
return False
self.session_id = None
return True
def __enter__(self) -> "Session": def __enter__(self) -> "Session":
if self.session_id is not None:
raise SessionAlreadyOpen(self)
try: try:
self.session_id = ( self.session_id = (
subprocess.check_output(["schroot", "-c", self.chroot, "-b"]) subprocess.check_output(["schroot", "-c", self.chroot, "-b"])
@ -86,6 +110,8 @@ class SchrootSession(Session):
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None, env: Optional[Dict[str, str]] = None,
): ):
if self.session_id is None:
raise NoSessionOpen(self)
base_argv = ["schroot", "-r", "-c", "session:" + self.session_id] base_argv = ["schroot", "-r", "-c", "session:" + self.session_id]
if cwd is None: if cwd is None:
cwd = self._cwd cwd = self._cwd
@ -113,9 +139,12 @@ class SchrootSession(Session):
cwd: Optional[str] = None, cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None, env: Optional[Dict[str, str]] = None,
close_fds: bool = True,
): ):
try: try:
subprocess.check_call(self._run_argv(argv, cwd, user, env=env)) subprocess.check_call(
self._run_argv(argv, cwd, user, env=env), close_fds=close_fds
)
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
raise subprocess.CalledProcessError(e.returncode, argv) raise subprocess.CalledProcessError(e.returncode, argv)
@ -151,19 +180,49 @@ class SchrootSession(Session):
.decode() .decode()
.rstrip("\n") .rstrip("\n")
) )
logging.info("Creating directory %s", home) logging.info("Creating directory %s in schroot session.", home)
self.check_call(["mkdir", "-p", home], cwd="/", user="root") self.check_call(["mkdir", "-p", home], cwd="/", user="root")
self.check_call(["chown", user, home], cwd="/", user="root") self.check_call(["chown", user, home], cwd="/", user="root")
def _fullpath(self, path: str) -> str: def external_path(self, path: str) -> str:
if self._cwd is None: if self._cwd is None:
raise ValueError("no cwd set") raise ValueError("no cwd set")
return os.path.join(self.location, os.path.join(self._cwd, path).lstrip("/")) return os.path.join(self.location, os.path.join(self._cwd, path).lstrip("/"))
def exists(self, path: str) -> bool: def exists(self, path: str) -> bool:
fullpath = self._fullpath(path) fullpath = self.external_path(path)
return os.path.exists(fullpath) return os.path.exists(fullpath)
def scandir(self, path: str): def scandir(self, path: str):
fullpath = self._fullpath(path) fullpath = self.external_path(path)
return os.scandir(fullpath) return os.scandir(fullpath)
def setup_from_vcs(
self, tree, include_controldir: Optional[bool] = None, subdir="package"
):
from ..vcs import dupe_vcs_tree, export_vcs_tree
build_dir = os.path.join(self.location, "build")
directory = tempfile.mkdtemp(dir=build_dir)
reldir = "/" + os.path.relpath(directory, self.location)
export_directory = os.path.join(directory, subdir)
if not include_controldir:
export_vcs_tree(tree, export_directory)
else:
dupe_vcs_tree(tree, export_directory)
return export_directory, os.path.join(reldir, subdir)
def setup_from_directory(self, path, subdir="package"):
import shutil
build_dir = os.path.join(self.location, "build")
directory = tempfile.mkdtemp(dir=build_dir)
reldir = "/" + os.path.relpath(directory, self.location)
export_directory = os.path.join(directory, subdir)
shutil.copytree(path, export_directory, dirs_exist_ok=True)
return export_directory, os.path.join(reldir, subdir)
is_temporary = True

View file

@ -35,8 +35,9 @@ from ..debian.fix_build import (
resolve_error, resolve_error,
versioned_package_fixers, versioned_package_fixers,
apt_fixers, apt_fixers,
BuildDependencyContext, DebianPackagingContext,
) )
from breezy.commit import NullCommitReporter
from breezy.tests import TestCaseWithTransport from breezy.tests import TestCaseWithTransport
@ -44,10 +45,17 @@ class DummyAptSearcher(FileSearcher):
def __init__(self, files): def __init__(self, files):
self._apt_files = files self._apt_files = files
def search_files(self, path, regex=False): def search_files(self, path, regex=False, case_insensitive=False):
for p, pkg in sorted(self._apt_files.items()): for p, pkg in sorted(self._apt_files.items()):
if case_insensitive:
flags = re.I
else:
flags = 0
if regex: if regex:
if re.match(path, p): if re.match(path, p, flags):
yield pkg
elif case_insensitive:
if path.lower() == p.lower():
yield pkg yield pkg
else: else:
if path == p: if path == p:
@ -97,16 +105,15 @@ blah (0.1) UNRELEASED; urgency=medium
session = PlainSession() session = PlainSession()
apt = AptManager(session) apt = AptManager(session)
apt._searchers = [DummyAptSearcher(self._apt_files)] apt._searchers = [DummyAptSearcher(self._apt_files)]
context = BuildDependencyContext( context = DebianPackagingContext(
("build", ),
self.tree, self.tree,
apt,
subpath="", subpath="",
committer="ognibuild <ognibuild@jelmer.uk>", committer="ognibuild <ognibuild@jelmer.uk>",
update_changelog=True, update_changelog=True,
commit_reporter=NullCommitReporter(),
) )
fixers = versioned_package_fixers(session) + apt_fixers(apt) fixers = versioned_package_fixers(session, context, apt) + apt_fixers(apt, context)
return resolve_error(error, context, fixers) return resolve_error(error, ("build",), fixers)
def get_build_deps(self): def get_build_deps(self):
with open(self.tree.abspath("debian/control"), "r") as f: with open(self.tree.abspath("debian/control"), "r") as f:

View file

@ -28,9 +28,9 @@ from buildlog_consultant.sbuild import (
from . import DetailedFailure from . import DetailedFailure
def export_vcs_tree(tree, directory): def export_vcs_tree(tree, directory, subpath=""):
try: try:
export(tree, directory, "dir", None) export(tree, directory, "dir", None, subdir=(subpath or None))
except OSError as e: except OSError as e:
if e.errno == errno.ENOSPC: if e.errno == errno.ENOSPC:
raise DetailedFailure(1, ["export"], NoSpaceOnDevice()) raise DetailedFailure(1, ["export"], NoSpaceOnDevice())

View file

@ -7,3 +7,8 @@ update_version {
match: "^ version=\"(.*)\",$" match: "^ version=\"(.*)\",$"
new_line: " version=\"$VERSION\"," new_line: " version=\"$VERSION\","
} }
update_version {
path: "ognibuild/__init__.py"
match: "^__version__ = \\((.*)\\)$"
new_line: "__version__ = $TUPLED_VERSION"
}

View file

@ -2,8 +2,12 @@
banned-modules = silver-platter = Should not use silver-platter banned-modules = silver-platter = Should not use silver-platter
[mypy] [mypy]
# A number of ognibuilds' dependencies don't have type hints yet
ignore_missing_imports = True ignore_missing_imports = True
[bdist_wheel] [bdist_wheel]
universal = 1 universal = 1
[egg_info]
tag_build =
tag_date = 0

View file

@ -6,7 +6,7 @@ from setuptools import setup
setup(name="ognibuild", setup(name="ognibuild",
description="Detect and run any build system", description="Detect and run any build system",
version="0.0.2", version="0.0.3",
maintainer="Jelmer Vernooij", maintainer="Jelmer Vernooij",
maintainer_email="jelmer@jelmer.uk", maintainer_email="jelmer@jelmer.uk",
license="GNU GPLv2 or later", license="GNU GPLv2 or later",
@ -29,12 +29,12 @@ setup(name="ognibuild",
}, },
install_requires=[ install_requires=[
'breezy', 'breezy',
'buildlog-consultant', 'buildlog-consultant>=0.0.4',
'requirements-parser', 'requirements-parser',
], ],
extras_require={ extras_require={
'debian': ['debmutate', 'python_debian', 'python_apt'], 'debian': ['debmutate', 'python_debian', 'python_apt'],
}, },
tests_require=['python_debian', 'buildlog-consultant', 'breezy'], tests_require=['python_debian', 'buildlog-consultant', 'breezy', 'testtools'],
test_suite='ognibuild.tests.test_suite', test_suite='ognibuild.tests.test_suite',
) )