New upstream snapshot.

This commit is contained in:
Jelmer Vernooij 2021-02-28 14:50:15 +00:00
commit 7964046324
No known key found for this signature in database
GPG key ID: 579C160D4C9E23E8
39 changed files with 4683 additions and 259 deletions

5
.flake8 Normal file
View file

@ -0,0 +1,5 @@
[flake8]
extend-ignore = E203, E266, E501, W293, W291
max-line-length = 88
max-complexity = 18
select = B,C,E,F,W,T4,B9

View file

@ -8,8 +8,8 @@ jobs:
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
strategy: strategy:
matrix: matrix:
os: [ubuntu-latest, macos-latest, windows-latest] os: [ubuntu-latest, macos-latest]
python-version: [3.7, 3.8, pypy3] python-version: [3.7, 3.8]
fail-fast: false fail-fast: false
steps: steps:
@ -20,8 +20,11 @@ jobs:
python-version: ${{ matrix.python-version }} python-version: ${{ matrix.python-version }}
- name: Install dependencies - name: Install dependencies
run: | run: |
python -m pip install --upgrade pip flake8 python -m pip install --upgrade pip flake8 cython
python -m pip install git+https://github.com/jelmer/buildlog-consultant
python setup.py develop python setup.py develop
mkdir -p ~/.config/breezy/plugins
brz branch lp:brz-debian ~/.config/breezy/plugins/debian
- name: Style checks - name: Style checks
run: | run: |
python -m flake8 python -m flake8

4
.gitignore vendored
View file

@ -3,3 +3,7 @@ build
ognibuild.egg-info ognibuild.egg-info
dist dist
__pycache__ __pycache__
.eggs
*.swp
*.swo
*.swn

1
AUTHORS Normal file
View file

@ -0,0 +1 @@
Jelmer Vernooij <jelmer@jelmer.uk>

76
CODE_OF_CONDUCT.md Normal file
View file

@ -0,0 +1,76 @@
# Contributor Covenant Code of Conduct
## Our Pledge
In the interest of fostering an open and welcoming environment, we as
contributors and maintainers pledge to making participation in our project and
our community a harassment-free experience for everyone, regardless of age, body
size, disability, ethnicity, sex characteristics, gender identity and expression,
level of experience, education, socio-economic status, nationality, personal
appearance, race, religion, or sexual identity and orientation.
## Our Standards
Examples of behavior that contributes to creating a positive environment
include:
* Using welcoming and inclusive language
* Being respectful of differing viewpoints and experiences
* Gracefully accepting constructive criticism
* Focusing on what is best for the community
* Showing empathy towards other community members
Examples of unacceptable behavior by participants include:
* The use of sexualized language or imagery and unwelcome sexual attention or
advances
* Trolling, insulting/derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or electronic
address, without explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Our Responsibilities
Project maintainers are responsible for clarifying the standards of acceptable
behavior and are expected to take appropriate and fair corrective action in
response to any instances of unacceptable behavior.
Project maintainers have the right and responsibility to remove, edit, or
reject comments, commits, code, wiki edits, issues, and other contributions
that are not aligned to this Code of Conduct, or to ban temporarily or
permanently any contributor for other behaviors that they deem inappropriate,
threatening, offensive, or harmful.
## Scope
This Code of Conduct applies both within project spaces and in public spaces
when an individual is representing the project or its community. Examples of
representing a project or community include using an official project e-mail
address, posting via an official social media account, or acting as an appointed
representative at an online or offline event. Representation of a project may be
further defined and clarified by project maintainers.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported by contacting the project lead at jelmer@jelmer.uk. All
complaints will be reviewed and investigated and will result in a response that
is deemed necessary and appropriate to the circumstances. The project team is
obligated to maintain confidentiality with regard to the reporter of an incident.
Further details of specific enforcement policies may be posted separately.
Project maintainers who do not follow or enforce the Code of Conduct in good
faith may face temporary or permanent repercussions as determined by other
members of the project's leadership.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4,
available at https://www.contributor-covenant.org/version/1/4/code-of-conduct.html
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see
https://www.contributor-covenant.org/faq

View file

@ -31,6 +31,9 @@ Ognibuild has a number of subcommands:
* ``ogni install`` - install the package * ``ogni install`` - install the package
* ``ogni test`` - run the testsuite in the source directory * ``ogni test`` - run the testsuite in the source directory
It also includes a subcommand that can fix up the build dependencies
for Debian packages, called deb-fix-build.
License License
------- -------

10
SECURITY.md Normal file
View file

@ -0,0 +1,10 @@
# Security Policy
## Supported Versions
ognibuild is still under heavy development. Only the latest version is security
supported.
## Reporting a Vulnerability
Please report security issues by e-mail to jelmer@jelmer.uk, ideally PGP encrypted to the key at https://jelmer.uk/D729A457.asc

2
TODO Normal file
View file

@ -0,0 +1,2 @@
- Need to be able to check up front whether a requirement is satisfied, before attempting to install it (which is more expensive)
- Cache parsed Contents files during test suite runs and/or speed up reading

6
debian/changelog vendored
View file

@ -1,3 +1,9 @@
ognibuild (0.0.1~git20210228.2528295-1) UNRELEASED; urgency=medium
* New upstream snapshot.
-- Jelmer Vernooij <jelmer@debian.org> Sun, 28 Feb 2021 14:49:57 +0000
ognibuild (0.0.1~git20201031.4cbc8df-1) unstable; urgency=low ognibuild (0.0.1~git20201031.4cbc8df-1) unstable; urgency=low
* Initial release. Closes: #981913 * Initial release. Closes: #981913

51
notes/architecture.md Normal file
View file

@ -0,0 +1,51 @@
Upstream requirements are expressed as objects derived from UpstreamRequirement.
They can either be:
* extracted from the build system
* extracted from errors in build logs
The details of UpstreamRequirements are specific to the kind of requirement,
and otherwise opaque to ognibuild.
When building a package, we first make sure that all declared upstream
requirements are met.
Then we attempt to build.
If any Problems are found in the log, buildlog-consultant will report them.
ognibuild can then invoke "fixers" to address Problems. Fixers can do things
like e.g. upgrade configure.ac to a newer version, or invoke autoreconf.
A list of possible fixers can be provided. Each fixer will be called
(in order) until one of them claims to ahve fixed the issue.
Problems can be converted to UpstreamRequirements by UpstreamRequirementFixer
UpstreamRequirementFixer uses a UpstreamRequirementResolver object that
can translate UpstreamRequirement objects into apt package names or
e.g. cpan commands.
ognibuild keeps finding problems, resolving them and rebuilding until it finds
a problem it can not resolve or that it thinks it has already resolved
(i.e. seen before).
Operations are run in a Session - this can represent a virtualized
environment of some sort (e.g. a chroot or virtualenv) or simply
on the host machine.
For e.g. PerlModuleRequirement, need to be able to:
* install from apt package
+ DebianInstallFixer(AptResolver()).fix(problem)
* update debian package (source, runtime, test) deps to include apt package
+ DebianPackageDepFixer(AptResolver()).fix(problem, ('test', 'foo'))
* suggest command to run to install from apt package
+ DebianInstallFixer(AptResolver()).command(problem)
* install from cpan
+ CpanInstallFixer().fix(problem)
* suggest command to run to install from cpan package
+ CpanInstallFixer().command(problem)
* update source package reqs to depend on perl module
+ PerlDepFixer().fix(problem)

44
notes/roadmap.md Normal file
View file

@ -0,0 +1,44 @@
class UpstreamRequirement(object):
family: str
class PythonPackageRequirement(UpstreamRequirement):
package: str
SetupPy.get_build_requirements() yields some PythonPackageRequirement objects
apt_resolver.install([PythonPackageRequirement(...)]) then:
* needs to translate to apt package name
Once we find errors during build, buildlog consultant extracts them ("MissingPythonPackage", "configure.ac needs updating").
fix_build then takes the problem found and converts it to an action:
* modifying some of the source files
* resolving requirements
Resolving requirements dependencies means creating e.g. a PythonPackageRequirement() object and feeding it to resolver.install()
we have specific handlers for each kind of thingy
resolver.install() needs to translate the upstream information to an apt name or a cpan name or update dependencies or raise an exception or..
MissingPythonPackage() -> PythonPackageRequirement()
PythonPackageRequirement() can either:
* directly provide apt names, if they are known
* look up apt names
We specifically want to support multiple resolvers. In some cases a resolver can't deal with a particular kind of requirement.
Who is responsible for taking a PythonPackageRequirement and translating it to an apt package name?
1) PythonPackageRequirement itself? That would mean knowledge about package naming etc, is with the requirement object, which seems wrong.
2) PythonPackageRequirement.apt_name(apt_archive) - i.e. find the package name given an archive object of some sort
3) The apt resolver has a list of callbacks to map requirements to apt package names

View file

@ -18,210 +18,55 @@
import os import os
import stat import stat
import subprocess
import sys
from typing import List
DEFAULT_PYTHON = 'python3' class DetailedFailure(Exception):
def __init__(self, retcode, argv, error):
self.retcode = retcode
self.argv = argv
self.error = error
class UnidentifiedError(Exception): class UnidentifiedError(Exception):
def __init__(self, retcode, argv, lines): def __init__(self, retcode, argv, lines, secondary=None):
self.retcode = retcode self.retcode = retcode
self.argv = argv self.argv = argv
self.lines = lines self.lines = lines
self.secondary = secondary
class NoBuildToolsFound(Exception):
"""No supported build tools were found."""
def shebang_binary(p): def shebang_binary(p):
if not (os.stat(p).st_mode & stat.S_IEXEC): if not (os.stat(p).st_mode & stat.S_IEXEC):
return None return None
with open(p, 'rb') as f: with open(p, "rb") as f:
firstline = f.readline() firstline = f.readline()
if not firstline.startswith(b'#!'): if not firstline.startswith(b"#!"):
return None return None
args = firstline[2:].split(b' ') args = firstline[2:].split(b" ")
if args[0] in (b'/usr/bin/env', b'env'): if args[0] in (b"/usr/bin/env", b"env"):
return os.path.basename(args[1].decode()) return os.path.basename(args[1].decode()).strip()
return os.path.basename(args[0].decode()) return os.path.basename(args[0].decode()).strip()
def note(m): class UpstreamRequirement(object):
sys.stdout.write('%s\n' % m)
# Name of the family of requirements - e.g. "python-package"
family: str
def __init__(self, family):
self.family = family
def met(self, session):
raise NotImplementedError(self)
def warning(m): class UpstreamOutput(object):
sys.stderr.write('WARNING: %s\n' % m)
def __init__(self, family, name):
self.family = family
self.name = name
def run_with_tee(session, args: List[str], **kwargs): def __repr__(self):
p = session.Popen( return "%s(%r, %r)" % (type(self).__name__, self.family, self.name)
args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kwargs)
contents = []
while p.poll() is None:
line = p.stdout.readline()
sys.stdout.buffer.write(line)
sys.stdout.buffer.flush()
contents.append(line.decode('utf-8', 'surrogateescape'))
return p.returncode, contents
def run_apt(session, args: List[str]) -> None:
args = ['apt', '-y'] + args
retcode, lines = run_with_tee(session, args, cwd='/', user='root')
if retcode == 0:
return
raise UnidentifiedError(retcode, args, lines)
def apt_install(session, packages: List[str]) -> None:
run_apt(session, ['install'] + packages)
def run_with_build_fixer(session, args):
session.check_call(args)
def run_dist(session):
# TODO(jelmer): Check $PATH rather than hardcoding?
if not os.path.exists('/usr/bin/git'):
apt_install(session, ['git'])
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
if os.path.exists('package.xml'):
apt_install(session, ['php-pear', 'php-horde-core'])
note('Found package.xml, assuming pear package.')
session.check_call(['pear', 'package'])
return
if os.path.exists('pyproject.toml'):
import toml
with open('pyproject.toml', 'r') as pf:
pyproject = toml.load(pf)
if 'poetry' in pyproject.get('tool', []):
note('Found pyproject.toml with poetry section, '
'assuming poetry project.')
apt_install(session, ['python3-venv', 'python3-pip'])
session.check_call(['pip3', 'install', 'poetry'], user='root')
session.check_call(['poetry', 'build', '-f', 'sdist'])
return
if os.path.exists('setup.py'):
note('Found setup.py, assuming python project.')
apt_install(session, ['python3', 'python3-pip'])
with open('setup.py', 'r') as f:
setup_py_contents = f.read()
try:
with open('setup.cfg', 'r') as f:
setup_cfg_contents = f.read()
except FileNotFoundError:
setup_cfg_contents = ''
if 'setuptools' in setup_py_contents:
note('Reference to setuptools found, installing.')
apt_install(session, ['python3-setuptools'])
if ('setuptools_scm' in setup_py_contents or
'setuptools_scm' in setup_cfg_contents):
note('Reference to setuptools-scm found, installing.')
apt_install(
session, ['python3-setuptools-scm', 'git', 'mercurial'])
# TODO(jelmer): Install setup_requires
interpreter = shebang_binary('setup.py')
if interpreter is not None:
if interpreter == 'python2' or interpreter.startswith('python2.'):
apt_install(session, [interpreter])
elif (interpreter == 'python3' or
interpreter.startswith('python3.')):
apt_install(session, [interpreter])
else:
apt_install(session, [DEFAULT_PYTHON])
run_with_build_fixer(session, ['./setup.py', 'sdist'])
else:
# Just assume it's Python 3
apt_install(session, ['python3'])
run_with_build_fixer(session, ['python3', './setup.py', 'sdist'])
return
if os.path.exists('setup.cfg'):
note('Found setup.cfg, assuming python project.')
apt_install(session, ['python3-pep517', 'python3-pip'])
session.check_call(['python3', '-m', 'pep517.build', '-s', '.'])
return
if os.path.exists('dist.ini') and not os.path.exists('Makefile.PL'):
apt_install(session, ['libdist-inkt-perl'])
with open('dist.ini', 'rb') as f:
for line in f:
if not line.startswith(b';;'):
continue
try:
(key, value) = line[2:].split(b'=', 1)
except ValueError:
continue
if (key.strip() == b'class' and
value.strip().startswith(b"'Dist::Inkt")):
note('Found Dist::Inkt section in dist.ini, '
'assuming distinkt.')
# TODO(jelmer): install via apt if possible
session.check_call(
['cpan', 'install', value.decode().strip("'")],
user='root')
run_with_build_fixer(session, ['distinkt-dist'])
return
# Default to invoking Dist::Zilla
note('Found dist.ini, assuming dist-zilla.')
apt_install(session, ['libdist-zilla-perl'])
run_with_build_fixer(session, ['dzil', 'build', '--in', '..'])
return
if os.path.exists('package.json'):
apt_install(session, ['npm'])
run_with_build_fixer(session, ['npm', 'pack'])
return
gemfiles = [name for name in os.listdir('.') if name.endswith('.gem')]
if gemfiles:
apt_install(session, ['gem2deb'])
if len(gemfiles) > 1:
warning('More than one gemfile. Trying the first?')
run_with_build_fixer(session, ['gem2tgz', gemfiles[0]])
return
if os.path.exists('waf'):
apt_install(session, ['python3'])
run_with_build_fixer(session, ['./waf', 'dist'])
return
if os.path.exists('Makefile.PL') and not os.path.exists('Makefile'):
apt_install(session, ['perl'])
run_with_build_fixer(session, ['perl', 'Makefile.PL'])
if not os.path.exists('Makefile') and not os.path.exists('configure'):
if os.path.exists('autogen.sh'):
if shebang_binary('autogen.sh') is None:
run_with_build_fixer(session, ['/bin/sh', './autogen.sh'])
else:
run_with_build_fixer(session, ['./autogen.sh'])
elif os.path.exists('configure.ac') or os.path.exists('configure.in'):
apt_install(session, [
'autoconf', 'automake', 'gettext', 'libtool', 'gnu-standards'])
run_with_build_fixer(session, ['autoreconf', '-i'])
if not os.path.exists('Makefile') and os.path.exists('configure'):
session.check_call(['./configure'])
if os.path.exists('Makefile'):
apt_install(session, ['make'])
run_with_build_fixer(session, ['make', 'dist'])
raise NoBuildToolsFound()

View file

@ -15,34 +15,161 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
import os import os
import sys import sys
from . import run_dist, NoBuildToolsFound, note from . import UnidentifiedError
from .buildsystem import NoBuildToolsFound, detect_buildsystems
from .resolver import (
auto_resolver,
native_resolvers,
UnsatisfiedRequirements,
)
from .resolver.apt import AptResolver
def main(): def get_necessary_declared_requirements(resolver, requirements, stages):
missing = []
for stage, req in requirements:
if stage in stages:
missing.append(req)
return missing
def install_necessary_declared_requirements(resolver, buildsystem, stages):
missing = []
try:
declared_reqs = buildsystem.get_declared_dependencies()
except NotImplementedError:
logging.warning(
'Unable to determine declared dependencies from %s', buildsystem)
else:
missing.extend(
get_necessary_declared_requirements(
resolver, declared_reqs, stages
)
)
resolver.install(missing)
STAGE_MAP = {
"dist": [],
"info": [],
"install": ["build"],
"test": ["test", "dev"],
"build": ["build"],
"clean": [],
}
def determine_fixers(session, resolver):
from .buildlog import UpstreamRequirementFixer
from .resolver.apt import AptResolver
return [UpstreamRequirementFixer(resolver)]
def main(): # noqa: C901
import argparse import argparse
parser = argparse.ArgumentParser() parser = argparse.ArgumentParser()
parser.add_argument('subcommand', type=str, choices=['dist'])
parser.add_argument( parser.add_argument(
'--directory', '-d', type=str, help='Directory for project.', "--directory", "-d", type=str, help="Directory for project.", default="."
default='.') )
parser.add_argument("--schroot", type=str, help="schroot to run in.")
parser.add_argument( parser.add_argument(
'--schroot', type=str, help='schroot to run in.') "--resolve",
choices=["apt", "native", "auto"],
default="auto",
help="What to do about missing dependencies",
)
parser.add_argument(
"--explain",
action='store_true',
help="Explain what needs to be done rather than making changes")
parser.add_argument(
"--ignore-declared-dependencies",
"--optimistic",
action="store_true",
help="Ignore declared dependencies, follow build errors only",
)
parser.add_argument(
"--verbose",
action="store_true",
help="Be verbose")
subparsers = parser.add_subparsers(dest='subcommand')
subparsers.add_parser('dist')
subparsers.add_parser('build')
subparsers.add_parser('clean')
subparsers.add_parser('test')
subparsers.add_parser('info')
install_parser = subparsers.add_parser('install')
install_parser.add_argument(
'--user', action='store_true', help='Install in local-user directories.')
args = parser.parse_args() args = parser.parse_args()
if not args.subcommand:
parser.print_usage()
return 1
if args.verbose:
logging.basicConfig(level=logging.DEBUG)
else:
logging.basicConfig(level=logging.INFO)
if args.schroot: if args.schroot:
from .session.schroot import SchrootSession from .session.schroot import SchrootSession
session = SchrootSession(args.schroot) session = SchrootSession(args.schroot)
else: else:
from .session.plain import PlainSession from .session.plain import PlainSession
session = PlainSession() session = PlainSession()
with session: with session:
if args.resolve == "apt":
resolver = AptResolver.from_session(session)
elif args.resolve == "native":
resolver = native_resolvers(session)
elif args.resolve == "auto":
resolver = auto_resolver(session)
logging.info('Using requirement resolver: %s', resolver)
os.chdir(args.directory) os.chdir(args.directory)
try: try:
if args.subcommand == 'dist': bss = list(detect_buildsystems(args.directory))
run_dist(session) logging.info('Detected buildsystems: %r', bss)
if not args.ignore_declared_dependencies and not args.explain:
stages = STAGE_MAP[args.subcommand]
if stages:
for bs in bss:
install_necessary_declared_requirements(resolver, bs, stages)
fixers = determine_fixers(session, resolver)
if args.subcommand == "dist":
from .dist import run_dist
run_dist(
session=session, buildsystems=bss, resolver=resolver,
fixers=fixers)
if args.subcommand == "build":
from .build import run_build
run_build(
session, buildsystems=bss, resolver=resolver,
fixers=fixers)
if args.subcommand == "clean":
from .clean import run_clean
run_clean(
session, buildsystems=bss, resolver=resolver,
fixers=fixers)
if args.subcommand == "install":
from .install import run_install
run_install(
session, buildsystems=bss, resolver=resolver,
fixers=fixers, user=args.user)
if args.subcommand == "test":
from .test import run_test
run_test(session, buildsystems=bss, resolver=resolver,
fixers=fixers)
if args.subcommand == "info":
from .info import run_info
run_info(session, buildsystems=bss)
except UnidentifiedError:
return 1
except NoBuildToolsFound: except NoBuildToolsFound:
note('No build tools found.') logging.info("No build tools found.")
return 1 return 1
return 0 return 0

30
ognibuild/build.py Normal file
View file

@ -0,0 +1,30 @@
#!/usr/bin/python3
# Copyright (C) 2020-2021 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from .buildsystem import NoBuildToolsFound
def run_build(session, buildsystems, resolver, fixers):
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
for buildsystem in buildsystems:
buildsystem.build(session, resolver, fixers)
return
raise NoBuildToolsFound()

206
ognibuild/buildlog.py Normal file
View file

@ -0,0 +1,206 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
"""Convert problems found in the buildlog to upstream requirements.
"""
import logging
from buildlog_consultant.common import (
MissingConfigStatusInput,
MissingPythonModule,
MissingPythonDistribution,
MissingCHeader,
MissingPkgConfig,
MissingCommand,
MissingFile,
MissingJavaScriptRuntime,
MissingSprocketsFile,
MissingGoPackage,
MissingPerlFile,
MissingPerlModule,
MissingXmlEntity,
MissingJDKFile,
MissingNodeModule,
MissingPhpClass,
MissingRubyGem,
MissingLibrary,
MissingJavaClass,
MissingCSharpCompiler,
MissingConfigure,
MissingAutomakeInput,
MissingRPackage,
MissingRubyFile,
MissingAutoconfMacro,
MissingValaPackage,
MissingXfceDependency,
MissingHaskellDependencies,
NeedPgBuildExtUpdateControl,
DhAddonLoadFailure,
MissingMavenArtifacts,
GnomeCommonMissing,
MissingGnomeCommonDependency,
)
from .fix_build import BuildFixer
from .requirements import (
BinaryRequirement,
PathRequirement,
PkgConfigRequirement,
CHeaderRequirement,
JavaScriptRuntimeRequirement,
ValaPackageRequirement,
RubyGemRequirement,
GoPackageRequirement,
DhAddonRequirement,
PhpClassRequirement,
RPackageRequirement,
NodePackageRequirement,
LibraryRequirement,
RubyFileRequirement,
XmlEntityRequirement,
SprocketsFileRequirement,
JavaClassRequirement,
HaskellPackageRequirement,
MavenArtifactRequirement,
GnomeCommonRequirement,
JDKFileRequirement,
PerlModuleRequirement,
PerlFileRequirement,
AutoconfMacroRequirement,
PythonModuleRequirement,
PythonPackageRequirement,
)
def problem_to_upstream_requirement(problem):
if isinstance(problem, MissingFile):
return PathRequirement(problem.path)
elif isinstance(problem, MissingCommand):
return BinaryRequirement(problem.command)
elif isinstance(problem, MissingPkgConfig):
return PkgConfigRequirement(
problem.module, problem.minimum_version)
elif isinstance(problem, MissingCHeader):
return CHeaderRequirement(problem.header)
elif isinstance(problem, MissingJavaScriptRuntime):
return JavaScriptRuntimeRequirement()
elif isinstance(problem, MissingRubyGem):
return RubyGemRequirement(problem.gem, problem.version)
elif isinstance(problem, MissingValaPackage):
return ValaPackageRequirement(problem.package)
elif isinstance(problem, MissingGoPackage):
return GoPackageRequirement(problem.package)
elif isinstance(problem, DhAddonLoadFailure):
return DhAddonRequirement(problem.path)
elif isinstance(problem, MissingPhpClass):
return PhpClassRequirement(problem.php_class)
elif isinstance(problem, MissingRPackage):
return RPackageRequirement(problem.package, problem.minimum_version)
elif isinstance(problem, MissingNodeModule):
return NodePackageRequirement(problem.module)
elif isinstance(problem, MissingLibrary):
return LibraryRequirement(problem.library)
elif isinstance(problem, MissingRubyFile):
return RubyFileRequirement(problem.filename)
elif isinstance(problem, MissingXmlEntity):
return XmlEntityRequirement(problem.url)
elif isinstance(problem, MissingSprocketsFile):
return SprocketsFileRequirement(problem.content_type, problem.name)
elif isinstance(problem, MissingJavaClass):
return JavaClassRequirement(problem.classname)
elif isinstance(problem, MissingHaskellDependencies):
return [HaskellPackageRequirement(dep) for dep in problem.deps]
elif isinstance(problem, MissingMavenArtifacts):
return [MavenArtifactRequirement(artifact)
for artifact in problem.artifacts]
elif isinstance(problem, MissingCSharpCompiler):
return BinaryRequirement('msc')
elif isinstance(problem, GnomeCommonMissing):
return GnomeCommonRequirement()
elif isinstance(problem, MissingJDKFile):
return JDKFileRequirement(problem.jdk_path, problem.filename)
elif isinstance(problem, MissingGnomeCommonDependency):
if problem.package == "glib-gettext":
return BinaryRequirement('glib-gettextize')
else:
logging.warning(
"No known command for gnome-common dependency %s",
problem.package)
return None
elif isinstance(problem, MissingXfceDependency):
if problem.package == "gtk-doc":
return BinaryRequirement("gtkdocize")
else:
logging.warning(
"No known command for xfce dependency %s",
problem.package)
return None
elif isinstance(problem, MissingPerlModule):
return PerlModuleRequirement(
module=problem.module,
filename=problem.filename,
inc=problem.inc)
elif isinstance(problem, MissingPerlFile):
return PerlFileRequirement(filename=problem.filename)
elif isinstance(problem, MissingAutoconfMacro):
return AutoconfMacroRequirement(problem.macro)
elif isinstance(problem, MissingPythonModule):
return PythonModuleRequirement(
problem.module,
python_version=problem.python_version,
minimum_version=problem.minimum_version)
elif isinstance(problem, MissingPythonDistribution):
return PythonPackageRequirement(
problem.module,
python_version=problem.python_version,
minimum_version=problem.minimum_version)
else:
return None
class UpstreamRequirementFixer(BuildFixer):
def __init__(self, resolver):
self.resolver = resolver
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.resolver)
def __str__(self):
return "upstream requirement fixer(%s)" % self.resolver
def can_fix(self, error):
req = problem_to_upstream_requirement(error)
return req is not None
def fix(self, error, context):
reqs = problem_to_upstream_requirement(error)
if reqs is None:
return False
if not isinstance(reqs, list):
reqs = [reqs]
changed = False
for req in reqs:
package = self.resolver.resolve(reqs)
if package is None:
return False
if context.add_dependency(package):
changed = True
return changed

633
ognibuild/buildsystem.py Normal file
View file

@ -0,0 +1,633 @@
#!/usr/bin/python
# Copyright (C) 2019-2020 Jelmer Vernooij <jelmer@jelmer.uk>
# encoding: utf-8
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
import os
import re
from typing import Optional
import warnings
from . import shebang_binary, UpstreamOutput, UnidentifiedError
from .requirements import (
BinaryRequirement,
PythonPackageRequirement,
PerlModuleRequirement,
NodePackageRequirement,
CargoCrateRequirement,
)
from .fix_build import run_with_build_fixers
class NoBuildToolsFound(Exception):
"""No supported build tools were found."""
class InstallTarget(object):
# Whether to prefer user-specific installation
user: Optional[bool]
# TODO(jelmer): Add information about target directory, layout, etc.
class BuildSystem(object):
"""A particular buildsystem."""
name: str
def __str__(self):
return self.name
def dist(self, session, resolver, fixers):
raise NotImplementedError(self.dist)
def test(self, session, resolver, fixers):
raise NotImplementedError(self.test)
def build(self, session, resolver, fixers):
raise NotImplementedError(self.build)
def clean(self, session, resolver, fixers):
raise NotImplementedError(self.clean)
def install(self, session, resolver, fixers, install_target):
raise NotImplementedError(self.install)
def get_declared_dependencies(self):
raise NotImplementedError(self.get_declared_dependencies)
def get_declared_outputs(self):
raise NotImplementedError(self.get_declared_outputs)
class Pear(BuildSystem):
name = "pear"
def __init__(self, path):
self.path = path
def setup(self, resolver):
resolver.install([BinaryRequirement("pear")])
def dist(self, session, resolver, fixers):
self.setup(resolver)
run_with_build_fixers(session, ["pear", "package"], fixers)
def test(self, session, resolver, fixers):
self.setup(resolver)
run_with_build_fixers(session, ["pear", "run-tests"], fixers)
def build(self, session, resolver, fixers):
self.setup(resolver)
run_with_build_fixers(session, ["pear", "build", self.path], fixers)
def clean(self, session, resolver, fixers):
self.setup(resolver)
# TODO
def install(self, session, resolver, fixers, install_target):
self.setup(resolver)
run_with_build_fixers(session, ["pear", "install", self.path], fixers)
class SetupPy(BuildSystem):
name = "setup.py"
def __init__(self, path):
self.path = path
from distutils.core import run_setup
self.result = run_setup(os.path.abspath(path), stop_after="init")
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.path)
def setup(self, resolver):
with open(self.path, "r") as f:
setup_py_contents = f.read()
try:
with open("setup.cfg", "r") as f:
setup_cfg_contents = f.read()
except FileNotFoundError:
setup_cfg_contents = ""
if "setuptools" in setup_py_contents:
logging.debug("Reference to setuptools found, installing.")
resolver.install([PythonPackageRequirement("setuptools")])
if (
"setuptools_scm" in setup_py_contents
or "setuptools_scm" in setup_cfg_contents
):
logging.debug("Reference to setuptools-scm found, installing.")
resolver.install(
[
PythonPackageRequirement("setuptools-scm"),
BinaryRequirement("git"),
BinaryRequirement("mercurial"),
]
)
# TODO(jelmer): Install setup_requires
def test(self, session, resolver, fixers):
self.setup(resolver)
self._run_setup(session, resolver, ["test"], fixers)
def build(self, session, resolver, fixers):
self.setup(resolver)
self._run_setup(session, resolver, ["build"], fixers)
def dist(self, session, resolver, fixers):
self.setup(resolver)
self._run_setup(session, resolver, ["sdist"], fixers)
def clean(self, session, resolver, fixers):
self.setup(resolver)
self._run_setup(session, resolver, ["clean"], fixers)
def install(self, session, resolver, fixers, install_target):
self.setup(resolver)
extra_args = []
if install_target.user:
extra_args.append('--user')
self._run_setup(session, resolver, ["install"] + extra_args, fixers)
def _run_setup(self, session, resolver, args, fixers):
interpreter = shebang_binary("setup.py")
if interpreter is not None:
resolver.install([BinaryRequirement(interpreter)])
run_with_build_fixers(session, ["./setup.py"] + args, fixers)
else:
# Just assume it's Python 3
resolver.install([BinaryRequirement("python3")])
run_with_build_fixers(
session, ["python3", "./setup.py"] + args,
fixers)
def get_declared_dependencies(self):
for require in self.result.get_requires():
yield "build", PythonPackageRequirement(require)
# Not present for distutils-only packages
if getattr(self.result, 'install_requires', []):
for require in self.result.install_requires:
yield "install", PythonPackageRequirement(require)
# Not present for distutils-only packages
if getattr(self.result, 'tests_require', []):
for require in self.result.tests_require:
yield "test", PythonPackageRequirement(require)
def get_declared_outputs(self):
for script in self.result.scripts or []:
yield UpstreamOutput("binary", os.path.basename(script))
entry_points = getattr(self.result, 'entry_points', None) or {}
for script in entry_points.get("console_scripts", []):
yield UpstreamOutput("binary", script.split("=")[0])
for package in self.result.packages or []:
yield UpstreamOutput("python3", package)
class PyProject(BuildSystem):
name = "pyproject"
def __init__(self, path):
self.path = path
self.pyproject = self.load_toml()
def load_toml(self):
import toml
with open(self.path, "r") as pf:
return toml.load(pf)
def dist(self, session, resolver, fixers):
if "poetry" in self.pyproject.get("tool", []):
logging.debug(
"Found pyproject.toml with poetry section, "
"assuming poetry project."
)
resolver.install(
[
PythonPackageRequirement("venv"),
PythonPackageRequirement("poetry"),
]
)
session.check_call(["poetry", "build", "-f", "sdist"])
return
raise AssertionError("no supported section in pyproject.toml")
class SetupCfg(BuildSystem):
name = "setup.cfg"
def __init__(self, path):
self.path = path
def setup(self, resolver):
resolver.install(
[
PythonPackageRequirement("pep517"),
]
)
def dist(self, session, resolver, fixers):
self.setup(resolver)
session.check_call(["python3", "-m", "pep517.build", "-s", "."])
class Npm(BuildSystem):
name = "npm"
def __init__(self, path):
import json
with open(path, "r") as f:
self.package = json.load(f)
def get_declared_dependencies(self):
if "devDependencies" in self.package:
for name, unused_version in self.package["devDependencies"].items():
# TODO(jelmer): Look at version
yield "dev", NodePackageRequirement(name)
def setup(self, resolver):
resolver.install([BinaryRequirement("npm")])
def dist(self, session, resolver, fixers):
self.setup(resolver)
run_with_build_fixers(session, ["npm", "pack"], fixers)
class Waf(BuildSystem):
name = "waf"
def __init__(self, path):
self.path = path
def setup(self, session, resolver, fixers):
resolver.install([BinaryRequirement("python3")])
def dist(self, session, resolver, fixers):
self.setup(session, resolver, fixers)
run_with_build_fixers(session, ["./waf", "dist"], fixers)
def test(self, session, resolver, fixers):
self.setup(session, resolver, fixers)
run_with_build_fixers(session, ["./waf", "test"], fixers)
class Gem(BuildSystem):
name = "gem"
def __init__(self, path):
self.path = path
def setup(self, resolver):
resolver.install([BinaryRequirement("gem2deb")])
def dist(self, session, resolver, fixers):
self.setup(resolver)
gemfiles = [
entry.name for entry in session.scandir(".") if entry.name.endswith(".gem")
]
if len(gemfiles) > 1:
logging.warning("More than one gemfile. Trying the first?")
run_with_build_fixers(session, ["gem2tgz", gemfiles[0]], fixers)
class DistInkt(BuildSystem):
def __init__(self, path):
self.path = path
self.name = "dist-zilla"
self.dist_inkt_class = None
with open("dist.ini", "rb") as f:
for line in f:
if not line.startswith(b";;"):
continue
try:
(key, value) = line[2:].split(b"=", 1)
except ValueError:
continue
if key.strip() == b"class" and value.strip().startswith(b"'Dist::Inkt"):
logging.debug(
"Found Dist::Inkt section in dist.ini, "
"assuming distinkt."
)
self.name = "dist-inkt"
self.dist_inkt_class = value.decode().strip("'")
return
logging.debug("Found dist.ini, assuming dist-zilla.")
def setup(self, resolver):
resolver.install(
[
PerlModuleRequirement("Dist::Inkt"),
]
)
def dist(self, session, resolver, fixers):
self.setup(resolver)
if self.name == "dist-inkt":
resolver.install([PerlModuleRequirement(self.dist_inkt_class)])
run_with_build_fixers(session, ["distinkt-dist"], fixers)
else:
# Default to invoking Dist::Zilla
resolver.install([PerlModuleRequirement("Dist::Zilla")])
run_with_build_fixers(
session, ["dzil", "build", "--in", ".."], fixers)
class Make(BuildSystem):
name = "make"
def __repr__(self):
return "%s()" % type(self).__name__
def setup(self, session, resolver, fixers):
resolver.install([BinaryRequirement("make")])
if session.exists("Makefile.PL") and not session.exists("Makefile"):
resolver.install([BinaryRequirement("perl")])
run_with_build_fixers(session, ["perl", "Makefile.PL"], fixers)
if not session.exists("Makefile") and not session.exists("configure"):
if session.exists("autogen.sh"):
if shebang_binary("autogen.sh") is None:
run_with_build_fixers(
session, ["/bin/sh", "./autogen.sh"], fixers)
try:
run_with_build_fixers(
session, ["./autogen.sh"], fixers)
except UnidentifiedError as e:
if (
"Gnulib not yet bootstrapped; "
"run ./bootstrap instead.\n" in e.lines
):
run_with_build_fixers(
session, ["./bootstrap"], fixers)
run_with_build_fixers(
session, ["./autogen.sh"], fixers)
else:
raise
elif session.exists("configure.ac") or session.exists("configure.in"):
resolver.install(
[
BinaryRequirement("autoconf"),
BinaryRequirement("automake"),
BinaryRequirement("gettextize"),
BinaryRequirement("libtoolize"),
]
)
run_with_build_fixers(session, ["autoreconf", "-i"], fixers)
if not session.exists("Makefile") and session.exists("configure"):
session.check_call(["./configure"])
def build(self, session, resolver, fixers):
self.setup(session, resolver, fixers)
run_with_build_fixers(session, ["make", "all"], fixers)
def test(self, session, resolver, fixers):
self.setup(session, resolver, fixers)
run_with_build_fixers(session, ["make", "check"], fixers)
def install(self, session, resolver, fixers, install_target):
self.setup(session, resolver, fixers)
run_with_build_fixers(session, ["make", "install"], fixers)
def dist(self, session, resolver, fixers):
self.setup(session, resolver, fixers)
try:
run_with_build_fixers(session, ["make", "dist"], fixers)
except UnidentifiedError as e:
if "make: *** No rule to make target 'dist'. Stop.\n" in e.lines:
pass
elif "make[1]: *** No rule to make target 'dist'. Stop.\n" in e.lines:
pass
elif (
"Reconfigure the source tree "
"(via './config' or 'perl Configure'), please.\n"
) in e.lines:
run_with_build_fixers(session, ["./config"], fixers)
run_with_build_fixers(session, ["make", "dist"], fixers)
elif (
"Please try running 'make manifest' and then run "
"'make dist' again.\n" in e.lines
):
run_with_build_fixers(session, ["make", "manifest"], fixers)
run_with_build_fixers(session, ["make", "dist"], fixers)
elif "Please run ./configure first\n" in e.lines:
run_with_build_fixers(session, ["./configure"], fixers)
run_with_build_fixers(session, ["make", "dist"], fixers)
elif any(
[
re.match(
r"Makefile:[0-9]+: \*\*\* Missing \'Make.inc\' "
r"Run \'./configure \[options\]\' and retry. Stop.\n",
line,
)
for line in e.lines
]
):
run_with_build_fixers(session, ["./configure"], fixers)
run_with_build_fixers(session, ["make", "dist"], fixers)
elif any(
[
re.match(
r"Problem opening MANIFEST: No such file or directory "
r"at .* line [0-9]+\.",
line,
)
for line in e.lines
]
):
run_with_build_fixers(session, ["make", "manifest"], fixers)
run_with_build_fixers(session, ["make", "dist"], fixers)
else:
raise
else:
return
def get_declared_dependencies(self):
# TODO(jelmer): Split out the perl-specific stuff?
if os.path.exists("META.yml"):
# See http://module-build.sourceforge.net/META-spec-v1.4.html for
# the specification of the format.
import ruamel.yaml
import ruamel.yaml.reader
with open("META.yml", "rb") as f:
try:
data = ruamel.yaml.load(f, ruamel.yaml.SafeLoader)
except ruamel.yaml.reader.ReaderError as e:
warnings.warn("Unable to parse META.yml: %s" % e)
return
for require in data.get("requires", []):
yield "build", PerlModuleRequirement(require)
class Cargo(BuildSystem):
name = "cargo"
def __init__(self, path):
from toml.decoder import load
with open(path, "r") as f:
self.cargo = load(f)
def get_declared_dependencies(self):
if "dependencies" in self.cargo:
for name, details in self.cargo["dependencies"].items():
# TODO(jelmer): Look at details['features'], details['version']
yield "build", CargoCrateRequirement(name)
def test(self, session, resolver, fixers):
run_with_build_fixers(session, ["cargo", "test"], fixers)
class Golang(BuildSystem):
"""Go builds."""
name = "golang"
class Maven(BuildSystem):
name = "maven"
def __init__(self, path):
self.path = path
class Cabal(BuildSystem):
name = "cabal"
def __init__(self, path):
self.path = path
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.path)
def _run(self, session, args, fixers):
try:
run_with_build_fixers(
session, ["runhaskell", "Setup.hs"] + args, fixers)
except UnidentifiedError as e:
if "Run the 'configure' command first.\n" in e.lines:
run_with_build_fixers(
session, ["runhaskell", "Setup.hs", "configure"], fixers)
run_with_build_fixers(
session, ["runhaskell", "Setup.hs"] + args, fixers)
else:
raise
def test(self, session, resolver, fixers):
self._run(session, ["test"], fixers)
def detect_buildsystems(path, trust_package=False): # noqa: C901
"""Detect build systems."""
if os.path.exists(os.path.join(path, "package.xml")):
logging.debug("Found package.xml, assuming pear package.")
yield Pear("package.xml")
if os.path.exists(os.path.join(path, "setup.py")):
logging.debug("Found setup.py, assuming python project.")
yield SetupPy("setup.py")
elif os.path.exists(os.path.join(path, "pyproject.toml")):
logging.debug("Found pyproject.toml, assuming python project.")
yield PyProject("pyproject.toml")
elif os.path.exists(os.path.join(path, "setup.cfg")):
logging.debug("Found setup.cfg, assuming python project.")
yield SetupCfg("setup.cfg")
if os.path.exists(os.path.join(path, "package.json")):
logging.debug("Found package.json, assuming node package.")
yield Npm("package.json")
if os.path.exists(os.path.join(path, "waf")):
logging.debug("Found waf, assuming waf package.")
yield Waf("waf")
if os.path.exists(os.path.join(path, "Cargo.toml")):
logging.debug("Found Cargo.toml, assuming rust cargo package.")
yield Cargo("Cargo.toml")
if os.path.exists(os.path.join(path, 'Setup.hs')):
logging.debug("Found Setup.hs, assuming haskell package.")
yield Cabal('Setup.hs')
if os.path.exists(os.path.join(path, "pom.xml")):
logging.debug("Found pom.xml, assuming maven package.")
yield Maven("pom.xml")
if os.path.exists(os.path.join(path, "dist.ini")) and not os.path.exists(
os.path.join(path, "Makefile.PL")
):
yield DistInkt("dist.ini")
gemfiles = [entry.name for entry in os.scandir(path) if entry.name.endswith(".gem")]
if gemfiles:
yield Gem(gemfiles[0])
if any(
[
os.path.exists(os.path.join(path, p))
for p in [
"Makefile",
"Makefile.PL",
"autogen.sh",
"configure.ac",
"configure.in",
]
]
):
yield Make()
if os.path.exists(os.path.join(path, ".travis.yml")):
import ruamel.yaml.reader
with open(".travis.yml", "rb") as f:
try:
data = ruamel.yaml.load(f, ruamel.yaml.SafeLoader)
except ruamel.yaml.reader.ReaderError as e:
warnings.warn("Unable to parse .travis.yml: %s" % (e,))
else:
language = data.get("language")
if language == "go":
yield Golang()
for entry in os.scandir(path):
if entry.name.endswith(".go"):
yield Golang()
break
def get_buildsystem(path, trust_package=False):
for buildsystem in detect_buildsystems(path, trust_package=trust_package):
return buildsystem
raise NoBuildToolsFound()

30
ognibuild/clean.py Normal file
View file

@ -0,0 +1,30 @@
#!/usr/bin/python3
# Copyright (C) 2020-2021 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from .buildsystem import NoBuildToolsFound
def run_clean(session, buildsystems, resolver, fixers):
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
for buildsystem in buildsystems:
buildsystem.clean(session, resolver, fixers)
return
raise NoBuildToolsFound()

View file

@ -0,0 +1,40 @@
#!/usr/bin/python
# Copyright (C) 2018 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from debian.deb822 import Deb822
from ..session import Session
# TODO(jelmer): move this to debian/
def satisfy_build_deps(session: Session, tree):
source = Deb822(tree.get_file("debian/control"))
deps = []
for name in ["Build-Depends", "Build-Depends-Indep", "Build-Depends-Arch"]:
try:
deps.append(source[name].strip().strip(","))
except KeyError:
pass
for name in ["Build-Conflicts", "Build-Conflicts-Indep", "Build-Conflicts-Arch"]:
try:
deps.append("Conflicts: " + source[name])
except KeyError:
pass
deps = [dep.strip().strip(",") for dep in deps]
from .apt import AptManager
apt = AptManager(session)
apt.satisfy(deps)

302
ognibuild/debian/apt.py Normal file
View file

@ -0,0 +1,302 @@
#!/usr/bin/python
# Copyright (C) 2019-2020 Jelmer Vernooij <jelmer@jelmer.uk>
# encoding: utf-8
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
import re
from typing import List, Iterator, Optional, Set
import os
from buildlog_consultant.apt import (
find_apt_get_failure,
)
from debian.deb822 import Release
from .. import DetailedFailure, UnidentifiedError
from ..session import Session, run_with_tee
def run_apt(session: Session, args: List[str]) -> None:
"""Run apt."""
args = ["apt", "-y"] + args
retcode, lines = run_with_tee(session, args, cwd="/", user="root")
if retcode == 0:
return
match, error = find_apt_get_failure(lines)
if error is not None:
raise DetailedFailure(retcode, args, error)
if match is not None:
raise UnidentifiedError(retcode, args, lines, secondary=(match.lineno, match.line))
while lines and lines[-1] == "":
lines.pop(-1)
raise UnidentifiedError(retcode, args, lines)
class FileSearcher(object):
def search_files(self, path: str, regex: bool = False) -> Iterator[str]:
raise NotImplementedError(self.search_files)
class AptManager(object):
session: Session
_searchers: Optional[List[FileSearcher]]
def __init__(self, session):
self.session = session
self._apt_cache = None
self._searchers = None
def searchers(self):
if self._searchers is None:
self._searchers = [
RemoteAptContentsFileSearcher.from_session(self.session),
GENERATED_FILE_SEARCHER]
return self._searchers
def package_exists(self, package):
if self._apt_cache is None:
import apt
self._apt_cache = apt.Cache(rootdir=self.session.location)
return package in self._apt_cache
def get_package_for_paths(self, paths, regex=False):
logging.debug('Searching for packages containing %r', paths)
# TODO(jelmer): Make sure we use whatever is configured in self.session
return get_package_for_paths(paths, self.searchers(), regex=regex)
def missing(self, packages):
root = getattr(self.session, "location", "/")
status_path = os.path.join(root, "var/lib/dpkg/status")
missing = set(packages)
import apt_pkg
with apt_pkg.TagFile(status_path) as tagf:
while missing:
tagf.step()
if not tagf.section:
break
if tagf.section["Package"] in missing:
if tagf.section["Status"] == "install ok installed":
missing.remove(tagf.section["Package"])
return list(missing)
def install(self, packages: List[str]) -> None:
logging.info('Installing using apt: %r', packages)
packages = self.missing(packages)
if packages:
run_apt(self.session, ["install"] + packages)
def satisfy(self, deps: List[str]) -> None:
run_apt(self.session, ["satisfy"] + deps)
class ContentsFileNotFound(Exception):
"""The contents file was not found."""
class RemoteAptContentsFileSearcher(FileSearcher):
def __init__(self):
self._db = {}
@classmethod
def from_session(cls, session):
logging.info('Loading apt contents information')
# TODO(jelmer): what about sources.list.d?
from aptsources.sourceslist import SourcesList
sl = SourcesList()
sl.load(os.path.join(session.location, 'etc/apt/sources.list'))
return cls.from_sources_list(
sl,
cache_dirs=[
os.path.join(session.location, 'var/lib/apt/lists'),
'/var/lib/apt/lists'])
def __setitem__(self, path, package):
self._db[path] = package
def search_files(self, path, regex=False):
c = re.compile(path)
for p, pkg in sorted(self._db.items()):
if regex:
if c.match(p):
yield pkg
else:
if path == p:
yield pkg
def load_file(self, f):
for line in f:
(path, rest) = line.rsplit(maxsplit=1)
package = rest.split(b"/")[-1]
decoded_path = "/" + path.decode("utf-8", "surrogateescape")
self[decoded_path] = package.decode("utf-8")
@classmethod
def _load_cache_file(cls, url, cache_dir):
from urllib.parse import urlparse
parsed = urlparse(url)
p = os.path.join(
cache_dir,
parsed.hostname + parsed.path.replace('/', '_') + '.lz4')
if not os.path.exists(p):
return None
logging.debug('Loading cached contents file %s', p)
import lz4.frame
return lz4.frame.open(p, mode='rb')
@classmethod
def from_urls(cls, urls, cache_dirs=None):
self = cls()
for url, mandatory in urls:
for cache_dir in cache_dirs or []:
f = cls._load_cache_file(url, cache_dir)
if f is not None:
self.load_file(f)
break
else:
if not mandatory and self._db:
logging.debug(
'Not attempting to fetch optional contents '
'file %s', url)
else:
logging.debug('Fetching contents file %s', url)
try:
self.load_url(url)
except ContentsFileNotFound:
if mandatory:
logging.warning(
'Unable to fetch contents file %s', url)
else:
logging.debug(
'Unable to fetch optional contents file %s',
url)
return self
@classmethod
def from_sources_list(cls, sl, cache_dirs=None):
# TODO(jelmer): Use aptsources.sourceslist.SourcesList
from .build import get_build_architecture
# TODO(jelmer): Verify signatures, etc.
urls = []
arches = [(get_build_architecture(), True), ("all", False)]
for source in sl.list:
if source.invalid or source.disabled:
continue
if source.type == 'deb-src':
continue
if source.type != 'deb':
logging.warning("Invalid line in sources: %r", source)
continue
base_url = source.uri.rstrip('/')
name = source.dist.rstrip('/')
components = source.comps
if components:
dists_url = base_url + "/dists"
else:
dists_url = base_url
if components:
for component in components:
for arch, mandatory in arches:
urls.append(
("%s/%s/%s/Contents-%s" % (
dists_url, name, component, arch), mandatory))
else:
for arch, mandatory in arches:
urls.append(
("%s/%s/Contents-%s" % (dists_url, name.rstrip('/'), arch), mandatory))
return cls.from_urls(urls, cache_dirs=cache_dirs)
@staticmethod
def _get(url):
from urllib.request import urlopen, Request
request = Request(url, headers={"User-Agent": "Debian Janitor"})
return urlopen(request)
def load_url(self, url, allow_cache=True):
from urllib.error import HTTPError
for ext in ['.xz', '.gz', '']:
try:
response = self._get(url + ext)
except HTTPError as e:
if e.status == 404:
continue
raise
break
else:
raise ContentsFileNotFound(url)
if ext == '.gz':
import gzip
f = gzip.GzipFile(fileobj=response)
elif ext == '.xz':
import lzma
from io import BytesIO
f = BytesIO(lzma.decompress(response.read()))
elif response.headers.get_content_type() == "text/plain":
f = response
else:
raise Exception(
"Unknown content type %r" % response.headers.get_content_type()
)
self.load_file(f)
class GeneratedFileSearcher(FileSearcher):
def __init__(self, db):
self._db = db
def search_files(self, path: str, regex: bool = False) -> Iterator[str]:
for p, pkg in sorted(self._db.items()):
if regex:
if re.match(path, p):
yield pkg
else:
if path == p:
yield pkg
# TODO(jelmer): read from a file
GENERATED_FILE_SEARCHER = GeneratedFileSearcher(
{
"/etc/locale.gen": "locales",
# Alternative
"/usr/bin/rst2html": "/usr/share/docutils/scripts/python3/rst2html",
}
)
def get_package_for_paths(
paths: List[str], searchers: List[FileSearcher], regex: bool = False) -> Optional[str]:
candidates: Set[str] = set()
for path in paths:
for searcher in searchers:
candidates.update(searcher.search_files(path, regex=regex))
if candidates:
break
if len(candidates) == 0:
logging.warning("No packages found that contain %r", paths)
return None
if len(candidates) > 1:
logging.warning(
"More than 1 packages found that contain %r: %r", path, candidates
)
# Euhr. Pick the one with the shortest name?
return sorted(candidates, key=len)[0]
else:
return candidates.pop()

237
ognibuild/debian/build.py Normal file
View file

@ -0,0 +1,237 @@
#!/usr/bin/python
# Copyright (C) 2018 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
__all__ = [
"changes_filename",
"get_build_architecture",
"add_dummy_changelog_entry",
"build",
"SbuildFailure",
]
from datetime import datetime
import logging
import os
import re
import subprocess
import sys
from debian.changelog import Changelog
from debmutate.changelog import get_maintainer, format_datetime
from breezy import osutils
from breezy.mutabletree import MutableTree
from breezy.plugins.debian.builder import BuildFailedError
from buildlog_consultant.sbuild import (
worker_failure_from_sbuild_log,
SbuildFailure,
)
DEFAULT_BUILDER = "sbuild --no-clean-source"
class MissingChangesFile(Exception):
"""Expected changes file was not written."""
def __init__(self, filename):
self.filename = filename
def changes_filename(package, version, arch):
non_epoch_version = version.upstream_version
if version.debian_version is not None:
non_epoch_version += "-%s" % version.debian_version
return "%s_%s_%s.changes" % (package, non_epoch_version, arch)
def get_build_architecture():
try:
return subprocess.check_output(
['dpkg-architecture', '-qDEB_BUILD_ARCH']).strip().decode()
except subprocess.CalledProcessError as e:
raise Exception(
"Could not find the build architecture: %s" % e)
def add_dummy_changelog_entry(
tree: MutableTree,
subpath: str,
suffix: str,
suite: str,
message: str,
timestamp=None,
maintainer=None,
):
"""Add a dummy changelog entry to a package.
Args:
directory: Directory to run in
suffix: Suffix for the version
suite: Debian suite
message: Changelog message
"""
def add_suffix(v, suffix):
m = re.fullmatch(
"(.*)(" + re.escape(suffix) + ")([0-9]+)",
v,
)
if m:
return m.group(1) + m.group(2) + "%d" % (int(m.group(3)) + 1)
else:
return v + suffix + "1"
path = os.path.join(subpath, "debian", "changelog")
if maintainer is None:
maintainer = get_maintainer()
if timestamp is None:
timestamp = datetime.now()
with tree.get_file(path) as f:
cl = Changelog()
cl.parse_changelog(f, max_blocks=None, allow_empty_author=True, strict=False)
version = cl[0].version
if version.debian_revision:
version.debian_revision = add_suffix(version.debian_revision, suffix)
else:
version.upstream_version = add_suffix(version.upstream_version, suffix)
cl.new_block(
package=cl[0].package,
version=version,
urgency="low",
distributions=suite,
author="%s <%s>" % maintainer,
date=format_datetime(timestamp),
changes=["", " * " + message, ""],
)
cl_str = cl._format(allow_missing_author=True)
tree.put_file_bytes_non_atomic(path, cl_str.encode(cl._encoding))
def get_latest_changelog_version(local_tree, subpath=""):
path = osutils.pathjoin(subpath, "debian/changelog")
with local_tree.get_file(path) as f:
cl = Changelog(f, max_blocks=1)
return cl.package, cl.version
def build(
local_tree,
outf,
build_command=DEFAULT_BUILDER,
result_dir=None,
distribution=None,
subpath="",
source_date_epoch=None,
):
args = [
sys.executable,
"-m",
"breezy",
"builddeb",
"--guess-upstream-branch-url",
"--builder=%s" % build_command,
]
if result_dir:
args.append("--result-dir=%s" % result_dir)
outf.write("Running %r\n" % (build_command,))
outf.flush()
env = dict(os.environ.items())
if distribution is not None:
env["DISTRIBUTION"] = distribution
if source_date_epoch is not None:
env["SOURCE_DATE_EPOCH"] = "%d" % source_date_epoch
logging.info("Building debian packages, running %r.", build_command)
try:
subprocess.check_call(
args, cwd=local_tree.abspath(subpath), stdout=outf, stderr=outf, env=env
)
except subprocess.CalledProcessError:
raise BuildFailedError()
def build_once(
local_tree,
build_suite,
output_directory,
build_command,
subpath="",
source_date_epoch=None,
):
build_log_path = os.path.join(output_directory, "build.log")
try:
with open(build_log_path, "w") as f:
build(
local_tree,
outf=f,
build_command=build_command,
result_dir=output_directory,
distribution=build_suite,
subpath=subpath,
source_date_epoch=source_date_epoch,
)
except BuildFailedError:
with open(build_log_path, "rb") as f:
raise worker_failure_from_sbuild_log(f)
(cl_package, cl_version) = get_latest_changelog_version(local_tree, subpath)
changes_name = changes_filename(cl_package, cl_version, get_build_architecture())
changes_path = os.path.join(output_directory, changes_name)
if not os.path.exists(changes_path):
raise MissingChangesFile(changes_name)
return (changes_name, cl_version)
def gbp_dch(path):
subprocess.check_call(["gbp", "dch"], cwd=path)
def attempt_build(
local_tree,
suffix,
build_suite,
output_directory,
build_command,
build_changelog_entry="Build for debian-janitor apt repository.",
subpath="",
source_date_epoch=None,
):
"""Attempt a build, with a custom distribution set.
Args:
local_tree: Tree to build in
suffix: Suffix to add to version string
build_suite: Name of suite (i.e. distribution) to build for
output_directory: Directory to write output to
build_command: Build command to build package
build_changelog_entry: Changelog entry to use
subpath: Sub path in tree where package lives
source_date_epoch: Source date epoch to set
Returns: Tuple with (changes_name, cl_version)
"""
add_dummy_changelog_entry(
local_tree, subpath, suffix, build_suite, build_changelog_entry
)
return build_once(
local_tree,
build_suite,
output_directory,
build_command,
subpath,
source_date_epoch=source_date_epoch,
)

View file

@ -0,0 +1,675 @@
#!/usr/bin/python
# Copyright (C) 2018 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
__all__ = [
"build_incrementally",
]
import logging
import os
import sys
from typing import List, Set, Optional
from debian.deb822 import (
Deb822,
PkgRelation,
)
from debian.changelog import Version
from breezy.commit import PointlessCommit
from breezy.mutabletree import MutableTree
from breezy.tree import Tree
from debmutate.control import (
ensure_some_version,
ensure_minimum_version,
ControlEditor,
)
from debmutate.debhelper import (
get_debhelper_compat_level,
)
from debmutate.deb822 import (
Deb822Editor,
)
from debmutate.reformatting import (
FormattingUnpreservable,
GeneratedFile,
)
from lintian_brush import (
reset_tree,
)
from lintian_brush.changelog import (
add_changelog_entry,
)
from debmutate._rules import (
dh_invoke_add_with,
update_rules,
)
from breezy.plugins.debian.changelog import debcommit
from buildlog_consultant import Problem
from buildlog_consultant.apt import (
AptFetchFailure,
)
from buildlog_consultant.common import (
MissingConfigStatusInput,
MissingAutomakeInput,
MissingConfigure,
NeedPgBuildExtUpdateControl,
MissingPythonModule,
MissingPythonDistribution,
MissingPerlFile,
)
from buildlog_consultant.sbuild import (
SbuildFailure,
)
from ..fix_build import BuildFixer, resolve_error, DependencyContext
from ..buildlog import UpstreamRequirementFixer
from ..resolver.apt import (
AptRequirement,
get_package_for_python_module,
)
from .build import attempt_build, DEFAULT_BUILDER
DEFAULT_MAX_ITERATIONS = 10
class CircularDependency(Exception):
"""Adding dependency would introduce cycle."""
def __init__(self, package):
self.package = package
class BuildDependencyContext(DependencyContext):
def add_dependency(self, requirement: AptRequirement):
return add_build_dependency(
self.tree,
requirement,
committer=self.committer,
subpath=self.subpath,
update_changelog=self.update_changelog,
)
class AutopkgtestDependencyContext(DependencyContext):
def __init__(
self, testname, tree, apt, subpath="", committer=None, update_changelog=True
):
self.testname = testname
super(AutopkgtestDependencyContext, self).__init__(
tree, apt, subpath, committer, update_changelog
)
def add_dependency(self, requirement):
return add_test_dependency(
self.tree,
self.testname,
requirement,
committer=self.committer,
subpath=self.subpath,
update_changelog=self.update_changelog,
)
def add_build_dependency(
tree: Tree,
requirement: AptRequirement,
committer: Optional[str] = None,
subpath: str = "",
update_changelog: bool = True,
):
if not isinstance(requirement, AptRequirement):
raise TypeError(requirement)
control_path = os.path.join(tree.abspath(subpath), "debian/control")
try:
with ControlEditor(path=control_path) as updater:
for binary in updater.binaries:
if binary["Package"] == requirement.package:
raise CircularDependency(requirement.package)
if requirement.minimum_version:
updater.source["Build-Depends"] = ensure_minimum_version(
updater.source.get("Build-Depends", ""),
requirement.package, requirement.minimum_version
)
else:
updater.source["Build-Depends"] = ensure_some_version(
updater.source.get("Build-Depends", ""),
requirement.package
)
except FormattingUnpreservable as e:
logging.info("Unable to edit %s in a way that preserves formatting.", e.path)
return False
if requirement.minimum_version:
desc = "%s (>= %s)" % (requirement.package, requirement.minimum_version)
else:
desc = requirement.package
if not updater.changed:
logging.info("Giving up; dependency %s was already present.", desc)
return False
logging.info("Adding build dependency: %s", desc)
return commit_debian_changes(
tree,
subpath,
"Add missing build dependency on %s." % desc,
committer=committer,
update_changelog=update_changelog,
)
def add_test_dependency(
tree,
testname,
requirement,
committer=None,
subpath="",
update_changelog=True,
):
if not isinstance(requirement, AptRequirement):
raise TypeError(requirement)
tests_control_path = os.path.join(tree.abspath(subpath), "debian/tests/control")
try:
with Deb822Editor(path=tests_control_path) as updater:
command_counter = 1
for control in updater.paragraphs:
try:
name = control["Tests"]
except KeyError:
name = "command%d" % command_counter
command_counter += 1
if name != testname:
continue
if requirement.minimum_version:
control["Depends"] = ensure_minimum_version(
control.get("Depends", ""),
requirement.package, requirement.minimum_version
)
else:
control["Depends"] = ensure_some_version(
control.get("Depends", ""), requirement.package
)
except FormattingUnpreservable as e:
logging.info("Unable to edit %s in a way that preserves formatting.", e.path)
return False
if not updater.changed:
return False
if requirement.minimum_version:
desc = "%s (>= %s)" % (
requirement.package, requirement.minimum_version)
else:
desc = requirement.package
logging.info("Adding dependency to test %s: %s", testname, desc)
return commit_debian_changes(
tree,
subpath,
"Add missing dependency for test %s on %s." % (testname, desc),
update_changelog=update_changelog,
)
def commit_debian_changes(
tree: MutableTree,
subpath: str,
summary: str,
committer: Optional[str] = None,
update_changelog: bool = True,
) -> bool:
with tree.lock_write():
try:
if update_changelog:
add_changelog_entry(
tree, os.path.join(subpath, "debian/changelog"), [summary]
)
debcommit(tree, committer=committer, subpath=subpath)
else:
tree.commit(
message=summary, committer=committer, specific_files=[subpath]
)
except PointlessCommit:
return False
else:
return True
def targeted_python_versions(tree: Tree) -> Set[str]:
with tree.get_file("debian/control") as f:
control = Deb822(f)
build_depends = PkgRelation.parse_relations(control.get("Build-Depends", ""))
all_build_deps: Set[str] = set()
for or_deps in build_depends:
all_build_deps.update(or_dep["name"] for or_dep in or_deps)
targeted = set()
if any(x.startswith("pypy") for x in all_build_deps):
targeted.add("pypy")
if any(x.startswith("python-") for x in all_build_deps):
targeted.add("cpython2")
if any(x.startswith("python3-") for x in all_build_deps):
targeted.add("cpython3")
return targeted
def fix_missing_python_distribution(error, context): # noqa: C901
targeted = targeted_python_versions(context.tree)
default = not targeted
pypy_pkg = context.apt.get_package_for_paths(
["/usr/lib/pypy/dist-packages/%s-.*.egg-info/PKG-INFO" % error.distribution], regex=True
)
if pypy_pkg is None:
pypy_pkg = "pypy-%s" % error.distribution
if not context.apt.package_exists(pypy_pkg):
pypy_pkg = None
py2_pkg = context.apt.get_package_for_paths(
["/usr/lib/python2\\.[0-9]/dist-packages/%s-.*.egg-info/PKG-INFO" % error.distribution],
regex=True,
)
if py2_pkg is None:
py2_pkg = "python-%s" % error.distribution
if not context.apt.package_exists(py2_pkg):
py2_pkg = None
py3_pkg = context.apt.get_package_for_paths(
["/usr/lib/python3/dist-packages/%s-.*.egg-info/PKG-INFO" % error.distribution],
regex=True,
)
if py3_pkg is None:
py3_pkg = "python3-%s" % error.distribution
if not context.apt.package_exists(py3_pkg):
py3_pkg = None
extra_build_deps = []
if error.python_version == 2:
if "pypy" in targeted:
if not pypy_pkg:
logging.warning("no pypy package found for %s", error.module)
else:
extra_build_deps.append(pypy_pkg)
if "cpython2" in targeted or default:
if not py2_pkg:
logging.warning("no python 2 package found for %s", error.module)
return False
extra_build_deps.append(py2_pkg)
elif error.python_version == 3:
if not py3_pkg:
logging.warning("no python 3 package found for %s", error.module)
return False
extra_build_deps.append(py3_pkg)
else:
if py3_pkg and ("cpython3" in targeted or default):
extra_build_deps.append(py3_pkg)
if py2_pkg and ("cpython2" in targeted or default):
extra_build_deps.append(py2_pkg)
if pypy_pkg and "pypy" in targeted:
extra_build_deps.append(pypy_pkg)
if not extra_build_deps:
return False
for dep_pkg in extra_build_deps:
assert dep_pkg is not None
if not context.add_dependency(
AptRequirement(
dep_pkg.package, minimum_version=error.minimum_version)):
return False
return True
def fix_missing_python_module(error, context):
if getattr(context, "tree", None) is not None:
targeted = targeted_python_versions(context.tree)
else:
targeted = set()
default = not targeted
pypy_pkg = get_package_for_python_module(context.apt, error.module, "pypy", None)
py2_pkg = get_package_for_python_module(context.apt, error.module, "python2", None)
py3_pkg = get_package_for_python_module(context.apt, error.module, "python3", None)
extra_build_deps = []
if error.python_version == 2:
if "pypy" in targeted:
if not pypy_pkg:
logging.warning("no pypy package found for %s", error.module)
else:
extra_build_deps.append(pypy_pkg)
if "cpython2" in targeted or default:
if not py2_pkg:
logging.warning("no python 2 package found for %s", error.module)
return False
extra_build_deps.append(py2_pkg)
elif error.python_version == 3:
if not py3_pkg:
logging.warning("no python 3 package found for %s", error.module)
return False
extra_build_deps.append(py3_pkg)
else:
if py3_pkg and ("cpython3" in targeted or default):
extra_build_deps.append(py3_pkg)
if py2_pkg and ("cpython2" in targeted or default):
extra_build_deps.append(py2_pkg)
if pypy_pkg and "pypy" in targeted:
extra_build_deps.append(pypy_pkg)
if not extra_build_deps:
return False
for dep_pkg in extra_build_deps:
assert dep_pkg is not None
if not context.add_dependency(
AptRequirement(dep_pkg.package, error.minimum_version)):
return False
return True
def retry_apt_failure(error, context):
return True
def enable_dh_autoreconf(context):
# Debhelper >= 10 depends on dh-autoreconf and enables autoreconf by
# default.
debhelper_compat_version = get_debhelper_compat_level(context.tree.abspath("."))
if debhelper_compat_version is not None and debhelper_compat_version < 10:
def add_with_autoreconf(line, target):
if target != b"%":
return line
if not line.startswith(b"dh "):
return line
return dh_invoke_add_with(line, b"autoreconf")
if update_rules(command_line_cb=add_with_autoreconf):
return context.add_dependency(AptRequirement("dh-autoreconf"))
return False
def fix_missing_configure(error, context):
if (not context.tree.has_filename("configure.ac") and
not context.tree.has_filename("configure.in")):
return False
return enable_dh_autoreconf(context)
def fix_missing_automake_input(error, context):
# TODO(jelmer): If it's ./NEWS, ./AUTHORS or ./README that's missing, then
# try to set 'export AUTOMAKE = automake --foreign' in debian/rules.
# https://salsa.debian.org/jelmer/debian-janitor/issues/88
return enable_dh_autoreconf(context)
def fix_missing_config_status_input(error, context):
autogen_path = "autogen.sh"
rules_path = "debian/rules"
if context.subpath not in (".", ""):
autogen_path = os.path.join(context.subpath, autogen_path)
rules_path = os.path.join(context.subpath, rules_path)
if not context.tree.has_filename(autogen_path):
return False
def add_autogen(mf):
rule = any(mf.iter_rules(b"override_dh_autoreconf"))
if rule:
return
rule = mf.add_rule(b"override_dh_autoreconf")
rule.append_command(b"dh_autoreconf ./autogen.sh")
if not update_rules(makefile_cb=add_autogen, path=rules_path):
return False
if context.update_changelog:
commit_debian_changes(
context.tree,
context.subpath,
"Run autogen.sh during build.",
committer=context.committer,
update_changelog=context.update_changelog,
)
return True
class PgBuildExtOutOfDateControlFixer(BuildFixer):
def __init__(self, session):
self.session = session
def can_fix(self, problem):
return isinstance(problem, NeedPgBuildExtUpdateControl)
def _fix(self, problem, context):
return self._fn(problem, context)
def _fix(self, error, context):
logging.info("Running 'pg_buildext updatecontrol'")
self.session.check_call(["pg_buildext", "updatecontrol"])
return commit_debian_changes(
context.tree,
context.subpath,
"Run 'pgbuildext updatecontrol'.",
committer=context.committer,
update_changelog=False,
)
def fix_missing_makefile_pl(error, context):
if (
error.filename == "Makefile.PL"
and not context.tree.has_filename("Makefile.PL")
and context.tree.has_filename("dist.ini")
):
# TODO(jelmer): add dist-zilla add-on to debhelper
raise NotImplementedError
return False
class SimpleBuildFixer(BuildFixer):
def __init__(self, problem_cls, fn):
self._problem_cls = problem_cls
self._fn = fn
def can_fix(self, problem):
return isinstance(problem, self._problem_cls)
def _fix(self, problem, context):
return self._fn(problem, context)
def versioned_package_fixers(session):
return [
PgBuildExtOutOfDateControlFixer(session),
SimpleBuildFixer(MissingConfigure, fix_missing_configure),
SimpleBuildFixer(MissingAutomakeInput, fix_missing_automake_input),
SimpleBuildFixer(MissingConfigStatusInput, fix_missing_config_status_input),
SimpleBuildFixer(MissingPerlFile, fix_missing_makefile_pl),
]
def apt_fixers(apt) -> List[BuildFixer]:
from ..resolver.apt import AptResolver
resolver = AptResolver(apt)
return [
SimpleBuildFixer(MissingPythonModule, fix_missing_python_module),
SimpleBuildFixer(MissingPythonDistribution, fix_missing_python_distribution),
SimpleBuildFixer(AptFetchFailure, retry_apt_failure),
UpstreamRequirementFixer(resolver),
]
def build_incrementally(
local_tree,
apt,
suffix,
build_suite,
output_directory,
build_command,
build_changelog_entry="Build for debian-janitor apt repository.",
committer=None,
max_iterations=DEFAULT_MAX_ITERATIONS,
subpath="",
source_date_epoch=None,
update_changelog=True,
):
fixed_errors = []
fixers = versioned_package_fixers(apt.session) + apt_fixers(apt)
logging.info('Using fixers: %r', fixers)
while True:
try:
return attempt_build(
local_tree,
suffix,
build_suite,
output_directory,
build_command,
build_changelog_entry,
subpath=subpath,
source_date_epoch=source_date_epoch,
)
except SbuildFailure as e:
if e.error is None:
logging.warning("Build failed with unidentified error. Giving up.")
raise
if e.phase is None:
logging.info("No relevant context, not making any changes.")
raise
if (e.error, e.phase) in fixed_errors:
logging.warning("Error was still not fixed on second try. Giving up.")
raise
if max_iterations is not None and len(fixed_errors) > max_iterations:
logging.warning("Last fix did not address the issue. Giving up.")
raise
reset_tree(local_tree, local_tree.basis_tree(), subpath=subpath)
if e.phase[0] == "build":
context = BuildDependencyContext(
local_tree,
apt,
subpath=subpath,
committer=committer,
update_changelog=update_changelog,
)
elif e.phase[0] == "autopkgtest":
context = AutopkgtestDependencyContext(
e.phase[1],
local_tree,
apt,
subpath=subpath,
committer=committer,
update_changelog=update_changelog,
)
else:
logging.warning("unable to install for context %r", e.phase)
raise
try:
if not resolve_error(e.error, context, fixers):
logging.warning("Failed to resolve error %r. Giving up.", e.error)
raise
except GeneratedFile:
logging.warning(
"Control file is generated, unable to edit to "
"resolver error %r.", e.error)
raise e
except CircularDependency:
logging.warning(
"Unable to fix %r; it would introduce a circular " "dependency.",
e.error,
)
raise e
fixed_errors.append((e.error, e.phase))
if os.path.exists(os.path.join(output_directory, "build.log")):
i = 1
while os.path.exists(
os.path.join(output_directory, "build.log.%d" % i)
):
i += 1
os.rename(
os.path.join(output_directory, "build.log"),
os.path.join(output_directory, "build.log.%d" % i),
)
def main(argv=None):
import argparse
parser = argparse.ArgumentParser("ognibuild.debian.fix_build")
parser.add_argument(
"--suffix", type=str, help="Suffix to use for test builds.", default="fixbuild1"
)
parser.add_argument(
"--suite", type=str, help="Suite to target.", default="unstable"
)
parser.add_argument(
"--output-directory", type=str, help="Output directory.", default=None
)
parser.add_argument(
"--committer", type=str, help="Committer string (name and email)", default=None
)
parser.add_argument(
"--build-command",
type=str,
help="Build command",
default=(DEFAULT_BUILDER + " -A -s -v"),
)
parser.add_argument(
"--no-update-changelog",
action="store_false",
default=None,
dest="update_changelog",
help="do not update the changelog",
)
parser.add_argument(
"--update-changelog",
action="store_true",
dest="update_changelog",
help="force updating of the changelog",
default=None,
)
args = parser.parse_args()
from breezy.workingtree import WorkingTree
from .apt import AptManager
from ..session.plain import PlainSession
apt = AptManager(PlainSession())
tree = WorkingTree.open(".")
build_incrementally(
tree,
apt,
args.suffix,
args.suite,
args.output_directory,
args.build_command,
committer=args.committer,
update_changelog=args.update_changelog,
)
if __name__ == "__main__":
sys.exit(main(sys.argv))

235
ognibuild/dist.py Normal file
View file

@ -0,0 +1,235 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import errno
import logging
import os
import shutil
import sys
import tempfile
from typing import Optional
from debian.deb822 import Deb822
from breezy.tree import Tree
from breezy.workingtree import WorkingTree
from buildlog_consultant.common import (
NoSpaceOnDevice,
)
from . import DetailedFailure
from .buildsystem import NoBuildToolsFound
from .session.schroot import SchrootSession
from .vcs import dupe_vcs_tree, export_vcs_tree
SUPPORTED_DIST_EXTENSIONS = [
".tar.gz",
".tgz",
".tar.bz2",
".tar.xz",
".tar.lzma",
".tbz2",
".tar",
".zip",
]
def is_dist_file(fn):
for ext in SUPPORTED_DIST_EXTENSIONS:
if fn.endswith(ext):
return True
return False
class DistNoTarball(Exception):
"""Dist operation did not create a tarball."""
def run_dist(session, buildsystems, resolver, fixers):
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
for buildsystem in buildsystems:
buildsystem.dist(session, resolver, fixers)
return
raise NoBuildToolsFound()
class DistCatcher(object):
def __init__(self, directory):
self.export_directory = directory
self.files = []
self.existing_files = None
def __enter__(self):
self.existing_files = os.listdir(self.export_directory)
return self
def find_files(self):
new_files = os.listdir(self.export_directory)
diff_files = set(new_files) - set(self.existing_files)
diff = set([n for n in diff_files if is_dist_file(n)])
if len(diff) == 1:
fn = diff.pop()
logging.info("Found tarball %s in package directory.", fn)
self.files.append(os.path.join(self.export_directory, fn))
return fn
if "dist" in diff_files:
for entry in os.scandir(os.path.join(self.export_directory, "dist")):
if is_dist_file(entry.name):
logging.info("Found tarball %s in dist directory.", entry.name)
self.files.append(entry.path)
return entry.name
logging.info("No tarballs found in dist directory.")
parent_directory = os.path.dirname(self.export_directory)
diff = set(os.listdir(parent_directory)) - set([subdir])
if len(diff) == 1:
fn = diff.pop()
logging.info("Found tarball %s in parent directory.", fn)
self.files.append(os.path.join(parent_directory, fn))
return fn
def __exit__(self, exc_type, exc_val, exc_tb):
self.find_files()
return False
def create_dist_schroot(
tree: Tree,
target_dir: str,
chroot: str,
packaging_tree: Optional[Tree] = None,
include_controldir: bool = True,
subdir: Optional[str] = None,
) -> str:
from .buildsystem import detect_buildsystems
from .resolver.apt import AptResolver
from .buildlog import UpstreamRequirementFixer
if subdir is None:
subdir = "package"
with SchrootSession(chroot) as session:
if packaging_tree is not None:
from .debian import satisfy_build_deps
satisfy_build_deps(session, packaging_tree)
build_dir = os.path.join(session.location, "build")
try:
directory = tempfile.mkdtemp(dir=build_dir)
except OSError as e:
if e.errno == errno.ENOSPC:
raise DetailedFailure(1, ["mkdtemp"], NoSpaceOnDevice())
reldir = "/" + os.path.relpath(directory, session.location)
export_directory = os.path.join(directory, subdir)
if not include_controldir:
export_vcs_tree(tree, export_directory)
else:
dupe_vcs_tree(tree, export_directory)
buildsystems = list(detect_buildsystems(export_directory))
resolver = AptResolver.from_session(session)
fixers = [UpstreamRequirementFixer(resolver)]
with DistCatcher(export_directory) as dc:
oldcwd = os.getcwd()
os.chdir(export_directory)
try:
session.chdir(os.path.join(reldir, subdir))
run_dist(session, buildsystems, resolver, fixers)
finally:
os.chdir(oldcwd)
for path in dc.files:
shutil.copy(path, target_dir)
return os.path.join(target_dir, os.path.basename(path))
logging.info("No tarball created :(")
raise DistNoTarball()
if __name__ == "__main__":
import argparse
import breezy.bzr # noqa: F401
import breezy.git # noqa: F401
from breezy.export import export
parser = argparse.ArgumentParser()
parser.add_argument(
"--chroot",
default="unstable-amd64-sbuild",
type=str,
help="Name of chroot to use",
)
parser.add_argument(
"directory",
default=".",
type=str,
nargs="?",
help="Directory with upstream source.",
)
parser.add_argument(
"--packaging-directory", type=str, help="Path to packaging directory."
)
parser.add_argument(
"--target-directory", type=str, default="..", help="Target directory"
)
parser.add_argument(
"--verbose",
action="store_true",
help="Be verbose")
args = parser.parse_args()
if args.verbose:
logging.basicConfig(level=logging.DEBUG)
else:
logging.basicConfig(level=logging.INFO)
tree = WorkingTree.open(args.directory)
if args.packaging_directory:
packaging_tree = WorkingTree.open(args.packaging_directory)
with packaging_tree.lock_read():
source = Deb822(packaging_tree.get_file("debian/control"))
package = source["Source"]
subdir = package
else:
packaging_tree = None
subdir = None
try:
ret = create_dist_schroot(
tree,
subdir=subdir,
target_dir=os.path.abspath(args.target_directory),
packaging_tree=packaging_tree,
chroot=args.chroot,
)
except NoBuildToolsFound:
logging.info("No build tools found, falling back to simple export.")
export(tree, "dist.tar.gz", "tgz", None)
else:
print("Created %s" % ret)
sys.exit(0)

127
ognibuild/fix_build.py Normal file
View file

@ -0,0 +1,127 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
from typing import List, Tuple, Callable, Type, Optional
from buildlog_consultant.common import (
find_build_failure_description,
Problem,
MissingPerlModule,
MissingPythonDistribution,
MissingCommand,
)
from breezy.mutabletree import MutableTree
from . import DetailedFailure, UnidentifiedError
from .debian.apt import AptManager
from .session import Session, run_with_tee
class BuildFixer(object):
"""Build fixer."""
def can_fix(self, problem):
raise NotImplementedError(self.can_fix)
def _fix(self, problem, context):
raise NotImplementedError(self._fix)
def fix(self, problem, context):
if not self.can_fix(problem):
return None
return self._fix(problem, context)
class DependencyContext(object):
def __init__(
self,
tree: MutableTree,
apt: AptManager,
subpath: str = "",
committer: Optional[str] = None,
update_changelog: bool = True,
):
self.tree = tree
self.apt = apt
self.subpath = subpath
self.committer = committer
self.update_changelog = update_changelog
def add_dependency(
self, package: str, minimum_version: Optional['Version'] = None
) -> bool:
raise NotImplementedError(self.add_dependency)
class SchrootDependencyContext(DependencyContext):
def __init__(self, session):
self.session = session
self.apt = AptManager(session)
def add_dependency(self, package, minimum_version=None):
# TODO(jelmer): Handle minimum_version
self.apt.install([package])
return True
def run_with_build_fixers(
session: Session, args: List[str], fixers: List[BuildFixer]):
logging.info("Running %r", args)
fixed_errors = []
while True:
retcode, lines = run_with_tee(session, args)
if retcode == 0:
return
match, error = find_build_failure_description(lines)
if error is None:
logging.warning("Build failed with unidentified error. Giving up.")
if match is not None:
raise UnidentifiedError(
retcode, args, lines, secondary=(match.lineno, match.line))
raise UnidentifiedError(retcode, args, lines)
logging.info("Identified error: %r", error)
if error in fixed_errors:
logging.warning(
"Failed to resolve error %r, it persisted. Giving up.", error
)
raise DetailedFailure(retcode, args, error)
if not resolve_error(
error,
SchrootDependencyContext(session),
fixers=fixers,
):
logging.warning("Failed to find resolution for error %r. Giving up.", error)
raise DetailedFailure(retcode, args, error)
fixed_errors.append(error)
def resolve_error(error, context, fixers):
relevant_fixers = []
for fixer in fixers:
if fixer.can_fix(error):
relevant_fixers.append(fixer)
if not relevant_fixers:
logging.warning("No fixer found for %r", error)
return False
for fixer in relevant_fixers:
logging.info("Attempting to use fixer %s to address %r", fixer, error)
made_changes = fixer.fix(error, context)
if made_changes:
return True
return False

45
ognibuild/info.py Normal file
View file

@ -0,0 +1,45 @@
#!/usr/bin/python3
# Copyright (C) 2020-2021 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from .buildsystem import NoBuildToolsFound, InstallTarget
def run_info(session, buildsystems):
for buildsystem in buildsystems:
print('%r:' % buildsystem)
deps = {}
try:
for kind, dep in buildsystem.get_declared_dependencies():
deps.setdefault(kind, []).append(dep)
except NotImplementedError:
print('\tUnable to detect declared dependencies for this type of build system')
if deps:
print('\tDeclared dependencies:')
for kind in deps:
print('\t\t%s:' % kind)
for dep in deps[kind]:
print('\t\t\t%s' % dep)
print('')
try:
outputs = list(buildsystem.get_declared_outputs())
except NotImplementedError:
print('\tUnable to detect declared outputs for this type of build system')
outputs = []
if outputs:
print('\tDeclared outputs:')
for output in outputs:
print('\t\t%s' % output)

33
ognibuild/install.py Normal file
View file

@ -0,0 +1,33 @@
#!/usr/bin/python3
# Copyright (C) 2020-2021 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from .buildsystem import NoBuildToolsFound, InstallTarget
def run_install(session, buildsystems, resolver, fixers, user: bool = False):
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
install_target = InstallTarget()
install_target.user = user
for buildsystem in buildsystems:
buildsystem.install(session, resolver, fixers, install_target)
return
raise NoBuildToolsFound()

293
ognibuild/requirements.py Normal file
View file

@ -0,0 +1,293 @@
#!/usr/bin/python
# Copyright (C) 2019-2020 Jelmer Vernooij <jelmer@jelmer.uk>
# encoding: utf-8
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import posixpath
from typing import Optional, List, Tuple
from . import UpstreamRequirement
class PythonPackageRequirement(UpstreamRequirement):
package: str
def __init__(self, package, python_version=None, minimum_version=None):
super(PythonPackageRequirement, self).__init__('python-package')
self.package = package
self.python_version = python_version
self.minimum_version = minimum_version
def __repr__(self):
return "%s(%r, python_version=%r, minimum_version=%r)" % (
type(self).__name__, self.package, self.python_version,
self.minimum_version)
def __str__(self):
return "python package: %s" % self.package
class BinaryRequirement(UpstreamRequirement):
binary_name: str
def __init__(self, binary_name):
super(BinaryRequirement, self).__init__('binary')
self.binary_name = binary_name
class PerlModuleRequirement(UpstreamRequirement):
module: str
filename: Optional[str]
inc: Optional[List[str]]
def __init__(self, module, filename=None, inc=None):
super(PerlModuleRequirement, self).__init__('perl-module')
self.module = module
self.filename = filename
self.inc = inc
def relfilename(self):
return self.module.replace("::", "/") + ".pm"
class NodePackageRequirement(UpstreamRequirement):
package: str
def __init__(self, package):
super(NodePackageRequirement, self).__init__('npm-package')
self.package = package
class CargoCrateRequirement(UpstreamRequirement):
crate: str
def __init__(self, crate):
super(CargoCrateRequirement, self).__init__('cargo-crate')
self.crate = crate
class PkgConfigRequirement(UpstreamRequirement):
module: str
def __init__(self, module, minimum_version=None):
super(PkgConfigRequirement, self).__init__('pkg-config')
self.module = module
self.minimum_version = minimum_version
class PathRequirement(UpstreamRequirement):
path: str
def __init__(self, path):
super(PathRequirement, self).__init__('path')
self.path = path
class CHeaderRequirement(UpstreamRequirement):
header: str
def __init__(self, header):
super(CHeaderRequirement, self).__init__('c-header')
self.header = header
class JavaScriptRuntimeRequirement(UpstreamRequirement):
def __init__(self):
super(JavaScriptRuntimeRequirement, self).__init__(
'javascript-runtime')
class ValaPackageRequirement(UpstreamRequirement):
package: str
def __init__(self, package: str):
super(ValaPackageRequirement, self).__init__('vala')
self.package = package
class RubyGemRequirement(UpstreamRequirement):
gem: str
minimum_version: Optional[str]
def __init__(self, gem: str, minimum_version: Optional[str]):
super(RubyGemRequirement, self).__init__('gem')
self.gem = gem
self.minimum_version = minimum_version
class GoPackageRequirement(UpstreamRequirement):
package: str
def __init__(self, package: str):
super(GoPackageRequirement, self).__init__('go')
self.package = package
class DhAddonRequirement(UpstreamRequirement):
path: str
def __init__(self, path: str):
super(DhAddonRequirement, self).__init__('dh-addon')
self.path = path
class PhpClassRequirement(UpstreamRequirement):
php_class: str
def __init__(self, php_class: str):
super(PhpClassRequirement, self).__init__('php-class')
self.php_class = php_class
class RPackageRequirement(UpstreamRequirement):
package: str
minimum_version: Optional[str]
def __init__(self, package: str, minimum_version: Optional[str] = None):
super(RPackageRequirement, self).__init__('r-package')
self.package = package
self.minimum_version = minimum_version
class LibraryRequirement(UpstreamRequirement):
library: str
def __init__(self, library: str):
super(LibraryRequirement, self).__init__('lib')
self.library = library
class RubyFileRequirement(UpstreamRequirement):
filename: str
def __init__(self, filename: str):
super(RubyFileRequirement, self).__init__('ruby-file')
self.filename = filename
class XmlEntityRequirement(UpstreamRequirement):
url: str
def __init__(self, url: str):
super(XmlEntityRequirement, self).__init__('xml-entity')
self.url = url
class SprocketsFileRequirement(UpstreamRequirement):
content_type: str
name: str
def __init__(self, content_type: str, name: str):
super(SprocketsFileRequirement, self).__init__('sprockets-file')
self.content_type = content_type
self.name = name
class JavaClassRequirement(UpstreamRequirement):
classname: str
def __init__(self, classname: str):
super(JavaClassRequirement, self).__init__('java-class')
self.classname = classname
class HaskellPackageRequirement(UpstreamRequirement):
package: str
def __init__(self, package: str):
super(HaskellPackageRequirement, self).__init__('haskell-package')
self.package = package
class MavenArtifactRequirement(UpstreamRequirement):
artifacts: List[Tuple[str, str, str]]
def __init__(self, artifacts):
super(MavenArtifactRequirement, self).__init__('maven-artifact')
self.artifacts = artifacts
class GnomeCommonRequirement(UpstreamRequirement):
def __init__(self):
super(GnomeCommonRequirement, self).__init__('gnome-common')
class JDKFileRequirement(UpstreamRequirement):
jdk_path: str
filename: str
def __init__(self, jdk_path: str, filename: str):
super(JDKFileRequirement, self).__init__('jdk-file')
self.jdk_path = jdk_path
self.filename = filename
@property
def path(self):
return posixpath.join(self.jdk_path, self.filename)
class PerlFileRequirement(UpstreamRequirement):
filename: str
def __init__(self, filename: str):
super(PerlFileRequirement, self).__init__('perl-file')
self.filename = filename
class AutoconfMacroRequirement(UpstreamRequirement):
macro: str
def __init__(self, macro: str):
super(AutoconfMacroRequirement, self).__init__('autoconf-macro')
self.macro = macro
class PythonModuleRequirement(UpstreamRequirement):
module: str
python_version: Optional[str]
minimum_version: Optional[str]
def __init__(self, module, python_version=None, minimum_version=None):
super(PythonModuleRequirement, self).__init__('python-module')
self.python_version = python_version
self.minimum_version = minimum_version

View file

@ -0,0 +1,228 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
class UnsatisfiedRequirements(Exception):
def __init__(self, reqs):
self.requirements = reqs
class Resolver(object):
def install(self, requirements):
raise NotImplementedError(self.install)
def resolve(self, requirement):
raise NotImplementedError(self.resolve)
def explain(self, requirements):
raise NotImplementedError(self.explain)
def met(self, requirement):
raise NotImplementedError(self.met)
class CPANResolver(Resolver):
def __init__(self, session):
self.session = session
def __str__(self):
return "cpan"
def install(self, requirements):
from ..requirements import PerlModuleRequirement
missing = []
for requirement in requirements:
if not isinstance(requirement, PerlModuleRequirement):
missing.append(requirement)
continue
# TODO(jelmer): Specify -T to skip tests?
self.session.check_call(
["cpan", "-i", requirement.module],
user="root", env={"PERL_MM_USE_DEFAULT": "1"}
)
if missing:
raise UnsatisfiedRequirements(missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
class HackageResolver(Resolver):
def __init__(self, session):
self.session = session
def __str__(self):
return "hackage"
def install(self, requirements):
from ..requirements import HaskellPackageRequirement
missing = []
for requirement in requirements:
if not isinstance(requirement, HaskellPackageRequirement):
missing.append(requirement)
continue
self.session.check_call(
["cabal", "install", requirement.package],
user="root")
if missing:
raise UnsatisfiedRequirements(missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
class CargoResolver(Resolver):
def __init__(self, session):
self.session = session
def __str__(self):
return "cargo"
def install(self, requirements):
from ..requirements import CargoCrateRequirement
missing = []
for requirement in requirements:
if not isinstance(requirement, CargoCrateRequirement):
missing.append(requirement)
continue
self.session.check_call(
["cargo", "install", requirement.crate],
user="root")
if missing:
raise UnsatisfiedRequirements(missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
class PypiResolver(Resolver):
def __init__(self, session):
self.session = session
def __str__(self):
return "pypi"
def install(self, requirements):
from ..requirements import PythonPackageRequirement
missing = []
for requirement in requirements:
if not isinstance(requirement, PythonPackageRequirement):
missing.append(requirement)
continue
self.session.check_call(["pip", "install", requirement.package])
if missing:
raise UnsatisfiedRequirements(missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
NPM_COMMAND_PACKAGES = {
"del-cli": "del-cli",
}
class NpmResolver(Resolver):
def __init__(self, session):
self.session = session
def __str__(self):
return "npm"
def install(self, requirements):
from ..requirements import NodePackageRequirement
missing = []
for requirement in requirements:
if not isinstance(requirement, NodePackageRequirement):
missing.append(requirement)
continue
try:
package = NPM_COMMAND_PACKAGES[requirement.command]
except KeyError:
missing.append(requirement)
continue
self.session.check_call(["npm", "-g", "install", package])
if missing:
raise UnsatisfiedRequirements(missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
class StackedResolver(Resolver):
def __init__(self, subs):
self.subs = subs
def __repr__(self):
return "%s(%r)" % (type(self).__name__, self.subs)
def __str__(self):
return "[" + ", ".join(map(str, self.subs)) + "]"
def install(self, requirements):
for sub in self.subs:
try:
sub.install(requirements)
except UnsatisfiedRequirements as e:
requirements = e.requirements
else:
return
def native_resolvers(session):
return StackedResolver([
CPANResolver(session),
PypiResolver(session),
NpmResolver(session),
CargoResolver(session),
HackageResolver(session)])
class ExplainResolver(Resolver):
def __init__(self, session):
self.session = session
@classmethod
def from_session(cls, session):
return cls(session)
def install(self, requirements):
raise UnsatisfiedRequirements(requirements)
def auto_resolver(session):
# TODO(jelmer): if session is SchrootSession or if we're root, use apt
from .apt import AptResolver
from ..session.schroot import SchrootSession
user = session.check_output(['echo', '$USER']).decode().strip()
resolvers = []
if isinstance(session, SchrootSession) or user == 'root':
resolvers.append(AptResolver.from_session(session))
resolvers.extend([
CPANResolver(session),
PypiResolver(session),
NpmResolver(session),
CargoResolver(session),
HackageResolver(session)])
return StackedResolver(resolvers)

542
ognibuild/resolver/apt.py Normal file
View file

@ -0,0 +1,542 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
import os
import posixpath
from ..debian.apt import AptManager
from . import Resolver, UnsatisfiedRequirements
from ..requirements import (
BinaryRequirement,
CHeaderRequirement,
PkgConfigRequirement,
PathRequirement,
UpstreamRequirement,
JavaScriptRuntimeRequirement,
ValaPackageRequirement,
RubyGemRequirement,
GoPackageRequirement,
DhAddonRequirement,
PhpClassRequirement,
RPackageRequirement,
NodePackageRequirement,
LibraryRequirement,
RubyFileRequirement,
XmlEntityRequirement,
SprocketsFileRequirement,
JavaClassRequirement,
HaskellPackageRequirement,
MavenArtifactRequirement,
GnomeCommonRequirement,
JDKFileRequirement,
PerlModuleRequirement,
PerlFileRequirement,
AutoconfMacroRequirement,
PythonModuleRequirement,
PythonPackageRequirement,
)
class AptRequirement(object):
def __init__(self, package, minimum_version=None):
self.package = package
self.minimum_version = minimum_version
def get_package_for_python_package(apt_mgr, package, python_version, minimum_version=None):
if python_version == "pypy":
pkg_name = apt_mgr.get_package_for_paths(
["/usr/lib/pypy/dist-packages/%s-.*.egg-info/PKG-INFO" % package],
regex=True)
elif python_version == "cpython2":
pkg_name = apt_mgr.get_package_for_paths(
["/usr/lib/python2\\.[0-9]/dist-packages/%s-.*.egg-info/PKG-INFO" % package],
regex=True)
elif python_version == "cpython3":
pkg_name = apt_mgr.get_package_for_paths(
["/usr/lib/python3/dist-packages/%s-.*.egg-info/PKG-INFO" % package],
regex=True)
else:
raise NotImplementedError
# TODO(jelmer): Dealing with epoch, etc?
if pkg_name is not None:
return AptRequirement(pkg_name, minimum_version)
return None
def get_package_for_python_module(apt_mgr, module, python_version, minimum_version):
if python_version == "python3":
paths = [
posixpath.join(
"/usr/lib/python3/dist-packages",
module.replace(".", "/"),
"__init__.py",
),
posixpath.join(
"/usr/lib/python3/dist-packages", module.replace(".", "/") + ".py"
),
posixpath.join(
"/usr/lib/python3\\.[0-9]+/lib-dynload",
module.replace(".", "/") + "\\.cpython-.*\\.so",
),
posixpath.join(
"/usr/lib/python3\\.[0-9]+/", module.replace(".", "/") + ".py"
),
posixpath.join(
"/usr/lib/python3\\.[0-9]+/", module.replace(".", "/"), "__init__.py"
),
]
elif python_version == "python2":
paths = [
posixpath.join(
"/usr/lib/python2\\.[0-9]/dist-packages",
module.replace(".", "/"),
"__init__.py",
),
posixpath.join(
"/usr/lib/python2\\.[0-9]/dist-packages",
module.replace(".", "/") + ".py",
),
posixpath.join(
"/usr/lib/python2.\\.[0-9]/lib-dynload",
module.replace(".", "/") + ".so",
),
]
elif python_version == "pypy":
paths = [
posixpath.join(
"/usr/lib/pypy/dist-packages", module.replace(".", "/"), "__init__.py"
),
posixpath.join(
"/usr/lib/pypy/dist-packages", module.replace(".", "/") + ".py"
),
posixpath.join(
"/usr/lib/pypy/dist-packages",
module.replace(".", "/") + "\\.pypy-.*\\.so",
),
]
else:
raise AssertionError("unknown python version %r" % python_version)
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name, minimum_version=minimum_version)
return None
def resolve_binary_req(apt_mgr, req):
if posixpath.isabs(req.binary_name):
paths = [req.binary_name]
else:
paths = [
posixpath.join(dirname, req.binary_name)
for dirname in ["/usr/bin", "/bin"]
]
pkg_name = apt_mgr.get_package_for_paths(paths)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_pkg_config_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths(
[posixpath.join("/usr/lib/pkgconfig", req.module + ".pc")],
)
if package is None:
package = apt_mgr.get_package_for_paths(
[posixpath.join("/usr/lib", ".*", "pkgconfig", req.module + ".pc")],
regex=True)
if package is not None:
return AptRequirement(package, minimum_version=req.minimum_version)
return None
def resolve_path_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths([req.path])
if package is not None:
return AptRequirement(package)
return None
def resolve_c_header_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths(
[posixpath.join("/usr/include", req.header)], regex=False
)
if package is None:
package = apt_mgr.get_package_for_paths(
[posixpath.join("/usr/include", ".*", req.header)], regex=True
)
if package is None:
return None
return AptRequirement(package)
def resolve_js_runtime_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths(
["/usr/bin/node", "/usr/bin/duk"], regex=False)
if package is not None:
return AptRequirement(package)
return None
def resolve_vala_package_req(apt_mgr, req):
path = "/usr/share/vala-[0-9.]+/vapi/%s.vapi" % req.package
package = apt_mgr.get_package_for_paths([path], regex=True)
if package is not None:
return AptRequirement(package)
return None
def resolve_ruby_gem_req(apt_mgr, req):
paths = [
posixpath.join(
"/usr/share/rubygems-integration/all/"
"specifications/%s-.*\\.gemspec" % req.gem
)
]
package = apt_mgr.get_package_for_paths(
paths, regex=True)
if package is not None:
return AptRequirement(package, minimum_version=req.minimum_version)
return None
def resolve_go_package_req(apt_mgr, req):
package = apt_mgr.get_package_for_paths(
[posixpath.join("/usr/share/gocode/src", req.package, ".*")],
regex=True
)
if package is not None:
return AptRequirement(package)
return None
def resolve_dh_addon_req(apt_mgr, req):
paths = [posixpath.join("/usr/share/perl5", req.path)]
package = apt_mgr.get_package_for_paths(paths)
if package is not None:
return AptRequirement(package)
return None
def resolve_php_class_req(apt_mgr, req):
path = "/usr/share/php/%s.php" % req.php_class.replace("\\", "/")
package = apt_mgr.get_package_for_paths([path])
if package is not None:
return AptRequirement(package)
return None
def resolve_r_package_req(apt_mgr, req):
paths = [posixpath.join("/usr/lib/R/site-library/.*/R/%s$" % req.package)]
package = apt_mgr.get_package_for_paths(paths, regex=True)
if package is not None:
return AptRequirement(package)
return None
def resolve_node_package_req(apt_mgr, req):
paths = [
"/usr/share/nodejs/.*/node_modules/%s/package.json" % req.package,
"/usr/lib/nodejs/%s/package.json" % req.package,
"/usr/share/nodejs/%s/package.json" % req.package,
]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_library_req(apt_mgr, req):
paths = [
posixpath.join("/usr/lib/lib%s.so$" % req.library),
posixpath.join("/usr/lib/.*/lib%s.so$" % req.library),
posixpath.join("/usr/lib/lib%s.a$" % req.library),
posixpath.join("/usr/lib/.*/lib%s.a$" % req.library),
]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_ruby_file_req(apt_mgr, req):
paths = [posixpath.join("/usr/lib/ruby/vendor_ruby/%s.rb" % req.filename)]
package = apt_mgr.get_package_for_paths(paths)
if package is not None:
return AptRequirement(package)
paths = [
posixpath.join(
r"/usr/share/rubygems-integration/all/gems/([^/]+)/"
"lib/%s.rb" % req.filename
)
]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_xml_entity_req(apt_mgr, req):
# Ideally we should be using the XML catalog for this, but hardcoding
# a few URLs will do for now..
URL_MAP = {
"http://www.oasis-open.org/docbook/xml/": "/usr/share/xml/docbook/schema/dtd/"
}
for url, path in URL_MAP.items():
if req.url.startswith(url):
search_path = posixpath.join(path, req.url[len(url) :])
break
else:
return None
pkg_name = apt_mgr.get_package_for_paths([search_path], regex=False)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_sprockets_file_req(apt_mgr, req):
if req.content_type == "application/javascript":
path = "/usr/share/.*/app/assets/javascripts/%s.js$" % req.name
else:
logging.warning("unable to handle content type %s", req.content_type)
return None
pkg_name = apt_mgr.get_package_for_paths([path], regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_java_class_req(apt_mgr, req):
# Unfortunately this only finds classes in jars installed on the host
# system :(
# TODO(jelmer): Call in session
output = apt_mgr.session.check_output(
["java-propose-classpath", "-c" + req.classname])
classpath = [p for p in output.decode().strip(":").strip().split(":") if p]
if not classpath:
logging.warning("unable to find classpath for %s", req.classname)
return False
logging.info("Classpath for %s: %r", req.classname, classpath)
package = apt_mgr.get_package_for_paths(classpath)
if package is None:
logging.warning("no package for files in %r", classpath)
return None
return AptRequirement(package)
def resolve_haskell_package_req(apt_mgr, req):
path = "/var/lib/ghc/package.conf.d/%s-.*.conf" % req.deps[0][0]
pkg_name = apt_mgr.get_package_for_paths([path], regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_maven_artifact_req(apt_mgr, req):
artifact = req.artifacts[0]
parts = artifact.split(":")
if len(parts) == 4:
(group_id, artifact_id, kind, version) = parts
regex = False
elif len(parts) == 3:
(group_id, artifact_id, version) = parts
kind = "jar"
regex = False
elif len(parts) == 2:
version = ".*"
(group_id, artifact_id) = parts
kind = "jar"
regex = True
else:
raise AssertionError("invalid number of parts to artifact %s" % artifact)
paths = [
posixpath.join(
"/usr/share/maven-repo",
group_id.replace(".", "/"),
artifact_id,
version,
"%s-%s.%s" % (artifact_id, version, kind),
)
]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=regex)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_gnome_common_req(apt_mgr, req):
return AptRequirement('gnome-common')
def resolve_jdk_file_req(apt_mgr, req):
path = req.jdk_path + ".*/" + req.filename
pkg_name = apt_mgr.get_package_for_paths([path], regex=True)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_perl_module_req(apt_mgr, req):
DEFAULT_PERL_PATHS = ["/usr/share/perl5"]
if req.inc is None:
if req.filename is None:
paths = [posixpath.join(inc, req.relfilename)
for inc in DEFAULT_PERL_PATHS]
elif not posixpath.isabs(req.filename):
return False
else:
paths = [req.filename]
else:
paths = [posixpath.join(inc, req.filename) for inc in req.inc]
pkg_name = apt_mgr.get_package_for_paths(paths, regex=False)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_perl_file_req(apt_mgr, req):
pkg_name = apt_mgr.get_package_for_paths([req.filename], regex=False)
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def _find_aclocal_fun(macro):
# TODO(jelmer): Use the API for codesearch.debian.net instead?
defun_prefix = b"AC_DEFUN([%s]," % macro.encode("ascii")
for entry in os.scandir("/usr/share/aclocal"):
if not entry.is_file():
continue
with open(entry.path, "rb") as f:
for line in f:
if line.startswith(defun_prefix):
return entry.path
raise KeyError
def resolve_autoconf_macro_req(apt_mgr, req):
try:
path = _find_aclocal_fun(req.macro)
except KeyError:
logging.info("No local m4 file found defining %s", req.macro)
return None
pkg_name = apt_mgr.get_package_for_paths([path])
if pkg_name is not None:
return AptRequirement(pkg_name)
return None
def resolve_python_module_req(apt_mgr, req):
if req.python_version == 2:
return get_package_for_python_module(apt_mgr, req.module, "cpython2", req.minimum_version)
elif req.python_version in (None, 3):
return get_package_for_python_module(apt_mgr, req.module, "cpython3", req.minimum_version)
else:
return None
def resolve_python_package_req(apt_mgr, req):
if req.python_version == 2:
return get_package_for_python_package(apt_mgr, req.package, "cpython2", req.minimum_version)
elif req.python_version in (None, 3):
return get_package_for_python_package(apt_mgr, req.package, "cpython3", req.minimum_version)
else:
return None
APT_REQUIREMENT_RESOLVERS = [
(BinaryRequirement, resolve_binary_req),
(PkgConfigRequirement, resolve_pkg_config_req),
(PathRequirement, resolve_path_req),
(CHeaderRequirement, resolve_c_header_req),
(JavaScriptRuntimeRequirement, resolve_js_runtime_req),
(ValaPackageRequirement, resolve_vala_package_req),
(RubyGemRequirement, resolve_ruby_gem_req),
(GoPackageRequirement, resolve_go_package_req),
(DhAddonRequirement, resolve_dh_addon_req),
(PhpClassRequirement, resolve_php_class_req),
(RPackageRequirement, resolve_r_package_req),
(NodePackageRequirement, resolve_node_package_req),
(LibraryRequirement, resolve_library_req),
(RubyFileRequirement, resolve_ruby_file_req),
(XmlEntityRequirement, resolve_xml_entity_req),
(SprocketsFileRequirement, resolve_sprockets_file_req),
(JavaClassRequirement, resolve_java_class_req),
(HaskellPackageRequirement, resolve_haskell_package_req),
(MavenArtifactRequirement, resolve_maven_artifact_req),
(GnomeCommonRequirement, resolve_gnome_common_req),
(JDKFileRequirement, resolve_jdk_file_req),
(PerlModuleRequirement, resolve_perl_module_req),
(PerlFileRequirement, resolve_perl_file_req),
(AutoconfMacroRequirement, resolve_autoconf_macro_req),
(PythonModuleRequirement, resolve_python_module_req),
(PythonPackageRequirement, resolve_python_package_req),
]
def resolve_requirement_apt(apt_mgr, req: UpstreamRequirement) -> AptRequirement:
for rr_class, rr_fn in APT_REQUIREMENT_RESOLVERS:
if isinstance(req, rr_class):
return rr_fn(apt_mgr, req)
raise NotImplementedError(type(req))
class AptResolver(Resolver):
def __init__(self, apt):
self.apt = apt
def __str__(self):
return "apt"
@classmethod
def from_session(cls, session):
return cls(AptManager(session))
def install(self, requirements):
missing = []
for req in requirements:
try:
if not req.met(self.apt.session):
missing.append(req)
except NotImplementedError:
missing.append(req)
if not missing:
return
still_missing = []
apt_requirements = []
for m in missing:
apt_req = self.resolve(m)
if apt_req is None:
still_missing.append(m)
else:
apt_requirements.append(apt_req)
if apt_requirements:
self.apt.install([r.package for r in apt_requirements])
if still_missing:
raise UnsatisfiedRequirements(still_missing)
def explain(self, requirements):
raise NotImplementedError(self.explain)
def resolve(self, req: UpstreamRequirement):
return resolve_requirement_apt(self.apt, req)

View file

@ -17,11 +17,12 @@
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from typing import Optional, List, Dict from typing import Optional, List, Dict
import sys
import subprocess
class Session(object): class Session(object):
def __enter__(self) -> "Session":
def __enter__(self) -> 'Session':
return self return self
def __exit__(self, exc_type, exc_val, exc_tb): def __exit__(self, exc_type, exc_val, exc_tb):
@ -32,35 +33,58 @@ class Session(object):
@property @property
def location(self) -> str: def location(self) -> str:
raise NotImplementedError(self.location) raise NotImplementedError
def check_call( def check_call(
self, self,
argv: List[str], cwd: Optional[str] = None, argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None): env: Optional[Dict[str, str]] = None,
):
raise NotImplementedError(self.check_call) raise NotImplementedError(self.check_call)
def check_output( def check_output(
self, self,
argv: List[str], cwd: Optional[str] = None, argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None) -> bytes: env: Optional[Dict[str, str]] = None,
) -> bytes:
raise NotImplementedError(self.check_output) raise NotImplementedError(self.check_output)
def Popen(self, argv, cwd: Optional[str] = None, def Popen(
user: Optional[str] = None, **kwargs): self, argv, cwd: Optional[str] = None, user: Optional[str] = None, **kwargs
):
raise NotImplementedError(self.Popen) raise NotImplementedError(self.Popen)
def call( def call(
self, argv: List[str], cwd: Optional[str] = None, self, argv: List[str], cwd: Optional[str] = None, user: Optional[str] = None
user: Optional[str] = None): ):
raise NotImplementedError(self.call) raise NotImplementedError(self.call)
def create_home(self) -> None: def create_home(self) -> None:
"""Create the user's home directory.""" """Create the user's home directory."""
raise NotImplementedError(self.create_home) raise NotImplementedError(self.create_home)
def exists(self, path: str) -> bool:
"""Check whether a path exists in the chroot."""
raise NotImplementedError(self.exists)
def scandir(self, path: str):
raise NotImplementedError(self.scandir)
class SessionSetupFailure(Exception): class SessionSetupFailure(Exception):
"""Session failed to be set up.""" """Session failed to be set up."""
def run_with_tee(session: Session, args: List[str], **kwargs):
p = session.Popen(args, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, **kwargs)
contents = []
while p.poll() is None:
line = p.stdout.readline()
sys.stdout.buffer.write(line)
sys.stdout.buffer.flush()
contents.append(line.decode("utf-8", "surrogateescape"))
return p.returncode, contents

View file

@ -18,18 +18,29 @@
from . import Session from . import Session
import os
import subprocess import subprocess
class PlainSession(Session): class PlainSession(Session):
"""Session ignoring user.""" """Session ignoring user."""
location = "/"
def create_home(self): def create_home(self):
pass pass
def check_call(self, args): def check_call(self, args):
return subprocess.check_call(args) return subprocess.check_call(args)
def check_output(self, args):
return subprocess.check_output(args)
def Popen(self, args, stdout=None, stderr=None, user=None, cwd=None): def Popen(self, args, stdout=None, stderr=None, user=None, cwd=None):
return subprocess.Popen( return subprocess.Popen(args, stdout=stdout, stderr=stderr, cwd=cwd)
args, stdout=stdout, stderr=stderr, cwd=cwd)
def exists(self, path):
return os.path.exists(path)
def scandir(self, path):
return os.scandir(path)

View file

@ -15,6 +15,8 @@
# along with this program; if not, write to the Free Software # along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import logging
import os
import shlex import shlex
import subprocess import subprocess
@ -32,27 +34,36 @@ class SchrootSession(Session):
def __init__(self, chroot: str): def __init__(self, chroot: str):
if not isinstance(chroot, str): if not isinstance(chroot, str):
raise TypeError('not a valid chroot: %r' % chroot) raise TypeError("not a valid chroot: %r" % chroot)
self.chroot = chroot self.chroot = chroot
self._location = None self._location = None
self._cwd = None self._cwd = None
def _get_location(self) -> str: def _get_location(self) -> str:
return subprocess.check_output( return (
['schroot', '--location', '-c', 'session:' + self.session_id subprocess.check_output(
]).strip().decode() ["schroot", "--location", "-c", "session:" + self.session_id]
)
.strip()
.decode()
)
def _end_session(self) -> None: def _end_session(self) -> None:
subprocess.check_output( subprocess.check_output(["schroot", "-c", "session:" + self.session_id, "-e"])
['schroot', '-c', 'session:' + self.session_id, '-e'])
def __enter__(self) -> 'Session': def __enter__(self) -> "Session":
try: try:
self.session_id = subprocess.check_output( self.session_id = (
['schroot', '-c', self.chroot, '-b']).strip().decode() subprocess.check_output(["schroot", "-c", self.chroot, "-b"])
.strip()
.decode()
)
except subprocess.CalledProcessError: except subprocess.CalledProcessError:
# TODO(jelmer): Capture stderr and forward in SessionSetupFailure # TODO(jelmer): Capture stderr and forward in SessionSetupFailure
raise SessionSetupFailure() raise SessionSetupFailure()
logging.info(
'Opened schroot session %s (from %s)', self.session_id,
self.chroot)
return self return self
def __exit__(self, exc_type, exc_val, exc_tb): def __exit__(self, exc_type, exc_val, exc_tb):
@ -68,30 +79,41 @@ class SchrootSession(Session):
self._location = self._get_location() self._location = self._get_location()
return self._location return self._location
def _run_argv(self, argv: List[str], cwd: Optional[str] = None, def _run_argv(
self,
argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None): env: Optional[Dict[str, str]] = None,
base_argv = ['schroot', '-r', '-c', 'session:' + self.session_id] ):
base_argv = ["schroot", "-r", "-c", "session:" + self.session_id]
if cwd is None: if cwd is None:
cwd = self._cwd cwd = self._cwd
if cwd is not None: if cwd is not None:
base_argv.extend(['-d', cwd]) base_argv.extend(["-d", cwd])
if user is not None: if user is not None:
base_argv.extend(['-u', user]) base_argv.extend(["-u", user])
if env: if env:
argv = [ argv = [
'sh', '-c', "sh",
' '.join( "-c",
['%s=%s ' % (key, shlex.quote(value)) " ".join(
for (key, value) in env.items()] + [
[shlex.quote(arg) for arg in argv])] "%s=%s " % (key, shlex.quote(value))
return base_argv + ['--'] + argv for (key, value) in env.items()
]
+ [shlex.quote(arg) for arg in argv]
),
]
return base_argv + ["--"] + argv
def check_call( def check_call(
self, self,
argv: List[str], cwd: Optional[str] = None, argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None): env: Optional[Dict[str, str]] = None,
):
try: try:
subprocess.check_call(self._run_argv(argv, cwd, user, env=env)) subprocess.check_call(self._run_argv(argv, cwd, user, env=env))
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
@ -99,29 +121,49 @@ class SchrootSession(Session):
def check_output( def check_output(
self, self,
argv: List[str], cwd: Optional[str] = None, argv: List[str],
cwd: Optional[str] = None,
user: Optional[str] = None, user: Optional[str] = None,
env: Optional[Dict[str, str]] = None) -> bytes: env: Optional[Dict[str, str]] = None,
) -> bytes:
try: try:
return subprocess.check_output( return subprocess.check_output(self._run_argv(argv, cwd, user, env=env))
self._run_argv(argv, cwd, user, env=env))
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
raise subprocess.CalledProcessError(e.returncode, argv) raise subprocess.CalledProcessError(e.returncode, argv)
def Popen(self, argv, cwd: Optional[str] = None, def Popen(
user: Optional[str] = None, **kwargs): self, argv, cwd: Optional[str] = None, user: Optional[str] = None, **kwargs
):
return subprocess.Popen(self._run_argv(argv, cwd, user), **kwargs) return subprocess.Popen(self._run_argv(argv, cwd, user), **kwargs)
def call( def call(
self, argv: List[str], cwd: Optional[str] = None, self, argv: List[str], cwd: Optional[str] = None, user: Optional[str] = None
user: Optional[str] = None): ):
return subprocess.call(self._run_argv(argv, cwd, user)) return subprocess.call(self._run_argv(argv, cwd, user))
def create_home(self) -> None: def create_home(self) -> None:
"""Create the user's home directory.""" """Create the user's home directory."""
home = self.check_output( home = (
['sh', '-c', 'echo $HOME']).decode().rstrip('\n') self.check_output(["sh", "-c", "echo $HOME"], cwd="/").decode().rstrip("\n")
user = self.check_output( )
['sh', '-c', 'echo $LOGNAME']).decode().rstrip('\n') user = (
self.check_call(['mkdir', '-p', home], user='root') self.check_output(["sh", "-c", "echo $LOGNAME"], cwd="/")
self.check_call(['chown', user, home], user='root') .decode()
.rstrip("\n")
)
logging.info("Creating directory %s", home)
self.check_call(["mkdir", "-p", home], cwd="/", user="root")
self.check_call(["chown", user, home], cwd="/", user="root")
def _fullpath(self, path: str) -> str:
if self._cwd is None:
raise ValueError('no cwd set')
return os.path.join(self.location, os.path.join(self._cwd, path).lstrip("/"))
def exists(self, path: str) -> bool:
fullpath = self._fullpath(path)
return os.path.exists(fullpath)
def scandir(self, path: str):
fullpath = self._fullpath(path)
return os.scandir(fullpath)

30
ognibuild/test.py Normal file
View file

@ -0,0 +1,30 @@
#!/usr/bin/python3
# Copyright (C) 2020-2021 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
from .buildsystem import NoBuildToolsFound
def run_test(session, buildsystems, resolver, fixers):
# Some things want to write to the user's home directory,
# e.g. pip caches in ~/.cache
session.create_home()
for buildsystem in buildsystems:
buildsystem.test(session, resolver, fixers)
return
raise NoBuildToolsFound()

View file

@ -22,7 +22,9 @@ import unittest
def test_suite(): def test_suite():
names = [ names = [
"debian_build",
"debian_fix_build",
] ]
module_names = ['ognibuild.tests.test_' + name for name in names] module_names = ["ognibuild.tests.test_" + name for name in names]
loader = unittest.TestLoader() loader = unittest.TestLoader()
return loader.loadTestsFromNames(module_names) return loader.loadTestsFromNames(module_names)

View file

@ -0,0 +1,168 @@
#!/usr/bin/python
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import datetime
import os
from ..debian.build import add_dummy_changelog_entry, get_build_architecture
from breezy.tests import TestCaseWithTransport, TestCase
class AddDummyChangelogEntryTests(TestCaseWithTransport):
def test_simple(self):
tree = self.make_branch_and_tree(".")
self.build_tree_contents(
[
("debian/",),
(
"debian/changelog",
"""\
janitor (0.1-1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
),
]
)
tree.add(["debian", "debian/changelog"])
add_dummy_changelog_entry(
tree,
"",
"jan+some",
"some-fixes",
"Dummy build.",
timestamp=datetime.datetime(2020, 9, 5, 12, 35, 4, 899654),
maintainer=("Jelmer Vernooij", "jelmer@debian.org"),
)
self.assertFileEqual(
"""\
janitor (0.1-1jan+some1) some-fixes; urgency=low
* Dummy build.
-- Jelmer Vernooij <jelmer@debian.org> Sat, 05 Sep 2020 12:35:04 -0000
janitor (0.1-1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
"debian/changelog",
)
def test_native(self):
tree = self.make_branch_and_tree(".")
self.build_tree_contents(
[
("debian/",),
(
"debian/changelog",
"""\
janitor (0.1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
),
]
)
tree.add(["debian", "debian/changelog"])
add_dummy_changelog_entry(
tree,
"",
"jan+some",
"some-fixes",
"Dummy build.",
timestamp=datetime.datetime(2020, 9, 5, 12, 35, 4, 899654),
maintainer=("Jelmer Vernooij", "jelmer@debian.org"),
)
self.assertFileEqual(
"""\
janitor (0.1jan+some1) some-fixes; urgency=low
* Dummy build.
-- Jelmer Vernooij <jelmer@debian.org> Sat, 05 Sep 2020 12:35:04 -0000
janitor (0.1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
"debian/changelog",
)
def test_exists(self):
tree = self.make_branch_and_tree(".")
self.build_tree_contents(
[
("debian/",),
(
"debian/changelog",
"""\
janitor (0.1-1jan+some1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
),
]
)
tree.add(["debian", "debian/changelog"])
add_dummy_changelog_entry(
tree,
"",
"jan+some",
"some-fixes",
"Dummy build.",
timestamp=datetime.datetime(2020, 9, 5, 12, 35, 4, 899654),
maintainer=("Jelmer Vernooij", "jelmer@debian.org"),
)
self.assertFileEqual(
"""\
janitor (0.1-1jan+some2) some-fixes; urgency=low
* Dummy build.
-- Jelmer Vernooij <jelmer@debian.org> Sat, 05 Sep 2020 12:35:04 -0000
janitor (0.1-1jan+some1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
"debian/changelog",
)
class BuildArchitectureTests(TestCase):
def setUp(self):
super(BuildArchitectureTests, self).setUp()
if not os.path.exists('/usr/bin/dpkg-architecture'):
self.skipTest('not a debian system')
def test_is_str(self):
self.assertIsInstance(get_build_architecture(), str)

View file

@ -0,0 +1,230 @@
#!/usr/bin/python
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import os
import re
from debian.deb822 import Deb822
from buildlog_consultant.common import (
MissingCommand,
MissingGoPackage,
MissingPerlModule,
MissingPkgConfig,
MissingPythonModule,
MissingRubyFile,
MissingRubyGem,
MissingValaPackage,
)
from ..debian import apt
from ..debian.apt import AptManager, FileSearcher
from ..debian.fix_build import (
resolve_error,
versioned_package_fixers,
apt_fixers,
BuildDependencyContext,
)
from breezy.tests import TestCaseWithTransport
class DummyAptSearcher(FileSearcher):
def __init__(self, files):
self._apt_files = files
def search_files(self, path, regex=False):
for p, pkg in sorted(self._apt_files.items()):
if regex:
if re.match(path, p):
yield pkg
else:
if path == p:
yield pkg
class ResolveErrorTests(TestCaseWithTransport):
def setUp(self):
super(ResolveErrorTests, self).setUp()
if not os.path.exists('/usr/bin/dpkg-architecture'):
self.skipTest('not a debian system')
self.tree = self.make_branch_and_tree(".")
self.build_tree_contents(
[
("debian/",),
(
"debian/control",
"""\
Source: blah
Build-Depends: libc6
Package: python-blah
Depends: ${python3:Depends}
Description: A python package
Foo
""",
),
(
"debian/changelog",
"""\
blah (0.1) UNRELEASED; urgency=medium
* Initial release. (Closes: #XXXXXX)
-- Jelmer Vernooij <jelmer@debian.org> Sat, 04 Apr 2020 14:12:13 +0000
""",
),
]
)
self.tree.add(["debian", "debian/control", "debian/changelog"])
self.tree.commit("Initial commit")
self._apt_files = {}
def resolve(self, error, context=("build",)):
from ..session.plain import PlainSession
session = PlainSession()
apt = AptManager(session)
apt._searchers = [DummyAptSearcher(self._apt_files)]
context = BuildDependencyContext(
self.tree,
apt,
subpath="",
committer="ognibuild <ognibuild@jelmer.uk>",
update_changelog=True,
)
fixers = versioned_package_fixers(session) + apt_fixers(apt)
return resolve_error(error, context, fixers)
def get_build_deps(self):
with open(self.tree.abspath("debian/control"), "r") as f:
return next(Deb822.iter_paragraphs(f)).get("Build-Depends", "")
def test_missing_command_unknown(self):
self._apt_files = {}
self.assertFalse(self.resolve(MissingCommand("acommandthatdoesnotexist")))
def test_missing_command_brz(self):
self._apt_files = {
"/usr/bin/b": "bash",
"/usr/bin/brz": "brz",
"/usr/bin/brzier": "bash",
}
self.overrideEnv('DEBEMAIL', 'jelmer@debian.org')
self.overrideEnv('DEBFULLNAME', 'Jelmer Vernooij')
self.assertTrue(self.resolve(MissingCommand("brz")))
self.assertEqual("libc6, brz", self.get_build_deps())
rev = self.tree.branch.repository.get_revision(self.tree.branch.last_revision())
self.assertEqual("Add missing build dependency on brz.\n", rev.message)
self.assertFalse(self.resolve(MissingCommand("brz")))
self.assertEqual("libc6, brz", self.get_build_deps())
def test_missing_command_ps(self):
self._apt_files = {
"/bin/ps": "procps",
"/usr/bin/pscal": "xcal",
}
self.assertTrue(self.resolve(MissingCommand("ps")))
self.assertEqual("libc6, procps", self.get_build_deps())
def test_missing_ruby_file(self):
self._apt_files = {
"/usr/lib/ruby/vendor_ruby/rake/testtask.rb": "rake",
}
self.assertTrue(self.resolve(MissingRubyFile("rake/testtask")))
self.assertEqual("libc6, rake", self.get_build_deps())
def test_missing_ruby_file_from_gem(self):
self._apt_files = {
"/usr/share/rubygems-integration/all/gems/activesupport-"
"5.2.3/lib/active_support/core_ext/string/strip.rb": "ruby-activesupport"
}
self.assertTrue(
self.resolve(MissingRubyFile("active_support/core_ext/string/strip"))
)
self.assertEqual("libc6, ruby-activesupport", self.get_build_deps())
def test_missing_ruby_gem(self):
self._apt_files = {
"/usr/share/rubygems-integration/all/specifications/"
"bio-1.5.2.gemspec": "ruby-bio",
"/usr/share/rubygems-integration/all/specifications/"
"bio-2.0.2.gemspec": "ruby-bio",
}
self.assertTrue(self.resolve(MissingRubyGem("bio", None)))
self.assertEqual("libc6, ruby-bio", self.get_build_deps())
self.assertTrue(self.resolve(MissingRubyGem("bio", "2.0.3")))
self.assertEqual("libc6, ruby-bio (>= 2.0.3)", self.get_build_deps())
def test_missing_perl_module(self):
self._apt_files = {"/usr/share/perl5/App/cpanminus/fatscript.pm": "cpanminus"}
self.assertTrue(
self.resolve(
MissingPerlModule(
"App/cpanminus/fatscript.pm",
"App::cpanminus::fatscript",
[
"/<<PKGBUILDDIR>>/blib/lib",
"/<<PKGBUILDDIR>>/blib/arch",
"/etc/perl",
"/usr/local/lib/x86_64-linux-gnu/perl/5.30.0",
"/usr/local/share/perl/5.30.0",
"/usr/lib/x86_64-linux-gnu/perl5/5.30",
"/usr/share/perl5",
"/usr/lib/x86_64-linux-gnu/perl/5.30",
"/usr/share/perl/5.30",
"/usr/local/lib/site_perl",
"/usr/lib/x86_64-linux-gnu/perl-base",
".",
],
)
)
)
self.assertEqual("libc6, cpanminus", self.get_build_deps())
def test_missing_pkg_config(self):
self._apt_files = {
"/usr/lib/x86_64-linux-gnu/pkgconfig/xcb-xfixes.pc": "libxcb-xfixes0-dev"
}
self.assertTrue(self.resolve(MissingPkgConfig("xcb-xfixes")))
self.assertEqual("libc6, libxcb-xfixes0-dev", self.get_build_deps())
def test_missing_pkg_config_versioned(self):
self._apt_files = {
"/usr/lib/x86_64-linux-gnu/pkgconfig/xcb-xfixes.pc": "libxcb-xfixes0-dev"
}
self.assertTrue(self.resolve(MissingPkgConfig("xcb-xfixes", "1.0")))
self.assertEqual("libc6, libxcb-xfixes0-dev (>= 1.0)", self.get_build_deps())
def test_missing_python_module(self):
self._apt_files = {"/usr/lib/python3/dist-packages/m2r.py": "python3-m2r"}
self.assertTrue(self.resolve(MissingPythonModule("m2r")))
self.assertEqual("libc6, python3-m2r", self.get_build_deps())
def test_missing_go_package(self):
self._apt_files = {
"/usr/share/gocode/src/github.com/chzyer/readline/utils_test.go": "golang-github-chzyer-readline-dev",
}
self.assertTrue(self.resolve(MissingGoPackage("github.com/chzyer/readline")))
self.assertEqual(
"libc6, golang-github-chzyer-readline-dev", self.get_build_deps()
)
def test_missing_vala_package(self):
self._apt_files = {
"/usr/share/vala-0.48/vapi/posix.vapi": "valac-0.48-vapi",
}
self.assertTrue(self.resolve(MissingValaPackage("posix")))
self.assertEqual("libc6, valac-0.48-vapi", self.get_build_deps())

63
ognibuild/vcs.py Normal file
View file

@ -0,0 +1,63 @@
#!/usr/bin/python3
# Copyright (C) 2020 Jelmer Vernooij <jelmer@jelmer.uk>
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
import errno
from breezy.errors import NotBranchError
from breezy.export import export
from breezy.workingtree import WorkingTree
from buildlog_consultant.sbuild import (
NoSpaceOnDevice,
)
from . import DetailedFailure
def export_vcs_tree(tree, directory):
try:
export(tree, directory, "dir", None)
except OSError as e:
if e.errno == errno.ENOSPC:
raise DetailedFailure(1, ["export"], NoSpaceOnDevice())
raise
def dupe_vcs_tree(tree, directory):
with tree.lock_read():
if isinstance(tree, WorkingTree):
tree = tree.basis_tree()
try:
result = tree._repository.controldir.sprout(
directory, create_tree_if_local=True, revision_id=tree.get_revision_id()
)
except OSError as e:
if e.errno == errno.ENOSPC:
raise DetailedFailure(1, ["sprout"], NoSpaceOnDevice())
raise
if not result.has_workingtree():
raise AssertionError
# Copy parent location - some scripts need this
if isinstance(tree, WorkingTree):
parent = tree.branch.get_parent()
else:
try:
parent = tree._repository.controldir.open_branch().get_parent()
except NotBranchError:
parent = None
if parent:
result.open_branch().set_parent(parent)

9
releaser.conf Normal file
View file

@ -0,0 +1,9 @@
name: "ognibuild"
timeout_days: 5
tag_name: "v$VERSION"
verify_command: "python3 setup.py test"
update_version {
path: "setup.py"
match: "^ version=\"(.*)\",$"
new_line: " version=\"$VERSION\","
}

View file

@ -1,5 +1,5 @@
[flake8] [flake8]
application-package-names = ognibuild banned-modules = silver-platter = Should not use silver-platter
[mypy] [mypy]
# A number of ognibuilds' dependencies don't have type hints yet # A number of ognibuilds' dependencies don't have type hints yet

View file

@ -23,5 +23,17 @@ setup(name="ognibuild",
], ],
entry_points={ entry_points={
"console_scripts": [ "console_scripts": [
"ogni=ognibuild.__main__:main"] "ogni=ognibuild.__main__:main",
}) "deb-fix-build=ognibuild.debian.fix_build:main",
]
},
install_requires=[
'breezy',
'buildlog-consultant',
],
extras_require={
'debian': ['debmutate', 'python_debian', 'python_apt'],
},
tests_require=['python_debian', 'buildlog-consultant', 'breezy'],
test_suite='ognibuild.tests.test_suite',
)