Compare commits

...

44 Commits

Author SHA1 Message Date
38405b883b
importer: cleanup should be happening if there's no meta 2024-03-10 21:45:08 -07:00
e8b72866ae
pv2: make python 3.9 the minimum 2024-03-10 21:39:31 -07:00
d9e2f2b370
importer: make lack of meta message more informative 2024-03-10 21:38:37 -07:00
32642b1efd
importer: fix metafile not found issue 2024-03-10 21:11:25 -07:00
ff35a29d08
importer: Address upstream hash ref list issue
There are cases where upstream (stream 10 for example) have repos for
packages that they plan or will ship, but there are no references
associated. Meaning, there's no branches, there's no data, nothing to
work off of.
2024-03-10 20:54:43 -07:00
7f2004719a
importer: Resolve invalid escape sequence on newer python
Signed-off-by: Louis Abel <label@rockylinux.org>
2024-03-08 19:12:50 -07:00
c2297d6189
rpmautospec: lack of changelog file shouldn't matter
Signed-off-by: Louis Abel <label@rockylinux.org>
2024-03-08 19:10:05 -07:00
01a6696adc
processor fix 2024-03-04 17:27:16 -07:00
99035af3e9
regex fixes 2024-03-04 17:09:33 -07:00
90aa3e2016
add autochangelog logic 2024-03-04 17:06:59 -07:00
3e925ca471
add autochangelog logic 2024-03-04 17:06:07 -07:00
b982646f3a
update readme to specify rpmautospec 2024-02-23 12:22:44 -07:00
0854a39c58
keep it all the same 2024-02-23 11:06:39 -07:00
45abe82b40
try conving name for pkg 2024-02-23 10:53:25 -07:00
f3213dafaf
preconv name support for git imports 2024-02-23 09:59:56 -07:00
1255a10ef9
preconv name support for git imports 2024-02-23 09:34:15 -07:00
7001664d9f
preconv name support for git imports 2024-02-23 09:08:24 -07:00
236f18412b
preconv name support for git imports 2024-02-23 08:52:16 -07:00
0913ae912f
temporarily allow md5 2024-02-12 17:22:00 -07:00
93f5fbe65d
make blank sources file 2024-01-31 15:18:23 -07:00
95c701aa38
hotfix: ~bootstrap should be removed from tag 2023-11-07 10:36:28 -07:00
f4499ca17d
hotfix: new branch should be the actual set branch name 2023-10-29 15:29:29 -07:00
655a580afd
pgp keys can go in lookaside 2023-10-19 13:51:40 -07:00
022d4ed03a
verify signature is on by default, switch is wrong 2023-10-17 18:55:11 -07:00
4fc3f849e1
hotfix: make sure modulemd.src.txt is original 2023-09-03 01:51:08 -07:00
a4136a972b
ready option not needed 2023-08-30 14:17:56 -07:00
1f22fc3196
disable bootstrap images 2023-08-30 13:14:07 -07:00
dae5bb1789
add quick print statement 2023-08-24 15:30:20 -07:00
67106beac7
add s3 uploader 2023-08-24 15:26:07 -07:00
46ad89c403
add s3 uploader 2023-08-24 15:22:16 -07:00
70a9af6b10
add initial importer 2023-08-23 14:16:08 -07:00
de3e4c56ed
make var names consistent 2023-08-22 00:35:53 -07:00
163f723677
hotfix: fix parser error
remove swap file
2023-08-22 00:06:16 -07:00
c329303281
0.11.0: add importer script for poetry 2023-08-21 23:53:31 -07:00
d0d7581e61
attempt to add rhel macro 2023-08-09 20:31:11 -07:00
e6627c48b7
hotfix: add print statement for moving 2023-08-09 18:18:43 -07:00
d232ddce1d
do not upload empty file 2023-08-03 08:15:48 -07:00
cb9165a243
do not upload empty file 2023-08-01 13:02:31 -07:00
b1b4ee4e38
hotfix: add temporary .gitignore removal on git import 2023-07-29 19:23:09 -07:00
Louis Abel
8a8fe26b95
hotfix: repair returncode catch 2023-07-13 21:54:24 -07:00
Louis Abel
417a969736
bump to version 0.10.1 2023-07-13 13:55:34 -07:00
Louis Abel
bcea55337f
hotfix: add returncode checks for operator 2023-07-13 13:29:54 -07:00
Louis Abel
32f0c3520c
non-source rpms should not be used in the importer 2023-07-13 09:50:45 -07:00
Louis Abel
3e8f3bb642
Add readme badges 2023-07-11 13:18:12 -07:00
10 changed files with 468 additions and 53 deletions

View File

@ -1,19 +1,25 @@
# pv2
![pv2 last commit](https://img.shields.io/github/last-commit/peridotbuild/pv2/development) ![pv2 issues](https://img.shields.io/github/issues/peridotbuild/pv2?link=https%3A%2F%2Fgithub.com%2Fperidotbuild%2Fpv2%2Fissues) ![prs](https://img.shields.io/github/issues-pr/peridotbuild/pv2?link=https%3A%2F%2Fgithub.com%2Fperidotbuild%2Fpv2%2Fpulls)
![language](https://img.shields.io/badge/language-python-blue)
![license](https://img.shields.io/github/license/peridotbuild/pv2)
pv2 is a backend module framework for building and development. Initially
designed as a POC to support peridot's transition to python, it provides
utilities that can be used for developers in and outside of the projects
in the RESF (such as Rocky Linux).
designed as a POC to support peridot's potential transition to python, it
provides utilities that can be used for developers in and outside of the
projects in the RESF (such as Rocky Linux).
## Requirements
* An RPM Distribution
* Fedora
* Enterprise Linux 8, 9+ recommended
* CentOS Stream 8, 9+ recommended
* Enterprise Linux 9+ recommended
* CentOS Stream 9+ recommended
* Python 3.6 or higher - Python 3.9+ recommended
* Python 3.9 or higher
* rpm-build
* A few python modules
@ -28,6 +34,10 @@ in the RESF (such as Rocky Linux).
* \*-rpm-macros
* \*-srpm-macros
* additional packages either in Fedora Linux or EPEL
* rpmautospec-rpm-macros
## Example Scripts
Example scripts are found in the `examples` directory, which can utilize

View File

@ -12,6 +12,7 @@ import datetime
from pv2.util import gitutil, fileutil, rpmutil, processor, generic
from pv2.util import error as err
from pv2.util import constants as const
from pv2.util import uploader as upload
#try:
# import gi
@ -21,6 +22,13 @@ from pv2.util import constants as const
#except ImportError:
# HAS_GI = False
try:
from rpmautospec.subcommands import process_distgit as rpmautocl
HAS_RPMAUTOSPEC = True
except ImportError:
HAS_RPMAUTOSPEC = False
print('WARNING! rpmautospec was not found on this system and is not loaded.')
__all__ = [
'Import',
'SrpmImport',
@ -80,10 +88,13 @@ class Import:
f"'%_topdir {local_repo_path}'"
]
command_to_send = ' '.join(command_to_send)
processor.run_proc_no_output_shell(command_to_send)
returned = processor.run_proc_no_output_shell(command_to_send)
if returned.returncode != 0:
rpmerr = returned.stderr
raise err.RpmOpenError(f'This package could not be unpacked:\n\n{rpmerr}')
@staticmethod
def pack_srpm(srpm_dir, spec_file, dist_tag):
def pack_srpm(srpm_dir, spec_file, dist_tag, release_ver):
"""
Packs an srpm from available sources
"""
@ -99,10 +110,15 @@ class Import:
'--define',
f"'_topdir {srpm_dir}'",
'--define',
f"'_sourcedir {srpm_dir}'"
f"'_sourcedir {srpm_dir}'",
'--define',
f"'rhel {release_ver}'"
]
command_to_send = ' '.join(command_to_send)
returned = processor.run_proc_no_output_shell(command_to_send)
if returned.returncode != 0:
rpmerr = returned.stderr
raise err.RpmBuildError(f'There was error packing the rpm:\n\n{rpmerr}')
wrote_regex = r'Wrote:\s+(.*\.rpm)'
regex_search = re.search(wrote_regex, returned.stdout, re.MULTILINE)
if regex_search:
@ -141,9 +157,21 @@ class Import:
for file in os.scandir(f'{local_repo_path}/SOURCES'):
full_path = f'{local_repo_path}/SOURCES/{file.name}'
magic = fileutil.get_magic_file(full_path)
if magic.name == 'empty':
continue
# PGP public keys have been in the lookaside before. We'll
# just do it this way. It gets around weird gitignores and
# weird srpmproc behavior.
if 'PGP public' in magic.name:
source_dict[f'SOURCES/{file.name}'] = fileutil.get_checksum(full_path)
if magic.encoding == 'binary':
source_dict[f'SOURCES/{file.name}'] = fileutil.get_checksum(full_path)
# This is a list of possible file names that should be in
# lookaside, even if their type ISN'T that.
if full_path.endswith('.rpm'):
source_dict[f'SOURCES/{file.name}'] = fileutil.get_checksum(full_path)
return source_dict
@staticmethod
@ -179,9 +207,40 @@ class Import:
print(f'{dest_path} already exists, skipping')
os.remove(source_path)
else:
print(f'Moving {source_path} to {dest_path}')
shutil.move(src=source_path, dst=dest_path)
if os.path.exists('/usr/sbin/restorecon'):
processor.run_proc_foreground_shell(f'/usr/sbin/restorecon {dest_path}')
@staticmethod
# pylint: disable=too-many-arguments
def upload_to_s3(repo_path, file_dict: dict, bucket, aws_key_id: str,
aws_secret_key: str, overwrite: bool = False):
"""
Upload an object to s3
"""
print('Pushing sources to S3...')
for name, sha in file_dict.items():
source_path = f'{repo_path}/{name}'
dest_name = sha
upload.upload_to_s3(source_path, bucket, aws_key_id,
aws_secret_key, dest_name=dest_name,
overwrite=overwrite)
@staticmethod
def import_lookaside_peridot_cli(
repo_path: str,
repo_name: str,
file_dict: dict,
):
"""
Attempts to find and use the peridot-cli binary to upload to peridot's
lookaside. This assumes the environment is setup correctly with the
necessary variables.
Note: This is a temporary hack and will be removed in a future update.
"""
for name, _ in file_dict.items():
source_path = f'{repo_path}/{name}'
@staticmethod
def skip_import_lookaside(repo_path: str, file_dict: dict):
@ -300,8 +359,12 @@ class SrpmImport(Import):
distprefix: str = 'el',
git_user: str = 'git',
org: str = 'rpms',
preconv_names: bool = False,
dest_lookaside: str = '/var/www/html/sources',
verify_signature: bool = False
verify_signature: bool = False,
aws_access_key_id: str = '',
aws_access_key: str = '',
aws_bucket: str = ''
):
"""
Init the class.
@ -318,9 +381,20 @@ class SrpmImport(Import):
self.__dest_lookaside = dest_lookaside
pkg_name = self.__srpm_metadata['name']
git_url = f'ssh://{git_user}@{git_url_path}/{org}/{pkg_name}.git'
package_name = pkg_name
if preconv_names:
package_name = pkg_name.replace('+', 'plus')
git_url = f'ssh://{git_user}@{git_url_path}/{org}/{package_name}.git'
self.__git_url = git_url
file_name_search_srpm_res = re.search(r'.*?\.src\.rpm$',
self.__srpm_path, re.IGNORECASE)
if not file_name_search_srpm_res:
raise err.RpmInfoError('This is not a source package')
if len(release) == 0:
self.__release = self.__get_srpm_release_version
@ -332,6 +406,10 @@ class SrpmImport(Import):
self.__branch = f'c{release}'
print(f'Warning: Branch name not specified, defaulting to {self.__branch}')
self.__aws_access_key_id = aws_access_key_id
self.__aws_access_key = aws_access_key
self.__aws_bucket = aws_bucket
def __get_srpm_release_version(self):
"""
Gets the release version from the srpm
@ -344,7 +422,8 @@ class SrpmImport(Import):
return None
def pkg_import(self, skip_lookaside: bool = False):
# pylint: disable=too-many-locals
def pkg_import(self, skip_lookaside: bool = False, s3_upload: bool = False):
"""
Actually perform the import
@ -352,7 +431,7 @@ class SrpmImport(Import):
than uploaded to lookaside.
"""
check_repo = gitutil.lsremote(self.git_url)
git_repo_path = f'/var/tmp/{self.rpm_name}'
git_repo_path = f'/var/tmp/{self.rpm_name_replace}'
branch = self.__branch
repo_tags = []
@ -384,7 +463,7 @@ class SrpmImport(Import):
repo_name=self.rpm_name_replace,
branch=None
)
gitutil.checkout(repo, branch=self.__branch, orphan=True)
gitutil.checkout(repo, branch=branch, orphan=True)
# Remove everything, plain and simple. Only needed for clone.
self.remove_everything(repo.working_dir)
for tag_name in repo.tags:
@ -412,12 +491,30 @@ class SrpmImport(Import):
self.generate_metadata(git_repo_path, self.rpm_name, sources)
self.generate_filesum(git_repo_path, self.rpm_name, self.srpm_hash)
if s3_upload:
# I don't want to blatantly blow up here yet.
if len(self.__aws_access_key_id) == 0 or len(self.__aws_access_key) == 0 or len(self.__aws_bucket) == 0:
print('WARNING: No access key, ID, or bucket was provided. Skipping upload.')
else:
self.upload_to_s3(
git_repo_path,
sources,
self.__aws_bucket,
self.__aws_access_key_id,
self.__aws_access_key,
)
if skip_lookaside:
self.skip_import_lookaside(git_repo_path, sources)
else:
self.import_lookaside(git_repo_path, self.rpm_name, branch,
sources, self.dest_lookaside)
# Temporary hack like with git.
dest_gitignore_file = f'{git_repo_path}/.gitignore'
if os.path.exists(dest_gitignore_file):
os.remove(dest_gitignore_file)
gitutil.add_all(repo)
verify = repo.is_dirty()
@ -462,6 +559,14 @@ class SrpmImport(Import):
"""
return self.__srpm_metadata['name']
@property
def rpm_name_replace(self):
"""
Returns name of srpm
"""
new_name = self.__srpm_metadata['name'].replace('+', 'plus')
return new_name
@property
def rpm_version(self):
"""
@ -474,7 +579,9 @@ class SrpmImport(Import):
"""
Returns release of srpm
"""
return self.__srpm_metadata['release']
# Remove ~bootstrap
final_string = self.__srpm_metadata['release'].replace('~bootstrap', '')
return final_string
@property
def part_of_module(self):
@ -489,14 +596,6 @@ class SrpmImport(Import):
return False
@property
def rpm_name_replace(self):
"""
Returns a "fixed" version of the RPM name
"""
new_name = self.__srpm_metadata['name'].replace('+', 'plus')
return new_name
@property
def distprefix(self):
"""
@ -527,19 +626,24 @@ class GitImport(Import):
package: str,
source_git_url_path: str,
source_git_org_path: str,
git_url_path: str,
dest_git_url_path: str,
release: str,
branch: str,
source_branch: str,
upstream_lookaside: str,
scl_mode: bool = False,
scl_package: str = '',
alternate_spec_name: str = '',
preconv_names: bool = False,
dest_lookaside: str = '/var/www/html/sources',
source_git_protocol: str = 'https',
dest_branch: str = '',
distprefix: str = 'el',
git_user: str = 'git',
org: str = 'rpms'
source_git_user: str = 'git',
dest_git_user: str = 'git',
dest_org: str = 'rpms',
aws_access_key_id: str = '',
aws_access_key: str = '',
aws_bucket: str = ''
):
"""
Init the class.
@ -550,16 +654,28 @@ class GitImport(Import):
self.__rpm = package
self.__release = release
# pylint: disable=line-too-long
self.__source_git_url = f'{source_git_protocol}://{source_git_url_path}/{source_git_org_path}/{package}.git'
self.__git_url = f'ssh://{git_user}@{git_url_path}/{org}/{package}.git'
full_source_git_url_path = source_git_url_path
if source_git_protocol == 'ssh':
full_source_git_url_path = f'{source_git_user}@{source_git_url_path}'
package_name = package
if preconv_names:
package_name = package.replace('+', 'plus')
self.__source_git_url = f'{source_git_protocol}://{full_source_git_url_path}/{source_git_org_path}/{package_name}.git'
self.__dest_git_url = f'ssh://{dest_git_user}@{dest_git_url_path}/{dest_org}/{package_name}.git'
self.__dist_prefix = distprefix
self.__dist_tag = f'.{distprefix}{release}'
self.__branch = branch
self.__dest_branch = branch
self.__source_branch = source_branch
self.__dest_branch = source_branch
self.__dest_lookaside = dest_lookaside
self.__upstream_lookaside = upstream_lookaside
self.__upstream_lookaside_url = self.get_lookaside_template_path(upstream_lookaside)
self.__alternate_spec_name = alternate_spec_name
self.__preconv_names = preconv_names
self.__aws_access_key_id = aws_access_key_id
self.__aws_access_key = aws_access_key
self.__aws_bucket = aws_bucket
if len(dest_branch) > 0:
self.__dest_branch = dest_branch
@ -567,8 +683,8 @@ class GitImport(Import):
if not self.__upstream_lookaside:
raise err.ConfigurationError(f'{upstream_lookaside} is not valid.')
# pylint: disable=too-many-locals
def pkg_import(self, skip_lookaside: bool = False):
# pylint: disable=too-many-locals, too-many-statements, too-many-branches
def pkg_import(self, skip_lookaside: bool = False, s3_upload: bool = False):
"""
Actually perform the import
@ -579,12 +695,14 @@ class GitImport(Import):
check_dest_repo = gitutil.lsremote(self.dest_git_url)
source_git_repo_path = f'/var/tmp/{self.rpm_name}-source'
source_git_repo_spec = f'{source_git_repo_path}/{self.rpm_name}.spec'
source_git_repo_changelog = f'{source_git_repo_path}/changelog'
dest_git_repo_path = f'/var/tmp/{self.rpm_name}'
metadata_file = f'{source_git_repo_path}/.{self.rpm_name}.metadata'
sources_file = f'{source_git_repo_path}/sources'
source_branch = self.source_branch
dest_branch = self.dest_branch
_dist_tag = self.dist_tag
release_ver = self.__release
repo_tags = []
# If the upstream repo doesn't report anything, exit.
@ -647,6 +765,7 @@ class GitImport(Import):
# Within the confines of the source git repo, we need to find a
# "sources" file or a metadata file. One of these will determine which
# route we take.
metafile_to_use = None
if os.path.exists(metadata_file):
no_metadata_list = ['stream', 'fedora']
if any(ignore in self.upstream_lookaside for ignore in no_metadata_list):
@ -660,7 +779,16 @@ class GitImport(Import):
raise err.ConfigurationError(f'sources files are not supported with {self.upstream_lookaside}')
metafile_to_use = sources_file
else:
raise err.GenericError('sources or metadata file NOT found')
#raise err.GenericError('sources or metadata file NOT found')
# There isn't a reason to make a blank file right now.
print('WARNING: There was no sources or metadata found.')
with open(metadata_file, 'w+') as metadata_handle:
pass
if not metafile_to_use:
print('Source: There was no metadata file found. Skipping import attempt.')
self.perform_cleanup([source_git_repo_path, dest_git_repo_path])
return False
sources_dict = self.parse_metadata_file(metafile_to_use)
@ -689,8 +817,37 @@ class GitImport(Import):
if not os.path.exists(source_git_repo_spec) and len(self.alternate_spec_name) == 0:
source_git_repo_spec = self.find_spec_file(source_git_repo_path)
# do rpm autochangelog logic here
#if HAS_RPMAUTOSPEC and os.path.exists(source_git_repo_changelog):
if HAS_RPMAUTOSPEC:
# Check that the spec file really has %autochangelog
AUTOCHANGELOG = False
with open(source_git_repo_spec, 'r') as spec_file:
for line in spec_file:
if re.match(r'^%autochangelog', line):
print('autochangelog found')
AUTOCHANGELOG = True
spec_file.close()
# It was easier to do this then reimplement logic
if AUTOCHANGELOG:
try:
rpmautocl.process_distgit(
source_git_repo_spec,
f'/tmp/{self.rpm_name}.spec'
)
except Exception as exc:
raise err.GenericError('There was an error with autospec.') from exc
shutil.copy(f'/tmp/{self.rpm_name}.spec',
f'{source_git_repo_path}/{self.rpm_name}.spec')
os.remove(f'/tmp/{self.rpm_name}.spec')
# attempt to pack up the RPM, get metadata
packed_srpm = self.pack_srpm(source_git_repo_path, source_git_repo_spec, _dist_tag)
packed_srpm = self.pack_srpm(source_git_repo_path,
source_git_repo_spec,
_dist_tag,
release_ver)
if not packed_srpm:
raise err.MissingValueError(
'The srpm was not written, yet command completed successfully.'
@ -711,12 +868,33 @@ class GitImport(Import):
self.generate_metadata(dest_git_repo_path, self.rpm_name, sources)
self.generate_filesum(dest_git_repo_path, self.rpm_name, "Direct Git Import")
if s3_upload:
# I don't want to blatantly blow up here yet.
if len(self.__aws_access_key_id) == 0 or len(self.__aws_access_key) == 0 or len(self.__aws_bucket) == 0:
print('WARNING: No access key, ID, or bucket was provided. Skipping upload.')
else:
self.upload_to_s3(
dest_git_repo_path,
sources,
self.__aws_bucket,
self.__aws_access_key_id,
self.__aws_access_key,
)
if skip_lookaside:
self.skip_import_lookaside(dest_git_repo_path, sources)
else:
self.import_lookaside(dest_git_repo_path, self.rpm_name, dest_branch,
sources, self.dest_lookaside)
# This is a temporary hack. There are cases that the .gitignore that's
# provided by upstream errorneouly keeps out certain sources, despite
# the fact that they were pushed before. We're killing off any
# .gitignore we find in the root.
dest_gitignore_file = f'{dest_git_repo_path}/.gitignore'
if os.path.exists(dest_gitignore_file):
os.remove(dest_gitignore_file)
gitutil.add_all(dest_repo)
verify = dest_repo.is_dirty()
if verify:
@ -733,8 +911,11 @@ class GitImport(Import):
"""
Returns the translated URL to obtain sources
"""
rpm_name = self.rpm_name
if self.preconv_names:
rpm_name = self.rpm_name_replace
dict_template = {
'PKG_NAME': self.rpm_name,
'PKG_NAME': rpm_name,
'FILENAME': filename,
'HASH_TYPE': hashtype.lower(),
'HASH': checksum
@ -771,7 +952,7 @@ class GitImport(Import):
"""
Returns the starting branch
"""
return self.__branch
return self.__source_branch
@property
def dest_branch(self):
@ -792,7 +973,7 @@ class GitImport(Import):
"""
Returns the destination git url
"""
return self.__git_url
return self.__dest_git_url
@property
def dist_tag(self):
@ -822,6 +1003,13 @@ class GitImport(Import):
"""
return self.__dest_lookaside
@property
def preconv_names(self):
"""
Returns if names are being preconverted
"""
return self.__preconv_names
class ModuleImport(Import):
"""
Imports module repos
@ -962,7 +1150,7 @@ class ModuleImport(Import):
with open(modulemd_file, 'r') as module_yaml:
content = module_yaml.read()
content_new = re.sub('ref:\s+(.*)', f'ref: {dest_branch}', content)
content_new = re.sub(r'ref:\s+(.*)', f'ref: {dest_branch}', content)
module_yaml.close()
# Write to the root
@ -970,10 +1158,11 @@ class ModuleImport(Import):
module_yaml.write(content_new)
module_yaml.close()
# Write to the sources, should be the same content
with open(f'{dest_git_repo_path}/SOURCES/modulemd.src.txt', 'w') as module_yaml:
module_yaml.write(content_new)
module_yaml.close()
# Write to the sources. It needs to be the original content.
shutil.copy(modulemd_file, f'{dest_git_repo_path}/SOURCES/modulemd.src.txt')
#with open(f'{dest_git_repo_path}/SOURCES/modulemd.src.txt', 'w') as module_yaml:
# module_yaml.write(content_new)
# module_yaml.close()
self.generate_metadata(dest_git_repo_path, self.module_name, {})
gitutil.add_all(dest_repo)

View File

@ -340,6 +340,7 @@ class MockConfig(MockConfigUtils):
packager: str = 'Default Packager <packager@noone.home>',
distsuffix=None,
distribution=None,
use_bootstrap_image=False,
**kwargs
):
"""
@ -418,6 +419,7 @@ class MockConfig(MockConfigUtils):
'rpmbuild_networking': enable_networking,
'print_main_output': print_main_output,
'macros': default_macros,
'use_bootstrap_image': use_bootstrap_image,
}
self.__config_opts.update(**kwargs)
self.__extra_config_opts = collections.defaultdict(list)

111
pv2/scripts/import_pkg.py Normal file
View File

@ -0,0 +1,111 @@
#!/usr/bin/python3
# This is called to do imports, whether from an RPM or a git repo (e.g. CentOS
# stream gitlab)
import argparse
import pv2.importer as importutil
parser = argparse.ArgumentParser(description="Importer Utility")
subparser = parser.add_subparsers(dest='cmd')
subparser.required = True
rpm_parser = subparser.add_parser('rpm')
git_parser = subparser.add_parser('git')
rpm_parser.add_argument('--gituser', type=str, required=False, default='git')
rpm_parser.add_argument('--giturl', type=str, required=True)
rpm_parser.add_argument('--branch', type=str, required=True)
rpm_parser.add_argument('--srpm', type=str, required=True)
rpm_parser.add_argument('--release', type=str, required=False, default='')
rpm_parser.add_argument('--gitorg', type=str, required=False, default='rpms')
rpm_parser.add_argument('--distprefix', type=str, required=False, default='el')
rpm_parser.add_argument('--dest-lookaside', type=str, required=False, default='/var/www/html/sources')
rpm_parser.add_argument('--no-verify-signature', action='store_true')
rpm_parser.add_argument('--skip-lookaside-upload',
action='store_true',
help='Set this flag to skip uploading to /var/www/html/sources esque lookaside')
rpm_parser.add_argument('--upload-to-s3', action='store_true')
rpm_parser.add_argument('--aws-access-key-id', type=str, required=False, default='')
rpm_parser.add_argument('--aws-access-key', type=str, required=False, default='')
rpm_parser.add_argument('--aws-bucket', type=str, required=False, default='')
git_parser.add_argument('--name', type=str, required=True)
git_parser.add_argument('--source-gituser', type=str, required=False, default='git')
git_parser.add_argument('--source-giturl', type=str, required=True)
git_parser.add_argument('--source-gitorg', type=str, required=True)
git_parser.add_argument('--source-branch', type=str, required=True)
git_parser.add_argument('--dest-gituser', type=str, required=False, default='git')
git_parser.add_argument('--dest-giturl', type=str, required=True)
git_parser.add_argument('--dest-gitorg', type=str, required=False, default='rpms')
git_parser.add_argument('--dest-branch', type=str, required=False, default='')
git_parser.add_argument('--release', type=str, required=False, default='')
git_parser.add_argument('--preconv-names', action='store_true', help='Convert + to plus first')
git_parser.add_argument('--distprefix', type=str, required=False, default='el')
git_parser.add_argument('--dest-lookaside', type=str, required=False, default='/var/www/html/sources')
git_parser.add_argument('--upstream-lookaside',
choices=('rocky8', 'rocky', 'centos', 'stream', 'fedora'),
required=True)
git_parser.add_argument('--alternate-spec-name',
type=str, required=False,
default='',
help='ex: if kernel-rt, use kernel. only use if built-in finder is failing')
git_parser.add_argument('--skip-lookaside-upload',
action='store_true',
help='Set this flag to skip uploading to /var/www/html/sources esque lookaside')
git_parser.add_argument('--upload-to-s3', action='store_true')
git_parser.add_argument('--aws-access-key-id', type=str, required=False, default='')
git_parser.add_argument('--aws-access-key', type=str, required=False, default='')
git_parser.add_argument('--aws-bucket', type=str, required=False, default='')
results = parser.parse_args()
command = parser.parse_args().cmd
def main():
"""
Run the main program. Callable via poetry or __main__
"""
if command == 'rpm':
classy = importutil.SrpmImport(
git_url_path=results.giturl,
srpm_path=results.srpm,
release=results.release,
branch=results.branch,
distprefix=results.distprefix,
git_user=results.gituser,
org=results.gitorg,
dest_lookaside=results.dest_lookaside,
verify_signature=results.no_verify_signature,
aws_access_key_id=results.aws_access_key_id,
aws_access_key=results.aws_access_key,
aws_bucket=results.aws_bucket,
)
classy.pkg_import(skip_lookaside=results.skip_lookaside_upload,
s3_upload=results.upload_to_s3)
elif command == 'git':
classy = importutil.GitImport(
package=results.name,
source_git_user=results.source_gituser,
source_git_url_path=results.source_giturl,
source_git_org_path=results.source_gitorg,
dest_git_user=results.dest_gituser,
dest_git_url_path=results.dest_giturl,
dest_org=results.dest_gitorg,
release=results.release,
preconv_names=results.preconv_names,
source_branch=results.source_branch,
dest_branch=results.dest_branch,
upstream_lookaside=results.upstream_lookaside,
distprefix=results.distprefix,
alternate_spec_name=results.alternate_spec_name,
dest_lookaside=results.dest_lookaside,
aws_access_key_id=results.aws_access_key_id,
aws_access_key=results.aws_access_key,
aws_bucket=results.aws_bucket,
)
classy.pkg_import(skip_lookaside=results.skip_lookaside_upload,
s3_upload=results.upload_to_s3)
else:
print('Unknown command')
if __name__ == '__main__':
main()

View File

@ -96,6 +96,10 @@ class ErrorConstants:
RPM_ERR_OPEN = 9400
RPM_ERR_SIG = 9401
RPM_ERR_INFO = 9402
RPM_ERR_BUILD = 9403
# Upload errors
UPLOAD_ERR = 9500
# pylint: disable=too-few-public-methods
class MockConstants:

View File

@ -31,6 +31,8 @@ __all__ = [
'RpmOpenError',
'RpmSigError',
'RpmInfoError',
'RpmBuildError',
'UploadError',
]
@ -170,7 +172,19 @@ class RpmSigError(GenericError):
class RpmInfoError(GenericError):
"""
There was an issue opening the RPM because the signature could not be
verified
There was an issue opening the RPM because the RPM is not valid.
"""
fault_code = errconst.RPM_ERR_INFO
class RpmBuildError(GenericError):
"""
There was an issue building or packing the RPM.
"""
fault_code = errconst.RPM_ERR_BUILD
class UploadError(GenericError):
"""
There was an issue for uploading an artifact or the uploader is not
working.
"""
fault_code = errconst.UPLOAD_ERR

View File

@ -46,8 +46,8 @@ def get_checksum(file_path: str, hashtype: str = 'sha256') -> str:
Borrowed from empanadas with some modifications
"""
# We shouldn't be using sha1 or md5.
if hashtype in ('sha', 'sha1', 'md5'):
raise err.ProvidedValueError(f'{hashtype} is not allowed.')
#if hashtype in ('sha', 'sha1', 'md5'):
# raise err.ProvidedValueError(f'{hashtype} is not allowed.')
try:
checksum = hashlib.new(hashtype)

View File

@ -139,8 +139,11 @@ def tag(repo, tag_name:str, message: str):
def lsremote(url):
"""
Helps check if a repo exists, and if it does, return references. If not,
return None and assume it doesn't exist.
Helps check if a repo exists.
If repo exists: return references
If repo exists and is completely empty: return empty dict
If repo does not exist: return None
"""
remote_refs = {}
git_cmd = rawgit.cmd.Git()
@ -153,5 +156,6 @@ def lsremote(url):
for ref in git_cmd.ls_remote(url).split('\n'):
hash_ref_list = ref.split('\t')
remote_refs[hash_ref_list[1]] = hash_ref_list[0]
if len(hash_ref_list) > 1:
remote_refs[hash_ref_list[1]] = hash_ref_list[0]
return remote_refs

75
pv2/util/uploader.py Normal file
View File

@ -0,0 +1,75 @@
# -*- mode:python; coding:utf-8; -*-
# Louis Abel <label@rockylinux.org>
"""
Utility functions for uploading artifacts
"""
import os
import sys
import threading
import pv2.util.error as err
try:
import boto3
s3 = boto3.client('s3')
except ImportError:
s3 = None
__all__ = [
'S3ProgressPercentage',
'upload_to_s3'
]
class S3ProgressPercentage:
"""
Displays progress of uploads. Loosely borrowed from the aws documentation.
"""
def __init__(self, filename):
self.__filename = filename
self.__size = float(os.path.getsize(filename))
self.__seen = 0
self.__lock = threading.Lock()
def __call__(self, num_of_bytes):
with self.__lock:
self.__seen += num_of_bytes
percentage = (self.__seen / self.__size) * 100
sys.stdout.write(
"\r%s %s / %s (%.2f%%)" % (self.__filename,
self.__seen,
self.__size,
percentage)
)
sys.stdout.flush()
def upload_to_s3(
input_file,
bucket,
access_key_id: str,
access_key: str,
dest_name=None
):
"""
Uploads an artifact to s3.
"""
if dest_name is None:
dest_name = os.path.basename(input_file)
if s3 is None:
raise err.UploadError('s3 module is not available')
s3_client = boto3.client(
's3',
aws_access_key_id=access_key_id,
aws_secret_access_key=access_key
)
with open(input_file, 'rb') as inner:
s3_client.upload_fileobj(
inner,
bucket,
dest_name,
Callback=S3ProgressPercentage(input_file)
)
inner.close()
# Hacky way to get a new line
sys.stdout.write('\n')

View File

@ -1,6 +1,6 @@
[project]
name = "pv2"
version = "0.10.0"
version = "0.12.0"
description = "PV2 backend framework module"
readme = "README.md"
authors = [
@ -11,13 +11,14 @@ maintainers = [
{ name = "Louis Abel", email = "label@rockylinux.org" }
]
requires-python = ">=3.6"
requires-python = ">=3.9"
dependencies = [
"GitPython >= 3.1.30",
"lxml >= 4.6.5",
"file-magic >= 0.4.0",
"pycurl >= 7.43.0.6"
"pycurl >= 7.43.0.6",
"boto3 >= 1.22.10"
]
[project.urls]
@ -29,3 +30,8 @@ file = "LICENSE"
[tool.setuptools]
package-dir = { "pv2" = "pv2" }
[tool.poetry.scripts]
import_pkg = "pv2.scripts.import_pkg:run"
#pkg_info = "pv2.scripts.pkg_info:run"
#build_pkg = "pv2.scripts.build_pkg:run"