Compare commits

..

1 Commits

Author SHA1 Message Date
Neil Hanlon ff4af8b327
Some fixes to run on el8 and build a cloud sig boot iso 2022-06-29 20:25:02 -04:00
249 changed files with 2430 additions and 14177 deletions

View File

@ -1,47 +0,0 @@
---
name: Build empanada images for imagefactory
on:
push:
branches: [ $default-branch, "devel" ]
pull_request:
branches: [ $default-branch ]
workflow_dispatch:
jobs:
buildx:
runs-on:
- ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v1
# https://github.com/docker/setup-buildx-action
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v1
with:
install: true
- name: Login to ghcr
if: github.event_name != 'pull_request'
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Build and push
id: docker_build
uses: docker/build-push-action@v2
with:
builder: ${{ steps.buildx.outputs.name }}
platforms: linux/amd64,linux/arm64,linux/s390x,linux/ppc64le
context: ./iso/empanadas
file: ./iso/empanadas/Containerfile.imagefactory
push: ${{ github.event_name != 'pull_request' }}
tags: ghcr.io/rocky-linux/empanadas-imagefactory:latest
cache-from: type=gha
cache-to: type=gha,mode=max

View File

@ -1,5 +1,5 @@
---
name: Build empanada container images for lorax
name: Build empanada container images
on:
push:
@ -17,17 +17,17 @@ jobs:
uses: actions/checkout@v2
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
uses: docker/setup-qemu-action@v1
# https://github.com/docker/setup-buildx-action
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3
uses: docker/setup-buildx-action@v1
with:
install: true
- name: Login to ghcr
if: github.event_name != 'pull_request'
uses: docker/login-action@v3
uses: docker/login-action@v1
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
@ -35,7 +35,7 @@ jobs:
- name: Build and push
id: docker_build
uses: docker/build-push-action@v5
uses: docker/build-push-action@v2
with:
builder: ${{ steps.buildx.outputs.name }}
platforms: linux/amd64,linux/arm64,linux/s390x,linux/ppc64le
@ -44,4 +44,4 @@ jobs:
push: ${{ github.event_name != 'pull_request' }}
tags: ghcr.io/rocky-linux/sig-core-toolkit:latest
cache-from: type=gha
cache-to: type=inline
cache-to: type=gha,mode=max

1
.gitignore vendored
View File

@ -1 +0,0 @@
*.sw[a-z]

View File

@ -4,8 +4,7 @@ sig-core-toolkit
Release Engineering toolkit for repeatable operations or functionality testing.
Currently mirrored at our [github](https://github.com/rocky-linux), and the
[RESF Git Service](https://git.resf.org). Changes will typically occur at the
RESF Git Service.
[RESF Git Service](https://git.resf.org). Changes will typically occur at GitHub.
What does this have?
--------------------
@ -13,9 +12,10 @@ What does this have?
* analyze -> Analysis utilities (such as download stats)
* chat -> mattermost related utilities
* func -> (mostly defunct) testing scripts and tools to test base functionality
* iso -> Contains `empanadas`, which provides ISO, Compose, and Sync related utilities.
* iso -> ISO and Compose related utilities, primarily for Rocky Linux 9+
* live -> Live image related utilities
* mangle -> Manglers and other misc stuff
* sync -> Sync tools, primarily for Rocky Linux 8 and will eventually be deprecated
* sync -> Sync tools, primarily for Rocky Linux 8
How can I help?
---------------
@ -23,17 +23,13 @@ How can I help?
Fork this repository and open a PR with your changes. Keep these things in mind
when you make changes:
* Your PR should be against the devel branch (not optional)
* Have pre-commit installed if possible
* Have shellcheck installed if possible
* Have pre-commit installed
* Have shellcheck installed
* Shell Scripts: These must pass a shellcheck test!
* Python scripts: Try your best to follow PEP8 guidelines (even the best linters get things wrong)
* Python scripts: Try your best to follow PEP8 guidelines
* Note that not everything has to pass. Just try your best.
PR's against the main branch will be closed.
PR's are preferred at the [RESF Git Service](https://git.resf.org).
Your PR should be against the devel branch at all times. PR's against the main
branch will be closed.
Will some of this be moved into separate repositories?
------------------------------------------------------

3
func/.gitignore vendored
View File

@ -1,5 +1,2 @@
log/*.log
log/*.log.*
clone_again/
cloned/
tftptest

View File

@ -5,20 +5,22 @@ These are a set of scripts that are designed to test the core functionality
of a Rocky Linux system. They are designed to work on current versions of
Rocky and are used to test a system as a Release Engineering self-QA but
can be used by others for their own personal testing (under the assumption
that you just want to see what happens, we don't judge.
that you just want to see what happens, we don't judge :).
These tests *must* pass for a X.0 release to be considered "Core Validated".
These tests *must* pass for a release to be considered "Core Validated"
Checking against the upstream repositories for package matches are not enough
and are/will be addressed by other tools.
* common -> Functions that our scripts and tests may or may not use. Templates
and other files should come here too under common/files and
scripts that use them should reference them as `./common/files/...`
* core -> Core functionality and testing. For example, packages and service
functionality.
* lib -> Library tests (these may be done elsewhere, such as openqa)
* lib -> Library tests (these may be done elsewhere)
* log -> Log output. This repository has example logs of running on Rocky
Linux.
* modules -> Tests for module streams and their basic tests
* stacks -> Software stacks, think like LAMP (may be done elsewhere, such as openqa)
* stacks -> Software stacks, think like LAMP.
How to Run
----------
@ -148,13 +150,6 @@ security is important, actually work and function correctly.
With that said, There is no reason to disable integral security layers on your
system.
### Should EPEL be enabled?
No. The point is to test Rocky packages, not EPEL. There are also package
differences that will break (eg: nc -> nmap-ncat vs netcat).
### What about CRB or extras?
It may say it's a failure, but it will continue anyway.
Current Tree
------------
```

View File

@ -36,25 +36,11 @@ function r_processor() {
if [[ "$(basename ${file})" =~ README|^\.|^_ ]]; then
continue
fi
[ -x "${file}" ] && echo "Begin processing script: ${file}" && "${file}"
[ -x "${file}" ] && "${file}"
done
return 0
}
function r_checkEPELEnabled() {
/usr/bin/dnf repolist | grep -q '^epel'
return $?
}
function r_checkTmpNoExec() {
grep 'tmp' /etc/fstab | grep -q noexec
tmpexec=$?
if [ "$tmpexec" -eq "0" ]; then
r_log "internal" "WARN: noexec is set for temporary directories. Some tests may fail."
fi
}
################################################################################
# Functions that deal with (p)ackages
@ -113,11 +99,7 @@ function p_getPackageArch() {
}
function p_getDist() {
rpm -q --whatprovides redhat-release --queryformat '%{version}\n' | cut -d'.' -f1
}
function p_getMinorVersion() {
rpm -q --whatprovides redhat-release --queryformat '%{version}\n' | cut -d'.' -f2
rpm -q "$(rpm -qf /etc/redhat-release)" --queryformat '%{version}\n' | cut -d'.' -f1
}
################################################################################
@ -222,16 +204,12 @@ function m_recycleLog() {
rl_ver=$(p_getDist)
rl_arch=$(m_getArch)
rl_minor_ver=$(p_getMinorVersion)
export rl_ver
export rl_arch
export rl_minor_ver
export -f r_log
export -f r_checkExitStatus
export -f r_processor
export -f r_checkEPELEnabled
export -f r_checkTmpNoExec
export -f p_installPackageNormal
export -f p_installPackageNoWeaks
export -f p_removePackage
@ -240,7 +218,6 @@ export -f p_resetModule
export -f p_getPackageRelease
export -f p_getPackageArch
export -f p_getDist
export -f p_getMinorVersion
export -f m_serviceCycler
export -f m_checkForPort
export -f m_assertCleanExit

View File

@ -1 +0,0 @@
Basic tests, such as repos

View File

@ -1,56 +0,0 @@
#!/usr/bin/env python3
# label <label@rockylinux.org>
import datetime
import sys
import dnf
import dnf.exceptions
# pylint: disable=unnecessary-lambda-assignment
now = datetime.datetime.today().strftime("%m-%d-%Y %T")
class DnfQuiet(dnf.Base):
"""
DNF object
This is in the event we need special functions
"""
def __init__(self):
dnf.Base.__init__(self)
def main():
"""
Main run
"""
dnfobj = DnfQuiet()
releasever = dnfobj.conf.releasever
try:
dnfobj.read_all_repos()
# pylint: disable=bare-except
except:
print(f'[-] {now} -> Could not read repos', file=sys.stderr)
sys.exit(1)
rocky_default_repos = {
'8': ['baseos', 'appstream', 'extras'],
'9': ['baseos', 'appstream', 'extras']
}.get(releasever, None)
if not rocky_default_repos:
print(f'[-] {now} -> Not a Rocky Linux system')
sys.exit(1)
print(f'[-] {now} -> Checking if non-default repo is enabled')
_not_allowed=False
for repo in list(dnfobj.repos.iter_enabled()):
if not repo.id in rocky_default_repos:
print(f'[-] {now} -> {repo.id} is enabled and should be disabled')
_not_allowed=True
if _not_allowed:
print(f'[-] {now} -> FAIL - There are extra repos enabled')
sys.exit(1)
print(f'[-] {now} -> PASS')
sys.exit(0)
if __name__ == "__main__":
main()

View File

@ -1,8 +1,6 @@
#!/bin/bash
r_log "acl" "Install the acl package"
p_installPackageNormal acl
p_installPackageNormal attr
# This normally is not needed.
#r_log "acl" "Remount filesystems with ACL support"
#mount -o remount,acl /
r_log "acl" "Remount filesystems with ACL support (this normally should not be needed)"
mount -o remount,acl /
sleep 3

View File

@ -2,7 +2,6 @@
ACLFILE=/tmp/testfile_acl
r_log "acl" "Test that the acl get and set functions work"
touch "${ACLFILE}"
trap '/bin/rm -f ${ACLFILE}' EXIT
# Use setfacl for readonly
r_log "acl" "Set readonly ACL for the user nobody"
@ -13,3 +12,4 @@ r_log "acl" "Verifying that the nobody user is set to read only"
getfacl "${ACLFILE}" | grep -q 'user:nobody:r--'
r_checkExitStatus $?
/bin/rm -f "${ACLFILE}"

View File

@ -1,22 +0,0 @@
#!/bin/bash
ACLIMG=/tmp/testacl.img
r_log "acl" "Test the use of xattr"
touch "${ACLIMG}"
trap '/bin/rm -f ${ACLIMG}' EXIT
# Use setfacl for readonly
r_log "acl" "Create image"
dd if=/dev/zero of=${ACLIMG} bs=1024000 count=100
echo -e 'y\n' | mkfs.ext3 "${ACLIMG}"
mkdir /mnt/xattr
mount -t ext3 -o loop,user_xattr "${ACLIMG}" /mnt/xattr
touch /mnt/xattr/testfile
r_log "acl" "Apply attrs as needed"
setfattr -n user.nobody /mnt/xattr/testfile
getfattr /mnt/xattr/testfile | grep -q 'user.nobody'
final_status=$?
umount /mnt/xattr
r_checkExitStatus $final_status

View File

@ -1,7 +1,6 @@
#!/bin/bash
r_log "archive" "Test bzip/bzcat/bunzip"
FILE=/var/tmp/bziptest.txt
trap '/bin/rm -f ${FILE}' EXIT
cat > "$FILE" <<EOF
testing text
@ -24,3 +23,5 @@ fi
grep -q 'testing text' "${FILE}"
r_checkExitStatus $?
/bin/rm -f "${FILE}*"

View File

@ -1,9 +1,8 @@
#!/bin/bash
r_log "archive" "Verifying gzip binaries"
echo -n "Processing; "
for bin in gunzip gzexe gzip zcat zcmp zdiff zegrep zfgrep zforce zgrep zless zmore znew; do
echo -n "$bin "
echo -n "$bin"
r_log "archive" "$bin"
$bin --version &> /dev/null || r_checkExitStatus 1
done

View File

@ -4,9 +4,6 @@ r_log "archive" "Test gzip/zcat/gunzip"
FILE=/var/tmp/gzip-test.txt
MD5HASH=e6331c582fbad6653832860f469f7d1b
# clean up
trap '/bin/rm $FILE* &> /dev/null && /bin/rm -rf /var/tmp/gziptest &> /dev/null' EXIT
# Double check that stuff is cleared out
/bin/rm $FILE* &> /dev/null
/bin/rm -rf /var/tmp/gziptest &> /dev/null
@ -110,3 +107,7 @@ tar -czf $FILE.tgz $FILE &> /dev/null
gunzip $FILE.tgz
[ -e $FILE.tar ]
r_checkExitStatus $?
# clean up
/bin/rm $FILE* &> /dev/null
/bin/rm -rf /var/tmp/gziptest &> /dev/null

View File

@ -2,8 +2,6 @@
r_log "archive" "Checking gzexe"
r_log "archive" "Creating archive"
FILE=/var/tmp/gzexe-test-script
trap '/bin/rm -f $FILE* 2>/dev/null' EXIT
/bin/rm -f $FILE* &>/dev/null
cat > $FILE <<EOF
@ -20,3 +18,5 @@ r_log "archive" "Test gzexe"
r_log "archive" "Check that it actually runs"
$FILE | grep -q "Hello!"
r_checkExitStatus $?
/bin/rm -f $FILE* 2>/dev/null

View File

@ -1,7 +1,6 @@
#!/bin/bash
r_log "archive" "Check zcmp and zdiff"
BASEFILE="/var/tmp/gziptest"
trap '/bin/rm -f ${BASEFILE}*' EXIT
/bin/rm -f ${BASEFILE}
cat > ${BASEFILE}.1 <<EOF
@ -16,3 +15,5 @@ r_log "archive" "Check zcmp"
r_log "archive" "Check zdiff"
/bin/zdiff ${BASEFILE}.1.gz ${BASEFILE}.2.gz || r_checkExitStatus 1
/bin/rm -f ${BASEFILE}*

View File

@ -2,7 +2,6 @@
r_log "archive" "Testing zforce"
BASEFILE="/var/tmp/abcdefg"
trap '/bin/rm "$BASEFILE.gz"' EXIT
/bin/rm $BASEFILE* &>/dev/null
cat > $BASEFILE <<EOF
@ -15,3 +14,5 @@ mv $BASEFILE.gz $BASEFILE
zforce $BASEFILE || r_checkExitStatus 1
[ -e "$BASEFILE.gz" ]
r_checkExitStatus $?
/bin/rm "$BASEFILE.gz"

View File

@ -1,7 +1,6 @@
#!/bin/bash
r_log "archive" "Testing zgrep"
BASEFILE=/var/tmp/zgreptest
trap '/bin/rm $BASEFILE*' EXIT
/bin/rm $BASEFILE* &> /dev/null
cat > $BASEFILE <<EOF
@ -12,3 +11,5 @@ gzip $BASEFILE
zgrep -q 'Green Obsidian' $BASEFILE.gz
r_checkExitStatus $?
/bin/rm $BASEFILE*

0
func/core/pkg_archive/27-znew.sh Normal file → Executable file
View File

View File

@ -4,7 +4,6 @@ r_log "archive" "Test tar create and extract"
TARDIR="/var/tmp/tartest"
FILE1="$TARDIR/test.1.txt"
FILE2="$TARDIR/test.2.txt"
trap '/bin/rm -rf /var/tmp/tarfile.tar $TARDIR' EXIT
mkdir -p $TARDIR
cat > $FILE1 <<EOF
@ -33,3 +32,5 @@ if [ $RES1 == 0 ] && [ $RES2 == 0 ]; then
fi
r_checkExitStatus $ret_val
/bin/rm -rf /var/tmp/tarfile.tar $TARDIR

View File

@ -1,7 +1,6 @@
#!/bin/bash
r_log "archive" "Check xzcmp and xzdiff"
BASEFILE="/var/tmp/xztest"
trap '/bin/rm -f ${BASEFILE}*' EXIT
/bin/rm -f ${BASEFILE}
cat > ${BASEFILE}.1 <<EOF
@ -16,3 +15,5 @@ r_log "archive" "Check xzcmp"
r_log "archive" "Check xzdiff"
/bin/zdiff ${BASEFILE}.1.xz ${BASEFILE}.2.xz || r_checkExitStatus 1
/bin/rm -f ${BASEFILE}*

View File

@ -4,7 +4,6 @@ r_log "archive" "Test zip create and extract"
ZIPDIR="/var/tmp/ziptest"
FILE1="$ZIPDIR/test.1.txt"
FILE2="$ZIPDIR/test.2.txt"
trap '/bin/rm -rf /var/tmp/zipfile.zip $ZIPDIR' EXIT
mkdir -p $ZIPDIR
cat > $FILE1 <<EOF
@ -32,4 +31,6 @@ if [ $RES1 == 0 ] && [ $RES2 == 0 ]; then
ret_val=0
fi
r_checkExitStatus "$ret_val"
r_checkExitStatus $ret_val
/bin/rm -rf /var/tmp/zipfile.zip $ZIPDIR

View File

@ -2,7 +2,6 @@
r_log "archive" "Testing lzop compress and decompress"
LZOFILE=/var/tmp/obsidian.txt
trap '/bin/rm ${LZOFILE}' EXIT
echo 'Green Obsidian is the release name' > ${LZOFILE}
@ -14,5 +13,5 @@ lzop -d ${LZOFILE}.lzo -o ${LZOFILE}
/bin/rm ${LZOFILE}.lzo
grep -q 'Green Obsidian' ${LZOFILE}
ret_val="$?"
r_checkExitStatus "$ret_val"
/bin/rm ${LZOFILE}

View File

@ -2,8 +2,6 @@
ATTRTEST="/var/tmp/attrtest.img"
ATTRMNT="/mnt/attrtest"
trap 'umount /mnt/attrtest ; /bin/rm -f ${ATTRTEST} ; /bin/rm -rf ${ATTRMNT}' EXIT
r_log "attr" "Checking that *attr works"
dd if=/dev/zero of="${ATTRTEST}" bs=1024000 count=100 &>/dev/null
r_checkExitStatus $?
@ -16,3 +14,8 @@ setfattr -n user.test "${ATTRMNT}/testfile"
getfattr "${ATTRMNT}/testfile" | grep -oq "user.test"
r_checkExitStatus $?
# Cleanup
umount /mnt/attrtest
/bin/rm -f "${ATTRTEST}"
/bin/rm -rf "${ATTRMNT}"

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "coreutils" "Testing cat"
trap "/bin/rm /var/tmp/cattest" EXIT
cat > /var/tmp/cattest <<EOF
Green Obsidian
@ -8,3 +7,5 @@ EOF
grep -q "Green Obsidian" /var/tmp/cattest
r_checkExitStatus $?
/bin/rm /var/tmp/cattest

View File

@ -1,6 +1,6 @@
#!/bin/bash
r_log "coreutils" "Testing readlink"
trap "/bin/rm /var/tmp/listen" EXIT
ln -s /var/tmp/talk /var/tmp/listen
readlink /var/tmp/listen | grep -q "/var/tmp/talk"
r_checkExitStatus $?
/bin/rm /var/tmp/listen

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "coreutils" "Test hash sum tools"
trap '/bin/rm ${HASHFILE}' EXIT
HASHFILE=/var/tmp/obsidian
echo "Green Obsidian is our release name" > ${HASHFILE}
@ -23,3 +22,5 @@ r_checkExitStatus $?
r_log "coreutils" "Test sha512sum"
/usr/bin/sha512sum ${HASHFILE} | grep -q e50554c29a5cb7bd04279d3c0918e486024c79c4b305a2e360a97d4021dacf56ce0d17fa6e6a0e81ad03d5fb74fbe2d50cce6081c2c277f22b958cdae978a2f5
r_checkExitStatus $?
/bin/rm ${HASHFILE}

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "coreutils" "Testing touch and ls"
trap '/bin/rm /tmp/touch-?' EXIT
r_log "coreutils" "Touch files with specific dates"
touch -t 199104230420 /tmp/touch-1
@ -11,3 +10,5 @@ r_log "coreutils" "Verify that the oldest file is last"
ls -lt /tmp/touch-? | tail -n 1 | grep -q 'touch-1'
r_checkExitStatus $?
/bin/rm /tmp/touch-?

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "coreutils" "Ensure uniq works as expected"
trap '/bin/rm /var/tmp/uniq' EXIT
cat > /var/tmp/uniq <<EOF
Rocky
@ -15,3 +14,4 @@ EOF
uniq -d /var/tmp/uniq | wc -l | grep -q 2 && uniq -u /var/tmp/uniq | wc -l | grep -q 4
r_checkExitStatus $?
/bin/rm /var/tmp/uniq

View File

@ -2,7 +2,6 @@
r_log "coreutils" "Ensure wc works as expected"
r_log "coreutils" "This should have already been done with uniq"
# Context: we should probably test some switches...
trap "/bin/rm /var/tmp/wc" EXIT
cat > /var/tmp/wc <<EOF
Rocky
@ -22,3 +21,5 @@ wc -L /var/tmp/wc | grep -q 8 && \
wc -w /var/tmp/wc | grep -q 8
r_checkExitStatus $?
/bin/rm /var/tmp/wc

View File

@ -5,18 +5,14 @@ OUTTER=/var/tmp/cpio/out
INNER=/var/tmp/cpio/in
PASSER=/var/tmp/cpio/pass
trap '/bin/rm -rf /var/tmp/cpio' EXIT
# Nothing should be here. Clean up first.
[ -d /var/tmp/cpio ] && /bin/rm -rf /var/tmp/cpio
r_log "cpio" "Test basic copy out"
mkdir -p "$OUTTER" "$INNER" "$PASSER"
# Ensure at least one file exists in /tmp to prevent errors.
echo 1 > $(mktemp)
# shellcheck disable=2012
find /tmp -type f | cpio -o > "$OUTTER"/cpio.out 2> /dev/null
ls /tmp | cpio -o > "$OUTTER"/cpio.out
r_checkExitStatus $?
r_log "cpio" "Test basic copy in"
@ -27,7 +23,7 @@ popd || exit 1
r_log "cpio" "Test basic passthrough"
pushd "$INNER" || exit 1
find . | cpio -pd "$PASSER"
find /tmp | cpio -pd "$PASSER"
r_checkExitStatus $?
popd || exit 1

View File

@ -15,4 +15,3 @@ r_checkExitStatus $?
r_log "cracklib" "Testing a complicated password"
echo -e "2948_Obaym-" | cracklib-check | grep -q "OK"
r_checkExitStatus $?

View File

@ -1,8 +1,6 @@
#!/bin/bash
r_log "cron" "Testing hourly cron jobs"
trap '/bin/rm /etc/cron.{weekly,daily,hourly}/rocky.sh' EXIT
cat > /etc/cron.hourly/rocky.sh <<EOF
#!/bin/bash
echo "obsidian"
@ -36,3 +34,5 @@ chmod +x /etc/cron.weekly/rocky.sh
run-parts /etc/cron.weekly | grep -q "obsidian"
r_checkExitStatus $?
/bin/rm /etc/cron.{weekly,daily,hourly}/rocky.sh

View File

@ -1,8 +1,9 @@
#!/bin/bash
r_log "file" "Check that we can see a symlink"
FILE_PATH=/var/tmp/linktest
trap '/bin/rm ${FILE_PATH}' EXIT
MIME="inode/symlink"
ln -s /etc/issue $FILE_PATH
file -i $FILE_PATH | grep -q "${MIME}"
r_checkExitStatus $?
/bin/rm /var/tmp/linktest

View File

@ -2,7 +2,6 @@
r_log "findutils" "Testing basic find stuff"
TMPDIR=/var/tmp/find
trap '/bin/rm -rf $TMPDIR' EXIT
[ -e $TMPDIR ] && rm -rf "$TMPDIR"
@ -38,8 +37,8 @@ r_log "findutils" "Perform for xargs test: fails with spaces in the name"
# shellcheck disable=SC2038
find "$TMPDIR" -type f | xargs ls &> /dev/null && { r_log "findutils" "Why did this get a 0 exit?"; exit "$FAIL"; }
ret_val=$?
if [ "$ret_val" -ne "0" ]; then
r_checkExitStatus 0
else
r_checkExitStatus 1
if [ "$ret_val" -ne 0 ]; then
r_checkExitStatus $?
fi
rm -rf "$TMPDIR"

View File

@ -1,14 +1,7 @@
#!/bin/bash
function cleanup() {
cp /etc/raddb/users.backup /etc/raddb/users
rm -rf /etc/raddb/users.backup
systemctl stop radiusd.service
}
r_log "freeradius" "Test basic freeradius functionality"
r_log "freeradius" "Configure freeradius"
trap cleanup EXIT
r_log "freeradius" "Configure freeradius"
cp /etc/raddb/users /etc/raddb/users.backup
cat >> /etc/raddb/users << EOF
rocky Cleartext-Password := "rocky"
@ -20,3 +13,7 @@ systemctl start radiusd.service
sleep 1
echo "User-Name=rocky,User-Password=rocky " | radclient -x localhost:1812 auth testing123 | grep -q 'Access-Accept'
r_checkExitStatus $?
cp /etc/raddb/users.backup /etc/raddb/users
rm -rf /etc/raddb/users.backup
systemctl stop radiusd.service

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "git" "Test basic git clones"
trap 'rm -rf $TMPREPO' EXIT
WORKDIR=$(pwd)
TMPREPO=/var/tmp/repo

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "httpd" "Test basic authentication functionality"
trap "rm /etc/httpd/conf.d/test-basic-auth.conf ; m_serviceCycler httpd reload" EXIT
cat > /etc/httpd/conf.d/test-basic-auth.conf <<EOF
## Core basic auth test
@ -16,8 +15,9 @@ EOF
htpasswd -c -b /etc/httpd/htpasswd tester tester
mkdir -p /var/www/html/basic_auth
echo "Basic Auth Test" > /var/www/html/basic_auth/index.html
# This isn't normally needed, it should just work
restorecon -R /var/www/html
m_serviceCycler httpd cycle
curl -s -u tester:tester http://localhost/basic_auth/ | grep -q 'Basic Auth Test' > /dev/null 2>&1
r_checkExitStatus $?
rm /etc/httpd/conf.d/test-basic-auth.conf
m_serviceCycler httpd reload

View File

@ -1,12 +1,5 @@
#!/bin/bash
function cleanup() {
rm /etc/httpd/conf.d/vhost.conf
sed -i '/127.0.0.1 coretest/d' /etc/hosts
m_serviceCycler httpd reload
}
r_log "httpd" "Test basic vhost functionality"
trap cleanup EXIT
echo "127.0.0.1 coretest" >> /etc/hosts
cat > /etc/httpd/conf.d/vhost.conf << EOF
@ -21,10 +14,12 @@ EOF
mkdir -p /var/www/vhost/coretest
echo "core vhost test page" > /var/www/vhost/coretest/index.html
# This isn't normally needed, it should just work
restorecon -R /var/www/vhost
m_serviceCycler httpd cycle
curl -s http://coretest/ | grep -q 'core vhost test page' > /dev/null 2>&1
r_checkExitStatus $?
rm /etc/httpd/conf.d/vhost.conf
sed -i '/127.0.0.1 coretest/d' /etc/hosts
m_serviceCycler httpd reload

View File

@ -2,8 +2,6 @@
r_log "httpd" "Test basic php"
echo "<?php echo phpinfo(); ?>" > /var/www/html/test.php
# This isn't normally needed, it should just work
restorecon -R /var/www/html
curl -s http://localhost/test.php | grep -q 'PHP Version' > /dev/null 2>&1
r_checkExitStatus $?

0
func/core/pkg_network/30-test-arpwatch.sh Normal file → Executable file
View File

View File

@ -1,3 +0,0 @@
#!/bin/bash
r_log "openssh" "Install openssh"
p_installPackageNormal openssh-clients openssh-server sshpass

View File

@ -1,5 +0,0 @@
#!/bin/bash
r_log "openssh" "Ensure ssh is listening"
echo "" > /dev/tcp/localhost/22
r_checkExitStatus $?

View File

@ -1,16 +0,0 @@
#!/bin/bash
r_log "openssh" "Testing basic login (using sshpass)"
trap 'userdel -rf sshpasstest; unset SSHPASS' EXIT
if sshd -T | grep -q "passwordauthentication yes"; then
r_log "openssh" "Creating test user"
export SSHPASS="Blu30nyx!"
useradd sshpasstest
echo "${SSHPASS}" | passwd --stdin sshpasstest
r_log "openssh" "Testing login"
sshpass -e ssh sshpasstest@localhost echo 'hello'
r_checkExitStatus $?
else
r_log "openssh" "Skipping test"
exit 0
fi

View File

@ -1,32 +0,0 @@
#!/bin/bash
r_log "openssh" "Testing key login (using sshpass)"
case $RL_VER in
8)
KEYTYPES="rsa ecdsa ed25519"
;;
9)
KEYTYPES="rsa ecdsa ed25519"
;;
*)
KEYTYPES="ed25519"
;;
esac
r_log "openssh" "Creating test user"
useradd sshkeytest
echo "Blu30nyx!" | passwd --stdin sshkeytest
for KEYTYPE in $KEYTYPES; do
r_log "openssh" "Creating key: ${KEYTYPE}"
runuser -l sshkeytest -c "echo | ssh-keygen -q -t ${KEYTYPE} -b 4096 -f ~/.ssh/id_${KEYTYPE}" > /dev/null
runuser -l sshkeytest -c "cat ~/.ssh/*pub > ~/.ssh/authorized_keys && chmod 600 ~/.ssh/*keys" > /dev/null
STRINGTEST=$(mktemp -u)
echo "${STRINGTEST}" > /home/sshkeytest/test_file
r_log "openssh" "Testing key: ${KEYTYPE}"
runuser -l sshkeytest -c "ssh -i ~/.ssh/id_${KEYTYPE} localhost | grep -q ${STRINGTEST} /home/sshkeytest/test_file"
ret_val=$?
r_checkExitStatus $ret_val
done
userdel -rf sshkeytest

View File

@ -1,4 +0,0 @@
#!/bin/bash
#
r_log "podman" "Installing podman"
p_installPackageNormal podman

View File

@ -1,32 +0,0 @@
#!/bin/bash
r_log "podman" "Testing podman"
test_to_run=(
"podman version"
"podman info"
"podman run --rm quay.io/rockylinux/rockylinux:${RL_VER}"
"podman system service -t 1"
"touch ${HOME}/test.txt && \
podman run --rm --privileged -v ${HOME}/test.txt:/test.txt quay.io/rockylinux/rockylinux:${RL_VER} bash -c 'echo HELLO > /test.txt' && \
grep -qe 'HELLO' ${HOME}/test.txt && \
rm -f ${HOME}/test.txt"
"printf \"FROM quay.io/rockylinux/rockylinux:${RL_VER}\nCMD echo 'HELLO'\n\" > ${HOME}/Containerfile && \
podman build -t test:latest -f ${HOME}/Containerfile && \
podman image rm localhost/test:latest && \
rm -rf ${HOME}/Containerfile"
)
tmpoutput="$(mktemp)"
trap 'rm -f ${tmpoutput}' EXIT
for command in "${test_to_run[@]}"; do
r_log "podman" "Running $0: ${command}"
if ! eval "${command}" > "${tmpoutput}" 2>&1; then
r_log "podman" "${command} has failed."
cat "${tmpoutput}"
exit 1
else
r_checkExitStatus 0
fi
done

View File

@ -1,31 +0,0 @@
#!/bin/bash
r_log "podman" "Testing podman sockets"
useradd podman-remote
loginctl enable-linger podman-remote
tmpoutput="$(mktemp)"
trap 'loginctl terminate-user podman-remote && loginctl disable-linger podman-remote && sleep 1 && userdel -r podman-remote && rm -f ${tmpoutput}' EXIT
sleep 3
su -l podman-remote > "${tmpoutput}" 2>&1 <<EOF
set -e
export XDG_RUNTIME_DIR=/run/user/\$(id -u)
systemctl --user enable --now podman.socket
podman --url unix://run/user/\$(id -u)/podman/podman.sock run --name port-mapping-test -d -p 8080:80 docker.io/nginx
pid=\$(systemctl --user show --property MainPID --value podman.service)
while [ "\${pid}" -ne 0 ] && [ -d /proc/\${pid} ]; do sleep 1; echo "Waiting for podman to exit"; done
podman --url unix://run/user/\$(id -u)/podman/podman.sock ps | grep -q -e port-mapping-test
podman --url unix://run/user/\$(id -u)/podman/podman.sock container rm -f port-mapping-test
systemctl --user disable --now podman.socket
EOF
ret_val=$?
if [ "$ret_val" -ne 0 ]; then
cat "${tmpoutput}"
r_checkExitStatus 1
fi
r_checkExitStatus 0

View File

@ -1,6 +1,5 @@
#!/bin/bash
r_log "postfix" "Install postfix (requires stop of other pieces)"
# This is OK if it fails - This is also not logged except in stderr
m_serviceCycler sendmail stop
p_installPackageNormal postfix nc dovecot openssl
m_serviceCycler postfix enable

View File

@ -18,3 +18,6 @@ mv /etc/dovecot/dovecot.conf.backup /etc/dovecot/dovecot.conf
mv /etc/postfix/main.cf.backup /etc/postfix/main.cf
r_checkExitStatus $ret_val
cp -a /etc/postfix/main.cf.backup /etc/postfix/main.cf
cp -a /etc/dovecot/dovecot.conf.backup /etc/dovecot/dovecot.conf

View File

@ -2,17 +2,6 @@
r_log "postfix" "Test postfix with TLS"
DROPDIR=/var/tmp/postfix
function cleanup() {
mv /etc/postfix/main.cf.backup /etc/postfix/main.cf
mv /etc/dovecot/dovecot.conf.backup /etc/dovecot/dovecot.conf
rm /etc/pki/tls/certs/mail.crt
rm /etc/pki/tls/private/mail.key
rm -rf $DROPDIR/mail.*
rm -rf /var/tmp/postfix
}
trap cleanup EXIT
cp -a /etc/postfix/main.cf /etc/postfix/main.cf.backup
cp -a /etc/dovecot/dovecot.conf /etc/dovecot/dovecot.conf.backup
@ -70,4 +59,11 @@ r_log "postfix" "Testing that postfix offers STARTTLS"
echo "ehlo test" | nc -w 3 127.0.0.1 25 | grep -q "STARTTLS"
ret_val=$?
r_checkExitStatus $ret_val
mv /etc/postfix/main.cf.backup /etc/postfix/main.cf
mv /etc/dovecot/dovecot.conf.backup /etc/dovecot/dovecot.conf
rm /etc/pki/tls/certs/mail.crt
rm /etc/pki/tls/certs/mail.key
rm -rf $DROPDIR/mail.*
rm -rf /var/tmp/postfix
r_checkExitStatus $?

View File

@ -1,19 +0,0 @@
#!/bin/bash
# Check that the release package is 1.X
r_log "rocky release" "Checking that the package is at least X.Y-1.B"
RELEASE_VER="$(rpm -q rocky-release --qf '%{RELEASE}')"
RELNUM="${RELEASE_VER:0:1}"
if [ "${RELNUM}" -ge "1" ]; then
if [[ "${RELEASE_VER:0:3}" =~ ^${RELNUM}.[[:digit:]] ]]; then
ret_val="0"
else
r_log "rocky release" "FAIL: The release package is not in X.Y-A.B format"
ret_val="1"
fi
else
r_log "rocky release" "FAIL: The release package likely starts with 0 and is not considered production ready."
ret_val="1"
fi
r_checkExitStatus $ret_val

View File

@ -8,12 +8,7 @@ if [ ! -d /sys/firmware/efi ]; then
exit 0
fi
else
if [[ "$rl_arch" == "x86_64" ]]; then
p_installPackageNormal pesign
pesign --show-signature --in /boot/efi/EFI/rocky/shim.efi | grep -Eq "Microsoft Windows UEFI Driver Publisher"
r_checkExitStatus $?
else
r_log "secureboot" "x86_64 is the only supported secureboot arch at this time"
exit 0
fi
p_installPackageNormal pesign
pesign --show-signature --in /boot/efi/EFI/rocky/shim.efi | grep -Eq "Microsoft Windows UEFI Driver Publisher"
r_checkExitStatus $?
fi

View File

@ -1,11 +1,4 @@
#!/bin/bash
function cleanup() {
pwconv
rm -rf /var/tmp/pwunconv /var/tmp/pwconv
}
trap cleanup EXIT
r_log "shadow" "Check that pwck can use correct files"
pwck -rq ./common/files/correct-passwd ./common/files/correct-shadow
r_checkExitStatus $?
@ -15,11 +8,9 @@ pwck -rq ./common/files/incorrect-passwd ./common/files/incorrect-shadow
ret_val=$?
if [ "$ret_val" -eq 0 ]; then
r_log "shadow" "They're correct."
r_checkExitStatus 1
else
r_log "shadow" "They're incorrect."
r_checkExitStatus 0
exit 1
fi
r_checkExitStatus 0
r_log "shadow" "Check that pwconv is functional"
mkdir -p /var/tmp/pwconv

View File

@ -6,5 +6,5 @@ echo "rocky func" > /var/lib/tftpboot/tftptest
tftp 127.0.0.1 -c get tftptest
grep -q "rocky func" tftptest
r_checkExitStatus $?
r_checkExitStatus
/bin/rm tftptest

View File

@ -1,59 +0,0 @@
#!/bin/bash
# Release Engineering Core Functionality Testing
# Louis Abel <label@rockylinux.org> @nazunalika
################################################################################
# Settings and variables
# Exits on any non-zero exit status - Disabled for now.
#set -e
# Undefined variables will cause an exit
set -u
COMMON_EXPORTS='./common/exports.sh'
COMMON_IMPORTS='./common/imports.sh'
SELINUX=$(getenforce)
# End
################################################################################
# shellcheck source=/dev/null disable=SC2015
[ -f $COMMON_EXPORTS ] && source $COMMON_EXPORTS || { echo -e "\n[-] $(date): Variables cannot be sourced."; exit 1; }
# shellcheck source=/dev/null disable=SC2015
[ -f $COMMON_IMPORTS ] && source $COMMON_IMPORTS || { echo -e "\n[-] $(date): Functions cannot be sourced."; exit 1; }
# Init log
# shellcheck disable=SC2015
[ -e "$LOGFILE" ] && m_recycleLog || touch "$LOGFILE"
# SELinux check
if [ "$SELINUX" != "Enforcing" ]; then
echo -e "\n[-] $(date): SELinux is not enforcing."
exit 1
fi
r_log "internal" "Starting Release Engineering Core Tests"
################################################################################
# Script Work
# Skip tests in a list - some tests are already -x, so it won't be an issue
if [ -e skip.list ]; then
r_log "internal" "Disabling tests"
# shellcheck disable=SC2162
grep -E "^${RL_VER}" skip.list | while read line; do
# shellcheck disable=SC2086
testFile="$(echo $line | cut -d '|' -f 2)"
r_log "internal" "SKIP ${testFile}"
chmod -x "${testFile}"
done
r_log "internal" "WARNING: Tests above were disabled."
fi
# TODO: should we let $1 judge what directory is ran?
# TODO: get some stacks and lib in there
#r_processor <(/usr/bin/find ./core -type f | sort -t'/')
#r_processor <(/usr/bin/find ./lib -type f | sort -t'/')
r_processor <(/usr/bin/find ./stacks/ipa -type f | sort -t'/')
r_log "internal" "Core Tests completed"
exit 0

View File

@ -1,3 +0,0 @@
#!/bin/bash
r_log "pdf" "Install enscript, ghostscript, and poppler"
p_installPackageNormal fontconfig @fonts enscript ghostscript poppler-utils

View File

@ -1,23 +0,0 @@
#!/bin/bash
r_log "pdf" "Create a PDF from postscript from text, convert it back to text and check"
trap 'rm -rf $PSFILE $PDFFILE $TESTFILE' EXIT
TOFIND="BlueOnyx"
PSFILE="/var/tmp/test.ps"
PDFFILE="/var/tmp/test.pdf"
TESTFILE="/var/tmp/psresult"
encript -q -p $PSFILE /etc/rocky-release
r_log "pdf" "Check created file"
grep -q $TOFIND $PSFILE
pdf_ret_val=$?
r_checkExitStatus $pdf_ret_val
ps2pdf $PSFILE $PDFFILE
pdftotext -q $PDFFILE $TESTFILE
r_log "pdf" "Checking after conversion to text"
grep -q $TOFIND $TESTFILE
text_ret_val=$?
r_checkExitStatus $text_ret_val

View File

@ -30,16 +30,6 @@ if [ "$SELINUX" != "Enforcing" ]; then
exit 1
fi
r_checkEPELEnabled
is_epel=$?
if [[ "$is_epel" == "0" ]]; then
echo "EPEL enabled. Stop."
r_log "internal" "EPEL enabled. Stop."
exit 1
fi
r_checkTmpNoExec
r_log "internal" "Starting Release Engineering Core Tests"
################################################################################
@ -62,7 +52,7 @@ fi
# TODO: get some stacks and lib in there
r_processor <(/usr/bin/find ./core -type f | sort -t'/')
r_processor <(/usr/bin/find ./lib -type f | sort -t'/')
#r_processor <(/usr/bin/find ./lib -type f | sort -t'/')
#r_processor <(/usr/bin/find ./stacks -type f | sort -t'/')
r_log "internal" "Core Tests completed"

0
func/stacks/ipa/00-ipa-pregame.sh Executable file → Normal file
View File

2
func/stacks/ipa/10-install-ipa.sh Executable file → Normal file
View File

@ -11,4 +11,4 @@ if [ "$RL_VER" -eq 8 ]; then
p_enableModule idm:DL1/{client,common,dns,server}
fi
p_installPackageNormal ipa-server ipa-server-dns expect
p_installPackageNormal ipa-server ipa-server-dns

0
func/stacks/ipa/11-configure-ipa.sh Executable file → Normal file
View File

0
func/stacks/ipa/12-verify-ipa.sh Executable file → Normal file
View File

35
func/stacks/ipa/20-ipa-user.sh Executable file → Normal file
View File

@ -13,43 +13,42 @@ kdestroy &> /dev/null
klist 2>&1 | grep -E "(No credentials|Credentials cache .* not found)" &> /dev/null
r_checkExitStatus $?
echo "b1U3OnyX!" | kinit admin@RLIPA.LOCAL
expect -f - <<EOF
set send_human {.1 .3 1 .05 2}
spawn kinit admin
sleep 1
expect "Password for admin@RLIPA.LOCAL:"
send -h "b1U3OnyX!\r"
sleep 5
close
EOF
klist | grep "admin@RLIPA.LOCAL" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Test adding a user"
ipa user-add --first=test --last=user --random ipatestuser > /tmp/ipatestuser
grep -q 'Added user "ipatestuser"' /tmp/ipatestuser
userDetails="$(ipa user-add --first=test --last=user --random ipatestuser)"
echo "$userDetails" | grep -q 'Added user "ipatestuser"'
r_checkExitStatus $?
ret_val=$?
if [ "$ret_val" -ne 0 ]; then
r_log "ipa" "User was not created, this is considered fatal"
r_checkExitStatus 1
exit 1
fi
sed -i 's|^ ||g' /tmp/ipatestuser
grep -q 'First name: test' /tmp/ipatestuser
echo "$userDetails" | grep -q 'First name: test'
r_checkExitStatus $?
grep -q 'Last name: user' /tmp/ipatestuser
echo "$userDetails" | grep -q 'Last name: user'
r_checkExitStatus $?
grep -q 'Full name: test user' /tmp/ipatestuser
echo "$userDetails" | grep -q 'Full name: test user'
r_checkExitStatus $?
grep -q 'Home directory: /home/ipatestuser' /tmp/ipatestuser
echo "$userDetails" | grep -q 'Home directory: /home/ipatestuser'
r_checkExitStatus $?
r_log "ipa" "Changing password of the user"
kdestroy &> /dev/null
userPassword="$(awk '/Random password/ { print $3 }' /tmp/ipatestuser)"
/bin/rm /tmp/ipatestuser
expect -f - <<EOF
set send_human {.1 .3 1 .05 2}
spawn kinit ipatestuser
sleep 1
expect "Password for ipatestuser@RLIPA.LOCAL: "
send -h -- "$(echo "$userPassword")\r"
send -h -- "$(echo "$userDetails" | awk '$0 ~ /Random password/ {print $3}')\r"
sleep 1
expect "Enter new password: "
send -h -- "gr@YAm3thy5st!\r"

18
func/stacks/ipa/21-ipa-service.sh Executable file → Normal file
View File

@ -13,21 +13,29 @@ kdestroy &> /dev/null
klist 2>&1 | grep -E "(No credentials|Credentials cache .* not found)" &> /dev/null
r_checkExitStatus $?
echo "b1U3OnyX!" | kinit admin@RLIPA.LOCAL
expect -f - <<EOF
set send_human {.1 .3 1 .05 2}
spawn kinit admin
sleep 1
expect "Password for admin@RLIPA.LOCAL:"
send -h "b1U3OnyX!\r"
sleep 5
close
EOF
klist | grep "admin@RLIPA.LOCAL" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Adding test service"
ipa service-add testservice/onyxtest.rlipa.local &> /dev/null
ipa service-add testservice/rltest.rlipa.local &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Getting keytab for service"
ipa-getkeytab -s onyxtest.rlipa.local -p testservice/onyxtest.rlipa.local -k /tmp/testservice.keytab &> /dev/null
ipa-getkeytab -s rltest.rlipa.local -p testservice/rltest.rlipa.local -k /tmp/testservice.keytab &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Getting a certificate for service"
ipa-getcert request -K testservice/onyxtest.rlipa.local -D onyxtest.rlipa.local -f /etc/pki/tls/certs/testservice.crt -k /etc/pki/tls/private/testservice.key &> /dev/null
ipa-getcert request -K testservice/rltest.rlipa.local -D rltest.rlipa.local -f /etc/pki/tls/certs/testservice.crt -k /etc/pki/tls/private/testservice.key &> /dev/null
r_checkExitStatus $?
while true; do
@ -49,7 +57,7 @@ while ! stat /etc/pki/tls/certs/testservice.crt &> /dev/null; do
done
r_log "ipa" "Verifying keytab"
klist -k /tmp/testservice.keytab | grep "testservice/onyxtest.rlipa.local" &> /dev/null
klist -k /tmp/testservice.keytab | grep "testservice/rltest.rlipa.local" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Verifying key matches the certificate"

16
func/stacks/ipa/22-ipa-dns.sh Executable file → Normal file
View File

@ -13,13 +13,21 @@ kdestroy &> /dev/null
klist 2>&1 | grep -qE "(No credentials|Credentials cache .* not found)" &> /dev/null
r_checkExitStatus $?
echo "b1U3OnyX!" | kinit admin@RLIPA.LOCAL
expect -f - <<EOF
set send_human {.1 .3 1 .05 2}
spawn kinit admin
sleep 1
expect "Password for admin@RLIPA.LOCAL:"
send -h "b1U3OnyX!\r"
sleep 5
close
EOF
klist | grep "admin@RLIPA.LOCAL" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Adding testzone subdomain"
ipa dnszone-add --name-server=onyxtest.rlipa.local. --admin-email=hostmaster.testzone.rlipa.local. testzone.rlipa.local &> /dev/null
ipa dnszone-add --name-server=rltest.rlipa.local. --admin-email=hostmaster.testzone.rlipa.local. testzone.rlipa.local &> /dev/null
r_checkExitStatus $?
sleep 5
@ -28,7 +36,7 @@ dig @localhost SOA testzone.rlipa.local | grep -q "status: NOERROR" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Adding a CNAME record to the primary domain"
ipa dnsrecord-add rlipa.local testrecord --cname-hostname=onyxtest &> /dev/null
ipa dnsrecord-add rlipa.local testrecord --cname-hostname=rltest &> /dev/null
r_checkExitStatus $?
sleep 5
@ -37,7 +45,7 @@ dig @localhost CNAME testrecord.rlipa.local | grep -q "status: NOERROR" &> /dev/
r_checkExitStatus $?
r_log "ipa" "Adding a CNAME to subdomain"
ipa dnsrecord-add testzone.rlipa.local testrecord --cname-hostname=onyxtest.rlipa.local. &> /dev/null
ipa dnsrecord-add testzone.rlipa.local testrecord --cname-hostname=rltest.rlipa.local. &> /dev/null
r_checkExitStatus $?
sleep 5

54
func/stacks/ipa/23-ipa-sudo.sh Executable file → Normal file
View File

@ -9,51 +9,19 @@ if [ "$IPAINSTALLED" -eq 1 ]; then
r_checkExitStatus 1
fi
kdestroy -A
klist 2>&1 | grep -E "(No credentials|Credentials cache .* not found)"
kdestroy &> /dev/null
klist 2>&1 | grep -E "(No credentials|Credentials cache .* not found)" &> /dev/null
r_checkExitStatus $?
echo "b1U3OnyX!" | kinit admin@RLIPA.LOCAL
klist | grep -q "admin@RLIPA.LOCAL"
r_checkExitStatus $?
r_log "ipa" "Creating a test sudo rule"
ipa sudorule-add testrule --desc="Test rule in IPA" --hostcat=all --cmdcat=all --runasusercat=all --runasgroupcat=all &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Adding user to test sudo rule"
ipa sudorule-add-user testrule --users="ipatestuser" &> /dev/null
r_checkExitStatus $?
r_log "ipa" "Verifying rule..."
ipa sudorule-show testrule > /tmp/testrule
grep -q 'Rule name: testrule' /tmp/testrule
r_checkExitStatus $?
grep -q 'Description: Test rule in IPA' /tmp/testrule
r_checkExitStatus $?
grep -q 'Enabled: TRUE' /tmp/testrule
r_checkExitStatus $?
grep -q 'Host category: all' /tmp/testrule
r_checkExitStatus $?
grep -q 'Command category: all' /tmp/testrule
r_checkExitStatus $?
grep -q 'RunAs User category: all' /tmp/testrule
r_checkExitStatus $?
grep -q 'RunAs Group category: all' /tmp/testrule
r_checkExitStatus $?
grep -q 'Users: ipatestuser' /tmp/testrule
r_checkExitStatus $?
m_serviceCycler sssd stop
rm -rf /var/lib/sss/db/*
m_serviceCycler sssd start
expect -f - <<EOF
set send_human {.1 .3 1 .05 2}
spawn kinit admin
sleep 1
expect "Password for admin@RLIPA.LOCAL:"
send -h "b1U3OnyX!\r"
sleep 5
close
EOF
r_log "ipa" "Verifying sudo abilities"
sudo -l -U ipatestuser > /tmp/sudooutput
grep -q 'ipatestuser may run the following commands' /tmp/sudooutput
r_checkExitStatus $?
grep -q 'ALL) ALL' /tmp/sudooutput
klist | grep "admin@RLIPA.LOCAL" &> /dev/null
r_checkExitStatus $?

0
func/stacks/ipa/50-cleanup-ipa.sh Executable file → Normal file
View File

View File

@ -1,2 +1,7 @@
While not considered a "stack", it's a combination of many things at once. So
it is being tested as a stack.
We will be testing mainly against EL8. It is not clear if EL9 will keep idm as
a module in 9. However, certain tests will be checking for the release just in
case that the modules will disappear. (I can only hope that it does and that
it just goes back to what Fedora is doing and what EL7 does). -label

0
func/stacks/lamp/00-install-lamp.sh Normal file → Executable file
View File

0
func/stacks/lamp/01-verification.sh Normal file → Executable file
View File

0
func/stacks/lamp/10-test-lamp.sh Normal file → Executable file
View File

View File

@ -2,4 +2,3 @@ __pycache__/
*.py[cod]
*$py.class
*.so
Containerfile*.devel

View File

@ -56,7 +56,7 @@ RUN rm -rf /etc/yum.repos.d/*.repo
RUN useradd -o -d /var/peridot -u 1002 peridotbuilder && usermod -a -G mock peridotbuilder
RUN chown peridotbuilder:mock /etc/yum.conf && chown -R peridotbuilder:mock /etc/dnf && chown -R peridotbuilder:mock /etc/rpm && chown -R peridotbuilder:mock /etc/yum.repos.d
RUN pip install 'git+https://git.resf.org/sig_core/toolkit.git@devel#egg=empanadas&subdirectory=iso/empanadas'
RUN pip install 'git+https://git.rockylinux.org/release-engineering/public/toolkit.git@devel#egg=empanadas&subdirectory=iso/empanadas'
RUN pip install awscli

View File

@ -1,68 +0,0 @@
FROM docker.io/fedora:36
ADD images/get_arch /get_arch
ENV TINI_VERSION v0.19.0
RUN curl -o /tini -L "https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-$(/get_arch)"
RUN chmod +x /tini
RUN dnf install -y \
bash \
bzip2 \
cpio \
diffutils \
findutils \
gawk \
gcc \
gcc-c++ \
git \
grep \
gzip \
info \
make \
patch \
python3 \
redhat-rpm-config \
rpm-build \
scl-utils-build \
sed \
shadow-utils \
tar \
unzip \
util-linux \
which \
xz \
dnf-plugins-core \
createrepo_c \
rpm-sign \
sudo \
mock \
python-pip \
imagefactory \
imagefactory-plugins*
RUN sed -i -e 's/# memory = 1024/memory = 2048/' /etc/oz/oz.cfg
COPY imagefactory.patch /
COPY oz.rpm /
RUN dnf -y install /oz.rpm
RUN (cd /usr/lib/python3.10/site-packages/; patch -p1 </imagefactory.patch)
RUN ssh-keygen -t rsa -q -f "$HOME/.ssh/id_rsa" -N ""
RUN dnf clean all
RUN rm -rf /etc/yum.repos.d/*.repo /get_arch
# RUN useradd -o -d /var/peridot -u 1002 peridotbuilder && usermod -a -G mock peridotbuilder
# RUN chown -R peridotbuilder:mock /etc/dnf && chown -R peridotbuilder:mock /etc/rpm && chown -R peridotbuilder:mock /etc/yum.repos.d && chown -R peridotbuilder:mock /var/lib/imagefactory/storage
RUN pip install awscli
RUN pip install 'git+https://git.resf.org/sig_core/toolkit.git@devel#egg=empanadas&subdirectory=iso/empanadas'
ENV LIBGUESTFS_BACKEND direct
COPY prep-azure.sh /prep-azure.sh
RUN chmod +x /prep-azure.sh
ENTRYPOINT ["/tini", "--"]

View File

@ -1,23 +1,12 @@
# iso
## Setup / Install
1. Install [Poetry](https://python-poetry.org/docs/)
2. Setup: `poetry install`
3. Install dependencies: `dnf install podman mock`
4. Have fun
3. Have fun
Deeper documenation can be found at the [SIG/Core Wiki](https://sig-core.rocky.page/documentation).
## Reliance on podman and mock
### Why podman?
Podman is a requirement for performing reposyncs. This was done because it was found to be easier to spin up several podman containers than several mock chroots and it was faster than doing one at a time in a loop. Podman is also used to parallelize ISO builds.
### Why mock?
There are cases where running `mock` is the preferred go-to: For example, building lorax images. Since you cannot build a lorax image for an architecture your system does not support, trying to "parallelize" it was out of the question. Adding this support in was not only for local testing without podman, it was also done so it can be run in our peridot kube cluster for each architecture.
## Updating dependencies
@ -27,8 +16,9 @@ Changes to the poetry.lock should be commited if dependencies are added or updat
## TODO
* Verbose mode should exist to output everything that's being called or ran.
* There should be additional logging regardless, not just to stdout, but also to a file.
Verbose mode should exist to output everything that's being called or ran.
There should be additional logging regardless, not just to stdout, but also to a file.
## scripts
@ -37,15 +27,8 @@ Changes to the poetry.lock should be commited if dependencies are added or updat
* sync_sig -> Syncs SIG repositories from Peridot
* build-iso -> Builds initial ISO's using Lorax
* build-iso-extra -> Builds DVD's and other images based on Lorax data
* build-iso-live -> Builds live images
* pull-unpack-tree -> Pulls the latest lorax data from an S3 bucket and configures treeinfo
* pull-cloud-image -> Pulls the latest cloud images from an S3 bucket
* finalize_compose -> Finalizes a compose with metadata and checksums, as well as copies images
* launch-builds -> Creates a kube config to run build-iso
* build-image -> Runs build-iso
* generate_compose -> Creates a compose directory right away and optionally links it as latest
(You should only use this if you are running into errors with images)
* peridot_repoclosure -> Runs repoclosure against a peridot instance
```
## wrappers
@ -85,7 +68,7 @@ from util import Checks
rlvars = rldict['9']
r = Checks(rlvars, arch)
r.check_validity()
r.check_valid_arch()
```
### script names and permissions

View File

@ -1 +1 @@
__version__ = '0.6.1'
__version__ = '0.1.0'

View File

@ -1,17 +1,17 @@
# All imports are here
import glob
import hashlib
import logging
import os
import platform
import time
from collections import defaultdict
from typing import Tuple
import glob
import rpm
import yaml
import logging
import hashlib
from collections import defaultdict
from typing import Tuple
# An implementation from the Fabric python library
class AttributeDict(defaultdict):
def __init__(self):
@ -26,31 +26,25 @@ class AttributeDict(defaultdict):
def __setattr__(self, key, value):
self[key] = value
# These are a bunch of colors we may use in terminal output
class Color:
RED = "\033[91m"
GREEN = "\033[92m"
PURPLE = "\033[95m"
CYAN = "\033[96m"
DARKCYAN = "\033[36m"
BLUE = "\033[94m"
YELLOW = "\033[93m"
UNDERLINE = "\033[4m"
BOLD = "\033[1m"
END = "\033[0m"
INFO = "[" + BOLD + GREEN + "INFO" + END + "] "
WARN = "[" + BOLD + YELLOW + "WARN" + END + "] "
FAIL = "[" + BOLD + RED + "FAIL" + END + "] "
STAT = "[" + BOLD + CYAN + "STAT" + END + "] "
RED = '\033[91m'
GREEN = '\033[92m'
PURPLE = '\033[95m'
CYAN = '\033[96m'
DARKCYAN = '\033[36m'
BLUE = '\033[94m'
YELLOW = '\033[93m'
UNDERLINE = '\033[4m'
BOLD = '\033[1m'
END = '\033[0m'
# vars and additional checks
rldict = AttributeDict()
sigdict = AttributeDict()
config = {
"rlmacro": rpm.expandMacro("%rhel"),
"dist": "el" + rpm.expandMacro("%rhel"),
"rlmacro": rpm.expandMacro('%rhel'),
"dist": 'el' + rpm.expandMacro('%rhel'),
"arch": platform.machine(),
"date_stamp": time.strftime("%Y%m%d.%H%M%S", time.localtime()),
"compose_root": "/mnt/compose",
@ -59,6 +53,7 @@ config = {
"category_stub": "mirror/pub/rocky",
"sig_category_stub": "mirror/pub/sig",
"repo_base_url": "https://yumrepofs.build.resf.org/v1/projects",
"rocky_staging": "https://dl.rockylinux.org/stg/rocky",
"mock_work_root": "/builddir",
"container": "centos:stream9",
"distname": "Rocky Linux",
@ -67,30 +62,28 @@ config = {
"x86_64": "amd64",
"aarch64": "arm64",
"ppc64le": "ppc64le",
"s390x": "s390x",
"i686": "386",
"s390x": "s390x"
},
"aws_region": "us-east-2",
"bucket": "resf-empanadas",
"bucket_url": "https://resf-empanadas.s3.us-east-2.amazonaws.com",
"bucket_url": "https://resf-empanadas.s3.us-east-2.amazonaws.com"
}
# Importing the config from yaml
import importlib_resources
_rootdir = importlib_resources.files("empanadas")
for conf in glob.iglob(f"{_rootdir}/configs/*.yaml"):
with open(conf, "r", encoding="utf-8") as file:
with open(conf, 'r', encoding="utf-8") as file:
rldict.update(yaml.safe_load(file))
# Import all SIG configs from yaml
for conf in glob.iglob(f"{_rootdir}/sig/*.yaml"):
with open(conf, "r", encoding="utf-8") as file:
with open(conf, 'r', encoding="utf-8") as file:
sigdict.update(yaml.safe_load(file))
# The system needs to be a RHEL-like system. It cannot be Fedora or SuSE.
# if "%rhel" in config['rlmacro']:
#if "%rhel" in config['rlmacro']:
# raise SystemExit(Color.BOLD + 'This is not a RHEL-like system.' + Color.END
# + '\n\nPlease verify you are running on a RHEL-like system that is '
# 'not Fedora nor SuSE. This means that the %rhel macro will be '
@ -99,56 +92,30 @@ for conf in glob.iglob(f"{_rootdir}/sig/*.yaml"):
# These will be set in their respective var files
# REVISION = rlvars['revision'] + '-' + rlvars['rclvl']
# rlvars = rldict[rlver]
# rlvars = rldict[rlmacro]
# COMPOSE_ISO_WORKDIR = COMPOSE_ROOT + "work/" + arch + "/" + date_stamp
#REVISION = rlvars['revision'] + '-' + rlvars['rclvl']
#rlvars = rldict[rlver]
#rlvars = rldict[rlmacro]
#COMPOSE_ISO_WORKDIR = COMPOSE_ROOT + "work/" + arch + "/" + date_stamp
ALLOWED_TYPE_VARIANTS = {
"Azure": ["Base", "LVM"],
"Container": ["Base", "Minimal", "UBI", "WSL"],
"EC2": ["Base", "LVM"],
"GenericCloud": ["Base", "LVM"],
"Vagrant": ["Libvirt", "Vbox", "VMware"],
"OCP": ["Base"],
"RPI": ["Base"],
"GenericArm": ["Minimal"],
}
def valid_type_variant(_type: str, variant: str="") -> Tuple[bool, str]:
ALLOWED_TYPE_VARIANTS = {
"Container": ["Base", "Minimal"],
"GenericCloud": [],
}
def valid_type_variant(_type: str, variant: str = "") -> bool:
if _type not in ALLOWED_TYPE_VARIANTS:
raise Exception(f"Type is invalid: ({_type}, {variant})")
if ALLOWED_TYPE_VARIANTS[_type] == None:
if variant is not None:
raise Exception(f"{_type} Type expects no variant type.")
return True
if variant not in ALLOWED_TYPE_VARIANTS[_type]:
if variant and variant.capitalize() in ALLOWED_TYPE_VARIANTS[_type]:
raise Exception(
f"Capitalization mismatch. Found: ({_type}, {variant}). Expected: ({_type}, {variant.capitalize()})"
)
raise Exception(
f"Type/Variant Combination is not allowed: ({_type}, {variant})"
)
return True
return False, f"Type is invalid: ({_type}, {variant})"
elif variant not in ALLOWED_TYPE_VARIANTS[_type]:
if variant.capitalize() in ALLOWED_TYPE_VARIANTS[_type]:
return False, f"Capitalization mismatch. Found: ({_type}, {variant}). Expected: ({_type}, {variant.capitalize()})"
return False, f"Type/Variant Combination is not allowed: ({_type}, {variant})"
return True, ""
from attrs import define, field
@define(kw_only=True)
class Architecture:
name: str = field()
version: str = field()
major: int = field(converter=int)
minor: int = field(converter=int)
@classmethod
def from_version(cls, architecture: str, version: str):
major, minor = str.split(version, ".")
if architecture not in rldict[major]["allowed_arches"]:
class Architecture(str):
@staticmethod
def New(architecture: str, version: int):
if architecture not in rldict[version]["allowed_arches"]:
print("Invalid architecture/version combo, skipping")
exit()
return cls(name=architecture, version=version, major=major, minor=minor)
return Architecture(architecture)

View File

@ -1,187 +0,0 @@
---
'10':
fullname: 'Rocky Linux 10.0'
revision: '10.0'
rclvl: 'RC1'
major: '10'
minor: '0'
profile: '10'
disttag: 'el10'
code: "Red Quartz"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
provide_multilib: True
project_id: 'e7b83c0a-b514-4903-b739-6943bbb307f7'
repo_symlinks:
NFV: 'nfv'
renames:
all: 'devel'
all_repos:
- 'all'
- 'BaseOS'
- 'AppStream'
- 'CRB'
- 'HighAvailability'
- 'ResilientStorage'
- 'RT'
- 'NFV'
- 'SAP'
- 'SAPHANA'
- 'extras'
- 'plus'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map:
xorrisofs: True
iso_level: False
images:
dvd:
disc: True
variant: 'AppStream'
repos:
- 'BaseOS'
- 'AppStream'
minimal:
disc: True
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
variant: 'BaseOS'
repos:
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
squashfs_only: True
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
builder: "lorax"
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r10'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
kiwimap:
git_repo: 'https://git.resf.org/sig_core/rocky-kiwi-descriptions.git'
branch: 'r9'
supported_builds:
- live
- cloud
- container
- vagrant
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
required_pkgs: []
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--forcearch=aarch64 --arch=aarch64 --arch=noarch'
ppc64le: '--forcearch=ppc64le --arch=ppc64le --arch=noarch'
s390x: '--forcearch=s390x --arch=s390x --arch=noarch'
repos:
devel: []
BaseOS: []
AppStream:
- BaseOS
CRB:
- BaseOS
- AppStream
HighAvailability:
- BaseOS
- AppStream
ResilientStorage:
- BaseOS
- AppStream
RT:
- BaseOS
- AppStream
NFV:
- BaseOS
- AppStream
SAP:
- BaseOS
- AppStream
- HighAvailability
SAPHANA:
- BaseOS
- AppStream
- HighAvailability
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r10/'
branch: 'r10'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-Rocky-10'
testing: 'SOURCES/RPM-GPG-KEY-Rocky-10-Testing'
list:
- 'SOURCES/Contributors'
- 'SOURCES/COMMUNITY-CHARTER'
- 'SOURCES/EULA'
- 'SOURCES/LICENSE'
- 'SOURCES/RPM-GPG-KEY-Rocky-10'
- 'SOURCES/RPM-GPG-KEY-Rocky-10-Testing'
...

View File

@ -1,196 +0,0 @@
---
'10-lookahead':
fullname: 'Rocky Linux 10.0'
revision: '10.0'
rclvl: 'LH1'
major: '10'
minor: '0'
profile: '10-lookahead'
disttag: 'el10'
code: "Red Quartz"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
provide_multilib: True
project_id: 'e7b83c0a-b514-4903-b739-6943bbb307f7'
repo_symlinks:
NFV: 'nfv'
renames:
all: 'devel'
all_repos:
- 'all'
- 'BaseOS'
- 'AppStream'
- 'CRB'
- 'HighAvailability'
- 'ResilientStorage'
- 'RT'
- 'NFV'
- 'SAP'
- 'SAPHANA'
- 'extras'
- 'plus'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map:
xorrisofs: True
iso_level: False
images:
dvd:
disc: True
variant: 'AppStream'
repos:
- 'BaseOS'
- 'AppStream'
minimal:
disc: True
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
variant: 'BaseOS'
repos:
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
squashfs_only: True
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
builder: "lorax"
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r10'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
kiwimap:
git_repo: 'https://git.resf.org/sig_core/rocky-kiwi-descriptions.git'
branch: 'r9'
supported_builds:
- live
- cloud
- container
- vagrant
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
required_pkgs:
- dracut-kiwi-live
- git
- kiwi
- kiwi-cli
- kiwi-systemdeps-image-validation
required_project_id: '47e0b4a8-84ba-43a9-bb94-eb99dde4cf14'
required_project_repos:
- 'core-common'
- 'core-infra'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--forcearch=aarch64 --arch=aarch64 --arch=noarch'
ppc64le: '--forcearch=ppc64le --arch=ppc64le --arch=noarch'
s390x: '--forcearch=s390x --arch=s390x --arch=noarch'
repos:
devel: []
BaseOS: []
AppStream:
- BaseOS
CRB:
- BaseOS
- AppStream
HighAvailability:
- BaseOS
- AppStream
ResilientStorage:
- BaseOS
- AppStream
RT:
- BaseOS
- AppStream
NFV:
- BaseOS
- AppStream
SAP:
- BaseOS
- AppStream
- HighAvailability
SAPHANA:
- BaseOS
- AppStream
- HighAvailability
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r10s/'
branch: 'r10s'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-Rocky-10'
testing: 'SOURCES/RPM-GPG-KEY-Rocky-10-Testing'
list:
- 'SOURCES/Contributors'
- 'SOURCES/COMMUNITY-CHARTER'
- 'SOURCES/EULA'
- 'SOURCES/LICENSE'
- 'SOURCES/RPM-GPG-KEY-Rocky-10'
- 'SOURCES/RPM-GPG-KEY-Rocky-10-Testing'
...

View File

@ -1,26 +1,23 @@
---
'8-beta':
'8-cloud':
fullname: 'Rocky Linux 8'
revision: '8.7'
rclvl: 'RC1'
revision: '8.6'
rclvl: 'RC2'
major: '8'
minor: '7'
minor: '6'
profile: '8'
disttag: 'el8'
code: "Green Obsidian"
bugurl: 'https://bugs.rockylinux.org'
fedora_release: 28
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
provide_multilib: False
project_id: '26694529-26cd-44bd-bc59-1c1195364322'
project_id: ''
repo_symlinks:
devel: 'Devel'
NFV: 'nfv'
renames:
all: 'devel'
renames: {}
all_repos:
- 'BaseOS'
- 'AppStream'
@ -32,11 +29,28 @@
- 'extras'
- 'devel'
- 'plus'
- 'rockyrpi'
no_comps_or_groups:
- 'extras'
- 'devel'
- 'plus'
- 'rockyrpi'
comps_or_groups:
- 'BaseOS'
- 'AppStream'
- 'PowerTools'
- 'HighAvailability'
- 'ResilientStorage'
- 'RT'
- 'NFV'
has_modules:
- 'AppStream'
- 'PowerTools'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map:
xorrisofs: False
xorrisofs: True
iso_level: False
images:
dvd:
@ -50,7 +64,6 @@
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
BaseOS:
disc: False
@ -66,54 +79,21 @@
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
extra_repos:
- 'https://yumrepofs.build.resf.org/v1/projects/f91da90d-5bdb-4cf2-80ea-e07f8dae5a5c/repo/all/aarch64/'
required_pkgs:
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
OCP:
format: qcow2
variants: [Base]
livemap:
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r8'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
allowed_arches:
- x86_64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--forcearch=aarch64 --arch=aarch64 --arch=noarch'
ppc64le: '--forcearch=ppc64le --arch=ppc64le --arch=noarch'
s390x: '--forcearch=s390x --arch=s390x --arch=noarch'
x86_64: '--arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--arch=aarch64 --arch=noarch'
ppc64le: '--arch=ppc64le --arch=noarch'
s390x: '--arch=s390x --arch=noarch'
repos:
BaseOS: []
AppStream:

View File

@ -1,26 +1,22 @@
---
'8':
fullname: 'Rocky Linux 8'
revision: '8.9'
rclvl: 'RC1'
revision: '8.6'
rclvl: 'sig-cloud'
major: '8'
minor: '9'
minor: '6'
profile: '8'
disttag: 'el8'
code: "Green Obsidian"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
fedora_release: 28
allowed_arches:
- x86_64
- aarch64
provide_multilib: False
project_id: 'e9cfc87c-d2d2-42d5-a121-852101f1a966'
project_id: ''
repo_symlinks:
devel: 'Devel'
NFV: 'nfv'
renames:
all: 'devel'
renames: {}
all_repos:
- 'BaseOS'
- 'AppStream'
@ -32,93 +28,81 @@
- 'extras'
- 'devel'
- 'plus'
- 'rockyrpi'
no_comps_or_groups:
- 'extras'
- 'devel'
- 'plus'
- 'rockyrpi'
comps_or_groups:
- 'BaseOS'
- 'AppStream'
- 'PowerTools'
- 'HighAvailability'
- 'ResilientStorage'
- 'RT'
- 'NFV'
has_modules:
- 'AppStream'
- 'PowerTools'
iso_map:
hosts:
x86_64: ''
aarch64: ''
ppc64le: ''
s390x: ''
images:
- dvd1
- minimal
- boot
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_packages:
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map:
xorrisofs: False
iso_level: False
hosts:
x86_64: ''
aarch64: ''
images:
dvd:
disc: True
variant: 'AppStream'
repos:
- 'BaseOS'
- 'AppStream'
minimal:
disc: True
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
BaseOS:
disc: False
isoskip: True
variant: 'BaseOS'
repos:
- 'BaseOS'
- 'AppStream'
lorax:
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI, WSL]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r8'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
allowed_arches:
- x86_64
lorax_variants:
- dvd
- minimal
- BaseOS
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--forcearch=aarch64 --arch=aarch64 --arch=noarch'
ppc64le: '--forcearch=ppc64le --arch=ppc64le --arch=noarch'
s390x: '--forcearch=s390x --arch=s390x --arch=noarch'
x86_64: '--arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--arch=aarch64 --arch=noarch'
ppc64le: '--arch=ppc64le --arch=noarch'
s390x: '--arch=s390x --arch=noarch'
repos:
BaseOS: []
AppStream:

View File

@ -1,149 +0,0 @@
---
'8-lookahead':
fullname: 'Rocky Linux 8'
revision: '8.8'
rclvl: 'RC1'
major: '8'
minor: '8'
profile: '8'
disttag: 'el8'
code: "Green Obsidian"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
provide_multilib: False
project_id: '3b0e9ec7-0679-4176-b253-8528eb3255eb'
repo_symlinks:
devel: 'Devel'
NFV: 'nfv'
renames:
all: 'devel'
all_repos:
- 'BaseOS'
- 'AppStream'
- 'PowerTools'
- 'HighAvailability'
- 'ResilientStorage'
- 'RT'
- 'NFV'
- 'extras'
- 'devel'
- 'plus'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map:
xorrisofs: False
iso_level: False
images:
dvd:
disc: True
variant: 'AppStream'
repos:
- 'BaseOS'
- 'AppStream'
minimal:
disc: True
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
BaseOS:
disc: False
isoskip: True
variant: 'BaseOS'
repos:
- 'BaseOS'
- 'AppStream'
lorax:
repos:
- 'BaseOS'
- 'AppStream'
variant: 'BaseOS'
lorax_removes:
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
OCP:
format: qcow2
variants: [Base]
livemap:
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r8'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
allowed_arches:
- x86_64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
aarch64: '--forcearch=aarch64 --arch=aarch64 --arch=noarch'
ppc64le: '--forcearch=ppc64le --arch=ppc64le --arch=noarch'
s390x: '--forcearch=s390x --arch=s390x --arch=noarch'
repos:
BaseOS: []
AppStream:
- BaseOS
PowerTools:
- BaseOS
- AppStream
HighAvailability:
- BaseOS
- AppStream
ResilientStorage:
- BaseOS
- AppStream
RT:
- BaseOS
- AppStream
NFV:
- BaseOS
- AppStream
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r8/'
branch: 'r8'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-rockyofficial'
testing: 'SOURCES/RPM-GPG-KEY-rockytesting'
list:
- 'SOURCES/COMMUNITY-CHARTER'
- 'SOURCES/EULA'
- 'SOURCES/LICENSE'
- 'SOURCES/RPM-GPG-KEY-rockyofficial'
- 'SOURCES/RPM-GPG-KEY-rockytesting'
...

View File

@ -1,23 +1,21 @@
---
'9-beta':
fullname: 'Rocky Linux 9.4'
revision: '9.4'
fullname: 'Rocky Linux 9.1'
revision: '9.1'
rclvl: 'BETA1'
major: '9'
minor: '4'
minor: '1'
profile: '9-beta'
disttag: 'el9'
code: "Blue Onyx"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
fedora_release: 34
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
provide_multilib: True
project_id: 'df5bcbfc-ba83-4da8-84d6-ae0168921b4d'
project_id: ''
repo_symlinks:
NFV: 'nfv'
renames:
@ -53,9 +51,7 @@
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
@ -64,8 +60,6 @@
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
squashfs_only: True
repos:
- 'BaseOS'
- 'AppStream'
@ -76,76 +70,9 @@
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
builder: "lorax"
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r9'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
kiwimap:
git_repo: 'https://git.resf.org/sig_core/rocky-kiwi-descriptions.git'
branch: 'r9'
supported_builds:
- live
- cloud
- container
- vagrant
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
required_pkgs:
- dracut-kiwi-live
- git
- kiwi
- kiwi-cli
- kiwi-systemdeps-image-validation
required_project_id: '47e0b4a8-84ba-43a9-bb94-eb99dde4cf14'
required_project_repos:
- 'core-common'
- 'core-infra'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
@ -183,7 +110,7 @@
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r9/'
branch: 'r9-beta'
branch: 'r9'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-Rocky-9'
testing: 'SOURCES/RPM-GPG-KEY-Rocky-9-Testing'

View File

@ -1,23 +1,21 @@
---
'9':
fullname: 'Rocky Linux 9.4'
revision: '9.4'
fullname: 'Rocky Linux 9.0'
revision: '9.0'
rclvl: 'RC1'
major: '9'
minor: '4'
minor: '0'
profile: '9'
disttag: 'el9'
code: "Blue Onyx"
fedora_release: 34
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
provide_multilib: True
project_id: '6202c09e-6252-4d3a-bcd3-9c7751682970'
project_id: '55b17281-bc54-4929-8aca-a8a11d628738'
repo_symlinks:
NFV: 'nfv'
renames:
@ -53,9 +51,7 @@
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
@ -64,8 +60,6 @@
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
squashfs_only: True
repos:
- 'BaseOS'
- 'AppStream'
@ -76,76 +70,9 @@
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
builder: "lorax"
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r9'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
kiwimap:
git_repo: 'https://git.resf.org/sig_core/rocky-kiwi-descriptions.git'
branch: 'r9'
supported_builds:
- live
- cloud
- container
- vagrant
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
required_pkgs:
- dracut-kiwi-live
- git
- kiwi
- kiwi-cli
- kiwi-systemdeps-image-validation
required_project_id: '47e0b4a8-84ba-43a9-bb94-eb99dde4cf14'
required_project_repos:
- 'core-common'
- 'core-infra'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'

View File

@ -1,60 +0,0 @@
# This is specifically for secondary/tertiary architectures
---
'9altarch':
fullname: 'Rocky Linux 9.0'
revision: '9.0'
rclvl: 'RC2'
major: '9'
minor: '0'
profile: '9'
disttag: 'el9'
code: "Blue Onyx"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- armv7hl
- riscv64
provide_multilib: False
project_id: ''
renames:
all: 'devel'
all_repos:
- 'all'
- 'BaseOS'
- 'AppStream'
- 'CRB'
- 'extras'
- 'plus'
structure:
packages: 'os/Packages'
repodata: 'os/repodata'
iso_map: {}
livemap: {}
repoclosure_map:
arches:
armv7hl: '--forcearch=armv7hl --arch=noarch'
riscv64: '--forcearch=riscv64 --arch=noarch'
repos:
devel: []
BaseOS: []
AppStream:
- BaseOS
CRB:
- BaseOS
- AppStream
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r9/'
branch: 'r9'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-Rocky-9'
testing: 'SOURCES/RPM-GPG-KEY-Rocky-9-Testing'
list:
- 'SOURCES/Contributors'
- 'SOURCES/COMMUNITY-CHARTER'
- 'SOURCES/EULA'
- 'SOURCES/LICENSE'
- 'SOURCES/RPM-GPG-KEY-Rocky-9'
- 'SOURCES/RPM-GPG-KEY-Rocky-9-Testing'
...

View File

@ -1,23 +1,21 @@
---
'9-lookahead':
fullname: 'Rocky Linux 9.5'
revision: '9.5'
fullname: 'Rocky Linux 9.1'
revision: '9.1'
rclvl: 'LH1'
major: '9'
minor: '5'
minor: '1'
profile: '9-lookahead'
disttag: 'el9'
code: "Blue Onyx"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
fedora_release: 34
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
provide_multilib: True
project_id: '6794b5a8-290b-4d0d-ad5a-47164329cbb0'
project_id: ''
repo_symlinks:
NFV: 'nfv'
renames:
@ -53,9 +51,7 @@
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
@ -64,8 +60,6 @@
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
squashfs_only: True
repos:
- 'BaseOS'
- 'AppStream'
@ -76,76 +70,9 @@
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
builder: "lorax"
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'r9'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
kiwimap:
git_repo: 'https://git.resf.org/sig_core/rocky-kiwi-descriptions.git'
branch: 'r9'
supported_builds:
- live
- cloud
- container
- vagrant
allowed_arches:
- x86_64
- aarch64
- ppc64le
- s390x
required_pkgs:
- dracut-kiwi-live
- git
- kiwi
- kiwi-cli
- kiwi-systemdeps-image-validation
required_project_id: '47e0b4a8-84ba-43a9-bb94-eb99dde4cf14'
required_project_repos:
- 'core-common'
- 'core-infra'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'
@ -183,7 +110,7 @@
extra_files:
git_repo: 'https://git.rockylinux.org/staging/src/rocky-release.git'
git_raw_path: 'https://git.rockylinux.org/staging/src/rocky-release/-/raw/r9/'
branch: 'r9s'
branch: 'r9lh'
gpg:
stable: 'SOURCES/RPM-GPG-KEY-Rocky-9'
testing: 'SOURCES/RPM-GPG-KEY-Rocky-9-Testing'

View File

@ -2,15 +2,12 @@
'rln':
fullname: 'Rocky Linux New'
revision: '10'
rclvl: 'RLN134'
rclvl: 'RLN120'
major: '10'
minor: '0'
profile: 'rln'
disttag: 'rln134'
code: "Silver Magma"
bugurl: 'https://bugs.rockylinux.org'
checksum: 'sha256'
fedora_major: '20'
allowed_arches:
- x86_64
- aarch64
@ -43,28 +40,24 @@
iso_level: False
images:
dvd:
disc: True
discnum: '1'
variant: 'AppStream'
repos:
- 'BaseOS'
- 'AppStream'
minimal:
disc: True
discnum: '1'
isoskip: True
repos:
- 'minimal'
- 'BaseOS'
variant: 'minimal'
volname: 'dvd'
BaseOS:
disc: False
isoskip: True
variant: 'BaseOS'
repos:
- 'BaseOS'
- 'AppStream'
lorax:
noupgrade: False
repos:
- 'BaseOS'
- 'AppStream'
@ -73,54 +66,10 @@
- 'libreport-rhel-anaconda-bugzilla'
required_pkgs:
- 'lorax'
- 'genisoimage'
- 'isomd5sum'
- 'lorax-templates-rocky'
- 'lorax-templates-rhel'
- 'lorax-templates-generic'
- 'xorriso'
cloudimages:
images:
Azure:
format: vhd
variants: [Base, LVM]
primary_variant: 'Base'
EC2:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
GenericCloud:
format: qcow2
variants: [Base, LVM]
primary_variant: 'Base'
Container:
format: tar.xz
variants: [Base, Minimal, UBI]
RPI:
format: raw.xz
OCP:
format: qcow2
variants: [Base]
Vagrant:
format: box
variants: [Libvirt, Vbox, VMware]
livemap:
git_repo: 'https://git.resf.org/sig_core/kickstarts.git'
branch: 'rln'
ksentry:
Workstation: rocky-live-workstation.ks
Workstation-Lite: rocky-live-workstation-lite.ks
XFCE: rocky-live-xfce.ks
KDE: rocky-live-kde.ks
MATE: rocky-live-mate.ks
Cinnamon: rocky-live-cinnamon.ks
allowed_arches:
- x86_64
- aarch64
required_pkgs:
- 'lorax-lmc-novirt'
- 'vim-minimal'
- 'pykickstart'
- 'git'
repoclosure_map:
arches:
x86_64: '--forcearch=x86_64 --arch=x86_64 --arch=athlon --arch=i686 --arch=i586 --arch=i486 --arch=i386 --arch=noarch'

View File

@ -3,20 +3,12 @@
import argparse
import datetime
import json
import logging
import os
import pathlib
import platform
import subprocess
import sys
import tempfile
import time
import pathlib
from attrs import define, Factory, field, asdict
from botocore import args
from jinja2 import Environment, FileSystemLoader, Template
from typing import Callable, List, NoReturn, Optional, Tuple, IO, Union
from typing import List, Tuple
from empanadas.common import Architecture, rldict, valid_type_variant
from empanadas.common import _rootdir
@ -30,457 +22,123 @@ parser.add_argument('--debug', action='store_true', help="debug?")
parser.add_argument('--type', type=str, help="Image type (container, genclo, azure, aws, vagrant)", required=True)
parser.add_argument('--variant', type=str, help="", required=False)
parser.add_argument('--release', type=str, help="Image release for subsequent builds with the same date stamp (rarely needed)", required=False)
parser.add_argument('--kube', action='store_true', help="output as a K8s job(s)", required=False)
parser.add_argument('--timeout', type=str, help="change timeout for imagefactory build process (default 3600)", required=False, default='3600')
results = parser.parse_args()
rlvars = rldict[results.version]
major = rlvars["major"]
debug = results.debug
log = logging.getLogger(__name__)
log.setLevel(logging.INFO if not debug else logging.DEBUG)
handler = logging.StreamHandler(sys.stdout)
handler.setLevel(logging.INFO if not debug else logging.DEBUG)
formatter = logging.Formatter(
'%(asctime)s :: %(name)s :: %(message)s',
'%Y-%m-%d %H:%M:%S'
)
handler.setFormatter(formatter)
log.addHandler(handler)
STORAGE_DIR = pathlib.Path("/var/lib/imagefactory/storage")
KICKSTART_PATH = pathlib.Path(os.environ.get("KICKSTART_PATH", "/kickstarts"))
BUILDTIME = datetime.datetime.utcnow()
CMD_PARAM_T = List[Union[str, Callable[..., str]]]
def render_icicle_template(template: Template, architecture: Architecture) -> str:
handle, output = tempfile.mkstemp()
if not handle:
exit(3)
with os.fdopen(handle, "wb") as tmp:
_template = template.render(
architecture=architecture,
fedora_version=rlvars["fedora_release"],
iso8601date=BUILDTIME.strftime("%Y%m%d"),
installdir="kickstart" if results.kickstartdir else "os",
major=major,
release=results.release if results.release else 0,
size="10G",
type=results.type.capitalize(),
utcnow=BUILDTIME,
version_variant=rlvars["revision"] if not results.variant else f"{rlvars['revision']}-{results.variant.capitalize()}",
)
tmp.write(_template.encode())
return output
@define(kw_only=True)
class ImageBuild:
architecture: Architecture = field()
base_uuid: Optional[str] = field(default="")
cli_args: argparse.Namespace = field()
command_args: List[str] = field(factory=list)
common_args: List[str] = field(factory=list)
debug: bool = field(default=False)
image_type: str = field()
job_template: Optional[Template] = field(init=False)
kickstart_arg: List[str] = field(factory=list)
kickstart_path: pathlib.Path = field(init=False)
metadata: pathlib.Path = field(init=False)
out_type: str = field(init=False)
outdir: pathlib.Path = field(init=False)
outname: str = field(init=False)
package_args: List[str] = field(factory=list)
release: int = field(default=0)
stage_commands: Optional[List[List[Union[str,Callable]]]] = field(init=False)
target_uuid: Optional[str] = field(default="")
tdl_path: pathlib.Path = field(init=False)
template: Template = field()
timeout: str = field(default='3600')
type_variant: str = field(init=False)
variant: Optional[str] = field()
def __attrs_post_init__(self):
self.tdl_path = self.render_icicle_template()
if not self.tdl_path:
def generate_kickstart_imagefactory_args(debug: bool = False) -> str:
type_variant = results.type if not results.variant else f"{results.type}-{results.variant}" # todo -cleanup
kickstart_path = pathlib.Path(f"{KICKSTART_PATH}/Rocky-{major}-{type_variant}.ks")
if not kickstart_path.is_file():
print(f"Kickstart file is not available: {kickstart_path}")
if not debug:
exit(2)
self.type_variant = self.type_variant_name()
self.outdir, self.outname = self.output_name()
self.out_type = self.image_format()
self.command_args = self._command_args()
self.package_args = self._package_args()
self.common_args = self._common_args()
self.metadata = pathlib.Path(self.outdir, ".imagefactory-metadata.json")
return f"--file-parameter install_script {kickstart_path}"
self.kickstart_path = pathlib.Path(f"{KICKSTART_PATH}/Rocky-{self.architecture.major}-{self.type_variant}.ks")
def get_image_format(_type: str) -> str:
mapping = {
"Container": "docker"
}
return mapping[_type] if _type in mapping.keys() else ''
self.checkout_kickstarts()
self.kickstart_arg = self.kickstart_imagefactory_args()
def generate_imagefactory_commands(tdl_template: Template, architecture: Architecture) -> List[List[str]]:
template_path = render_icicle_template(tdl_template, architecture)
if not template_path:
exit(2)
try:
os.mkdir(self.outdir)
except FileExistsError as e:
log.info("Directory already exists for this release. If possible, previously executed steps may be skipped")
except Exception as e:
log.exception("Some other exception occured while creating the output directory", e)
return 0
args_mapping = {
"debug": "--debug"
}
if os.path.exists(self.metadata):
with open(self.metadata, "r") as f:
try:
o = json.load(f)
self.base_uuid = o['base_uuid']
self.target_uuid = o['target_uuid']
except json.decoder.JSONDecodeError as e:
log.exception("Couldn't decode metadata file", e)
finally:
f.flush()
# only supports boolean flags right now?
args = [param for name, param in args_mapping.items() if getattr(results,name)]
package_args = []
# Yes, this is gross. I'll fix it later.
if self.image_type in ["Container"]:
self.stage_commands = [
["tar", "-C", f"{self.outdir}", "--strip-components=1", "-x", "-f", lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", "*/layer.tar"],
["xz", f"{self.outdir}/layer.tar"]
]
if self.image_type in ["RPI"]:
self.stage_commands = [
["cp", lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", f"{self.outdir}/{self.outname}.raw"],
["xz", f"{self.outdir}/{self.outname}.raw"]
]
if self.image_type in ["GenericCloud", "OCP", "GenericArm"]:
self.stage_commands = [
["qemu-img", "convert", "-c", "-f", "raw", "-O", "qcow2", lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", f"{self.outdir}/{self.outname}.qcow2"]
]
if self.image_type in ["EC2"]:
self.stage_commands = [
["qemu-img", "convert", "-f", "raw", "-O", "qcow2", lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", f"{self.outdir}/{self.outname}.qcow2"]
]
if self.image_type in ["Azure"]:
self.stage_commands = [
["/prep-azure.sh", lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", f"{STORAGE_DIR}"],
["cp", lambda: f"{STORAGE_DIR}/{self.target_uuid}.vhd", f"{self.outdir}/{self.outname}.vhd"]
]
if self.image_type in ["Vagrant"]:
_map = {
"Vbox": {"format": "vmdk", "provider": "virtualbox"},
"Libvirt": {"format": "qcow2", "provider": "libvirt", "virtual_size": 10},
"VMware": {"format": "vmdk", "provider": "vmware_desktop"}
}
output = f"{_map[self.variant]['format']}" #type: ignore
provider = f"{_map[self.variant]['provider']}" # type: ignore
kickstart_arg = generate_kickstart_imagefactory_args(True) # REMOVE DEBUG ARG
# pop from the options map that will be passed to the vagrant metadata.json
convert_options = _map[self.variant].pop('convertOptions') if 'convertOptions' in _map[self.variant].keys() else '' #type: ignore
if results.type == "Container":
args += ["--parameter", "offline_icicle", "true"]
package_args += ["--parameter", "compress", "xz"]
tar_command = ["tar", "-Oxf", f"{STORAGE_DIR}/*.body" "./layer.tar"]
type_variant = results.type if not results.variant else f"{results.type}-{results.variant}" # todo -cleanup
outname = f"Rocky-{rlvars['major']}-{type_variant}.{BUILDTIME.strftime('%Y%m%d')}.{results.release if results.release else 0}.{architecture}"
self.stage_commands = [
["qemu-img", "convert", "-c", "-f", "raw", "-O", output, *convert_options, lambda: f"{STORAGE_DIR}/{self.target_uuid}.body", f"{self.outdir}/{self.outname}.{output}"],
["tar", "-C", self.outdir, "-czf", f"/tmp/{self.outname}.box", '.'],
["mv", f"/tmp/{self.outname}.box", self.outdir]
]
self.prepare_vagrant(_map[self.variant])
outdir = pathlib.Path(f"/tmp/{outname}")
if self.stage_commands:
self.stage_commands.append(["cp", "-v", lambda: f"{STORAGE_DIR}/{self.target_uuid}.meta", f"{self.outdir}/build.meta"])
def prepare_vagrant(self, options):
"""Setup the output directory for the Vagrant type variant, dropping templates as required"""
file_loader = FileSystemLoader(f"{_rootdir}/templates")
tmplenv = Environment(loader=file_loader)
templates = {}
templates['Vagrantfile'] = tmplenv.get_template(f"vagrant/Vagrantfile.{self.variant}")
templates['metadata.json'] = tmplenv.get_template('vagrant/metadata.tmpl.json')
templates['info.json'] = tmplenv.get_template('vagrant/info.tmpl.json')
if self.variant == "VMware":
templates[f"{self.outname}.vmx"] = tmplenv.get_template('vagrant/vmx.tmpl')
if self.variant == "Vbox":
templates['box.ovf'] = tmplenv.get_template('vagrant/box.tmpl.ovf')
if self.variant == "Libvirt":
# Libvirt vagrant driver expects the qcow2 file to be called box.img.
qemu_command_index = [i for i, d in enumerate(self.stage_commands) if d[0] == "qemu-img"][0]
self.stage_commands.insert(qemu_command_index+1, ["mv", f"{self.outdir}/{self.outname}.qcow2", f"{self.outdir}/box.img"])
for name, template in templates.items():
self.render_template(f"{self.outdir}/{name}", template,
name=self.outname,
arch=self.architecture.name,
options=options
build_command = (f"imagefactory base_image {kickstart_arg} {' '.join(args)} {template_path}"
f" | tee -a {outdir}/logs/base_image-{outname}.out"
f" | tail -n4 > {outdir}/base.meta || exit 2"
)
def checkout_kickstarts(self) -> int:
cmd = ["git", "clone", "--branch", f"r{self.architecture.major}", rlvars['livemap']['git_repo'], f"{KICKSTART_PATH}"]
ret, out, err, _ = self.runCmd(cmd, search=False)
log.debug(out)
log.debug(err)
if ret > 0:
ret = self.pull_kickstarts()
return ret
def pull_kickstarts(self) -> int:
cmd: CMD_PARAM_T = ["git", "-C", f"{KICKSTART_PATH}", "reset", "--hard", "HEAD"]
ret, out, err, _ = self.runCmd(cmd, search=False)
log.debug(out)
log.debug(err)
if ret == 0:
cmd = ["git", "-C", f"{KICKSTART_PATH}", "pull"]
ret, out, err, _ = self.runCmd(cmd, search=False)
log.debug(out)
log.debug(err)
return ret
out_type = get_image_format(results.type)
package_command = ["imagefactory", "target_image", *args, template_path,
"--id", "$(awk '$1==\"UUID:\"{print $NF}'"+f" /tmp/{outname}/base.meta)",
*package_args,
"--parameter", "repository", outname, out_type,
"|", "tee", "-a", f"{outdir}/base_image-{outname}.out",
"|", "tail", "-n4", ">", f"{outdir}/target.meta", "||", "exit", "3"
]
def output_name(self) -> Tuple[pathlib.Path, str]:
directory = f"Rocky-{self.architecture.major}-{self.type_variant}-{self.architecture.version}-{BUILDTIME.strftime('%Y%m%d')}.{self.release}"
name = f"{directory}.{self.architecture.name}"
outdir = pathlib.Path(f"/tmp/", directory)
return outdir, name
copy_command = (f"aws s3 cp --recursive {outdir}/* s3://resf-empanadas/buildimage-{ outname }/{ BUILDTIME.strftime('%s') }/")
commands = [build_command, package_command, copy_command]
return commands
def type_variant_name(self):
return self.image_type if not self.variant else f"{self.image_type}-{self.variant}"
def _command_args(self):
args_mapping = {
"debug": "--debug",
}
return [param for name, param in args_mapping.items() if getattr(self.cli_args, name)]
def _package_args(self) -> List[str]:
if self.image_type in ["Container"]:
return ["--parameter", "compress", "xz"]
return [""]
def _common_args(self) -> List[str]:
args = []
if self.image_type in ["Container"]:
args = ["--parameter", "offline_icicle", "true"]
if self.image_type in ["GenericCloud", "EC2", "Vagrant", "Azure", "OCP", "RPI", "GenericArm"]:
args = ["--parameter", "generate_icicle", "false"]
return args
def image_format(self) -> str:
mapping = {
"Container": "docker"
}
return mapping[self.image_type] if self.image_type in mapping.keys() else ''
def kickstart_imagefactory_args(self) -> List[str]:
if not self.kickstart_path.is_file():
log.warning(f"Kickstart file is not available: {self.kickstart_path}")
if not debug:
log.warning("Exiting because debug mode is not enabled.")
exit(2)
return ["--file-parameter", "install_script", str(self.kickstart_path)]
def render_template(self, path, template, **kwargs) -> pathlib.Path:
with open(path, "wb") as f:
_template = template.render(**kwargs)
f.write(_template.encode())
f.flush()
output = pathlib.Path(path)
if not output.exists():
log.error("Failed to write template")
raise Exception("Failed to template")
return output
def render_icicle_template(self) -> pathlib.Path:
output = tempfile.NamedTemporaryFile(delete=False).name
return self.render_template(output, self.template,
architecture=self.architecture.name,
iso8601date=BUILDTIME.strftime("%Y%m%d"),
installdir="kickstart" if self.cli_args.kickstartdir else "os",
major=self.architecture.major,
minor=self.architecture.minor,
release=self.release,
size="10G",
type=self.image_type,
utcnow=BUILDTIME,
version_variant=self.architecture.version if not self.variant else f"{self.architecture.version}-{self.variant}",
)
def build_command(self) -> List[str]:
build_command = ["imagefactory", "--timeout", self.timeout, *self.command_args, "base_image", *self.common_args, *self.kickstart_arg, self.tdl_path]
return build_command
def package_command(self) -> List[str]:
package_command = ["imagefactory", *self.command_args, "target_image", self.out_type, *self.common_args,
"--id", f"{self.base_uuid}",
*self.package_args,
"--parameter", "repository", self.outname,
]
return package_command
def copy_command(self) -> List[str]:
copy_command = ["aws", "s3", "cp", "--recursive", f"{self.outdir}/",
f"s3://resf-empanadas/buildimage-{self.architecture.version}-{self.architecture.name}/{ self.outname }/{ BUILDTIME.strftime('%s') }/"
]
return copy_command
def build(self) -> int:
if self.base_uuid:
return 0
self.fix_ks()
ret, out, err, uuid = self.runCmd(self.build_command())
if uuid:
self.base_uuid = uuid.rstrip()
self.save()
return ret
def package(self) -> int:
# Some build types don't need to be packaged by imagefactory
# @TODO remove business logic if possible
if self.image_type in ["GenericCloud", "EC2", "Azure", "Vagrant", "OCP", "RPI", "GenericArm"]:
self.target_uuid = self.base_uuid if hasattr(self, 'base_uuid') else ""
if self.target_uuid:
return 0
ret, out, err, uuid = self.runCmd(self.package_command())
if uuid:
self.target_uuid = uuid.rstrip()
self.save()
return ret
def stage(self) -> int:
""" Stage the artifacst from wherever they are (unpacking and converting if needed)"""
if not hasattr(self,'stage_commands'):
return 0
returns = []
for command in self.stage_commands: #type: ignore
ret, out, err, _ = self.runCmd(command, search=False)
returns.append(ret)
return all(ret > 0 for ret in returns)
def copy(self, skip=False) -> int:
# move or unpack if necessary
log.info("Executing staging commands")
if (stage := self.stage() > 0):
raise Exception(stage)
if not skip:
log.info("Copying files to output directory")
ret, out, err, _ = self.runCmd(self.copy_command(), search=False)
return ret
log.info(f"Build complete! Output available in {self.outdir}/")
return 0
def runCmd(self, command: CMD_PARAM_T, search: bool = True) -> Tuple[int, Union[bytes,None], Union[bytes,None], Union[str,None]]:
prepared, _ = self.prepare_command(command)
log.info(f"Running command: {' '.join(prepared)}")
kwargs = {
"stderr": subprocess.PIPE,
"stdout": subprocess.PIPE
}
if debug: del kwargs["stderr"]
with subprocess.Popen(prepared, **kwargs) as p:
uuid = None
# @TODO implement this as a callback?
if search:
for _, line in enumerate(p.stdout): # type: ignore
ln = line.decode()
if ln.startswith("UUID: "):
uuid = ln.split(" ")[-1]
log.debug(f"found uuid: {uuid}")
out, err = p.communicate()
res = p.wait(), out, err, uuid
if res[0] > 0:
log.error(f"Problem while executing command: '{prepared}'")
if search and not res[3]:
log.error("UUID not found in stdout. Dumping stdout and stderr")
self.log_subprocess(res)
return res
def prepare_command(self, command_list: CMD_PARAM_T) -> Tuple[List[str],List[None]]:
"""
Commands may be a callable, which should be a lambda to be evaluated at
preparation time with available locals. This can be used to, among
other things, perform lazy evaluations of f-strings which have values
not available at assignment time. e.g., filling in a second command
with a value extracted from the previous step or command.
"""
r = []
return r, [r.append(c()) if (callable(c) and c.__name__ == '<lambda>') else r.append(str(c)) for c in command_list]
def log_subprocess(self, result: Tuple[int, Union[bytes, None], Union[bytes, None], Union[str, None]]):
def log_lines(title, lines):
log.info(f"====={title}=====")
log.info(lines.decode())
log.info(f"Command return code: {result[0]}")
stdout = result[1]
stderr = result[2]
if stdout:
log_lines("Command STDOUT", stdout)
if stderr:
log_lines("Command STDERR", stderr)
def fix_ks(self):
cmd: CMD_PARAM_T = ["sed", "-i", f"s,$basearch,{self.architecture.name},", str(self.kickstart_path)]
self.runCmd(cmd, search=False)
def render_kubernetes_job(self):
commands = [self.build_command(), self.package_command(), self.copy_command()]
if not self.job_template:
return None
template = self.job_template.render(
architecture=self.architecture.name,
backoffLimit=4,
buildTime=BUILDTIME.strftime("%s"),
command=commands,
imageName="ghcr.io/rockylinux/sig-core-toolkit:latest",
jobname="buildimage",
namespace="empanadas",
major=major,
restartPolicy="Never",
)
return template
def save(self):
with open(self.metadata, "w") as f:
try:
o = { name: getattr(self, name) for name in ["base_uuid", "target_uuid"] }
log.debug(o)
json.dump(o, f)
except AttributeError as e:
log.error("Couldn't find attribute in object. Something is probably wrong", e)
except Exception as e:
log.exception(e)
finally:
f.flush()
def run():
try:
valid_type_variant(results.type, results.variant)
except Exception as e:
log.exception(e)
result, error = valid_type_variant(results.type, results.variant)
if not result:
print(error)
exit(2)
file_loader = FileSystemLoader(f"{_rootdir}/templates")
tmplenv = Environment(loader=file_loader)
tdl_template = tmplenv.get_template('icicle/tdl.xml.tmpl')
job_template = tmplenv.get_template('kube/Job.tmpl')
arches = rlvars['allowed_arches'] if results.kube else [platform.uname().machine]
for architecture in rlvars["allowed_arches"]:
architecture = Architecture.New(architecture, major)
for architecture in arches:
IB = ImageBuild(
architecture=Architecture.from_version(architecture, rlvars['revision']),
cli_args=results,
debug=results.debug,
image_type=results.type,
release=results.release if results.release else 0,
template=tdl_template,
variant=results.variant,
)
if results.kube:
IB.job_template = tmplenv.get_template('kube/Job.tmpl')
#commands = IB.kube_commands()
print(IB.render_kubernetes_job())
else:
ret = IB.build()
ret = IB.package()
ret = IB.copy()
commands = generate_imagefactory_commands(tdl_template, architecture)
print(job_template.render(
architecture=architecture,
backoffLimit=4,
buildTime=datetime.datetime.utcnow().strftime("%s"),
command=commands,
imageName="ghcr.io/rockylinux/sig-core-toolkit:latest",
jobname="buildimage",
namespace="empanadas",
major=major,
restartPolicy="Never",
))

View File

@ -13,7 +13,6 @@ parser.add_argument('--isolation', type=str, help="mock isolation mode")
parser.add_argument('--rc', action='store_true', help="Release Candidate, Beta, RLN")
parser.add_argument('--local-compose', action='store_true', help="Compose Directory is Here")
parser.add_argument('--logger', type=str)
parser.add_argument('--hashed', action='store_true')
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
@ -25,7 +24,6 @@ a = IsoBuild(
rc=results.rc,
isolation=results.isolation,
compose_dir_is_here=results.local_compose,
hashed=results.hashed,
logger=results.logger,
)

View File

@ -16,9 +16,6 @@ parser.add_argument('--local-compose', action='store_true', help="Compose Direct
parser.add_argument('--logger', type=str)
parser.add_argument('--extra-iso', type=str, help="Granular choice in which iso is built")
parser.add_argument('--extra-iso-mode', type=str, default='local')
parser.add_argument('--hashed', action='store_true')
parser.add_argument('--updated-image', action='store_true')
parser.add_argument('--image-increment',type=str, default='0')
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
@ -33,10 +30,7 @@ a = IsoBuild(
extra_iso=results.extra_iso,
extra_iso_mode=results.extra_iso_mode,
compose_dir_is_here=results.local_compose,
hashed=results.hashed,
logger=results.logger,
updated_image=results.updated_image,
image_increment=results.image_increment
logger=results.logger
)
def run():

View File

@ -1,43 +0,0 @@
# builds ISO's
import argparse
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import LiveBuild
parser = argparse.ArgumentParser(description="Live ISO Compose")
parser.add_argument('--release', type=str, help="Major Release Version or major-type (eg 9-beta)", required=True)
parser.add_argument('--isolation', type=str, help="Mock Isolation")
parser.add_argument('--local-compose', action='store_true', help="Compose Directory is Here")
parser.add_argument('--peridot', action='store_true', help="Use peridot repos")
parser.add_argument('--image', type=str, help="Granular choice in which live image is built")
parser.add_argument('--logger', type=str)
parser.add_argument('--live-iso-mode', type=str, default='local')
parser.add_argument('--hashed', action='store_true')
parser.add_argument('--just-copy-it', action='store_true', help="Just copy the images to the compose dir")
parser.add_argument('--force-build', action='store_true', help="Just build and overwrite the images")
parser.add_argument('--builder', type=str, help="Choose a builder type and override the set value in the configs")
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
a = LiveBuild(
rlvars,
config,
major=major,
isolation=results.isolation,
live_iso_mode=results.live_iso_mode,
image=results.image,
compose_dir_is_here=results.local_compose,
peridot=results.peridot,
hashed=results.hashed,
justcopyit=results.just_copy_it,
force_build=results.force_build,
builder=results.builder,
logger=results.logger
)
def run():
a.run_build_live_iso()

View File

@ -1,34 +0,0 @@
# builds ISO's
import argparse
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import IsoBuild
parser = argparse.ArgumentParser(description="Live ISO Compose")
parser.add_argument('--release', type=str, help="Major Release Version or major-type (eg 9-beta)", required=True)
parser.add_argument('--isolation', type=str, help="Mock Isolation")
parser.add_argument('--local-compose', action='store_true', help="Compose Directory is Here")
parser.add_argument('--image', action='store_true', help="Live image name")
parser.add_argument('--logger', type=str)
parser.add_argument('--live-iso-mode', type=str, default='local')
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
a = LiveBuild(
rlvars,
config,
major=major,
isolation=results.isolation,
extra_iso_mode=results.live_iso_mode,
image=results.image,
compose_dir_is_here=results.local_compose,
logger=results.logger
)
def run():
print(a.livemap['ksentry'])
print(a.livemap['ksentry'].keys())

View File

@ -1,37 +0,0 @@
# This script can be called to do single syncs or full on syncs.
import argparse
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import RepoSync
# Start up the parser baby
parser = argparse.ArgumentParser(description="Peridot Sync and Compose")
# All of our options
parser.add_argument('--release', type=str, help="Major Release Version or major-type (eg 9-beta)", required=True)
parser.add_argument('--arch', type=str, help="Architecture")
parser.add_argument('--fpsync', type=str, help="Use fpsync instead of rsync")
parser.add_argument('--logger', type=str)
# Parse them
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
r = Checks(rlvars, config['arch'])
r.check_validity()
# Send them and do whatever I guess
a = RepoSync(
rlvars,
config,
major=major,
arch=results.arch,
fpsync=results.fpsync,
logger=results.logger,
)
def run():
a.run_compose_closeout()

View File

@ -1,84 +0,0 @@
# This script can be called to do single syncs or full on syncs.
import os
import argparse
import logging
import sys
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import RepoSync
from empanadas.util import Shared
# Start up the parser baby
parser = argparse.ArgumentParser(description="Peridot Sync and Compose")
# All of our options
parser.add_argument('--release', type=str, help="Major Release Version or major-type (eg 9-beta)", required=True)
parser.add_argument('--sig', type=str, help="SIG Name if applicable")
parser.add_argument('--symlink', action='store_true', help="symlink to latest")
parser.add_argument('--copy-old-compose', action='store_true', help="Runs an rsync from previous compose")
parser.add_argument('--logger', type=str)
# Parse them
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
r = Checks(rlvars, config['arch'])
r.check_validity()
# Send them and do whatever I guess
def run():
if results.logger is None:
log = logging.getLogger("generate")
log.setLevel(logging.INFO)
handler = logging.StreamHandler(sys.stdout)
handler.setLevel(logging.INFO)
formatter = logging.Formatter(
'%(asctime)s :: %(name)s :: %(message)s',
'%Y-%m-%d %H:%M:%S'
)
handler.setFormatter(formatter)
log.addHandler(handler)
else:
log = results.logger
compose_base = config['compose_root'] + "/" + major
shortname = config['shortname']
version = rlvars['revision']
date_stamp = config['date_stamp']
profile = rlvars['profile']
logger = log
if results.sig is not None:
shortname = 'SIG-' + results.sig
generated_dir = Shared.generate_compose_dirs(
compose_base,
shortname,
version,
date_stamp,
logger
)
compose_latest_dir = os.path.join(
config['compose_root'],
major,
"latest-{}-{}".format(
shortname,
profile,
)
)
if results.copy_old_compose:
if os.path.exists(compose_latest_dir):
previous_compose_path = os.path.realpath(compose_latest_dir)
else:
log.warning('No symlink exists; we cannot copy from the old compose')
if results.symlink:
if os.path.exists(compose_latest_dir):
os.remove(compose_latest_dir)
os.symlink(generated_dir, compose_latest_dir)
log.info('Generated compose dirs.')

View File

@ -31,7 +31,7 @@ def run():
elif results.env == "all":
arches = EKSARCH+EXTARCH
command = ["build-iso", "--release", f"{results.release}", "--isolation", "simple", "--hashed"]
command = ["build-iso", "--release", f"{results.release}", "--isolation", "simple"]
if results.rc:
command += ["--rc"]

View File

@ -1,38 +0,0 @@
# This is for doing repoclosures upstream
import argparse
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import RepoSync
# Start up the parser baby
parser = argparse.ArgumentParser(description="Peridot Upstream Repoclosure")
# All of our options
parser.add_argument('--release', type=str, help="Major Release Version or major-type (eg 9-beta)", required=True)
parser.add_argument('--simple', action='store_false')
parser.add_argument('--enable-repo-gpg-check', action='store_true')
parser.add_argument('--hashed', action='store_true')
parser.add_argument('--logger', type=str)
# Parse them
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
r = Checks(rlvars, config['arch'])
r.check_validity()
a = RepoSync(
rlvars,
config,
major=major,
hashed=results.hashed,
parallel=results.simple,
repo_gpg_check=results.enable_repo_gpg_check,
logger=results.logger,
)
def run():
a.run_upstream_repoclosure()

View File

@ -1,39 +0,0 @@
# builds ISO's
import argparse
from empanadas.common import *
from empanadas.util import Checks
from empanadas.util import IsoBuild
parser = argparse.ArgumentParser(description="ISO Artifact Builder")
parser.add_argument('--release', type=str, help="Major Release Version", required=True)
parser.add_argument('--s3', action='store_true', help="S3")
parser.add_argument('--arch', type=str, help="Architecture")
parser.add_argument('--local-compose', action='store_true', help="Compose Directory is Here")
parser.add_argument('--force-download', action='store_true', help="Force a download")
parser.add_argument('--s3-region', type=str, help="S3 region (overrides defaults)")
parser.add_argument('--s3-bucket', type=str, help="S3 bucket name (overrides defaults)")
parser.add_argument('--s3-bucket-url', type=str, help="S3 bucket url (overrides defaults)")
parser.add_argument('--logger', type=str)
results = parser.parse_args()
rlvars = rldict[results.release]
major = rlvars['major']
a = IsoBuild(
rlvars,
config,
major=major,
s3=results.s3,
arch=results.arch,
force_download=results.force_download,
compose_dir_is_here=results.local_compose,
s3_region=results.s3_region,
s3_bucket=results.s3_bucket,
s3_bucket_url=results.s3_bucket_url,
logger=results.logger,
)
def run():
a.run_pull_generic_images()

Some files were not shown because too many files have changed in this diff Show More