Skip to content

Commit

Permalink
kernelcollector: Add webhook support
Browse files Browse the repository at this point in the history
  • Loading branch information
darktohka committed Sep 17, 2020
1 parent 6f42f79 commit 36ffee5
Show file tree
Hide file tree
Showing 10 changed files with 256 additions and 96 deletions.
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,4 @@ RUN \
&& rm -rf /tmp/* /var/cache/apk/*

USER kernelcollector
ENTRYPOINT ["python", "-m", "kernelcollector.Main"]
ENTRYPOINT ["/bin/sh", "/srv/entrypoint.sh"]
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,15 @@ KernelCollector is a small Python script that handles the upkeep of a Linux kern
It keeps track of header, image and module packages for the `amd64`, `i386`, `armhf`, `arm64`, `ppc64el` and `s390x` architectures.

There are three kind of kernel images that KernelCollector collects:
* `linux-current`: The newest stable version of the Linux kernel, for example: `v5.0.7`
* `linux-beta`: The newest release candidate of the Linux kernel, for example: `v5.1-rc5`
* `linux-devel`: The newest trunk build of the Linux kernel, for example: `v2019-04-16`
* `linux-current`: The newest stable version of the Linux kernel, for example: `v5.8.10`
* `linux-beta`: The newest release candidate of the Linux kernel, for example: `v5.9-rc5`
* `linux-devel`: The newest trunk build of the Linux kernel, for example: `v2019-09-17`

Using a cronjob, KernelCollector can always keep these packages updated in the Debian package repository.

This is useful because it allows users to automatically upgrade their Linux kernels to the latest version from the update channel, without any user input. For example, you will not receive beta or devel versions while on the current release channel.

Older kernel versions will disappear once the newest kernel is installed. If kernel version `5.0.8` is released, everybody using the KernelCollector repository will automatically be upgraded to version `5.0.8`, straight from `5.0.7` - and so on.
Older kernel versions will disappear once the newest kernel is installed. If kernel version `5.8.10` is released, everybody using the KernelCollector repository will automatically be upgraded to version `5.8.10`, straight from `5.8.9` - and so on.

This kind of setup might not be useful (or too risky) for some people, in that case, you are welcome to handle your own kernel installations.

Expand Down Expand Up @@ -71,7 +71,7 @@ Next, edit the `settings.json` file to your liking:
* `distribution`: Defaults to `sid`. This really doesn't matter, as the packages require a newer version of Debian or Ubuntu, and this is just a matter of preference.
* `gpgKey`: Defaults to `ABCD`. Obviously, this isn't a real GPG key. Repositories maintained by KernelCollector are GPG signed. You will have to create your own GPG key, which can be password protected if needed.
* `gpgPassword`: Defaults to `none`. If you don't have a GPG password, please set the password to `none`. If you have one, specify it here.
* `repoPath`: Defaults to `/var/www/debian`. This is the filesystem path of your repository, where the artifacts will be published to.
* `repoPath`: Defaults to `/srv/packages`. This is the filesystem path of your repository, where the artifacts will be published to.

You might notice that you need a GPG key to sign the kernel packages. This is out of scope for this tutorial, Google is your friend in this regard, though `gpg --full-generate-key` might be a good point to start.

Expand Down
7 changes: 7 additions & 0 deletions entrypoint.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/bin/sh

# Import our GPG key
gpg --import gpg.key

# Run the actual program
python -m kernelcollector.Main
35 changes: 16 additions & 19 deletions kernelcollector/Main.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
from .PackageCollector import PackageCollector
from .PackageList import PackageList
from .PackageDistribution import PackageDistribution
import traceback, json, socket, os, sys, time
from .WebhookEmitter import WebhookEmitter
import traceback, json, logging, os, sys

class Main(object):

def __init__(self):
try:
if os.path.exists('settings.json'):
with open('settings.json', 'r') as file:
self.settings = json.load(file)
except:
self.settings = {}

defaultValues = {'repoPath': '/var/www/debian', 'gpgKey': 'ABCDEF', 'gpgPassword': 'none', 'distribution': 'sid', 'description': 'Package repository for newest Linux kernels', 'architectures': ['amd64']}
defaultValues = {'repoPath': '/srv/packages', 'gpgKey': 'ABCDEF', 'gpgPassword': 'none', 'distribution': 'sid', 'description': 'Package repository for newest Linux kernels', 'architectures': ['amd64'], 'webhook': None}
edited = False

for key, value in defaultValues.items():
Expand All @@ -25,34 +24,32 @@ def __init__(self):
self.saveSettings()
sys.exit()

self.packageList = PackageList(self.settings['repoPath'].rstrip('/'), self.settings['gpgKey'], self.settings['gpgPassword'], verbose=True)
self.packageDist = PackageDistribution(self.settings['distribution'], self.settings['architectures'], self.settings['description'], verbose=True)
self.logger = WebhookEmitter(self.settings['webhook'])

self.packageList = PackageList(self.logger, self.settings['repoPath'].rstrip('/'), self.settings['gpgKey'], self.settings['gpgPassword'])
self.packageDist = PackageDistribution(self.logger, self.settings['distribution'], self.settings['architectures'], self.settings['description'])
self.packageList.addDistribution(self.packageDist)

self.packageCollector = PackageCollector(self.settings['architectures'], self.packageList, verbose=True)
self.logFolder = os.path.join(os.getcwd(), 'logs')
self.packageCollector = PackageCollector(self.logger, self.settings['architectures'], self.packageList)

def runAllBuilds(self):
# Attempt to run all builds.
# If something goes wrong, a log file will be created with the error.
# If something goes wrong, a webhook message will be sent.

try:
self.packageCollector.runAllBuilds()
except:
log = traceback.format_exc()

if not os.path.exists(self.logFolder):
os.makedirs(self.logFolder)

logFilename = os.path.join(self.logFolder, 'crash-{0}.log'.format(int(time.time())))

with open(logFilename, 'w') as file:
file.write(log)
self.logger.add('Something went wrong while building packages!', alert=True)
self.logger.add(traceback.format_exc(), pre=True)
self.logger.send_all()

def saveSettings(self):
with open('settings.json', 'w') as file:
json.dump(self.settings, file, sort_keys=True, indent=4, separators=(',', ': '))

if __name__ == '__main__':
logging.basicConfig(format='[%(asctime)s] %(message)s', datefmt='%Y/%m/%d %I:%M:%S %p')
logging.root.setLevel(logging.INFO)

main = Main()
main.runAllBuilds()
70 changes: 42 additions & 28 deletions kernelcollector/PackageCollector.py
Original file line number Diff line number Diff line change
@@ -1,29 +1,26 @@
from bs4 import BeautifulSoup
from . import Utils
import json, requests, tempfile, shutil, os, time, uuid
import json, logging, tempfile, shutil, os, time, uuid
import requests

FIND_IMAGE_RM = 'rm -f /lib/modules/$version/.fresh-install'
NEW_FIND_IMAGE_RM = 'rm -rf /lib/modules/$version'
INITRD_IMAGE_RMS = ['rm -f /boot/initrd.img-$version', 'rm -f /var/lib/initramfs-tools/$version']

class PackageCollector(object):

def __init__(self, architectures, pkgList, verbose=True):
def __init__(self, logger, architectures, pkgList):
self.logger = logger
self.architectures = architectures
self.pkgList = pkgList
self.tmpDir = os.path.join(tempfile.gettempdir(), uuid.uuid4().hex)
self.verbose = verbose
self.currentDir = os.getcwd()
self.reloadCache()

def log(self, message):
if self.verbose:
print(message)

def runAllBuilds(self):
# Get all releases and prereleases
self.log('Current directory is {0}'.format(self.currentDir))
self.log('Checking latest versions of the kernel...')
logging.info(f'Current directory is {self.currentDir}')
logging.info('Checking latest versions of the kernel...')
releases, prereleases = self.getAllReleases()

# The newest release is always the last in the list
Expand All @@ -38,9 +35,9 @@ def runAllBuilds(self):
dailyRelease = self.getNewestDailyRelease()
downloaded = False

self.log('Current release: {0}'.format(release))
self.log('Current release candidate: {0}'.format(prerelease))
self.log('Current daily build: v{0}'.format(dailyRelease))
logging.info(f'Current release: {release}')
logging.info(f'Current release candidate: {prerelease}')
logging.info(f'Current daily build: v{dailyRelease}')

# Create the temporary folder
if os.path.exists(self.tmpDir):
Expand All @@ -57,7 +54,7 @@ def runAllBuilds(self):
downloaded = True

# Redownload devel build if necessary
if self.downloadAndRepackAll('daily/{0}'.format(dailyRelease), dailyRelease, 'linux-devel'):
if self.downloadAndRepackAll(f'daily/{dailyRelease}', dailyRelease, 'linux-devel'):
downloaded = True

# Update cache and publish repository
Expand Down Expand Up @@ -138,7 +135,7 @@ def getNewestDailyRelease(self):
return max(versions)

def getFiles(self, releaseLink, releaseType):
with requests.get('https://kernel.ubuntu.com/~kernel-ppa/mainline/{0}'.format(releaseLink)) as site:
with requests.get(f'https://kernel.ubuntu.com/~kernel-ppa/mainline/{releaseLink}') as site:
data = site.content

files = {}
Expand Down Expand Up @@ -171,17 +168,17 @@ def getFiles(self, releaseLink, releaseType):
# and they can be either generic, low latency or snapdragon (the processor)
# The only package that doesn't have a sub type is headers-all, which is archless
for type in ('image', 'modules', 'headers'):
if '-{0}-'.format(type) not in text:
if f'-{type}-' not in text:
continue

for subType in ('generic', 'lowlatency', 'snapdragon'):
if '-{0}'.format(subType) in text:
files['{0}-{1}-{2}-{3}'.format(releaseType, type, subType, arch)] = text
if f'-{subType}' in text:
files[f'{releaseType}-{type}-{subType}-{arch}'] = text
foundCurrent = True
break

if (not foundCurrent) and '-headers-' in text:
files['{0}-headers-all'.format(releaseType)] = text
files[f'{releaseType}-headers-all'] = text

return files

Expand All @@ -190,7 +187,7 @@ def downloadAndRepack(self, releaseLink, releaseName, releaseType, pkgName, file
extractFolder = os.path.join(self.tmpDir, uuid.uuid4().hex)
controlFilename = os.path.join(extractFolder, 'DEBIAN', 'control')
postrmFilename = os.path.join(extractFolder, 'DEBIAN', 'postrm')
link = 'https://kernel.ubuntu.com/~kernel-ppa/mainline/{0}/{1}'.format(releaseLink, filename)
link = f'https://kernel.ubuntu.com/~kernel-ppa/mainline/{releaseLink}/{filename}'

# Create a temporary folder for the repackaging
if os.path.exists(extractFolder):
Expand All @@ -211,15 +208,23 @@ def downloadAndRepack(self, releaseLink, releaseName, releaseType, pkgName, file
releaseName = '-'.join(names)

# Download the .deb
self.log('Downloading package {0} (release v{1})'.format(pkgName, releaseName))
logging.info(f'Downloading package {pkgName} (release v{releaseName})')
Utils.downloadFile(link, debFilename)

# Extract the .deb file
os.system('dpkg-deb -R {0} {1}'.format(debFilename, extractFolder))
result = Utils.run_process(['dpkg-deb', '-R', debFilename, extractFolder])

if result.failed:
self.logger.add(f'Could not extract {os.path.basename(debFilename)} (error code {result.exit_code})!', alert=True)
self.logger.add(result.get_output(), pre=True)
self.logger.send_all()
return

os.remove(debFilename)

if not os.path.exists(controlFilename):
self.log('No control file for {0}...'.format(pkgName))
self.logger.add(f'No control file for {pkgName}...', alert=True)
self.logger.send_all()
return

# Rewrite the control file
Expand All @@ -232,9 +237,9 @@ def downloadAndRepack(self, releaseLink, releaseName, releaseType, pkgName, file
# For example, generic packages will conflict with lowlatency and snapdragon packages
for i, line in enumerate(controlLines):
if line.startswith('Package:'):
controlLines[i] = 'Package: {0}'.format(pkgName)
controlLines[i] = f'Package: {pkgName}'
elif line.startswith('Version:'):
controlLines[i] = 'Version: {0}'.format(releaseName)
controlLines[i] = f'Version: {releaseName}'
elif line.startswith('Depends: '):
dependencies = [dep for dep in line[len('Depends: '):].split(', ') if not dep.startswith('linux-')]

Expand Down Expand Up @@ -276,7 +281,14 @@ def downloadAndRepack(self, releaseLink, releaseName, releaseType, pkgName, file
f.write('\n'.join(postrmLines))

# Repack the .deb file
os.system('dpkg-deb -b {0} {1}'.format(extractFolder, debFilename))
result = Utils.run_process(['dpkg-deb', '-b', extractFolder, debFilename])

if result.failed:
self.logger.add(f'Could not pack {os.path.basename(debFilename)} (error code {result.exit_code})!', alert=True)
self.logger.add(result.get_output(), pre=True)
self.logger.send_all()
return

self.pkgList.addDebToPool(debFilename)

# Remove the temporary extract folder
Expand All @@ -285,7 +297,7 @@ def downloadAndRepack(self, releaseLink, releaseName, releaseType, pkgName, file

def downloadAndRepackAll(self, releaseLink, releaseName, releaseType):
# Download the file list for this release
self.log('Downloading release: {0}'.format(releaseType))
logging.info(f'Downloading release: {releaseType}')

files = self.getFiles(releaseLink, releaseType)
requiredTypes = ['image', 'modules', 'headers']
Expand All @@ -303,7 +315,8 @@ def downloadAndRepackAll(self, releaseLink, releaseName, releaseType):
currentTypes.append(type)

if len(currentTypes) != len(requiredTypes):
self.log('Release is not yet ready: {0}'.format(releaseType))
self.logger.add(f'Release is not yet ready: {releaseType}')
self.logger.send_all()
return False

downloaded = False
Expand All @@ -312,7 +325,7 @@ def downloadAndRepackAll(self, releaseLink, releaseName, releaseType):
for pkgName, filename in files.items():
# Check our cache
if self.fileCache.get(pkgName, None) == filename:
self.log('Skipping package {0}.'.format(pkgName))
logging.info(f'Skipping package {pkgName}.')
continue

# Download and repack
Expand Down Expand Up @@ -347,3 +360,4 @@ def updateCache(self):
def publishRepository(self):
# If temporary directory doesn't exist, nothing matters
self.pkgList.saveAllDistributions(['l', 'custom'])
self.pkgList.sendEmbeddedReport()
46 changes: 21 additions & 25 deletions kernelcollector/PackageDistribution.py
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
from deb_pkg_tools.control import unparse_control_fields
from datetime import datetime
from . import Utils
import traceback, logging, gzip, os
import gnupg
import gzip, os

gpg = gnupg.GPG()
gpg.encoding = 'utf-8'

class PackageDistribution(object):

def __init__(self, name, architectures, description, verbose=True):
def __init__(self, logger, name, architectures, description):
self.logger = logger
self.name = name
self.architectures = architectures
self.description = description
self.verbose = verbose

def getName(self):
return self.name
Expand Down Expand Up @@ -48,17 +48,22 @@ def setPackageList(self, pkgList):
os.makedirs(self.folder)

def getArchDir(self, arch):
return os.path.join(self.folder, 'main', 'binary-{0}'.format(arch))
return os.path.join(self.folder, 'main', f'binary-{arch}')

def log(self, message):
if self.verbose:
print(message)
def signFile(self, filename, content, detach=False):
with open(filename, 'w') as file:
try:
file.write(str(gpg.sign(content, detach=detach, keyid=self.pkgList.gpgKey, passphrase=self.pkgList.gpgPassword)))
except:
self.logger.add(f'Could not sign {filename}! Please check your GPG keys!', alert=True)
self.logger.add(traceback.format_exc(), pre=True)
self.logger.send_all()

def save(self, releases):
mainDir = os.path.join(self.folder, 'main')
archToPackages = {arch: [] for arch in self.architectures}

self.log("Writing package list to disk...")
logging.info('Writing package list to disk...')

# Associate our packages with architectures.
for release in releases:
Expand All @@ -82,7 +87,7 @@ def save(self, releases):
with open(os.path.join(archDir, 'Release'), 'w') as file:
file.write('\n'.join([
'Component: main', 'Origin: linux-kernel', 'Label: linux-kernel',
'Architecture: {0}'.format(arch), 'Description: {0}'.format(self.description)
f'Architecture: {arch}', f'Description: {self.description}'
]))

packages = '\n'.join(archToPackages[arch])
Expand All @@ -107,28 +112,19 @@ def save(self, releases):

md5, sha1, sha256 = Utils.getAllHashes(fullPath)
size = str(os.path.getsize(fullPath))
md5s.append(' {0} {1} {2}'.format(md5, size, displayPath))
sha1s.append(' {0} {1} {2}'.format(sha1, size, displayPath))
sha256s.append(' {0} {1} {2}'.format(sha256, size, displayPath))
md5s.append(f' {md5} {size} {displayPath}')
sha1s.append(f' {sha1} {size} {displayPath}')
sha256s.append(f' {sha256} {size} {displayPath}')

# Save the final package list, signing
release = '\n'.join([
'Origin: linux-kernel', 'Label: linux-kernel', 'Suite: {0}'.format(self.name), 'Codename: {0}'.format(self.name), 'Date: {0}'.format(date),
'Architectures: {0}'.format(' '.join(self.architectures)), 'Components: main', 'Description: {0}'.format(self.description),
'Origin: linux-kernel', 'Label: linux-kernel', f'Suite: {self.name}', f'Codename: {self.name}', f'Date: {date}',
'Architectures: {0}'.format(' '.join(self.architectures)), 'Components: main', f'Description: {self.description}',
'MD5Sum:\n{0}'.format('\n'.join(md5s)), 'SHA1:\n{0}'.format('\n'.join(sha1s)), 'SHA256:\n{0}'.format('\n'.join(sha256s))
])

with open(os.path.join(self.folder, 'Release'), 'w') as file:
file.write(release)

with open(os.path.join(self.folder, 'InRelease'), 'w') as file:
try:
file.write(str(gpg.sign(release, keyid=self.pkgList.gpgKey, passphrase=self.pkgList.gpgPassword)))
except:
self.log("Couldn't sign InRelease :( Check your GPG keys!")

with open(os.path.join(self.folder, 'Release.gpg'), 'w') as file:
try:
file.write(str(gpg.sign(release, detach=True, keyid=self.pkgList.gpgKey, passphrase=self.pkgList.gpgPassword)))
except:
self.log("Couldn't sign Release.gpg :( Check your GPG keys!")
self.signFile(os.path.join(self.folder, 'InRelease'), release, detach=False)
self.signFile(os.path.join(self.folder, 'Release.gpg'), release, detach=True)
Loading

0 comments on commit 36ffee5

Please sign in to comment.