Skip to content

Jofthe v patch monitoring 1.0.1 #1

Jofthe v patch monitoring 1.0.1

Jofthe v patch monitoring 1.0.1 #1

Workflow file for this run

# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL Advanced"
on:
push:
branches: [ "master" ]
pull_request:
branches: [ "master" ]
schedule:
- cron: '34 20 * * 1'
jobs:
analyze:
name: Analyze (${{ matrix.language }})
# Runner size impacts CodeQL analysis time. To learn more, please see:
# - https://gh.io/recommended-hardware-resources-for-running-codeql
# - https://gh.io/supported-runners-and-hardware-resources
# - https://gh.io/using-larger-runners (GitHub.com only)
# Consider using larger runners or machines with greater resources for possible analysis time improvements.
runs-on: ${{ (matrix.language == 'swift' && 'macos-latest') || 'ubuntu-latest' }}
permissions:
# required for all workflows
security-events: write
# required to fetch internal or private CodeQL packs
packages: read
# only required for workflows in private repositories
actions: read
contents: read
strategy:
fail-fast: false
matrix:
include:
- language: c-cpp
build-mode: autobuild
# CodeQL supports the following values keywords for 'language': 'c-cpp', 'csharp', 'go', 'java-kotlin', 'javascript-typescript', 'python', 'ruby', 'swift'
# Use `c-cpp` to analyze code written in C, C++ or both
# Use 'java-kotlin' to analyze code written in Java, Kotlin or both
# Use 'javascript-typescript' to analyze code written in JavaScript, TypeScript or both
# To learn more about changing the languages that are analyzed or customizing the build mode for your analysis,
# see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/customizing-your-advanced-setup-for-code-scanning.
# If you are analyzing a compiled language, you can modify the 'build-mode' for that language to customize how
# your codebase is analyzed, see https://docs.github.com/en/code-security/code-scanning/creating-an-advanced-setup-for-code-scanning/codeql-code-scanning-for-compiled-languages
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v3
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.
# For more details on CodeQL's query packs, refer to: https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality
# If the analyze step fails for one of the languages you are analyzing with
# "We were unable to automatically build your code", modify the matrix above
# to set the build mode to "manual" for that language. Then modify this step
# to build your code.
# ℹ️ Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun
- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo 'If you are using a "manual" build mode for one or more of the' \
'languages you are analyzing, replace this with the commands to build' \
'your code, for example:'
echo ' make bootstrap'
echo ' make release'
exit 1
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v3
with:

Check failure on line 91 in .github/workflows/codeql.yml

View workflow run for this annotation

GitHub Actions / .github/workflows/codeql.yml

Invalid workflow file

You have an error in your yaml syntax on line 91
category: "/language:${{matrix.language}}"
- name: Lightstep Pre-Deploy Check
# You may pin to the exact commit or the version.
# uses: lightstep/lightstep-action-predeploy@22bec553a6d0fb3de5026acb1159085601f76408
uses: lightstep/[email protected]
with:
# The organization associated with your Lightstep account (usually your company name)
lightstep_organization: # optional
# The Lightstep project associated with this repository
lightstep_project: # optional
# The key to access the Lightstep Public API
lightstep_api_key: # optional
# The token to access the Rollbar API
rollbar_api_token: # optional
# The token to access the PagerDuty API
pagerduty_api_token: # optional
# If set to true, collapse all conditions to a single table row
rollup_conditions: # optional
# If set to true, will not add a comment to pull-requests
disable_comment: # optional
- name: elmah.io Upload Source Map Action
# You may pin to the exact commit or the version.
# uses: elmahio/github-upload-source-map-action@02d3e0a71cb7741a98860405f287c4abf95f62bc
uses: elmahio/github-upload-source-map-action@v1
with:
# An API key with permission to upload source maps.
apiKey:
# The ID of the log which should contain the minified JavaScript and source map.
logId:
# An URL to the online minified JavaScript file. The URL can be absolute or relative but will always be converted to a relative path (no protocol, domain, and query parameters). elmah.io uses this path to lookup any lines in a JS stack trace that will need de-minification.
path:
# A path to the source map file. Only files with an extension of .map and content type of application/json will be accepted.
sourceMap:
# A path to the minified JavaScript file. Only files with an extension of .js and content type of text/javascript will be accepted.
minifiedJavaScript:
- name: Publish event
# You may pin to the exact commit or the version.
# uses: fiberplane/publish-event@63e125ddca44bfb308eec949bcb22f80230394b0
uses: fiberplane/[email protected]
with:
# API token used to access the Fiberplane API with
api-token:
# Title of the newly created event
title: # default is GitHub Action
# Labels to add to the event.
Format: key=value|key=value|key=value
labels:
# Time at which the event occurred. Defaults to current time. Format should be a RFC 3339 formatted string
time: # optional
# ID of the workspace to which the event should be posted
workspace-id:
# Base URL of the Fiberplane API
fp-base-url: # default is https://studio.fiberplane.com
- name: rootly-pulse
# You may pin to the exact commit or the version.
# uses: rootlyhq/pulse-action@7aa3a8baf889ff8b37a489dde53edece73b24a64
uses: rootlyhq/[email protected]
with:
# Summary of the pulse
summary:
# A API key for rootly
api_key:
# Environments associated with the pulse. Separate with commas.
environments: # optional
# Services associated with the pulse. Separate with commas.
services: # optional
# Labels associated with the pulse. Separate with commas and separate key-value pair with = (no spaces before or after =).
labels: # optional
# Source of the pulse
source: # optional
# Refs associated with the pulse. Separate with commas and separate key-value pair with = (no spaces before or after =).
refs: # optional
- name: Sync Templates
# You may pin to the exact commit or the version.
# uses: fiberplane/sync-templates@e35786a91f4d6ec8f9b4df0ccfa66770cff78083
uses: fiberplane/sync-templates@v1
with:
# API token used to access the Fiberplane API with
api-token:
# ID of the workspace to which the templates should be uploaded to
workspace-id:
# Base URL of the Fiberplane API
fp-base-url: # optional, default is https://studio.fiberplane.com
# Custom directory that should be monitored for Template JSONNET files (default: .fiberplane/templates/)
templates-directory: # optional, default is .fiberplane/templates/
# Version of the Fiberplane CLI to use (latest by default)
fp-version: # optional, default is latest
- name: Deploy Prometheus and Grafana
# You may pin to the exact commit or the version.
# uses: bitovi/github-actions-deploy-prometheus@60abab51796e327667fc11d63a8ac75b9e2834b9
uses: bitovi/[email protected]
with:
# Specifies if this action should checkout the code
checkout: # optional, default is true
# AWS access key ID
aws_access_key_id:
# AWS secret access key
aws_secret_access_key:
# AWS session token, if you're using temporary credentials
aws_session_token: # optional
# AWS default region
aws_default_region: # default is us-east-1
# Auto-generated by default so it's unique for org/repo/branch. Set to override with custom naming the unique AWS resource identifier for the deployment. Defaults to `${org}-${repo}-${branch}`.
aws_resource_identifier: # optional
# A list of additional tags that will be included on created resources. Example: `{"key1": "value1", "key2": "value2"}`
aws_extra_tags: # optional, default is {}
# Secret name to pull env variables from AWS Secret Manager, could be a comma separated list, read in order. Expected JSON content.
env_aws_secret: # optional
# File containing environment variables to be used with the app
env_repo: # optional
# `.env` file to be used with the app from Github secrets
env_ghs: # optional
# `.env` file to be used with the app from Github variables
env_ghv: # optional
# The AWS EC2 instance type
aws_ec2_instance_type: # optional, default is t2.medium
# The AWS IAM instance profile to use for the EC2 instance. Use if you want to pass an AWS role with specific permissions granted to the instance
aws_ec2_instance_profile: # optional
# Creates a Secret in AWS secret manager to store a kypair
aws_ec2_create_keypair_sm: # optional
# Root disk size for the EC2 instance
aws_ec2_instance_vol_size: # optional, default is 10
# A JSON object of additional tags that will be included on created resources. Example: `{"key1": "value1", "key2": "value2"}`
aws_ec2_additional_tags: # optional
# AMI filter to use when searching for an AMI to use for the EC2 instance. Defaults to `ubuntu/images/hvm-ssd/ubuntu-focal-20.04-amd64-server-*`
aws_ec2_ami_filter: # optional, default is ubuntu/images/hvm-ssd/ubuntu-focal-20.04-amd64-server-*
# Set to true to provision infrastructure (with Terraform) but skip the app deployment (with ansible)
infrastructure_only: # optional, default is false
# Path to the grafana datasource directory
grafana_datasource_dir: # optional, default is observability/grafana/datasources
# Path to the prometheus config file
prometheus_config: # optional, default is observability/prometheus/prometheus.yml
# Set to "true" to Destroy the created AWS infrastructure for this instance
tf_stack_destroy: # optional, default is false
# Change this to be anything you want to. Carefull to be consistent here. A missing file could trigger recreation, or stepping over destruction of non-defined objects.
tf_state_file_name: # optional
# Append a string to the tf-state-file. Setting this to `unique` will generate `tf-state-aws-unique`. Can co-exist with the tf_state_file_name variable.
tf_state_file_name_append: # optional
# AWS S3 bucket to use for Terraform state. Defaults to `${org}-${repo}-{branch}-tf-state-aws`
tf_state_bucket: # optional
# Force purge and deletion of S3 tf_state_bucket defined. Any file contained there will be destroyed. `tf_stack_destroy` must also be `true`
tf_state_bucket_destroy: # optional
# Define the root domain name for the application. e.g. bitovi.com. If empty, ELB URL will be provided.
aws_domain_name: # optional
# Define the sub-domain part of the URL. Defaults to `${org}-${repo}-{branch}`
aws_sub_domain: # optional
# Deploy application to root domain. Will create root and www DNS records. Domain must exist in Route53.
aws_root_domain: # optional
# Existing certificate ARN to be used in the ELB. Use if you manage a certificate outside of this action. See https://docs.aws.amazon.com/acm/latest/userguide/gs-acm-list.html for how to find the certificate ARN.
aws_cert_arn: # optional
# Generates and manage the root certificate for the application to be used in the ELB.
aws_create_root_cert: # optional
# Generates and manage the sub-domain certificate for the application to be used in the ELB.
aws_create_sub_cert: # optional
# Set this to true if you want not to use a certificate in the ELB.
aws_no_cert: # optional
# Define if a VPC should be created
aws_vpc_create: # optional
# Set a specific name for the VPC
aws_vpc_name: # optional
# Define Base CIDR block which is divided into subnet CIDR blocks. Defaults to 10.0.0.0/16.
aws_vpc_cidr_block: # optional
# Comma separated list of public subnets. Defaults to 10.10.110.0/24
aws_vpc_public_subnets: # optional
# Comma separated list of private subnets. If none, none will be created.
aws_vpc_private_subnets: # optional
# Comma separated list of availability zones. Defaults to `aws_default_region.
aws_vpc_availability_zones: # optional
# AWS VPC ID. Accepts `vpc-###` values.
aws_vpc_id: # optional
# Specify a Subnet to be used with the instance. If none provided, will pick one.
aws_vpc_subnet_id: # optional
# A JSON object of additional tags that will be included on created resources. Example: `{"key1": "value1", "key2": "value2"}`
aws_vpc_additional_tags: # optional
- name: Annotate Nobl9 SLO
# You may pin to the exact commit or the version.
# uses: nobl9/action-annotate-slo@1b79aa07d76525d32ec2d468f8afd1858e160cc1
uses: nobl9/[email protected]
with:
# Annotation to add
annotation:
# SLO to annotate
slo: # optional
# Labels to annotate
labels: # optional
# Project containing SLO to annotate
project:
# sloctl version to use
sloctl_version: # optional, default is 0.0.99
# Nobl9 client id
nobl9_client_id:
# Nobl9 client secret
nobl9_client_secret:
# Nobl9 okta org url
nobl9_okta_org_url: # optional, default is https://accounts.nobl9.com
# Nobl9 okta auth server
nobl9_okta_auth_server: # optional, default is auseg9kiegWKEtJZC416
# Nobl9 URL
nobl9_url: # optional, default is https://app.nobl9.com/api
- name: Autometrics Report
# You may pin to the exact commit or the version.
# uses: autometrics-dev/diff-metrics@778b3281f8446790af0afd766d5bb236defb5dde
uses: autometrics-dev/[email protected]
with:
# Github token to use
gh-token:
# The list of rust project roots to check. One path per line
rs-roots: # optional
# The list of typescript project roots to check. One path per line
ts-roots: # optional
# The list of golang project roots to check. One path per line
go-roots: # optional
# The list of python project roots to check. One path per line
py-roots: # optional
# The number of days to keep the artifacts for. Defaults to 0 (inherits the policy from the repository)
retention-days: # optional, default is 0
# The version of am to download, skip patch or minor to act as a wildcard. "0.2" means ">=0.2.0 && <0.3.0", "1" means ">=1.0.0 && <2.0.0", etc.
am-version: # optional
- name: Instrument pipeline
# You may pin to the exact commit or the version.
# uses: autometrics-dev/instrument-pipeline@142e4e6cbc109bb37c705daa856b6462689b3ef8
uses: autometrics-dev/[email protected]
with:
# URL to the aggregation gateway, for example `http://localhost:9091`
pushgateway:
# Type of the aggregation gateway, one of `prometheus`, `gravel`, or `zapier`. Currently only changes the url format in case of prometheus
gatewaytype: # optional
# Comma separated list of buckets for duration histogram, with or without the brackets []
buckets: # optional
- name: Lightstep Services Change Report
# You may pin to the exact commit or the version.
# uses: lightstep/lightstep-action-snapshot@166ec5f31d611858ebe9ed3437848e8fe675fb89
uses: lightstep/[email protected]
with:
# The organization associated with your Lightstep account (usually your company name)
lightstep_organization: # optional
# The Lightstep project associated with this repository
lightstep_project: # optional
# Only show services in the snapshot from this comma-separated list
lightstep_service_filter: # optional
# The query to use when taking a snapshot
lightstep_snapshot_query: # optional
# The Lightstep snapshot id to summarize
lightstep_snapshot_id: # optional
# The Lightstep snapshot id to compare with lightstep_snapshot_id
lightstep_snapshot_compare_id: # optional
# The key to access the Lightstep Public API
lightstep_api_key: # optional
# Github API Token
github_token: # optional
# If set to true, will not add a comment to pull-requests
disable_comment: # optional
- name: Setup Nsolid environment
# You may pin to the exact commit or the version.
# uses: nodesource/setup-nsolid@1ca68d2589d3d56ecd3881dfe6ffa87eeda9c939
uses: nodesource/[email protected]
with:
# Node to use, if no values specified we will setup the major version available for the nsolid-version. E.g: 18.x, 20.x.
node-version: # optional
# Nsolid version to use. E.g: 5.0.5, 4.10.0, latest.
nsolid-version:
# Target operating system for Nsolid to use. E.g: linux, darwin, win32. Will use linux by default.
platform: # optional
# Target architecture for Node to use.
arch: # optional
- name: elmah.io Create Deployment Action
# You may pin to the exact commit or the version.
# uses: elmahio/github-create-deployment-action@132611db9161ecebb1b07db6510d4ea5e0d2d415
uses: elmahio/github-create-deployment-action@v1
with:
# An API key with permission to create deployments.
apiKey:
# The version number of this deployment. The value of version can be a SemVer compliant string or any other syntax that you are using as your version numbering scheme.
version:
# Optional description of this deployment. Can be markdown or clear text.
description: # optional
# The name of the person responsible for creating this deployment. This can be the name taken from your deployment server.
userName: # optional
# The email of the person responsible for creating this deployment. This can be the email taken from your deployment server.
userEmail: # optional
# As default, deployments are attached all logs of the organization. If you want a deployment to attach to a single log only, set this to the ID of that log.
logId: # optional
- name: Nobl9 sloctl action
# You may pin to the exact commit or the version.
# uses: nobl9/nobl9-action@b921770c1ed8d80a3dc04924074717683ea0ffa8
uses: nobl9/[email protected]
with:
# Client ID
client_id:
# Client Secret
client_secret:
# The path or glob pattern to the configuration in YAML format
sloctl_yml:
# Submits server-side request without persisting the configured resources
dry_run: # optional, default is false
- name: Nobl9 sloctl action
# You may pin to the exact commit or the version.
# uses: nobl9/nobl9-action@b921770c1ed8d80a3dc04924074717683ea0ffa8
uses: nobl9/[email protected]
with:
# Client ID
client_id:
# Client Secret
client_secret:
# The path or glob pattern to the configuration in YAML format
sloctl_yml:
# Submits server-side request without persisting the configured resources
dry_run: # optional, default is false
- name: Datadog JUnitXML Upload
# You may pin to the exact commit or the version.
# uses: DataDog/junit-upload-github-action@c4b57b587ae0e3ed618a1f0e7a7d260cfde53032
uses: DataDog/[email protected]
with:
# (Deprecated) Datadog API key to use to upload the junit files.
api-key: # optional
# Datadog API key to use to upload the junit files.
api_key: # optional
# Service name to use with the uploaded test results.
service:
# (Deprecated) The Datadog site to upload the files to.
datadog-site: # optional, default is datadoghq.com
# The Datadog site to upload the files to.
site: # optional, default is datadoghq.com
# JUnit files to upload.
files: # default is .
# Controls the maximum number of concurrent file uploads.
concurrency: # default is 20
# The node version used to install datadog-ci
node-version: # default is 20
# Datadog tags to associate with the uploaded test results.
tags: # optional
# Datadog env to use for the tests.
env: # optional
# Set to "true" to enable forwarding content from XML reports as logs.
logs: # optional
# The version of the @datadog/datadog-ci package to use. It defaults to the latest release (`latest`).
datadog-ci-version: # optional, default is latest
# Extra args to be passed to the datadog-ci cli.
extra-args: # optional, default is
- name: Push Workflow Data to Tinybird
# You may pin to the exact commit or the version.
# uses: localstack/tinybird-workflow-push@518790fd8ad2665b06419c2588c744da355bb970
uses: localstack/[email protected]
with:
# Github token for receiving start and end time of the workflow
github_token:
# The token to authenticate with Tinybird
tinybird_token:
# The Tinybird datasource to which to push the data to
tinybird_datasource: # optional, default is ci_workflows
# The id of the workflow
workflow_id: # optional
# Optional input to manually override the outcome reported to Tinybird. By default the outcome is calculated using the worst outcome of all jobs in the current workflow run attempt.
outcome: # optional
- name: PagerDuty Change Events
# You may pin to the exact commit or the version.
# uses: PagerDuty/pagerduty-change-events-action@ec2c5d5cff79059924d663a7427733785626c3bf
uses: PagerDuty/[email protected]
with:
# The integration key that identifies the service the change was made to.
integration-key:
# Custom event summary. If provided the GitHub event type is ignored and the given summary used. A link to the run is included in the event.
custom-event: # optional
- name: rss-to-issues
# You may pin to the exact commit or the version.
# uses: git-for-windows/rss-to-issues@60a6a47582d79d434bd18c6d3af3d9ab7356cf56
uses: git-for-windows/[email protected]
with:
# The GITHUB_TOKEN secret
github-token:
# URL of the RSS/Atom feed
feed:
# Only look at feed items younger than this
max-age:
# Prefix added to the created issues' titles
prefix: # optional
# Labels to add, comma separated
labels: # optional
# Log issue creation but do nothing
dry-run: # optional
# Aggregate all items in a single issue
aggregate: # optional
# Limit the issue contents' size
character-limit: # optional
# Limit to feed items whose titles match this regular expression
title-pattern: # optional
# Limit to feed items whose contents match this regular expression
content-pattern: # optional
- name: Yor GitHub Action
# You may pin to the exact commit or the version.
# uses: bridgecrewio/yor-action@04bf3da0c4e8619a307c023ce8f0d196a2d8a4ee
uses: bridgecrewio/[email protected]
with:
# directory with infrastructure code to scan
directory: # optional, default is .
# Run scan on all checks but a specific check identifier (comma separated)
tag_groups: # optional
# comma delimited list of yor tags to apply
tag: # optional
# comma delimited list of yor tags to not apply
skip_tags: # optional
# comma delimited list of paths for yor to skip tagging of
skip_dirs: # optional
# comma delimited list of paths to external (custom) tags & tag groups plugins
custom_tags: # optional
# The format of the output. cli, json
output_format: # optional
# log level
log_level: # optional
# Choose whether the action will commit changes. Changes will be commited if this is exactly "YES"
commit_changes: # optional, default is true
- name: Honeycomb Buildevents
# You may pin to the exact commit or the version.
# uses: honeycombio/gha-buildevents@e891e91ad0fcd80b71430a97ebf5a9baecac388a
uses: honeycombio/[email protected]
with:
# A Honeycomb API key - needed to send traces.
apikey:
# Defaults to https://api.honeycomb.io
apihost: # optional, default is https://api.honeycomb.io
# The Honeycomb dataset to send traces to.
dataset:
# Status of the job or worfklow. Setting this signals when to end the trace.
status: # optional
# Unix timestamp to represent when the trace started. Not necessary for single job workflows. Send in final use of the action for multi-job workflows.
trace-start: # optional
# Set this to a key unique for this matrix cell, only useful when using a build matrix.
matrix-key: # optional
# Deprecated value - please use status instead
job-status: # optional
# (true/false) Whether to send an event representing the setup of this action.
send-init-event: # optional, default is true
- name: Load runner information
# You may pin to the exact commit or the version.
# uses: devops-actions/load-runner-info@7f8c07227aa6176e94e4eeb912016bb0a9d33796
uses: devops-actions/[email protected]
with:
# Slug of the organization to analyze.
organization:
# Slug of the repository to analyze.
repo: # optional
# Access token to use for analysis with either admin:org or repo owner if you run it against a repo
accessToken: