Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
67 commits
Select commit Hold shift + click to select a range
550cc8c
added bert as a test inference benchmark
McLavish Oct 30, 2025
f9c3817
hotfix to enable gpu capabilities
McLavish Nov 3, 2025
0f93b66
added pre-commit hooks for linting and formatting
McLavish Nov 4, 2025
b965d7b
linting and formatting setting for whoever uses vscode + black + flak…
McLavish Nov 4, 2025
e9916db
reformatted local file so it passes linting/format
McLavish Nov 4, 2025
2b75311
Merge branch 'development' into feature/bert-inference
McLavish Nov 4, 2025
813af03
bert now uses gpu
McLavish Nov 4, 2025
3a96f04
changed data repo to be OUR forked data repo
McLavish Nov 4, 2025
d4d5d30
change data loading path to own forked repo
Russellpang Nov 5, 2025
1b7deb7
change data loading path to own forked repo
Russellpang Nov 5, 2025
668652c
update benchmark function
Russellpang Nov 5, 2025
aae1023
fix: replaced onnxruntime requirement from CPU to GPU. now it actuall…
McLavish Nov 5, 2025
25fd1d9
circleci mypy fix?
McLavish Nov 5, 2025
c478c91
Merge pull request #2 from McLavish/feature/bert-inference
McLavish Nov 5, 2025
d6c4227
benchmarks is now flake8/black compliant. pre-commit hooks also check…
McLavish Nov 5, 2025
27b14d6
add linalg benchmarks
Russellpang Nov 5, 2025
ace2335
add linalg benchmarks
Russellpang Nov 5, 2025
ad3023d
changed CI/CD to run linting on the benchmarks folder ONLY. disabled …
McLavish Nov 5, 2025
52f30c0
fix typo
Russellpang Nov 6, 2025
4efff4d
update code
Russellpang Nov 6, 2025
adf54a5
migrated from CircleCI to Github Actions
McLavish Nov 6, 2025
67772e2
fixed workflow directory
McLavish Nov 6, 2025
8f02b66
pip dependencies take too long
McLavish Nov 6, 2025
ae61e4b
Merge pull request #8 from McLavish/hotfix/code-quality-on-benchmarks
McLavish Nov 6, 2025
e06985c
new benchmark data
McLavish Nov 10, 2025
037f6c3
Bring folder from other-branch
Nov 12, 2025
377d949
update code
Nov 12, 2025
8dd8a6e
modify code and requirements
Nov 12, 2025
de15075
unfinished new fuc
Russellpang Nov 12, 2025
3006879
add new functions
Russellpang Nov 12, 2025
d224ddc
add new functions
Russellpang Nov 13, 2025
921f321
added recommender benchmark
McLavish Nov 13, 2025
dd840d1
Merge branch 'development' into feature/russell
YuxuanLiu-kayla Nov 13, 2025
4fca4aa
changed data submodule to use ssh and not https
McLavish Nov 13, 2025
26dfcf4
add channel_flow, compute, fft, and resnet of jax_npbench
down-street Nov 15, 2025
fad77da
reset the config
down-street Nov 15, 2025
e995e6a
Merge pull request #13 from McLavish/jiahao/npbenchs
down-street Nov 15, 2025
7e0d13f
microbenchmark example
Nov 16, 2025
942f5a1
Remove SSH public key from eval command
Russellpang Nov 16, 2025
6bc1dd7
Remove local_deployment.json configuration
Russellpang Nov 16, 2025
460ea1f
Delete out_storage.json configuration file
Russellpang Nov 16, 2025
de41ab6
Remove SSH private key from eval command
Russellpang Nov 16, 2025
ded520f
remove garbage
Nov 16, 2025
5c85980
test
Russellpang Nov 17, 2025
c5782dd
test
Russellpang Nov 17, 2025
2b52ced
test
Russellpang Nov 17, 2025
e5cb20c
Merge branch 'development' into feature/russell
Russellpang Nov 17, 2025
6488d6d
remove unnecessay files
Russellpang Nov 17, 2025
55c4ac4
fuck you
Russellpang Nov 17, 2025
b97b7a5
Refactor argument parsing for cleaner syntax
Russellpang Nov 17, 2025
1998b6b
Change 'reps' to 'iters' in jacobi2d function
Russellpang Nov 17, 2025
2cbd768
Delete benchmarks/000.microbenchmarks/050.matmul directory
Russellpang Nov 17, 2025
074d4b7
Merge pull request #6 from McLavish/feature/russell
McLavish Nov 17, 2025
efced9c
Revert "changed data submodule to use ssh and not https"
McLavish Nov 17, 2025
bc48b5e
fix: missing config.json
McLavish Nov 17, 2025
e154ba0
Merge branch 'development' into feature/inference-recommender
McLavish Nov 17, 2025
d9ed506
Merge pull request #11 from McLavish/feature/inference-recommender
McLavish Nov 17, 2025
e120167
add microbenchmark gpu latency
JessieeeNotLi Nov 19, 2025
cedcf15
ran benchmark on gpu
Nov 19, 2025
6d28eef
Bump Black to support Python 3.12
Nov 19, 2025
55960b7
Add benchmark documentation for GPU Cache Latency
JessieeeNotLi Nov 19, 2025
ba8f800
Add readme for GPU Cache Latency benchmark
JessieeeNotLi Nov 19, 2025
9632860
changed to match repo structure
JessieeeNotLi Nov 21, 2025
a1caa5a
removed unnecesary commits
JessieeeNotLi Nov 22, 2025
38e0022
added wrap divergence
JessieeeNotLi Nov 25, 2025
947eddd
update black
JessieeeNotLi Dec 15, 2025
59e8efd
fixed formatting
JessieeeNotLi Dec 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
81 changes: 0 additions & 81 deletions .circleci/config.yml

This file was deleted.

55 changes: 55 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
name: Lint

on:
push:
pull_request:

jobs:
linting:
runs-on: ubuntu-latest

steps:
- name: Check out code
uses: actions/checkout@v4

- name: Set up Python
id: setup-python
uses: actions/setup-python@v5
with:
python-version: '3.x'

- name: Cache virtualenv
uses: actions/cache@v4
with:
path: python-venv
key: venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('requirements.txt') }}-${{ github.ref_name }}
restore-keys: |
venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-${{ hashFiles('requirements.txt') }}-
venv-${{ runner.os }}-${{ steps.setup-python.outputs.python-version }}-

- name: Install system packages
run: |
sudo apt-get update
sudo apt-get install -y libcurl4-openssl-dev

- name: Install Python dependencies (via install.py)
run: |
python3 install.py --no-aws --no-azure --no-gcp --no-openwhisk --no-local

- name: Black (check)
run: |
. python-venv/bin/activate
black benchmarks --check --config .black.toml

- name: Flake8 (lint)
run: |
. python-venv/bin/activate
# write to file and echo to stdout (requires flake8 with --tee support)
flake8 benchmarks --config=.flake8.cfg --tee --output-file flake-reports

- name: Upload flake report
if: always()
uses: actions/upload-artifact@v4
with:
name: flake-reports
path: flake-reports
2 changes: 1 addition & 1 deletion .gitmodules
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@
url = https://github.com/mcopik/pypapi.git
[submodule "benchmarks-data"]
path = benchmarks-data
url = https://github.com/spcl/serverless-benchmarks-data.git
url = https://github.com/McLavish/serverless-benchmarks-data-dphpc.git
3 changes: 3 additions & 0 deletions .mypy.ini
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@
[mypy-docker]
ignore_missing_imports = True

[mypy-docker.*]
ignore_missing_imports = True

[mypy-tzlocal]
ignore_missing_imports = True

Expand Down
30 changes: 30 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
# .pre-commit-config.yaml
repos:
- repo: local
hooks:
- id: flake8-local
name: flake8 (project env)
language: python
additional_dependencies: ["flake8==7.1.1"]
entry: flake8
args: ["--config=.flake8.cfg"]
types: [python]
files: ^(sebs/|benchmarks/)
- repo: local
hooks:
- id: black-check-local
name: black --check (project env)
language: python
additional_dependencies: ["black==24.4.2"]
entry: black
args: ["--config=.black.toml", "--check", "--diff"]
types: [python]
files: ^(sebs/|benchmarks/)
# - repo: local
# hooks:
# - id: mypy-local
# name: mypy (project venv)
# language: system
# entry: bash -lc 'python -m mypy --config-file=.mypy.ini sebs'
# types: [python]

15 changes: 15 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
{
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true
},

"black-formatter.importStrategy": "fromEnvironment",
"black-formatter.path": [],
"black-formatter.args": ["--config=.black.toml"],

"flake8.importStrategy": "fromEnvironment",
"flake8.path": [],
"flake8.args": ["--config=.flake8.cfg"],
"flake8.enabled": true
}
13 changes: 6 additions & 7 deletions benchmarks/000.microbenchmarks/010.sleep/input.py
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
size_generators = {"test": 1, "small": 100, "large": 1000}

size_generators = {
'test' : 1,
'small' : 100,
'large': 1000
}

def buckets_count():
return (0, 0)

def generate_input(data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func):
return { 'sleep': size_generators[size] }

def generate_input(
data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func
):
return {"sleep": size_generators[size]}
6 changes: 3 additions & 3 deletions benchmarks/000.microbenchmarks/010.sleep/python/function.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@

from time import sleep


def handler(event):

# start timing
sleep_time = event.get('sleep')
sleep_time = event.get("sleep")
sleep(sleep_time)
return { 'result': sleep_time }
return {"result": sleep_time}
10 changes: 6 additions & 4 deletions benchmarks/000.microbenchmarks/020.network-benchmark/input.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,12 @@ def buckets_count():
return 0, 1


def generate_input(data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func):
def generate_input(
data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func
):
return {
'bucket': {
'bucket': benchmarks_bucket,
'output': output_paths[0],
"bucket": {
"bucket": benchmarks_bucket,
"output": output_paths[0],
},
}
Original file line number Diff line number Diff line change
@@ -1,27 +1,26 @@
import csv
import json
import os.path
import socket
from datetime import datetime
from time import sleep

from . import storage


def handler(event):

request_id = event['request-id']
address = event['server-address']
port = event['server-port']
repetitions = event['repetitions']
output_bucket = event.get('bucket').get('bucket')
output_prefix = event.get('bucket').get('output')
request_id = event["request-id"]
address = event["server-address"]
port = event["server-port"]
repetitions = event["repetitions"]
output_bucket = event.get("bucket").get("bucket")
output_prefix = event.get("bucket").get("output")
times = []
i = 0
socket.setdefaulttimeout(3)
server_socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
server_socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
server_socket.bind(('', 0))
message = request_id.encode('utf-8')
server_socket.bind(("", 0))
message = request_id.encode("utf-8")
adr = (address, port)
consecutive_failures = 0
while i < repetitions + 1:
Expand All @@ -43,16 +42,16 @@ def handler(event):
consecutive_failures = 0
server_socket.settimeout(2)
server_socket.close()

if consecutive_failures != 5:
with open('/tmp/data.csv', 'w', newline='') as csvfile:
writer = csv.writer(csvfile, delimiter=',')
writer.writerow(["id", "client_send", "client_rcv"])
with open("/tmp/data.csv", "w", newline="") as csvfile:
writer = csv.writer(csvfile, delimiter=",")
writer.writerow(["id", "client_send", "client_rcv"])
for row in times:
writer.writerow(row)

client = storage.storage.get_instance()
filename = 'results-{}.csv'.format(request_id)
key = client.upload(output_bucket, os.path.join(output_prefix, filename), '/tmp/data.csv')
filename = "results-{}.csv".format(request_id)
key = client.upload(output_bucket, os.path.join(output_prefix, filename), "/tmp/data.csv")

return { 'result': key }
return {"result": key}
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@


def buckets_count():
return 0, 1

def generate_input(data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func):

def generate_input(
data_dir, size, benchmarks_bucket, input_paths, output_paths, upload_func, nosql_func
):
return {
'bucket': {
'bucket': benchmarks_bucket,
'output': output_paths[0],
"bucket": {
"bucket": benchmarks_bucket,
"output": output_paths[0],
},
}
Loading