Skip to content

Commit c433d64

Browse files
authored
Merge pull request #101 from ezmsg-org/dev
Bulk update dev -> main for next release
2 parents 378d669 + d5ea943 commit c433d64

File tree

11 files changed

+535
-787
lines changed

11 files changed

+535
-787
lines changed

.github/workflows/python-publish-ezmsg-sigproc.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,8 +17,8 @@ jobs:
1717
steps:
1818
- uses: actions/checkout@v4
1919

20-
- name: Install uv
21-
uses: astral-sh/setup-uv@v2
20+
- name: Install the latest version of uv
21+
uses: astral-sh/setup-uv@v6
2222

2323
- name: Build Package
2424
run: uv build

.github/workflows/python-tests.yml

Lines changed: 4 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ jobs:
1313
build:
1414
strategy:
1515
matrix:
16-
python-version: ["3.10", "3.11", "3.12"]
16+
python-version: ["3.10.15", "3.11", "3.12", "3.13"]
1717
os:
1818
- "ubuntu-latest"
1919
- "windows-latest"
@@ -23,17 +23,11 @@ jobs:
2323
steps:
2424
- uses: actions/checkout@v4
2525

26-
- name: Install uv
27-
uses: astral-sh/setup-uv@v2
28-
with:
29-
enable-cache: true
30-
cache-dependency-glob: "uv.lock"
31-
32-
- name: Set up Python ${{ matrix.python-version }}
33-
run: uv python install ${{ matrix.python-version }}
26+
- name: Install the latest version of uv
27+
uses: astral-sh/setup-uv@v6
3428

3529
- name: Install the project
36-
run: uv sync --all-extras --dev
30+
run: uv sync --python ${{ matrix.python-version }}
3731

3832
- name: Lint
3933
run:

.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -142,4 +142,5 @@ cython_debug/
142142
# JetBrains
143143
.idea/
144144

145-
src/ezmsg/sigproc/__version__.py
145+
src/ezmsg/sigproc/__version__.py
146+
uv.lock

README.md

Lines changed: 34 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,33 +1,48 @@
11
# ezmsg.sigproc
22

3-
Timeseries signal processing implementations for ezmsg
3+
## Overview
44

5-
## Dependencies
6-
7-
* `ezmsg`
8-
* `numpy`
9-
* `scipy`
10-
* `pywavelets`
5+
ezmsg-sigproc offers timeseries signal‑processing primitives built atop the ezmsg message‑passing framework. Core dependencies include ezmsg, numpy, scipy, pywavelets, and sparse; the project itself is managed through hatchling and uses VCS hooks to populate __version__.py.
116

127
## Installation
138

14-
### Release
15-
169
Install the latest release from pypi with: `pip install ezmsg-sigproc` (or `uv add ...` or `poetry add ...`).
1710

18-
### Development Version
11+
You can install pre-release versions directly from GitHub:
1912

20-
You can add the development version of `ezmsg-sigproc` to your project's dependencies in one of several ways.
13+
* Using `pip`: `pip install git+https://github.com/ezmsg-org/ezmsg-sigproc.git@dev`
14+
* Using `uv`: `uv add git+https://github.com/ezmsg-org/ezmsg-sigproc --branch dev`
15+
* Using `poetry`: `poetry add "git+https://github.com/ezmsg-org/ezmsg-sigproc.git@dev"`
2116

22-
You can clone it and add its path to your project dependencies. You may wish to do this if you intend to edit `ezmsg-sigproc`. If so, please refer to the [Developers](#developers) section below.
17+
> See the [Development](#development) section below for installing with the intention of developing.
2318
24-
You can also add it directly from GitHub:
19+
## Source layout & key modules
20+
* All source resides under src/ezmsg/sigproc, which contains a suite of processors (for example, filter.py, spectrogram.py, spectrum.py, sampler.py) and math and util subpackages.
21+
* The framework’s backbone is base.py, defining standard protocols—Processor, Producer, Consumer, and Transformer—that enable both stateless and stateful processing chains.
22+
* Filtering is implemented in filter.py, providing settings dataclasses and a stateful transformer that applies supplied coefficients to incoming data.
23+
* Spectral analysis uses a composite spectrogram transformer chaining windowing, spectrum computation, and axis adjustments.
2524

26-
* Using `pip`: `pip install git+https://github.com/ezmsg-org/ezmsg-sigproc.git@dev`
27-
* Using `poetry`: `poetry add "git+https://github.com/ezmsg-org/ezmsg-sigproc.git@dev"`
28-
* Using `uv`: `uv add git+https://github.com/ezmsg-org/ezmsg-sigproc --branch dev`
25+
## Operating styles: Standalone processors vs. ezmsg pipelines
26+
While each processor is designed to be assembled into an ezmsg pipeline, the components are also well‑suited for offline, ad‑hoc analysis. You can instantiate processors directly in scripts or notebooks for quick prototyping or to validate results from other code. The companion Unit wrappers, however, are meant for assembling processors into a full ezmsg pipeline.
27+
28+
A fully defined ezmsg pipeline shines in online streaming scenarios where message routing, scheduling, and latency handling are crucial. Nevertheless, you can run the same pipeline offline—say, within a Jupyter notebook—if your analysis benefits from ezmsg’s structured execution model. Deciding between a standalone processor and a full pipeline comes down to the trade‑off between simplicity and the operational overhead of the pipeline:
29+
30+
* Standalone processors: Low overhead, ideal for one‑off or exploratory offline tasks.
31+
* Pipeline + Unit wrappers: Additional setup cost but bring concurrency, standardized interfaces, and automatic message flow—useful when your offline experiment mirrors a live system or when you require fine‑grained pipeline behavior.
32+
33+
## Documentation & tests
34+
* `docs/ProcessorsBase.md` details the processor hierarchy and generic type patterns, providing a solid foundation for custom components.
35+
* Unit tests (e.g., `tests/unit/test_sampler.py`) offer concrete examples of usage, showcasing sampler generation, windowing, and message handling.
36+
37+
## Where to learn next
38+
* Study docs/ProcessorsBase.md to master the processor architecture.
39+
* Explore unit tests for hands‑on examples of composing processors and Units.
40+
* Review the ezmsg framework in pyproject.toml to understand the surrounding ecosystem.
41+
* Experiment with the code—try running processors standalone and then integrate them into a small pipeline to observe the trade‑offs firsthand.
42+
43+
This approach equips newcomers to choose the right level of abstraction—raw processor, Unit wrapper, or full pipeline—based on the demands of their analysis or streaming application.
2944

30-
## Developers
45+
## Development
3146

3247
We use [`uv`](https://docs.astral.sh/uv/getting-started/installation/) for development. It is not strictly required, but if you intend to contribute to ezmsg-sigproc then using `uv` will lead to the smoothest collaboration.
3348

@@ -36,4 +51,5 @@ We use [`uv`](https://docs.astral.sh/uv/getting-started/installation/) for devel
3651
3. Open a terminal and `cd` to the cloned folder.
3752
4. `uv sync` to create a .venv and install dependencies.
3853
5. `uv run pre-commit install` to install pre-commit hooks to do linting and formatting.
39-
6. After editing code and making commits, Run the test suite before making a PR: `uv run pytest tests`
54+
6. Run the test suite before finalizing your edits: `uv run pytest tests`
55+
7. Make a PR against the `dev` branch of the main repo.

pyproject.toml

Lines changed: 11 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -20,9 +20,18 @@ dependencies = [
2020
"sparse>=0.15.4",
2121
]
2222

23-
[project.optional-dependencies]
23+
[dependency-groups]
24+
dev = [
25+
"typer>=0.12.5",
26+
"pre-commit>=4.2.0",
27+
"jupyter>=1.1.1",
28+
{include-group = "lint"},
29+
{include-group = "test"},
30+
]
31+
lint = [
32+
"ruff"
33+
]
2434
test = [
25-
"flake8>=7.1.1",
2635
"frozendict>=2.4.4",
2736
"pytest-asyncio>=0.24.0",
2837
"pytest-cov>=5.0.0",
@@ -42,9 +51,6 @@ version-file = "src/ezmsg/sigproc/__version__.py"
4251
[tool.hatch.build.targets.wheel]
4352
packages = ["src/ezmsg"]
4453

45-
[tool.uv]
46-
dev-dependencies = ["pre-commit>=3.8.0", "ruff>=0.6.7"]
47-
4854
[tool.pytest.ini_options]
4955
norecursedirs = "tests/helpers"
5056
addopts = "-p no:warnings"

src/ezmsg/sigproc/filter.py

Lines changed: 21 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -32,17 +32,19 @@ class FilterCoefficients:
3232

3333

3434
def _normalize_coefs(
35-
coefs: FilterCoefficients | tuple[npt.NDArray, npt.NDArray] | npt.NDArray,
36-
) -> tuple[str, tuple[npt.NDArray, ...]]:
35+
coefs: FilterCoefficients | tuple[npt.NDArray, npt.NDArray] | npt.NDArray | None,
36+
) -> tuple[str, tuple[npt.NDArray, ...] | None]:
3737
coef_type = "ba"
3838
if coefs is not None:
3939
# scipy.signal functions called with first arg `*coefs`.
4040
# Make sure we have a tuple of coefficients.
41-
if isinstance(coefs, npt.NDArray):
41+
if isinstance(coefs, np.ndarray):
4242
coef_type = "sos"
4343
coefs = (coefs,) # sos funcs just want a single ndarray.
4444
elif isinstance(coefs, FilterCoefficients):
45-
coefs = (FilterCoefficients.b, FilterCoefficients.a)
45+
coefs = (coefs.b, coefs.a)
46+
elif not isinstance(coefs, tuple):
47+
coefs = (coefs,)
4648
return coef_type, coefs
4749

4850

@@ -91,16 +93,20 @@ def _reset_state(self, message: AxisArray) -> None:
9193
axis = message.dims[0] if self.settings.axis is None else self.settings.axis
9294
axis_idx = message.get_axis_idx(axis)
9395
n_tail = message.data.ndim - axis_idx - 1
94-
coefs = (
95-
(self.settings.coefs,)
96-
if self.settings.coefs is not None
97-
and not isinstance(self.settings.coefs, tuple)
98-
else self.settings.coefs
99-
)
100-
zi_func = {"ba": scipy.signal.lfilter_zi, "sos": scipy.signal.sosfilt_zi}[
101-
self.settings.coef_type
102-
]
103-
zi = zi_func(*coefs)
96+
_, coefs = _normalize_coefs(self.settings.coefs)
97+
98+
if self.settings.coef_type == "ba":
99+
b, a = coefs
100+
if len(a) == 1 or np.allclose(a[1:], 0):
101+
# For FIR filters, use lfiltic with zero initial conditions
102+
zi = scipy.signal.lfiltic(b, a, [])
103+
else:
104+
# For IIR filters...
105+
zi = scipy.signal.lfilter_zi(b, a)
106+
else:
107+
# For second-order sections (SOS) filters, use sosfilt_zi
108+
zi = scipy.signal.sosfilt_zi(*coefs)
109+
104110
zi_expand = (None,) * axis_idx + (slice(None),) + (None,) * n_tail
105111
n_tile = (
106112
message.data.shape[:axis_idx] + (1,) + message.data.shape[axis_idx + 1 :]
@@ -166,12 +172,7 @@ def _process(self, message: AxisArray) -> AxisArray:
166172
if message.data.size > 0:
167173
axis = message.dims[0] if self.settings.axis is None else self.settings.axis
168174
axis_idx = message.get_axis_idx(axis)
169-
coefs = (
170-
(self.settings.coefs,)
171-
if self.settings.coefs is not None
172-
and not isinstance(self.settings.coefs, tuple)
173-
else self.settings.coefs
174-
)
175+
_, coefs = _normalize_coefs(self.settings.coefs)
175176
filt_func = {"ba": scipy.signal.lfilter, "sos": scipy.signal.sosfilt}[
176177
self.settings.coef_type
177178
]
Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
from typing import Callable
2+
import warnings
3+
4+
import numpy as np
5+
6+
from .filter import (
7+
FilterBaseSettings,
8+
BACoeffs,
9+
FilterByDesignTransformer,
10+
BaseFilterByDesignTransformerUnit,
11+
)
12+
13+
14+
class GaussianSmoothingSettings(FilterBaseSettings):
15+
sigma: float | None = 1.0
16+
"""
17+
sigma : float
18+
Standard deviation of the Gaussian kernel.
19+
"""
20+
21+
width: int | None = 4
22+
"""
23+
width : int
24+
Number of standard deviations covered by the kernel window if kernel_size is not provided.
25+
"""
26+
27+
kernel_size: int | None = None
28+
"""
29+
kernel_size : int | None
30+
Length of the kernel in samples. If provided, overrides automatic calculation.
31+
"""
32+
33+
34+
def gaussian_smoothing_filter_design(
35+
sigma: float = 1.0,
36+
width: int = 4,
37+
kernel_size: int | None = None,
38+
) -> BACoeffs | None:
39+
# Parameter checks
40+
if sigma <= 0:
41+
raise ValueError(f"sigma must be positive. Received: {sigma}")
42+
43+
if width <= 0:
44+
raise ValueError(f"width must be positive. Received: {width}")
45+
46+
if kernel_size is not None:
47+
if kernel_size < 1:
48+
raise ValueError(f"kernel_size must be >= 1. Received: {kernel_size}")
49+
else:
50+
kernel_size = int(2 * width * sigma + 1)
51+
52+
# Warn if kernel_size is smaller than recommended but don't fail
53+
expected_kernel_size = int(2 * width * sigma + 1)
54+
if kernel_size < expected_kernel_size:
55+
## TODO: Either add a warning or determine appropriate kernel size and raise an error
56+
warnings.warn(
57+
f"Provided kernel_size {kernel_size} is smaller than recommended "
58+
f"size {expected_kernel_size} for sigma={sigma} and width={width}. "
59+
"The kernel may be truncated."
60+
)
61+
62+
from scipy.signal.windows import gaussian
63+
64+
b = gaussian(kernel_size, std=sigma)
65+
b /= np.sum(b) # Ensure normalization
66+
a = np.array([1.0])
67+
68+
return b, a
69+
70+
71+
class GaussianSmoothingFilterTransformer(
72+
FilterByDesignTransformer[GaussianSmoothingSettings, BACoeffs]
73+
):
74+
def get_design_function(
75+
self,
76+
) -> Callable[[float], BACoeffs]:
77+
# Create a wrapper function that ignores fs parameter since gaussian smoothing doesn't need it
78+
def design_wrapper(fs: float) -> BACoeffs:
79+
return gaussian_smoothing_filter_design(
80+
sigma=self.settings.sigma,
81+
width=self.settings.width,
82+
kernel_size=self.settings.kernel_size,
83+
)
84+
85+
return design_wrapper
86+
87+
88+
class GaussianSmoothingFilter(
89+
BaseFilterByDesignTransformerUnit[
90+
GaussianSmoothingSettings, GaussianSmoothingFilterTransformer
91+
]
92+
):
93+
SETTINGS = GaussianSmoothingSettings

0 commit comments

Comments
 (0)