Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
77 commits
Select commit Hold shift + click to select a range
4d263eb
add ApsParameters to objects.py
mahmud1 Aug 2, 2024
d35a302
fix lint
mahmud1 Aug 2, 2024
3a36c6e
fix bug in ApsParameters when model_params is None
mahmud1 Aug 2, 2024
8028550
implement extractModelParams and applyModelParams in filtering.
mahmud1 Aug 2, 2024
7d084a1
fix bug in params size check in extractModelParams
mahmud1 Aug 2, 2024
867cb12
remove logger from applyModelParams to avoid overlogging in loops.
mahmud1 Aug 2, 2024
654756c
implement applySpatialFilteringToP2
mahmud1 Aug 2, 2024
b0a508c
small debug message change
mahmud1 Aug 2, 2024
d57e3d3
implement applySimpleInterpolationToP2
mahmud1 Aug 2, 2024
2199db3
implement selectP2 in densification
mahmud1 Aug 2, 2024
dae8193
update logger messaeg in selectP2
mahmud1 Aug 4, 2024
655fb83
save aps parameters in step3 and select p2 points and apply aps to th…
mahmud1 Aug 4, 2024
0c6e182
fix lint
mahmud1 Aug 4, 2024
3ce7944
fix lint
mahmud1 Aug 4, 2024
1af045c
update docs
mahmud1 Aug 4, 2024
81b5a6c
update tests
mahmud1 Aug 4, 2024
aefff82
move filtering.coherence_p2 to densification.coherence_p2
mahmud1 Aug 4, 2024
2ee3857
update test
mahmud1 Aug 4, 2024
6a9d237
update savefig filename step_3 to step_4
mahmud1 Aug 4, 2024
b84d47f
update docs processing.rst
mahmud1 Aug 4, 2024
466e64d
update docs demo
mahmud1 Aug 4, 2024
d8f6505
Change type hint from dict to Config.
Andreas-Piter Aug 6, 2024
27253dc
Fix lint.
Andreas-Piter Aug 6, 2024
36d60ed
store APS method to aps_parameters.h5 in step 3 and apply filter in s…
mahmud1 Aug 6, 2024
932e664
Merge branch 'release/next_release' into release/save-aps
mahmud1 Oct 17, 2024
2a42710
merge next_release into save-aps
mahmud1 Oct 17, 2024
022e6d1
do not save aps2_obj in step 4
mahmud1 Oct 18, 2024
8f5875f
Merge branch 'refs/heads/release/next_release' into release/save-aps
Andreas-Piter Oct 22, 2024
4d48c7e
fix remaining utm to map coordinate after merging main into save-aps
mahmud1 Oct 26, 2024
17e40b3
remove p2_coh*_aps.h5 in test.
mahmud1 Oct 26, 2024
3b7eba3
Update ci.yml
mahmud1 Feb 18, 2025
3dac330
add pip installation in docs
mahmud1 Feb 18, 2025
798bd7d
add gdal to requirements in setup.py
mahmud1 Feb 18, 2025
9bcb47e
Update installation.rst to make mamba the recommended installation me…
mahmud1 Feb 18, 2025
7e681ca
Update installation.rst to fix typo
mahmud1 Feb 18, 2025
ce4e62c
Update installation.rst
mahmud1 Feb 18, 2025
e4bc4fb
add gdal to extra_req
mahmud1 Feb 18, 2025
fe51bed
numpy<2 in setup.py to avoid kaumi conflict
mahmud1 Feb 18, 2025
6cd6fbf
Update setup.py to use numpy<=1.26
mahmud1 Feb 18, 2025
c8234da
Merge pull request #5 from mahmud1/update-runner2
mahmud1 Feb 18, 2025
81bffa5
Update setup.py
mahmud1 Feb 18, 2025
5af61cc
Update setup.py
mahmud1 Feb 18, 2025
d053a31
use ci_env directly and skip cloning to ci_tmp_env
mahmud1 Feb 18, 2025
a7f23b1
Update ci.yml
mahmud1 Feb 18, 2025
47acefe
fix typo
mahmud1 Feb 18, 2025
e3fca85
remove before_script job
mahmud1 Feb 18, 2025
2c5eea8
update test_install to replace conda with pip
mahmud1 Feb 18, 2025
05ac883
update docker file. move to root and rename environment.yml
mahmud1 Feb 18, 2025
855d284
update installation documentation with environment.yml in root.
mahmud1 Feb 18, 2025
028e4cd
Update installation.rst
mahmud1 Feb 19, 2025
a370ae9
update pip installation in docs
mahmud1 Feb 20, 2025
3d194fe
Fix numerical problems when computing the grid size.
Andreas-Piter Feb 24, 2025
4f968d7
Merge branch 'luhipi:main' into main
mahmud1 Feb 27, 2025
462de88
Create .readthedocs.yaml
mahmud1 Feb 27, 2025
3e650a3
Update history.
Andreas-Piter Feb 28, 2025
f1b4a26
Merge pull request #62 from luhipi/check-grid-size
Andreas-Piter Feb 28, 2025
a9313b2
Update README.rst
mahmud1 Mar 3, 2025
41b430e
skip deployment in ci.yml
mahmud1 Mar 3, 2025
05ff3cb
update link to docs in README.rst
mahmud1 Mar 4, 2025
cee1ca7
Revert "Create .readthedocs.yaml"
mahmud1 Mar 4, 2025
acaf276
Merge pull request #7 from luhipi/main
mahmud1 Mar 4, 2025
7256d58
Merge pull request #8 from mahmud1/main
mahmud1 Mar 4, 2025
73696a0
Merge pull request #6 from mahmud1/update-runner2
mahmud1 Mar 4, 2025
c4b3f96
update dev installation in docs
mahmud1 Mar 4, 2025
f1ce499
Update ci.yml to test install via pip instead of mamba
mahmud1 Mar 4, 2025
fe52239
Merge pull request #71 from luhipi/update-readme
mahmud1 Mar 4, 2025
21e7bc7
Merge branch 'update-runner-test-installation' into main-to-merge
mahmud1 Mar 4, 2025
1362efa
Merge pull request #13 from mahmud1/main-to-merge
mahmud1 Mar 4, 2025
33510e5
Merge pull request #14 from mahmud1/update-docs
mahmud1 Mar 4, 2025
ea77b34
Update installation.rst
mahmud1 Mar 4, 2025
20e9faf
Merge pull request #72 from mahmud1/update-runner-test-installation
mahmud1 Mar 4, 2025
77e5f3c
Update HISTORY.rst
mahmud1 Mar 5, 2025
7a2082d
Merge pull request #73 from luhipi/update-runner
mahmud1 Mar 6, 2025
05cd267
Merge pull request #12 from luhipi/main
mahmud1 Mar 6, 2025
ff8f174
Merge branch 'main' into next_version/save-aps
mahmud1 Mar 31, 2025
022053a
merge main into save-aps
mahmud1 Mar 31, 2025
5996f99
merge main into save-aps
mahmud1 Mar 31, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
224 changes: 110 additions & 114 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ on:
pull_request:
branches:
- main
- 'v?*b'
- 'v?*beta'
push:
branches:
- main
Expand All @@ -15,40 +17,13 @@ permissions:
contents: write

env:
SKIP_DEPLOY: false
SKIP_INSTALL: true
SKIP_DEPLOY: true
SKIP_INSTALL: false
SKIP_TEST: false

jobs:
before_script:
runs-on: self-hosted

strategy:
matrix:
node-version: [ 20.x ]

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'

- name: Install dependencies
run: |
source /opt/conda/etc/profile.d/conda.sh
conda init bash
source ~/.bashrc
conda env remove --name ci_temp_env --yes || echo "Environment ci_temp_env does not exist"
conda create --prefix /home/runneruser/.conda/envs/ci_temp_env --clone ci_env
conda activate ci_temp_env
shell: bash

test_sarvey:
runs-on: self-hosted
needs: before_script
steps:
- name: Checkout code
uses: actions/checkout@v4
Expand All @@ -64,11 +39,12 @@ jobs:
source /opt/conda/etc/profile.d/conda.sh
conda init bash
source ~/.bashrc
conda activate ci_temp_env
conda activate ci_env
rm -rf tests/testdata
# wget -nv -c -O testdata.zip https://seafile.projekt.uni-hannover.de/f/4b3be399dffa488e98db/?dl=1
wget -nv -c -O testdata.zip https://seafile.projekt.uni-hannover.de/f/104b499f6f7e4360877d/?dl=1
unzip testdata.zip
mkdir -p test
mv testdata tests/
mamba list
make pytest
Expand All @@ -79,7 +55,7 @@ jobs:
run: |
conda init bash
source ~/.bashrc
source activate ci_temp_env
source activate ci_env

# replace documentation address for tags befro make docs
IFS='/' read -r OWNER REPO <<< "${GITHUB_REPOSITORY}"
Expand Down Expand Up @@ -165,7 +141,7 @@ jobs:
source /opt/conda/etc/profile.d/conda.sh
conda init bash
source ~/.bashrc
conda activate ci_temp_env
conda activate ci_env
make lint
shell: bash

Expand Down Expand Up @@ -208,7 +184,7 @@ jobs:
source /opt/conda/etc/profile.d/conda.sh
conda init bash
source ~/.bashrc
source activate ci_temp_env
source activate ci_env
make urlcheck
shell: bash

Expand All @@ -228,92 +204,112 @@ jobs:
run: |
source /opt/conda/etc/profile.d/conda.sh
conda activate base
mamba env remove --name sarvey_testinstall --yes || echo "Environment sarvey_testinstall does not exist"
mamba clean --index-cache --tarballs --packages -y
pip install conda-merge
export PATH=$HOME/.local/bin:$PATH # workaround conda-merge not found!
wget https://raw.githubusercontent.com/insarlab/MiaplPy/main/conda-env.yml
conda-merge conda-env.yml tests/CI_docker/context/environment_sarvey.yml > env.yml
mamba env create --name sarvey_testinstall -f env.yml

conda create -n sarvey_testinstall python=3.10 pip -y
conda activate sarvey_testinstall
conda install conda-forge::pysolid -y
conda install conda-forge::gdal
pip install git+https://github.com/insarlab/MiaplPy.git
pip install .

# get current branch for installation
IFS='/' read -r OWNER REPO <<< "${GITHUB_REPOSITORY}"

URL="https://github.com/${OWNER}/${REPO}"
URL_IO="https://${OWNER}.github.io/${REPO}"

echo "Repository URL: ${URL}"
echo "Repository Documentation URL: ${URL}"

if [[ "$GITHUB_REF" == refs/tags/* ]]; then
current_ref="${GITHUB_REF##*/}"
elif [[ "$GITHUB_REF" == refs/heads/* ]]; then
current_ref="${GITHUB_REF#refs/heads/}"
elif [[ "$GITHUB_REF" == refs/pull/* ]]; then
current_ref="${GITHUB_HEAD_REF}"
else
current_ref="${GITHUB_REF}"
fi

echo "Current branch/tag: ${URL}.git@$current_ref"
pip install git+${URL}.git@$current_ref
pip install sarvey[dev]

OUTPUT=$(pip check) || { echo "$OUTPUT"; true; }
cd ..
conda list
python -c "import sarvey; print(sarvey)"
shell: bash

deploy_pages:
runs-on: self-hosted

needs:
- test_sarvey
- test_urls
- test_styles
if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/')
steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: '3.x'

- name: Download docs
uses: actions/download-artifact@v4
with:
name: docs
path: docs/_build/html/

- name: Download coverage report
uses: actions/download-artifact@v4
with:
name: coverage-report
path: htmlcov/

- name: Download report.html
uses: actions/download-artifact@v4
with:
name: test-report

- name: Deploy to GitHub Pages
# trigger if merged into the main branch || published new tag
if: env.SKIP_DEPLOY == 'false' && github.event_name != 'pull_request' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/'))
run: |
rm -rf public
# deploy_pages:
# runs-on: self-hosted

# needs:
# - test_sarvey
# - test_urls
# - test_styles
# if: github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/')
# steps:
# - name: Checkout code
# uses: actions/checkout@v4

# - name: Set up Python
# uses: actions/setup-python@v5
# with:
# python-version: '3.x'

# - name: Download docs
# uses: actions/download-artifact@v4
# with:
# name: docs
# path: docs/_build/html/

# - name: Download coverage report
# uses: actions/download-artifact@v4
# with:
# name: coverage-report
# path: htmlcov/

# - name: Download report.html
# uses: actions/download-artifact@v4
# with:
# name: test-report

# - name: Deploy to GitHub Pages
# # trigger if merged into the main branch || published new tag
# if: env.SKIP_DEPLOY == 'false' && github.event_name != 'pull_request' && (github.ref == 'refs/heads/main' || startsWith(github.ref, 'refs/tags/'))
# run: |
# rm -rf public

git clone --branch gh-pages https://github.com/${{ github.repository }} public

if [[ "${GITHUB_REF}" == refs/tags/* ]]; then
TAG_NAME=${GITHUB_REF##*/}
echo "Deploying to GitHub Pages for version tag: $TAG_NAME"
DOCS_PATH=public/tags/$TAG_NAME
else
echo "Deploying to GitHub Pages for main branch"
DOCS_PATH=public/main
fi

rm -rf $DOCS_PATH
mkdir -p $DOCS_PATH/docs
mkdir -p $DOCS_PATH/images
mkdir -p $DOCS_PATH/coverage
mkdir -p $DOCS_PATH/test_reports
# git clone --branch gh-pages https://github.com/${{ github.repository }} public

# if [[ "${GITHUB_REF}" == refs/tags/* ]]; then
# TAG_NAME=${GITHUB_REF##*/}
# echo "Deploying to GitHub Pages for version tag: $TAG_NAME"
# DOCS_PATH=public/tags/$TAG_NAME
# else
# echo "Deploying to GitHub Pages for main branch"
# DOCS_PATH=public/main
# fi

# rm -rf $DOCS_PATH
# mkdir -p $DOCS_PATH/docs
# mkdir -p $DOCS_PATH/images
# mkdir -p $DOCS_PATH/coverage
# mkdir -p $DOCS_PATH/test_reports

cp -r docs/_build/html/* $DOCS_PATH/docs
cp -r htmlcov/* $DOCS_PATH/coverage/
cp report.html $DOCS_PATH/test_reports/

ls -al $DOCS_PATH
ls -al $DOCS_PATH/docs
ls -al $DOCS_PATH/coverage
ls -al $DOCS_PATH/test_reports

shell: bash

- name: Upload to GitHub Pages
if: env.SKIP_DEPLOY == 'false'
uses: peaceiris/actions-gh-pages@v4
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: ./public
# cp -r docs/_build/html/* $DOCS_PATH/docs
# cp -r htmlcov/* $DOCS_PATH/coverage/
# cp report.html $DOCS_PATH/test_reports/

# ls -al $DOCS_PATH
# ls -al $DOCS_PATH/docs
# ls -al $DOCS_PATH/coverage
# ls -al $DOCS_PATH/test_reports

# shell: bash

# - name: Upload to GitHub Pages
# if: env.SKIP_DEPLOY == 'false'
# uses: peaceiris/actions-gh-pages@v4
# with:
# github_token: ${{ secrets.GITHUB_TOKEN }}
# publish_dir: ./public
11 changes: 11 additions & 0 deletions HISTORY.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,21 @@
History
=======

Future major version
--------------------

* Export data to GIS format in WGS84 coordinates.
* Change file name coordinates_UTM.h5 to coordinates_map.h5.
* Use Transverse Mercator instead of UTM as map coordinates.


Future minor version (release soon)
-----------------------------------

* Update CI docker builder
* Update runner to test installation
* Update documentation with new instruction for installation including pip
* Fix numerical problems when computing grid size.

1.2.0 (2025-02-19)
------------------
Expand Down
24 changes: 9 additions & 15 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ Open-source InSAR time series analysis software developed within the project SAR
Documentation
-------------
The documentation with installation instructions, processing steps, and examples with a demo dataset can be found at:
https://luhipi.github.io/sarvey/main/docs/
https://luhipi.github.io/sarvey/main

Discussion
----------
Expand All @@ -26,16 +26,12 @@ Status
:target: https://github.com/luhipi/sarvey/actions
:alt: Pipelines
.. image:: https://img.shields.io/static/v1?label=Documentation&message=GitHub%20Pages&color=blue
:target: https://luhipi.github.io/sarvey/main/docs/
:target: https://luhipi.github.io/sarvey/main
:alt: Documentation
.. image:: https://zenodo.org/badge/DOI/10.5281/zenodo.12544130.svg
:target: https://doi.org/10.5281/zenodo.12544130
:alt: DOI


See also the latest coverage_ report and the pytest_ HTML report.


License
-------

Expand Down Expand Up @@ -191,15 +187,13 @@ This package was created with Cookiecutter_ and the `fernlab/cookiecutter-python

.. _Cookiecutter: https://github.com/audreyr/cookiecutter
.. _`fernlab/cookiecutter-python-package`: https://git.gfz-potsdam.de/fernlab/products/cookiecutters/cookiecutter-python-package
.. _coverage: https://luhipi.github.io/sarvey/main/coverage/
.. _pytest: https://luhipi.github.io/sarvey/main/test_reports/report.html
.. _processing: https://luhipi.github.io/sarvey/main/docs/processing.html
.. _`processing steps`: https://luhipi.github.io/sarvey/main/docs/processing.html#processing-steps-for-two-step-unwrapping-workflow
.. _preparation: https://luhipi.github.io/sarvey/main/docs/preparation.html
.. _`Two-step unwrapping`: https://luhipi.github.io/sarvey/main/docs/processing.html#processing-steps-for-two-step-unwrapping-workflow
.. _`One-step unwrapping`: https://luhipi.github.io/sarvey/main/docs/processing.html#processing-steps-for-one-step-unwrapping-workflow
.. _`installation instruction`: https://luhipi.github.io/sarvey/main/docs/installation.html
.. _`history`: https://luhipi.github.io/sarvey/main/docs/history.html
.. _processing: https://luhipi.github.io/sarvey/main/processing.html
.. _`processing steps`: https://luhipi.github.io/sarvey/main/processing.html#processing-steps-for-two-step-unwrapping-workflow
.. _preparation: https://luhipi.github.io/sarvey/main/preparation.html
.. _`Two-step unwrapping`: https://luhipi.github.io/sarvey/main/processing.html#processing-steps-for-two-step-unwrapping-workflow
.. _`One-step unwrapping`: https://luhipi.github.io/sarvey/main/processing.html#processing-steps-for-one-step-unwrapping-workflow
.. _`installation instruction`: https://luhipi.github.io/sarvey/main/installation.html
.. _`history`: https://luhipi.github.io/sarvey/main/history.html
.. _MiaplPy: https://github.com/insarlab/MiaplPy
.. _MintPy: https://github.com/insarlab/MintPy
.. _ISCE: https://github.com/isce-framework/isce2
Expand Down
Loading