- 
                Notifications
    You must be signed in to change notification settings 
- Fork 28
Goldstein filtering step for unwrapping workflow #247
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
for more information, see https://pre-commit.ci
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
- i'll add an issue about the nanblock problem to fix in a later PR
- let's add a simple test like this (from my git diff
diff --git a/tests/test_unwrap.py b/tests/test_unwrap.py
index 8789820..10f16c4 100644
--- a/tests/test_unwrap.py
+++ b/tests/test_unwrap.py
@@ -97,6 +97,21 @@ class TestUnwrapSingle:
         assert u_path.exists()
         assert c_path.exists()
+    @pytest.mark.parametrize("method", [UnwrapMethod.SNAPHU, UnwrapMethod.PHASS])
+    def test_goldstein(self, tmp_path, list_of_gtiff_ifgs, corr_raster, method):
+        # test other init_method
+        unw_filename = tmp_path / "unwrapped.unw.tif"
+        unw_path, conncomp_path = dolphin.unwrap.unwrap(
+            ifg_filename=list_of_gtiff_ifgs[0],
+            corr_filename=corr_raster,
+            unw_filename=unw_filename,
+            nlooks=1,
+            unwrap_method=method,
+            run_goldstein=True,
+        )
+        assert unw_path.exists()
+        assert conncomp_path.exists()
+
| description="Statistical cost mode method for SNAPHU.", | ||
| ) | ||
|  | ||
| alpha: float = Field( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we might end up changing this name slightly to avoid confusion with the SHP alpha parameter but for now it's fine and different
| unwrapper_unw_filename = Path(unw_filename) | ||
|  | ||
| if unwrap_method == UnwrapMethod.SNAPHU: | ||
| from ._snaphu_py import unwrap_snaphu_py | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Was it causing any problems to have this imported at the top of the file? Or was this a mistake to move the import?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oops, yeah I moved this here because I was testing in an environment without snaphu. It was very convenient to not have to install snaphu simply by moving the import to where it's needed, but I can definitely move this back if that's an antipattern.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suppose we can leave it for now if someone really want's to use another unwrapper that's not snaphu py, we're not currently putting snaphu-py in the environment file
| unwrapper_ifg_filename = filt_ifg_filename | ||
| unwrapper_unw_filename = scratch_unw_filename | ||
| else: | ||
| unwrapper_ifg_filename = Path(ifg_filename) | ||
| unwrapper_unw_filename = Path(unw_filename) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note for @mirzaees and @taliboliver - this is where it could make sense to add the interpolation logic.
If this section gets unwieldy, perhaps we could break it into a helper function that returns the unwrapper ifg_filename and unw_filename
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry for the late review. I started taking a look yesterday but was too slow to finish editing my comments. I'll post the comments for posterity but overall I think the code looks good and mostly just want to complain about the lack of unit testing.
I posted some suggestions for unit tests below without providing much detail. If you ever want to talk through the details of how we might implement these, I'd be happy to tag up.
| from numpy.typing import ArrayLike | ||
|  | ||
|  | ||
| def goldstein(phase: ArrayLike, alpha: float, psize: int = 32) -> np.ndarray: | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Naming the input argument "phase" might be a little misleading. The Goldstein filter is intended to operate on complex-valued interferogram data -- not on the wrapped phase directly.
I typically name this parameter "igram".
| def goldstein(phase: ArrayLike, alpha: float, psize: int = 32) -> np.ndarray: | |
| def goldstein(igram: ArrayLike, alpha: float, psize: int = 32) -> np.ndarray: | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah I lost this same comment in a closed window- let's definitely change it for the next PR because it confused me several times
|  | ||
|  | ||
| def goldstein(phase: ArrayLike, alpha: float, psize: int = 32) -> np.ndarray: | ||
| """Apply the Goldstein adaptive filter to the given data. | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add a reference to the Goldstein/Werner 1998 paper here.
@scottstanie has made it really easy to cite references in docstrings in this repo--
You could add a bibtex entry like this
@article{Goldstein1998RadarInterferogramFiltering,
    title = {Radar interferogram filtering for geophysical applications},
    author = {Goldstein, Richard M. and Werner, Charles L.},
    year = {1998},
    month = nov,
    journal = {Geophysical Research Letters},
    volume = {25},
    number = {21},
    pages = {4035–4038}
}
to this file: docs/references.bib
Then you could cite the paper in a docstring like this:
| """Apply the Goldstein adaptive filter to the given data. | |
| """Apply the Goldstein adaptive filter [@Goldstein1998RadarInterferogramFiltering] to the given data. | 
| Parameters | ||
| ---------- | ||
| phase : np.ndarray | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The annotation above says this argument is of type ArrayLike. Usually types like this are documented like this in the NumPy docs:
| phase : np.ndarray | |
| phase : array_like | 
where "array_like" is a keyword in the NumPy glossary: https://numpy.org/doc/stable/glossary.html#term-array_like
| Filtering parameter for Goldstein algorithm | ||
| Must be between 0 (no filtering) and 1 (maximum filtering) | ||
| psize : int, optional | ||
| edge length of square patch | ||
| Default = 32 | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I prefer to use full sentences in docstring descriptions (with periods and proper capitalization), since I think it looks more professional.
| Filtering parameter for Goldstein algorithm | |
| Must be between 0 (no filtering) and 1 (maximum filtering) | |
| psize : int, optional | |
| edge length of square patch | |
| Default = 32 | |
| Filtering parameter for the Goldstein algorithm. | |
| Must be between 0 (no filtering) and 1 (maximum filtering). | |
| psize : int, optional | |
| Edge length of square patch. Defaults to 32. | 
| 2D complex array containing the data to be filtered. | ||
| alpha : float | ||
| Filtering parameter for Goldstein algorithm | ||
| Must be between 0 (no filtering) and 1 (maximum filtering) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Must be between 0 (no filtering) and 1 (maximum filtering)
Should we enforce this in the function body? E.g. something like this:
if (alpha < 0.0) or (alpha > 1.0):
    raise ValueError(f"alpha must be between 0 and 1, instead got {alpha}")There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly, you could consider adding a check that psize has a valid value.
I think psize should be at least >=2, since if psize is 1, then the block stride (psize // 2) will be 0.
| # ignore processing for empty chunks | ||
| if np.all(np.isnan(data)): | ||
| return data | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we really need any special handling for this case?
Let's say hypothetically we remove lines 77 & 78. In that case, if the input data contained all NaN values, I expect that the output filtered data will still contain all NaNs. So lines 77 & 78 are just optimizing that case -- the results should still be the same but we get to skip some unnecessary computation.
But should we really be optimizing for the case where the input data is all NaN-valued? Note that this is (slightly) pessimizing the usual case where the input contains at least one non-NaN value, since np.all(np.isnan(data)) takes relatively small but nonzero time.
I would suggest removing these lines since I think encountering all-NaN-valued datasets seems unlikely in normal operation so, on net, this is probably a pessimization.
| # ignore processing for empty chunks | |
| if np.all(np.isnan(data)): | |
| return data | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could be a good thing to test- For the stitched, geocoded interferograms, The tilt means it's about 30% nodata. For single bursts, it's about 60% nodata (which is why adding a check like this speeds up the wrapped phase part by ~2x).
On the other hand, having this hardcoded is very geocoded-input-centric. An RIFG has no need for this and only gets it slowed down
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah will remove this. I think this is a remnant from when I was nanning out zero values, which can't be done with tophu at the moment anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For the stitched, geocoded interferograms, The tilt means it's about 30% nodata. For single bursts, it's about 60% nodata (which is why adding a check like this speeds up the wrapped phase part by ~2x).
Right now, the check for all NaN values is outside the loop over patches. So this shortcut will only trigger if the dataset contains 100% nodata. IIUC, there shouldn't be any speed up for interferograms that contain 60% nodata.
Maybe we should consider adding a similar check inside the loop over patches? If an individual patch contains all NaNs, there's no need to compute & apply the filter there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh right, yes this should be removed then, the comment # ignore processing for empty chunks fooled me
| for i in range(0, data.shape[0] - psize, psize // 2): | ||
| for j in range(0, data.shape[1] - psize, psize // 2): | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Noting another potential issue with odd-valued psize here--
If psize is even-valued, then each pixel will be covered by exactly 4 overlapping blocks (except for pixels near the borders of the data array.
But if psize is odd-valued, then the last row & column of pixels in each block will each be covered by an additional block.
| def apply_pspec(data: ArrayLike): | ||
| # NaN is allowed value | ||
| assert not (alpha < 0), f"Invalid parameter value {alpha} < 0" | ||
| wgt = np.power(np.abs(data) ** 2, alpha / 2) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we care about the amplitude values of the filtered data at all? Maybe a question for @scottstanie.
Currently, the filter coefficients and block weights are not normalized, so the processing is effectively applying a scaling factor to the output. It shouldn't affect the phase, but the amplitude values will not be comparable to the input amplitudes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't believe so- snaphu just ignores the amplitude if you also pass the correlation, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, I believe you're right about SNAPHU. The ICU algorithm does care of about interferogram amplitudes a bit (for computing "intensity neutrons"), but I don't think we're seriously considering ICU in our comparison of unwrappers. I'm not aware of other unwrapping algorithms that use amplitude information, so it seems reasonably safe to not worry about normalizing amplitudes.
| ) | ||
| return wgt | ||
|  | ||
| def patch_goldstein_filter( | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No unit tests 
If you make this a standalone function (so that its importable from the test suite), it should be pretty trivial to implement a test that checks that the phase values are unchanged by filtering with alpha equal to 0. I think that would be a nice test.
Another good test would be to simulate a signal with a known magnitude spectrum (e.g. by constructing the signal in the frequency domain and then taking the IFFT) and then applying patch_goldstein_filter() to it. Then you could compute the magnitude spectrum of the filtered data and compare it to the expected spectrum (which should be the original spectrum to the power 1+alpha).
It might also be worthwhile to exercise the blocking logic in a test. Common issues in block processing include off-by-one indexing errors and mishandling the remainder samples past the end of the last full-sized block. I think you could guard against most of these common issues in a test by passing an array of ones to the goldstein() function and checking that the output array values are constant (i.e. are all the same) to within some tolerance.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
related to the future nan change #261 , we could check that the number of input nodata values are similar/the same as the output nodata values.
| if suf == ".tif": | ||
| driver = "GTiff" | ||
| opts = list(io.DEFAULT_TIFF_OPTIONS) | ||
| else: | ||
| driver = "ENVI" | ||
| opts = list(io.DEFAULT_ENVI_OPTIONS) | 
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like this is baking in the assumption that we will only ever want to process GeoTiff files or ENVI files.
It'd be nice if dolphin had a function like
def get_raster_driver(filename: str | os.PathLike[str]) -> str:
    ...that abstracted the logic for inferring the driver name of a particular raster, so that any future changes to this logic could be contained in a single function.
Similarly, it'd be nice to see a function like
def get_default_driver_options(driver: str) -> list[str]:
    ...as the single source of truth for getting our default driver-specific raster creation options.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I also meant to comment here- get_raster_driver is a good one to add to the other similar io functions, but here I might actually move to cut the ENVI one and just produce geotiff outputs at this stage. I think Ryan got this ENVI check back when I was half way out of only setting up ENVI files, way before the wrapper/tophu was set up.
If we do want to accomodate more than GTiff, that would probably make more sense as a larger refactor to pass around DatasetWriters instead of Filenames. It would just require more thought how you would swap the user's DatasetWriter for a temporary one to hold the goldstein output
* Implement goldstein filtering step for unwrapping workflow * Fix linter checks, reformat * Default goldstein filtering to false * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Fix mypy/ruff issues, add type annotations * Fix numpy typing annotation * Add goldstein pytest --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
* start paper folder * add skele * fix names * add fig placeholder, new ref * fix makefile * fix paper instructions * add ref * more refs * refs * draft to end * add paper action * missing space * no need to run full test for this branch * typo * fix ref * + * Enforce consistency between jax and jaxlib (#252) Consistency between jax and jaxlib can be enforced by installing the most recent versions via pip, which is currently not possible through conda where the two packages are out of sync. This is necessary as dolphin would crash early on in processing if the two packages are not in sync. * Disable corrections part of pytest, add one for timeseries (#254) * dont exit before unwrapping for full test * make test data have mostly signal, not pure noise * fix the failing test due to len 1 stack losing dimension * raise an error if the reference point selection fails * fix 3d writing, pytest gets farther * `max_bandwidth` can be 1: fix `gt` pydantic restriction * try more * disable corrections pytest This is very annoying to get working with fake data. The current error is: RuntimeError: The -tr option was used, but there's no geotransform or it is rotated. This configuration is not supported. The geometdata for statis layers is wrong * fix jax warning * [skip ci] fix changelog date * Add option to toggle off inversion (#255) * Add option to toggle off inversion Add option to toggle inversion in the config function. By default set to true, like for unwrapping. This allows for more flexibility when running especially large-scale processing. * Make downstream updates to displacement.py * Revert outdated disp changes * Revert another vestigial change to displacement * Add `similarity` module (#256) * add similarity module and `median_similarity` funciton * extract `process_blocks` to an io module * add basic tests * add max sim, add process blocks function * fix argument ordering for `invert` * fix block proc ordering and protocol definitions * fix process check, zip self similarity check * make strictly less * remove `-tap` option from `gdal_merge.py` call (#257) * Goldstein filtering step for unwrapping workflow (#247) * Implement goldstein filtering step for unwrapping workflow * Fix linter checks, reformat * Default goldstein filtering to false * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci * Fix mypy/ruff issues, add type annotations * Fix numpy typing annotation * Add goldstein pytest --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#263) * [pre-commit.ci] pre-commit autoupdate updates: - [github.com/astral-sh/ruff-pre-commit: v0.2.2 → v0.3.2](astral-sh/ruff-pre-commit@v0.2.2...v0.3.2) - [github.com/pre-commit/mirrors-mypy: v1.8.0 → v1.9.0](pre-commit/mirrors-mypy@v1.8.0...v1.9.0) * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Create dependabot.yml (#266) * Bump the actions group with 7 updates (#267) * Bump the actions group with 7 updates Bumps the actions group with 7 updates: | Package | From | To | | --- | --- | --- | | [actions/checkout](https://github.com/actions/checkout) | `3` | `4` | | [actions/cache](https://github.com/actions/cache) | `3` | `4` | | [actions/upload-artifact](https://github.com/actions/upload-artifact) | `3` | `4` | | [actions/setup-python](https://github.com/actions/setup-python) | `4` | `5` | | [docker/login-action](https://github.com/docker/login-action) | `2` | `3` | | [docker/build-push-action](https://github.com/docker/build-push-action) | `4` | `5` | | [akhilerm/tag-push-action](https://github.com/akhilerm/tag-push-action) | `2.1.0` | `2.2.0` | Updates `actions/checkout` from 3 to 4 - [Release notes](https://github.com/actions/checkout/releases) - [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md) - [Commits](actions/checkout@v3...v4) Updates `actions/cache` from 3 to 4 - [Release notes](https://github.com/actions/cache/releases) - [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md) - [Commits](actions/cache@v3...v4) Updates `actions/upload-artifact` from 3 to 4 - [Release notes](https://github.com/actions/upload-artifact/releases) - [Commits](actions/upload-artifact@v3...v4) Updates `actions/setup-python` from 4 to 5 - [Release notes](https://github.com/actions/setup-python/releases) - [Commits](actions/setup-python@v4...v5) Updates `docker/login-action` from 2 to 3 - [Release notes](https://github.com/docker/login-action/releases) - [Commits](docker/login-action@v2...v3) Updates `docker/build-push-action` from 4 to 5 - [Release notes](https://github.com/docker/build-push-action/releases) - [Commits](docker/build-push-action@v4...v5) Updates `akhilerm/tag-push-action` from 2.1.0 to 2.2.0 - [Release notes](https://github.com/akhilerm/tag-push-action/releases) - [Commits](akhilerm/tag-push-action@v2.1.0...v2.2.0) --- updated-dependencies: - dependency-name: actions/checkout dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: actions/cache dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: actions/upload-artifact dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: actions/setup-python dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: docker/login-action dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: docker/build-push-action dependency-type: direct:production update-type: version-update:semver-major dependency-group: actions - dependency-name: akhilerm/tag-push-action dependency-type: direct:production update-type: version-update:semver-minor dependency-group: actions ... Signed-off-by: dependabot[bot] <[email protected]> * add write package permission see https://docs.github.com/en/actions/publishing-packages/publishing-docker-images#publishing-images-to-github-packages also https://stackoverflow.com/questions/76607955/error-denied-installation-not-allowed-to-create-organization-package --------- Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Scott Staniewicz <[email protected]> * Toggle zero mask and apply to unw phase field (#262) * Add option to toggle zeromsk for unw * Add unwrapping mask * snaphu ci string fix * tophu ci str fix * remove typo from common * Pass zero unw files if applicable * Fix snaphu cli error --------- Co-authored-by: Simran S Sangha <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#270) updates: - [github.com/astral-sh/ruff-pre-commit: v0.3.2 → v0.3.3](astral-sh/ruff-pre-commit@v0.3.2...v0.3.3) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Add interpolation option for unwrapping (#264) * add interpolation option for unwrapping * add reference, convert float weights once * remove the extra all check * Remove pymp reference * remove todo * Pass through the correlation threshold for interpolation The last implementation had a default of 0.0 Since every weight was >=0, no pixels were interpolated. * support to have both interpolation and filtering * change the values of all zero sample interferogram * add test for interpolation loop * change docstring to match datatype --------- Co-authored-by: mirzaees <[email protected]> Co-authored-by: Scott Staniewicz <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#271) updates: - [github.com/astral-sh/ruff-pre-commit: v0.3.3 → v0.3.4](astral-sh/ruff-pre-commit@v0.3.3...v0.3.4) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Grow conncomp (#272) * add grow conncomp * change typing * select reference point with a general condition file * Update src/dolphin/timeseries.py * use a more generic form of Callable * use with to open files --------- Co-authored-by: mirzaees <[email protected]> Co-authored-by: Geoffrey Gunter <[email protected]> * [pre-commit.ci] pre-commit autoupdate (#273) updates: - [github.com/astral-sh/ruff-pre-commit: v0.3.4 → v0.3.5](astral-sh/ruff-pre-commit@v0.3.4...v0.3.5) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#275) updates: - [github.com/pre-commit/pre-commit-hooks: v4.5.0 → v4.6.0](pre-commit/pre-commit-hooks@v4.5.0...v4.6.0) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Compressed slc naming (#274) * change compressed slc naming in unit tests * change default for tilecostthreshold --------- Co-authored-by: mirzaees <[email protected]> Co-authored-by: Scott Staniewicz <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Docs cleanup (#268) * use `pymdownx.snippets` to cut repeated docs from readme. add `CONTRIBUTING.MD` * remove the extra gpu env file since Jax needs separate installs * Revert #252 * update contrib, setup, tutorial map * add JAX GPU setup notes * typos * update the change log (#276) * update the change log * [pre-commit.ci] auto fixes from pre-commit.com hooks for more information, see https://pre-commit.ci --------- Co-authored-by: mirzaees <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#278) updates: - [github.com/astral-sh/ruff-pre-commit: v0.3.5 → v0.3.7](astral-sh/ruff-pre-commit@v0.3.5...v0.3.7) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Cli timeseries (#279) * add timeseries step to cli * pre-commit fix --------- Co-authored-by: mirzaees <[email protected]> * [pre-commit.ci] pre-commit autoupdate (#280) updates: - [github.com/astral-sh/ruff-pre-commit: v0.3.7 → v0.4.1](astral-sh/ruff-pre-commit@v0.3.7...v0.4.1) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * correct reference point typing (#281) * correct reference point typing * correct coherence threshold typing * correct raising error --------- Co-authored-by: mirzaees <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#283) updates: - [github.com/astral-sh/ruff-pre-commit: v0.4.1 → v0.4.2](astral-sh/ruff-pre-commit@v0.4.1...v0.4.2) - [github.com/pre-commit/mirrors-mypy: v1.9.0 → v1.10.0](pre-commit/mirrors-mypy@v1.9.0...v1.10.0) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * fix bug for get dates (#284) * fix bug for get dates * add comments to clarify --------- Co-authored-by: mirzaees <[email protected]> * temporary fix for compressed SLC metadata (#287) * temporary fix for compressed slc metadata Co-authored-by: mirzaees <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * [pre-commit.ci] pre-commit autoupdate (#286) updates: - [github.com/astral-sh/ruff-pre-commit: v0.4.2 → v0.4.3](astral-sh/ruff-pre-commit@v0.4.2...v0.4.3) Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> * Enforce convention for conncomp files to be consistent with unw files (#288) * Do not add COR suffix, in case CONCOMP file * Update constants for conncomp * fix names * add fig placeholder, new ref fix makefile fix paper instructions add ref more refs refs draft to end add paper action missing space no need to run full test for this branch typo fix ref + refs, typos * reword example * This is a combination of 2 commits. reword example frontmatter yaml * edits for draft * add image for figure plot * add SNWG ack. --------- Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: Simran S Sangha <[email protected]> Co-authored-by: Ryan Burns <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Simran S Sangha <[email protected]> Co-authored-by: Sara Mirzaee <[email protected]> Co-authored-by: mirzaees <[email protected]> Co-authored-by: Geoffrey Gunter <[email protected]>
No description provided.