Skip to content

Conversation

@lacoak21
Copy link
Contributor

Change Summary

Overview

Update full / reduce mode calculation when getting the geometric factors at l2.

Summary from Tenzin

updates:
Old ways:
Looks at rgfo and if value is 3, any value 4 or above should use reduced gain.
New ways:
Now, always triggering rgfo at step 0. Physically, we are not reducing anymore. So we don’t need to use reduced gain. We should use the geometric factor as it is for all of Lo intensity products. So any places that checks rgfo value to determine if we should use reduced gain, we don’t need to do that anymore.
Which data get affected?
Updated in all the data since Nov 24 SCI-LUT version was uploaded.

Updated Files

  • imap_processing/codice/codice_l2.py
    • Update mode calculation

Testing

Fix tests.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the mode calculation logic for geometric factors in CoDICE L2 processing. The change implements date-based conditional logic to handle a transition in how reduced/full gain modes are determined, splitting behavior before and after November 24, 2024 (though the code currently says 2025).

Changes:

  • Added date-based conditional logic for computing geometric factors
  • Updated mode calculation to use different algorithms before/after a cutoff date
  • Modified test fixtures and values to align with the new logic

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 10 comments.

File Description
imap_processing/codice/codice_l2.py Added datetime import and implemented date-based conditional logic in compute_geometric_factors function
imap_processing/tests/codice/test_codice_l2.py Updated test fixture values and test parameters to work with the new mode calculation logic
Comments suppressed due to low confidence (1)

imap_processing/tests/codice/test_codice_l2.py:155

  • All test cases use dates before the date switch (20250101), so the new mode calculation logic that applies after November 24th, 2024 (or 2025 if not corrected) is not being tested. Add test cases with dates after the switch date to verify the new behavior works correctly.
def test_compute_geometric_factors_all_full_mode(mock_half_spin_per_esa_step):
    # rgfo_half_spin = 3 means all half_spin values (1 or 2) are < rgfo_half_spin
    dataset = xr.Dataset(
        {
            "rgfo_half_spin": (("epoch",), np.array([4, 4])),
            "half_spin_per_esa_step": (("esa_step",), mock_half_spin_per_esa_step),
        },
        attrs={"Logical_file_id": "imap_codice_l1b_lo-sw-angular_20250101_v001"},
    )
    geometric_factor_lut = {
        "full": np.zeros((128, 24)),
        "reduced": np.ones((128, 24)),
    }
    result = compute_geometric_factors(dataset, geometric_factor_lut)

    # Expect "full" values everywhere
    expected = np.full((2, 128, 24), 0)
    np.testing.assert_array_equal(result, expected)


def test_compute_geometric_factors_all_reduced_mode(mock_half_spin_per_esa_step):
    # rgfo_half_spin = 0 means all half_spin values (>=1) are >= rgfo_half_spin
    dataset = xr.Dataset(
        {
            "rgfo_half_spin": (("epoch",), np.array([1])),
            "half_spin_per_esa_step": (("esa_step",), mock_half_spin_per_esa_step),
        },
        attrs={"Logical_file_id": "imap_codice_l1b_lo-sw-angular_20250101_v001"},
    )
    geometric_factor_lut = {
        "full": np.zeros((128, 24)),
        "reduced": np.ones((128, 24)),
    }
    result = compute_geometric_factors(dataset, geometric_factor_lut)

    # Expect "reduced" values everywhere
    expected = np.full((1, 128, 24), 1)
    np.testing.assert_array_equal(result, expected)


def test_compute_geometric_factors_mixed(mock_half_spin_per_esa_step):
    # rgfo_half_spin = 1
    dataset = xr.Dataset(
        {
            "rgfo_half_spin": (("epoch",), np.array([2])),
            "half_spin_per_esa_step": (("esa_step",), mock_half_spin_per_esa_step),
        },
        attrs={"Logical_file_id": "imap_codice_l1b_lo-sw-angular_20250101_v001"},
    )
    geometric_factor_lut = {
        "full": np.zeros((128, 24)),
        "reduced": np.ones((128, 24)),
    }
    result = compute_geometric_factors(dataset, geometric_factor_lut)

    # ESA steps 0-63 (half_spin=1) -> 2 > 1 → mode=full → 1
    # ESA steps 64-127 (half_spin=2) -> 1 !>1 → mode=reduced → 0
    expected = np.repeat(np.array([[[0]] * 64 + [[1]] * 64]), 24, -1)
    np.testing.assert_array_equal(result, expected)

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@laspsandoval
Copy link
Contributor

laspsandoval commented Jan 21, 2026

I also saw this error:

Using the new science lut:
imap_codice_l1a-sci-lut_20251218_v001.json

I am getting an error on line 224 in codice_l1a_lo_species.py:
l1a_dataset["acquisition_time_per_step"] = xr.DataArray(
calculate_acq_time_per_step(sci_lut_data["lo_stepping_tab"]),
dims=("esa_step",),
attrs=cdf_attrs.get_variable_attributes(
"acquisition_time_per_step", check_schema=False
),
)

E xarray.structure.alignment.AlignmentError: cannot reindex or align along dimension 'esa_step' because of conflicting dimension sizes: {104, 128} (note: an index is found along that dimension with size=128)

venv/lib/python3.12/site-packages/xarray/structure/alignment.py:451: AlignmentError

But maybe this is for another PR.

Copy link
Contributor

@laspsandoval laspsandoval left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me. If you are uncertain then wait for Tenzin's review, though, since she knows the code better. But this is what Michael was talking about at the previous meeting so from my perspective it looks good.

@lacoak21
Copy link
Contributor Author

Looks good to me. If you are uncertain then wait for Tenzin's review, though, since she knows the code better. But this is what Michael was talking about at the previous meeting so from my perspective it looks good.

Michael, Tenzin and Joey looked at it this morning so I feel pretty confident that this the correct way. The only issue im struggling to deal with is the failing ialirt test. I am using the Logical_file_id attribute to get the start date of the file so I can check which method to use but the ialirt dataset doesnt have it. Im hesitant to just add it to the test dataset in case that will not be there in production

@laspsandoval
Copy link
Contributor

Looks good to me. If you are uncertain then wait for Tenzin's review, though, since she knows the code better. But this is what Michael was talking about at the previous meeting so from my perspective it looks good.

Michael, Tenzin and Joey looked at it this morning so I feel pretty confident that this the correct way. The only issue im struggling to deal with is the failing ialirt test. I am using the Logical_file_id attribute to get the start date of the file so I can check which method to use but the ialirt dataset doesnt have it. Im hesitant to just add it to the test dataset in case that will not be there in production

You are correct. I-ALiRT will not have a logical_file_id. Would it be possible to use the epoch midpoint in that case?

@lacoak21
Copy link
Contributor Author

You are correct. I-ALiRT will not have a logical_file_id. Would it be possible to use the epoch midpoint in that case?

Sure- ill just do that for all of them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

2 participants