Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ASTM F3548 flight planners preparation does not verify USS's are clearing their area in the correct Interoperability Ecosystem/DSS #902

Open
wing-utm-sharing-airspace opened this issue Jan 23, 2025 · 1 comment
Labels
automated-testing Related to automated testing tools test-scenario-behavior

Comments

@wing-utm-sharing-airspace

Observed behavior
A USS can pass the PrepareFlightPlanners Clear Area test case even if they are talking to the wrong DSS deployment when clearing their area of operational intents.

Example: Droneup had a misconfiguration where they were pointing to an old, non-pooled instance of the Wing DSS after all participants were instructed to point to the new pooled DSS deployment.

This issue went undetected for about a week by InterUSS Automated Tests until Wing turned down the old DSS deployment completely which resulted in a test failure.

Passing Test report from 1/22 after DSS polling was launched in the qual environment: e5390b3c-cd7d-4ab4-a3c0-12071f34293b.zip
Test Report Where Droneup began causing a failure after the old Wing DSS instance was turned down: 272a4dfe-a322-4f74-88a6-1b95f7f8f9bd.zip

Test check
Prepare Flight Planners - Clear Area - step 34 in test report e5390b3c-cd7d-4ab4-a3c0-12071f34293b attached above

Difference from expected behavior
If a USS is not configured to talk to the same DSS deployment as other participants in the test, they would be unable to actually clear their area. Can test coverage be added to check all USSs are talking to the appropriate DSS for the test configuration?

@BenjaminPelletier
Copy link
Member

In e539, droneup_uss is only in flight_planners_to_clear, not flight_planners. flight_planners_to_clear are not primary subjects of the test -- they are participants who may interfere with the test, so the PrepareFlightPlanners scenario attempts to mitigate that interference by asking them to remove any of their flights which may be in the area, but those participants are not actually intended to be tested. This is likely why there is no tested requirements output for DroneUp in e539 (because they are not under test by that configuration). The way to catch DroneUp's configuration error at the time of e539 would be to test DroneUp -- specifically, to include droneup_uss in flight_planners. Existing tests are already capable of detecting that misconfiguration as the nominal planning scenario (and other planning scenarios) verify that operational intents are managed correctly in the DSS when flights are planned.

In 272a, a failure is detected in the PrepareFlightPlanners scenario and the current prescription when that happens is to abort the test run. The simple solution here would likely be to recognize that droneup_uss is not functioning properly, so they can be removed as a participant uss_qualifier attempts to clear the flights of (remove them from flight_planners_to_clear). They are already not a primary subject of the test -- this removal is merely specifying to uss_qualifier that it should not perform the courtesy attempt to clear airspace (which should already be clear if everything is working properly) for this participant. on_failure for that scenario could be changed to Continue, but that would mean that normal tests would proceed even when there is an existing operational intent that is known to be blocking the airspace -- that is generally contrary to InterUSS practice of not performing tests that are expected to fail due to an earlier failure as that makes the root problem harder to identify.

This issue went undetected for about a week by InterUSS Automated Tests until Wing turned down the old DSS deployment completely which resulted in a test failure.

The non-detection was due to the test configuration indicating not to test the misconfigured participant. If the test configuration indicated that uss_qualifier (as-is) should test the misconfigured participant, the problem would have been detected. The later test failure is because the test configuration indicated that the misconfigured participant is someone who should have their flights cleared, and then they started failing to clear their flights. This could be resolved if the test configuration indicated instead that uss_qualifier should not attempt to clear the misconfigured participant's flights.

If a USS is not configured to talk to the same DSS deployment as other participants in the test, they would be unable to actually clear their area.

The problem here is that even a misconfigured participant can achieve the measurable outcome of a clear area in many cases since there are no flights to clear. The purpose of this scenario is not to detect a failure of the requirement to be connected to the correct DSS; the purpose is to ensure a clear area for testing. The requirement to be connected to the correct DSS is tested in the various planning scenarios.

Can test coverage be added to check all USSs are talking to the appropriate DSS for the test configuration?

There is no need to add coverage in general as it already exists in test scenarios not performed on droneup_uss since they were not indicated as a flight_planner. I do not think such a check could be reasonably performed in the PrepareFlightPlanners scenario, but we're open to suggestions for how that could be accomplished within the scope of that scenario

As an historical sidenote, the separate flight_planners_to_clear was added to address the request to provide the ability to ask some participants only to clear airspace and not participate in the full test. Users can choose not to use this feature and instead have participants either be fully-tested or not present.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
automated-testing Related to automated testing tools test-scenario-behavior
Projects
None yet
Development

No branches or pull requests

2 participants