Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add metrics for successful/failed Spark index creation #837

Draft
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

Swiddis
Copy link

@Swiddis Swiddis commented Oct 30, 2024

Description

Success and failure metrics for spark index creation -- note that "failure" here means an execution failure (e.g. Spark couldn't execute the query), not validation failure (e.g. the index already exists).

E.g. success for creating a skipping index:
image

Related Issues

N/A

Check List

  • Updated documentation (docs/ppl-lang/README.md)
  • Implemented unit tests
  • Implemented tests for combination with other commands
  • New added source code should include a copyright header
  • Commits are signed per the DCO using --signoff

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.
For more information on following Developer Certificate of Origin and signing off your commits, please check here.

Signed-off-by: Simeon Widdis <sawiddis@amazon.com>
Signed-off-by: Simeon Widdis <sawiddis@amazon.com>
Comment on lines 50 to 55
val result = doCreate(spark, metadata)
emitIndexCreationStatusMetric(metadata, success = true)
Some(result)
} catch {
case e: Exception =>
emitIndexCreationStatusMetric(metadata, success = false)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you double check if this is the right place? I recall this factory is actually deserializer from Flint metadata to index class. I guess you may want to intercept API call in FlintSpark? Also for all metrics related logic, please consider AOP style rather than insert this logic in many places.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Moved the metric collection to the new location. I experimented with trying something AOP-like but it was cumbersome -- linking the existing logic in spark/refresh introduces a circular dependency, and the callable implicitly raising Exception makes error handling awkward if trying to make a general-purpose duplicate of what exists on the Scala side.

@Swiddis Swiddis marked this pull request as draft November 7, 2024 18:41
Signed-off-by: Simeon Widdis <sawiddis@amazon.com>
Signed-off-by: Simeon Widdis <sawiddis@amazon.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants