Skip to content

Conversation

@jaminthorns
Copy link
Contributor

@jaminthorns jaminthorns commented Feb 21, 2025

Upon re-reading #77, I'm pretty sure that I implemented the measurement filtering functionality in Telemetry.Metrics.ConsoleReporter incorrectly by including the entire measurements map instead of just the measurement for the current metric. The typespec for keep and drop aligned with this implementation, but this test did not:

test "using event filter that evaluates both metadata and measurement" do
metric =
apply(Metrics, unquote(metric_type), [
"my.repo.query",
[keep: &(match?(%{repo: :my_app_read_only_repo}, &1) and &2 > 100)]
])
assert metric.keep.(%{repo: :my_app_read_only_repo}, 200)
refute metric.keep.(%{repo: :my_app_read_only_repo}, 50)
end

There was never a test for the reporter implementation, which is why this discrepancy wasn't caught. This makes it so that the keep/drop functions accept only the measurement value specified for the metric.

@codecov-commenter
Copy link

codecov-commenter commented Feb 21, 2025

⚠️ Please install the 'codecov app svg image' to ensure uploads and comments are reliably processed by Codecov.

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 99.32%. Comparing base (ab86164) to head (d13d4df).
⚠️ Report is 10 commits behind head on main.
❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files

Impacted file tree graph

@@             Coverage Diff             @@
##              main     #118      +/-   ##
===========================================
- Coverage   100.00%   99.32%   -0.68%     
===========================================
  Files            2        2              
  Lines          145      149       +4     
===========================================
+ Hits           145      148       +3     
- Misses           0        1       +1     
Files with missing lines Coverage Δ
lib/telemetry_metrics.ex 98.95% <ø> (-1.05%) ⬇️
lib/telemetry_metrics/console_reporter.ex 100.00% <100.00%> (ø)

Continue to review full report in Codecov by Sentry.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 74d186e...d13d4df. Read the comment docs.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

I was incorrectly passing the entire `measurements` map instead of the
specific `measurement` for a metric to the `keep`/`drop` callbacks. This
fixes that issue and adds a test to verify.
@josevalim
Copy link
Contributor

I wonder if we should pass all of the measurements as is and update the docs instead. Passing all measurements is definitely more metadata and would provide a richer API after all.

@jaminthorns
Copy link
Contributor Author

Yeah, the only place I thought it might be clumsy is if a user provides a :measurement function, they'd have to repeat the logic in :keep/:drop, like this:

summary("http.request.stop.rate",
  measurement: fn %{duration: duration, response_size: response_size} -> response_size / duration end,
  keep: fn %{duration: duration, response_size: response_size} -> (response_size / duration) > 1000 end
)

Which could of course be factored out like:

rate = fn %{duration: duration, response_size: response_size} -> response_size / duration end

summary("http.request.stop.rate",
  measurement: rate,
  keep: fn measurements -> rate(measurements) > 1000 end
)

Versus what I was going for:

summary("http.request.stop.rate",
  measurement: fn %{duration: duration, response_size: response_size} -> response_size / duration end,
  keep: fn rate -> rate > 1000 end
)

But I think that your suggestion makes more sense, since there could still be a case where other measurements factor into whether a user would want to filter them out. We should probably err on the side of making the API more powerful (as it is now).

I'll make another PR to update the docs. Thanks for the input!

josevalim pushed a commit that referenced this pull request Aug 12, 2025
The documentation and tests for the `:keep` and `:drop` metric options
incorrectly suggested that the optional 2nd argument for both options
(introduced in 3fde830) were a single `measurement` value instead of a
`measurements` map.

See #118
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants