-
Notifications
You must be signed in to change notification settings - Fork 5.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Metrics calculated with filters end up in having conflicts among them #25455
Comments
On further inspection, also the drill-down to row-level is not working. |
@marcoruggine There's a separate issue for the drill-through #6010 |
@marcoruggine - we're running into the same issue. The issue @flamber mentioned is corrected and merged to master. Upon inspecting the generated SQL, it seems the filters specified on the metric are simply applied to the where clause of the query. I was a little surprised it would do that. I assumed it would have generated some sort of case statement for the actual column that is aggregated and then when drilled, actually apply the filter. @flamber - is this limitation known? |
@rcronin The idea was to completely revamp Metrics https://www.metabase.com/roadmap - I cannot give a timeline. |
Really could use this fix. Can someone please share a timeline? |
@ccaldicott1 i would suggest creating metrics in a model or a view in the db |
Paused as per this comment. |
At the current state, Metrics aren't supposed to be composed or combined. Users shouldn't be able to create metrics from others metrics (composition) or add multiple metrics to the same query (combination). We run the aggregation pipeline in a single stage, meaning that we have only a single where clause, what leads to conflicts. Given the our approach to Metrics and the desire to rework the feature into something much powerful, the right fix today is to disable Metric combination and composition to end users. As we work on Metrics v2, supporting these cases will be a requirement. |
Unassigning myself. Context in this slack thread. |
Removing this issue from the board as there's nothing left to do on the BE, but not closing since it will exist until #36422 is fixed |
Well then, should we be expecting a new, more powerful version of metrics? |
@marcoruggine Metrics 2.0 is on the public roadmap, yes! |
Describe the bug
We are currently introducing Metrics in our environment and we have discovered that when more than 1 Metric is applied to the question (eg: a pivot with 2+ Metrics based on the same model) they end up conflicting and giving wrong results.
With some manual analysis it seems to us that the conflict arises between the filters applied in the calculated Metric.
Example:
Completed Payments = distinct(payment_id) filtered by is_refunded = false
Refunded Payments = distinct(payment_id) filtered by is_refunded = true
If we add both to a chart/table to see the evolution over time of both, we end up with no results because there isn't any payment that, at the same time, satisfies both is_refunded = false + is_refunded = true.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
What we would expect is that filters that are explicitly applied to the Metric stay within that Metric and don't get adopted at the whole query level.
One of the most important points of having Metrics is that we can calculate different things, that normally would require SQL because of these case-when filters, in the same question coming from the Notebook.
Screenshots
Information about your Metabase Installation:
V 0.44.3
Severity
This is actually blocking the whole process of introducing Metabase to the Company.
We need 1 single source of truth, Metrics are what we need to avoid endless repetitions (and mistakes) by people adding the same Metric to a dashboard or analysis.
The text was updated successfully, but these errors were encountered: