Hi there, Community
I am encountering an issue with Quiver time-series aggregations
Steps to reproduce:
- Create a time-series plot
- Create an event set (ie. by Time-series search, on condition, ie. time series value > 0)
- Create ‘event statistics’ plot out of the event set
Out of the time series data points falling into the event, either first or last point will be omitted in the aggregation. This can be checked by setting either ‘first point’ or ‘last point’ as aggregation method.
Hints appreciated! 
thank you,
Hi @n0derat,
Welcome to Quiver! I investigated the issue you flagged and didn’t find a problem with the first point or last point aggregations for event statistics card. I think there might be some confusion around the behavior for the events plot visualization. I would suggest setting interpolation to NONE for the input time series and then comparing what you’d expect the first and last point to be for an event.
It might help to think about the events plot visualization this way- when a point in the input time series is found where the search condition is true, the event “opens”. And the event “closes” at a subsequent point for which the search condition is false. This means what might appear to be the last point in an event at first glance might not be part of the event (check if it satisfies the search condition). The following screenshot shows what I mean.
I acknowledge that this behavior can be confusing and have flagged it to the team. If you’re noticing other issues with event statistics, I’d be happy to investigate further if you can provide more information.
Thank you for investigating. I understand the event opening and closing exactly like you have described. It’s weird, I had interpolation set to None, just like you, and the filter-based event classification, just like you, and was getting the first or last point lost in the statistics. The points were fulfilling the event classification criteria, also the first or last behaviour was also seemingly random.