Skip to content

Commit 309fa5e

Browse files
authored
fill out consolidation page
1 parent db26660 commit 309fa5e

File tree

1 file changed

+39
-1
lines changed

1 file changed

+39
-1
lines changed

docs/concepts/consolidation.md

Lines changed: 39 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,40 @@
1+
## Overview
12

2-
TODO
3+
* Consolidation is automatically applied based upon the size of the query window.
4+
* There is a hard maximum of 1,440 datapoints per line, which is one day of data with a 1-minute step size.
5+
* There must be at least 1 pixel per datapoint.
6+
* The unit is preserved across consolidation.
7+
* The consolidation function will match the aggregation function used (i.e. avg for avg/sum, max for max).
8+
9+
## Further Explanation
10+
11+
* Step size will change based on the amount of data requested.
12+
* For short time windows, with graphs that are the default width, you will get a 1m step size up to ~12 hours.
13+
* For longer time windows, such as multiple days, the step size can increase to 2m, 5m, 10m, or more.
14+
* The step size can change as the graph width changes - this is most often noticed on very wide monitors.
15+
* One pixel per datapoint.
16+
* In addition to the hard limit on datapoints per line, the pixel width is also considered, in order to be able to
17+
render one pixel per data point.
18+
* The maximum number of datapoints for a line is the smaller of 1,440 and number of pixels that can be visualized.
19+
* This can be demonstrated by viewing a metric for the last day:
20+
* If you maximize the graph canvas, the Step size switches to 1m.
21+
* If you minimize the graph canvas, the Step size switches to 5m.
22+
* Unit is preserved.
23+
* The y-axis unit is preserved for any step size changes.
24+
* Consolidation function matches aggregation function.
25+
* Let’s say you are looking at a graph of Stream Starts per Second (SPS), which is reported by thousands of instances.
26+
* This is typically done with a sum aggregation, because you want to see the total value across the fleet.
27+
* If you look at this data across a three hour window with a default canvas size, then it will be a 1-minute step size,
28+
and no consolidation is applied.
29+
* If you expand the time range to 24 hours, then the amount of data selected is now large enough that it must be
30+
combined, in order to graph it.
31+
* The step size changes to 5-minute, and a consolidation function is applied to combine multiple minutes of data
32+
into a single point on the graph.
33+
* Since the aggregation is a sum in this case, the consolidation is an average. This means that each point on the graph
34+
is now the average value of the five minutes of data from the previous graph across the same time range.
35+
* If you had used a max aggregation, then a max consolidation would be applied. The consolidation function is automatically
36+
chosen on your behalf, to provide the most correct value on the graph.
37+
* As long as your graph lines are continuous, you should not notice any significant changes due to the application of the
38+
consolidation function.
39+
* If your graph lines are sparsely populated, with many zero or NaN values, then you will see significant changes due to
40+
consolidation, because zero or low values will pull down an average.

0 commit comments

Comments
 (0)