How can one define a variable interval on a time series?


#1

This question stems from the 2018 January C3 IoT Academy.

In my use case, I have a source system that generally outputs data at 30 minute intervals. However, if there is a certain type of behavior in the system, it starts sending data at 1 minute intervals across the same channel instead of the 30 minute interval it was sending.

How does the platform handle this and is there a way I can have a variable normalization process for the same data stream so that the 1-minute data is normalized at 1-minute intervals and the 30-minute data is normalized at 30-minute intervals and the two results are combined to create a single normalized timeseries?

– to start to answer part of my question, I know that I am storing the raw data exactly as it comes in and that the normalization process is simply generating a normalized timeseries according to my normalization logic and storing it separately from the raw data. I also know that the simple way would be to set the interval of my timeseries to 1-minute for the whole series, but that seems a bit wasteful and clunky.

Is this something that a custom normalization process that I would build could handle?

Lastly, I went to visit the In Depth article on the normalization process at the following link: and the link to the custom normalization documentation was broken (see screenshot of article below)
https://[[host_url]].com/api/1/coned/prod/documentation/topic/normalization