MATLAB Answers

0

how do I extract data from a single field at 12 hour intervals without hitting the 8000 datapoint limit?

Asked by allan mitchell on 10 Sep 2018
Latest activity Commented on by Vinod
on 11 Sep 2018
I'm trying to extract readings from a single field on a feed at 12 hour intervals in csv fromat using a restful api GET method.
If i use the following call: http://api.thingspeak.com/channels/XXXXX9/fields/4.csv?start=2017-01-01%2000:00&end=2018-01-01%2023:59&median=daily the returned file has a years worth of entries with one reading for each day.
I had expected that if I used: http://api.thingspeak.com/channels/XXXXX9/fields/4.csv?start=2017-01-01%2000:00&end=2018-01-01%2023:59&median=720 that I would get a years worth of entries with 2 readings for each day but what I actually get is about a weeks worth of entries with 2 readings per day.
I guess that a weeks worth of entries at one minute intervals corresponds to about 8000 data points which is the download limit. The "daily" parameter seems to overcome this restriction but I would like to know if there is a way to overcome the restriction using some kind of half day or quarter day interval.
Thanks in anticipation.

  0 Comments

Sign in to comment.

1 Answer

Answer by Vinod
on 11 Sep 2018
Edited by Vinod
on 11 Sep 2018

Allan,
Can you describe your use case and application in further detail? How will this data that is older than the most recent 8000 points be used? Do the recent points not matter?
Note that when you do a median=daily, it is not the raw data but some downsampled data that you are getting. Is that sufficient for your application?
Depending on your application I can provide some useful ways to analyze the data.
-Vinod

  4 Comments

Show 1 older comment
Sounds like a reasonable request.
Is this being done "live" or for an offline analysis that is done on demand? If it is offline analysis, could exporting the entire CSV data from the Data Import/Export tab meet your needs?
For online analysis, you could do something like
https://api.thingspeak.com/channels/_CHANNEL_ID_/fields/1.csv?api_key=_YOUR_API_KEY_&start=2017-01-20&results=8000
This will get you 8000 points from the start time. Then pick the last entry in it, look at its timestamp and determine if you need to request more data (if so, it would be the last time entry in the previous response), etc.
Another option is
https://api.thingspeak.com/channels/_CHANNEL_ID_/fields/1.csv?api_key=_YOUR_API_KEY_&start=2017-01-20&end=2017-01-21
This should give you data within the date range (assuming it is less than 8000 points in the range.)
Finally, what language are you using to do this analysis?
Thanks for this. I think I'm getting the idea that there is not really a way to do this simply.
I'm analysing the data offline so, yes, I can "just" export the entire csv but the file is already > 100mb. As I said in my original question I was hoping to use: https://api.thingspeak.com/channels/_CHANNEL_ID_/fields/1.csv?api_key=_YOUR_API_KEY_&start=2017-01-20&end=2018-01-19 and another parameter to get the datapoints that I want.
I guess I'll just write a lua script to iterate over the date range and extract the datapoints that I want. It's not hard to do it's just not a very elegant solution to the problem.
Perhaps for the future I'll copy the 08:00 and 20:00 datapoints to a seperate channel so I can export them more easily. The problem with that is that you don't always know in advance which datapoints you might want in the future.
If you are doing offline analysis, CSV export is what I would recommend. If you send me an email vcherian (at) mathworks (dot) com, I could potentially show you a way to get all your data programatically - the equivalent of CSV export using the UI.

Sign in to comment.