I have another thought that occurred to me regarding the Get Logs feature which I would like to throw out for discussion: I believe there are GSAK users who would be happy to just retrieve the latest logs that they do not have. So one could write a macro which for some particular geocache:
1. Checks the cache logs to see the date on the most recent log in the database and uses the prior day as a target date
2. Incrementally gets the logs a few at a time until it hits the target log date
While this is now programmable in a macro with the new feature in the GcGetLogs command, I wonder if this might not be a compelling element to add to this feature since this feature has been implicated as being used by GSAK users in hammering the API server. It could be coded on the dialog as a checkbox "Stop when most recent log date for cache in database is exceeded"
My sense is that if the programming were done to retrieve logs one by one, then this would not save anything, but if logs were iteratively retrieved in batches of 4 or 5 and one compared that to retrieving logs with Max logs set to 30 (or bigger) then that could generate quite a bit of savings in traffic, particularly for users whose databases are already pretty up-to-date.
1. Checks the cache logs to see the date on the most recent log in the database and uses the prior day as a target date
2. Incrementally gets the logs a few at a time until it hits the target log date
While this is now programmable in a macro with the new feature in the GcGetLogs command, I wonder if this might not be a compelling element to add to this feature since this feature has been implicated as being used by GSAK users in hammering the API server. It could be coded on the dialog as a checkbox "Stop when most recent log date for cache in database is exceeded"
My sense is that if the programming were done to retrieve logs one by one, then this would not save anything, but if logs were iteratively retrieved in batches of 4 or 5 and one compared that to retrieving logs with Max logs set to 30 (or bigger) then that could generate quite a bit of savings in traffic, particularly for users whose databases are already pretty up-to-date.