Is there a known issue or limit associated with large filter sets when refreshing cache data? Seems to fail consistently when refreshing more than say 5000. I have a db of all caches in Kansas, for example, but when I try to refresh them with plenty of available download balance, it makes the sequence of api calls, then fails when it begins to process them. I'm guessing there isn't a known limit or the program would simply tell you to reduce the filter set before trying to run (takes a while to run). If not a known issue, I can reproduce and send more info but thought I would check first. I tried replicating it just now but the geocaching.com site is having issues. Error says something about babel parsing and then presents a message box and quits.
↧