* SQLiteSpy
Like the others I mentioned, SQLiteSpy is quick, light-weight and handy. Unfortunately, I could not get the extension to load - I got a "The specified module could not be found" message. I moved the dll into the program folder but with the same result.
* not sure I completely understand
I think you do, or at least you're in the right direction.
* Refreshing the cache will not change anything because...
This is not entirely correct.
When importing a waypoint from a CSV file (or a GPX file based on a CSV that has not been refreshed), the information the file provides is inserted into the waypoint information. The HasCorrected flag is not set. So...
A - Refresh Cache Data (without HasCorrected). All coordinate information that was imported from the file is replaced by online data. The imported data is lost. Not surprising.
B - Refresh Cache Data (with HasCorrected). Prior to refreshing, all coordinate fields are the same. After refreshing, the cache coordinates in the main window remain as imported. In the Corrected Coordinate dialog box, the upper text field shows the imported data, but the lower text field is 'updated' to the 'original' online coordinates.
C - I am not fully aware of the details (yet) but there is a situation in which corrected coordinates are lost during import from a file.
* When things get this messy there often is no reliable way to sort it out.
It's not as messy as it at first appears. If 5% of the waypoints have reverted to the original coordinates, the problem is more about locating them than it is re-setting them. In a list of 1000 waypoints, that's only 50 gone wrong. The hassle is finding the errant rascals. To be clear, the problem is not data integrity but quality assurance and accuracy - something we'd prefer to have if at all possible, and among large databases these sorts of problems are something very difficult to detect.
That's the direction we're heading in. Only rather than assume corrected if different, we can know for a fact that corrected in gsak are mistaken if they match the original. And even then, 'some' mystery caches are at the posted coordinates, so they still need to be manually verified.
The task is to find a way of short listing a group of candidates from the larger databse and only run the comparison on those. To that end, I was given this filter suggestion for GSAK...
... but the results are much different than from a direct SQL select query.
Like the others I mentioned, SQLiteSpy is quick, light-weight and handy. Unfortunately, I could not get the extension to load - I got a "The specified module could not be found" message. I moved the dll into the program folder but with the same result.
* not sure I completely understand
I think you do, or at least you're in the right direction.
* Refreshing the cache will not change anything because...
This is not entirely correct.
When importing a waypoint from a CSV file (or a GPX file based on a CSV that has not been refreshed), the information the file provides is inserted into the waypoint information. The HasCorrected flag is not set. So...
A - Refresh Cache Data (without HasCorrected). All coordinate information that was imported from the file is replaced by online data. The imported data is lost. Not surprising.
B - Refresh Cache Data (with HasCorrected). Prior to refreshing, all coordinate fields are the same. After refreshing, the cache coordinates in the main window remain as imported. In the Corrected Coordinate dialog box, the upper text field shows the imported data, but the lower text field is 'updated' to the 'original' online coordinates.
C - I am not fully aware of the details (yet) but there is a situation in which corrected coordinates are lost during import from a file.
* When things get this messy there often is no reliable way to sort it out.
It's not as messy as it at first appears. If 5% of the waypoints have reverted to the original coordinates, the problem is more about locating them than it is re-setting them. In a list of 1000 waypoints, that's only 50 gone wrong. The hassle is finding the errant rascals. To be clear, the problem is not data integrity but quality assurance and accuracy - something we'd prefer to have if at all possible, and among large databases these sorts of problems are something very difficult to detect.
QUOTE |
compare the coordinates in GSAK to the coordinates at Geocaching.com and assume the coordinates in GSAK are corrected if they differ |
That's the direction we're heading in. Only rather than assume corrected if different, we can know for a fact that corrected in gsak are mistaken if they match the original. And even then, 'some' mystery caches are at the posted coordinates, so they still need to be manually verified.
The task is to find a way of short listing a group of candidates from the larger databse and only run the comparison on those. To that end, I was given this filter suggestion for GSAK...
CODE |
LatOriginal=Latitude AND lonOriginal=Longitude AND HasCorrected |
... but the results are much different than from a direct SQL select query.