Edit GPX tracks b4 upload

Well my motto is, better bad data than no data, and best of all would be data tagged as bad.

First off, bad data that might be usefull, created by bad fixes, device bugs, signal noise. This kind of data is good to have because it can help in telling you what error level the GPS might give you. So I can say that even though I was on this road, the GPS told me I was on this other road, if you have these sort of things you could theoretically improve your software to give you a better location.

Second, bad data created by human error, taking a brake, going into a house etc. This kind of data would be wonderful to have in a central repository because you would be able to find places where people stopped to have a break. And if you tagged it you should be able to recognise this in newly uploaded data, and semi-automatically remove it from the DB.

Tired… :wink:

I suppose the big problem is knowing when data is bad.

Strictly, data is data. Period. The thing we are most interested in is accuracy (and its associated partner, precision - but leave that for another day).

In a scientific experiment we design the experiment to focus on the phenomenon we are interested in. The design, the execution, the construction of the apparatus all have a direct bearing on accuracy even though we have previously defined the parameters we are interested in. Statistical methods are therefore employed to be able to state (with an attached degree of confidence) that this hypothesis or this outcome seems to be a satisfactory explanation of the phenomenon under investigation.

In mapping we are at the mercy of trees, rain, snow, the gps equipment itself and so on for obtaining raw data. There is no way there is any parallel with the design of an experiment and so we cannot say that any particular measurement of a position is within a particular degree of accuracy. We are at the receiving end, not the defining end.

So how can we tell if this data is of any use whatsoever? We cannot say that the median of multiple tracks is probably the “correct” route because everyone who passses that point is shadowed from a part of the sky by this forest, those trees, that building, or whatever - the result is that the trace can be consistently out in one particular direction. So the median isn’t a guide to “best fit”.

Only when we have recourse to an alternative source of verification can we even begin to attach any kind of accuracy to the raw data. Satellite is the obvious; other maps is another, sketches made during a personal survey are a third (though in the case of the latter, features can only ever be relative to one another).

Even if I have a very clear idea of the gps trace in relation to the track I have followed I cannot say I am on solid ground: the zig-zag trace I obtain clearly does not follow the straight road I walked - so the data is already circumspect. But even if I draw the line of best fit through the data points, how do I know if that line should overall be shifted 10m NSEW? I don’t.

So we do the best we can, we clean up the worst of the most obvious exceptions and we consign to the future the job of sorting the wheat from the chaff.

Tools that would help: a preload analysis of the gps trace compared to those already uploaded. A bit like astronomical image-stacking software - we know roughly where it is - and “roughly” will have to do for now.

The ultimate arbiter? Probably satellite - but we don’t have the rights to use that data freely. At least not yet and not to the precision we’d probably like.


The mean of all the tracks may not be correct as you said, but it’s going to be correct for people using gps’s as it’s the average gps track. Although the intention should just be to get it ‘really correct’ rather than just suitable for gps’s. You also would have to consider the value of each track though, so that tracks recorded from a slow moving source where more influential on averaging.

But… What I tend to find is as I go through an area of bad reception, is that the path becomes more and more off, but retains the general shapes/curves of the route. If I did the route many times from the same point the tracks would gradually fan out.

Then If it is particularly bad I will take that route from the other angle. This will give good reception at both ends. I can draw the shape from the routes from each angle. Rotate them to line up with the accurate readings at both ends, and draw the mean. (I wonder if the electronic compass helps here?)

Another give away is the spacing of the dots. Firstly how frequent they are all the way along…i.e. in a car or walking. Walking will give better data. But secondly looking back to see if it has kept a fixed signal. So If you have 10 routes scattered around where the road goes, stick with the route that consistently retains it’s usual frequency.

This is all guess work, it’s not guaranteed, or a perfect formula which your (ric) post seem to suggest your looking for, but using this method I think most roads are relatively accurate.

For areas like woods though which I have found to be the worst, I think the best option is just to map them around this time of year. Avoid having to understand a splodge of dots, by getting the data before the leaves come out.

I agree, but we don’t know that… Since we have so few tracks, you can only say that it’s correct for people passing at the times when the tracks where recorded. But in someway almost all of my tracks have been ok, I’ve been able to draw good and accurate maps even though the tracks weren’t good. That’s possible to judge now when we have satellite images.

If there were more bad data though… :wink:

I’m new to all this and had a look around on the gpspassion forums for an editor. The best one I’ve found is GPX Editor (http://www.knackes.com/blog/index.php?2008/04/08/174-gpx-editor-0046). There are others listed but it certainly meets your requirement of a decent GUI in Windows. It also has some nice algorithms to clean up the data using various filters (if you decide that’s what you want to do, after reading the discussion on this thread!).


Now I feel a bit bad for imposing my views on people… :wink:

Thanks OliverLondon,
I downloaded the Editor. It is what I was looking for. I hav eto get used to the way it handels the cutting. But I will try to use it.
Thanks again

Take a look at Viking http://viking.sourceforge.net. It works in Linux and Windows (sometimes) and supports three flavors of OSM maps (osmarender, mapnik, maplint) as a background, as well as Google, terraserver, bluemarble, your own georeferenced image, etc. It also supports uploading your GPX track to OSM from within the program!