Merge/correct GPX files


I’ve been walking and driving the same area several times and, therefore, got several GPX files of that area. Due to GPS errors not all the tracks show the same track although they overlap at several points. They even overlap during part of the track.

I wonder if there is a software that would merge this tracks. I don’t mean merging them so I see multiple tracks but merge them in the sense that overlapped positions (1 meter error or so) will become a single track. This would be really usefull for mapping as it would correct my GPX tracks based on multiple track recordings of the same track, giving a precisely positioned track.

Any software will do this for me?

Best Regards

try any app from


Not really…

There are some applications that are really usefull if you’ve got a single track, such as the filters removing too much SOG (Speed Over Ground)… some seem to look at the DOP which is nice but my iPhone Application won’t save DOP. Therefore my intention is driving/walking/whatever over the same track several times. The position shall be almost identical. So, more or less what I’m looking for is:

Filter out incorrect data from the GPX file… This is easy with some filters. Knowing two points and the time it was recorded (Present on the GPX file) I know the distance and time taken. For example 10 Km/h would probably be a excesive speed if I tracked the path walking and therefore it can be discarded automatically. Knowing two points I can also get my bearing. If I know my toughest turn has been around 90 degres, I could savelly stated that any bearing of 110 degrees or more is a false location.

All this things can be done with the software at that page and surelly help… but the GPX files of the same track won’t overlap all the way (Although they will at some points). This is reasonable as the GPS can have errors of a few meters so none of the GPX files are 100% precise. So, ¿How to tell which is the closest one, or has a greater precision?

Well I guess it could be done as the GPS receiver does internally…

I could get a point from a GPX file, and create a fictional circle around it of X meters or centimeters (Whatever precisión you wish). Then I would start looking on all the other GPX files of the same track doing the same exact thing. Whenever 3 or more of this fictional circles cut each other I would consider I’ve found a precise location. When thre circles cut each other I would have an union area defined by three points… If I draw lines between this points I get a triangle… The central point of the triangle would be considered a nice estimate where the maximum error should probably be less than the radius of the circles I’ve described.

Using this points I would obtain a much more precise track… If I’ve got 4 tracks I could do this process with every file (3 files at a time, all variants). When I’ve got a result I could start doing the process with the results (3 at a time sequentially)… and get an even tighter estimate of the track.

What I’m looking for is for an application that would do this automatically for me…

The more time you go through the same track the more precision the result will have.

There is a program called topofusion. That does exactly what you want. Unfortunatly not opensource or free. There is a demoversion available which has a limitation that only three files can be opened at the same time.

I’m trying to write my own software what will do the same thing. I am trying to use the same logic as they do, which is available as a paper on their website. I also have another way of doing it, ,which i can switch to. But i don’t have much time to spare to work on it, so you have to wait. Any help is welcome if you want to help.

Actually, i already have (or had, have to look it up) finished code that does the averaging. See also (in dutch) but with the code. It runs in R (r-cran project).

If you want more code and examples, let me know.

I’ve made a user guide and home for Michiels script.

Thanks a lot.

Would it be possible to have a version of the script in - say - perl or python?
R is open source but not very common amongst “normal” mappers.


I think the script is quite dependent on R functionality, so I don’t foresee it freeing itself from R anytime in the near future. It should be possible to wrap and package it so that all the interaction with R happened in the background though. But that will take quite some more effort and help from others to happen.

I looked at the script, and this being 9pm on a workday, I didn’t really see any R wizardry aside from matlab-style matrix accesses. Just picking the track with the most points as the reference trackand then picking the closest points on all the other tracks and averaging the locations. Essentially, a least-squares error algorithm, which is what I would choose. Not a very difficult approach to replicate in python or perl, or even C - if one has the time, of course. If I had an extra hour, I could throw a quick Qt GUI on top of it. Parsing the GPX format would be the most difficult part (GPSbabel would probably help).

I would, however, go as far as first filtering the source GPX tracks to remove jitter, in case you spent some time in one spot. comes to mind.

That’s great news! I don’t really program and so don’t really have a clue :confused: Michiel should be able say why he’s used R, and whether he considers his use wizardry. Have you looked at the latest testing code?

Yes, when using the script I’ve simplified the source tracks with gpsbabel first.