I took quite a few tracks when I was on vacation.
But some parts of some tracks are inaccurate. Sometimes, because the receiver didn’t have good reception, or I started to drive before the receiver had a fix. Now I would like to delete this part of the tracks before I upload the gpx tracks to OSM. How should I do this?
I looked at several GPX editors, but didn’t find one for Windows that would do this nicely. By nice, I mean with a GUI where I can select the part to be deleted with the mouse.
The only chance I see is to import the tracks in JOSM, change the GPX layer into a data layer, delete the bad segments and export them as gpx again.
Is there a more elegant approach?
I just tested JOSM,
The conversion from GPX → Data, manipulation of the data and converting back from DATA → GPX does work. However, the time stamp for each track point gets lost on the way.
I tried to only import a GPX track and export it again without doing anything to the track. The same result. JOSM seems to forget the time stamp. Thus using JOSM is not an option.
How do you edit your tracks?
Personally I use Garmin MapSource for cleaning up my tracks, but if you just want to remove the first or last few trackpoints you could also use an ordinary text editor…
Personally I use memory map for this, but that’s because it’s what I’m using for logging… it has a reasonable gui and enables you to split tracks etc.
I’m new to this OSM thing, and have been asking myself the same question – I have files with 2500+ points, and need to identify the bits where I was indoors and remove them.
The long term solution I suppose is to turn the kit off when inside, but it would make the system more consumer friendly and “open” if there was a more elegant solution to this.
I spent half a day yesterday in ex$el doing statistical analysis on the data, and was still unable to identify the cloud of values around the building (without puling the coordinates out and cheating) – note that for this trace specifically I only have GPX data to go on (long story)
One idea that did strike me is – when using NMEA logs, which I assume most people are, the HDOP/VDOP, number of satalites and some function of SNR should give some good indicators of signal quality, it may be worth investigating a way to store some of this data in the tracks that are uploaded, then give an option in JOSM to only plot points that are over a certain quality, or show them in a lighter colour.
I don’t know how important this is, but some tracks are going to have some trash in them somewhere.
What can consistently be said about the pointless bits of data, that could be used to filter it?
Are the majority of people storing NMEA logs? I’m not, and never have.
I have thought the same though in the past about lighter data/bad data. Data which was made from a low amount of satalite signals could be lighter or more transparent.
In the future I wonder if the amount of gpx points in small areas will be come so high that it’s easier to generate images from the data, and update these on uploading a new route, which people then download to programs like JOSM. Removing rubbish from theses as a moderators tool would be as simple as a rubber tool in the most basic graphics programs then.
I upload everything I have, this has its drawbacks but I think it’s ok.
The easiest way is to find an GPX editor that can show a map on which the GPX points are plottes (like the MapSource application that comes with Garmin GPS’s with OSM background maps, but there are many other applications around). By zooming in on the GPX traces I can easily determine where I stayed at one place for a long time, where I was indoors or where the tracks had poor receiption. I usually try to filter all those ‘bad spots’ out before uploading and it won’t take long when performed every time after tracking.
Am I the only one who thinks this “bad” data actually should be uploaded? It’s always a good idea to have more data.
I don’t see the ‘good idea’ on data that is clearly wrong.
E.g. for some reason my track is out (compared to at least 5 other traces) by about 20 meters kriscrossing around while the road is perfectly straight… that track is only usable for saying ‘that track is bad aparently’ which doesn’t add anything to the project other then space requirements on the db server.
Well if you have two GPX tracks on the same spot and both are bad, then you can probably guess something about the environment around it. Of course I understand your problem with it, it would be good to be able to hide this kind of data, but deleting is so final.
That’s why I have two directories to store my tracks in: ‘raw’ and ‘sanitized’.
Bad data is pointless - it adds to server-space requirements, adds clutter and can mask perfectly good data. I have a track from last year I’d seriously like to delete (somehow it’s become orphaned - no longer appearing as one of my tracks in my list) and all it does is mask the better and more recent data.
First, step 1, if you know data is poor - for whatever reason - it should be filtered out. Mapsource, Google Earth can all import gps tracks and it’s possible to see whether the gps track is any good or not. If it’s bad, don’t upload it.
Second (step 2), how bad is bad? More to the point is what you have better than what is there? OK, so you can filter out the worst offenders in your data set (step 1), but the next stage might be to see what (public) tracks are have been iuploaded to OSM already. If there are tracks in OSM that appear nice and tight and yours are all over the place (zig-zags on a perfectly straight road, for example), then I see no point in adding noise which can only mask those perfectly good tracks. So keep them private or edit them further before making public.
Third, It would be useful to be able to switch tracks on and off selectively and individually on the Potlatch Editor because the tracks are so thick. Multiple tracks from multiple contributors can create an area of pale blue some 20m wide in places. Here JOSM wins - neat little grey dots - much better. But the ability to turn on and off the tracks in an area would be useful in ANY editor.
I use NoniGPSPlot (free) to capture data, then use GPSTrackMaker (free version) to delete data that is clearly wrong or just plain flaky. I never edit points, never run track-reduction or optimisation, I only ever delete trackpoints. The cleaned up gpx file(s) are then uploaded.
Lambertus gets my vote.
Shift+click the GPS icon to show only your tracks.
Or, alternatively, use the ‘edit’ link by each trace (in the GPS traces display) to preload the whole of that trace.
emj: I don’t see any need for bad data. I remeber seeing some town in the cotswolds/england where somebody seems to have uploaded what appears like them playing a football match with a gps in there pocket and it’s just a pain, and has no benefit.
I use mapsource as Lambertus said about and it’s very easy to filter off the rubbish which isn’t beneficial to anyone/osm. It takes seconds to do, so I don’t see much issue with it. It’s also ratehr easy to turn a gps on and off, so when I stand around for a while it goes off…likewise for when I start/finish a journey. Not really hard to do.
sparky_lad: In josm you can just turn tracks on and off.
At least Lambertus sees why I want to save those logs, and I am slowly acknowledging why it’s a good idea to edit before upload. So I would like to ask everyone. don’t delete the raw logs they can be usefull.
What data people have on there own computer is completely up to them obviously, so of course I don’t object to them keeping hold of there raw data. I do as you said, and have the files saved from gps, then have a folder of files which are sorted ready for osm.
Yes but that is my point if you only store it on your own computer it’s completely useless, the “bad” data is only good if you have it in large quantities. One track isn’t enough.
So what would it take to get you people to upload you GPX logs unedited?
Ofcourse there are bits of road that I frequently visit while tracking which results in dozens of traces for that road. When displaying all those traces a thick ‘white band’ having a width of about 10m appears where the road approximately is. Normally one would interpret the tracks and put the road in the middle of the white band, right?
Now, please tell me what use there is for uploading (part of) a track that has an offset of e.g. >20m (thus falling at least 10m outside the so called white band). The track appears to go through houses perhaps almost become part of the white band that indicates another road. Why and how would the project benefit from such bad data?
emj: I’m not talking about bad data, in the sence that it’s data that is a generated when driving fast through a forest in a gauge in September time. I’m referring to data accumulated when people sit down at a park bench and doze for 20 minutes leaving a dence collection of dots with no advantage to OSM.
In fact if it is woodland in a gauge (As iin first example) then I would encourage more data than usual, since most people would get bad reception there, so you require the entire range of bad data to create a good map from the mean route.