Bad data is pointless - it adds to server-space requirements, adds clutter and can mask perfectly good data. I have a track from last year I’d seriously like to delete (somehow it’s become orphaned - no longer appearing as one of my tracks in my list) and all it does is mask the better and more recent data.

Three things:

First, step 1, if you know data is poor - for whatever reason - it should be filtered out. Mapsource, Google Earth can all import gps tracks and it’s possible to see whether the gps track is any good or not. If it’s bad, don’t upload it.

Second (step 2), how bad is bad? More to the point is what you have better than what is there? OK, so you can filter out the worst offenders in your data set (step 1), but the next stage might be to see what (public) tracks are have been iuploaded to OSM already. If there are tracks in OSM that appear nice and tight and yours are all over the place (zig-zags on a perfectly straight road, for example), then I see no point in adding noise which can only mask those perfectly good tracks. So keep them private or edit them further before making public.

Third, It would be useful to be able to switch tracks on and off selectively and individually on the Potlatch Editor because the tracks are so thick. Multiple tracks from multiple contributors can create an area of pale blue some 20m wide in places. Here JOSM wins - neat little grey dots - much better. But the ability to turn on and off the tracks in an area would be useful in ANY editor.

I use NoniGPSPlot (free) to capture data, then use GPSTrackMaker (free version) to delete data that is clearly wrong or just plain flaky. I never edit points, never run track-reduction or optimisation, I only ever delete trackpoints. The cleaned up gpx file(s) are then uploaded.

Lambertus gets my vote.