Hi,
I would like to ask for early community feedback on a proposed QA project
related to post boxes in Switzerland (amenity=post_box).
Important upfront:
- This is not intended as an automated edit, import, or a “single source of truth”.
- The goal is to support mappers with structured review hints, not to overwrite OSM data.
Background
In Switzerland, Swiss Post publishes a public Standortsuche (location search)
which currently lists about 13’850 letterboxes (Briefeinwurf), including:
- locations of letterboxes,
- emptying deadlines (collection times).
From an initial analysis, these data appear relatively stable, but changes to
locations and collection_times do occur over time.
At the same time, a first snapshot of OSM data for Switzerland suggests that
post_box coverage and metadata completeness vary, for example:
- about 7’400 OSM objects tagged with
amenity=post_box, - a significant share without
collection_times, - many objects without any
check_dateorcheck_date:collection_timestag.
These figures are intended as indicative observations only, not as a judgement
on data quality.
Project idea (QA-focused)
The idea is to build an open, transparent QA tool that:
- Periodically fetches public Swiss Post letterbox data.
- Compares them spatially with existing OSM
amenity=post_boxobjects. - Produces review artefacts only, for example:
- GeoJSON layers for visual inspection (e.g. uMap / JOSM),
- CSV lists of potential review candidates,
- optional MapRoulette-style tasks (later, if appropriate).
→ No automatic uploads to OSM are planned.
Project repository (early prototype):
GitHub - swisspost-postbox-qa/worker: Tools to compare postbox objects in OSM and «Swiss Post Standortsuche»
Minimal project status page:
Swiss Post – Postbox QA (Prototype)
A small set of example QA outputs (limited test area) is available in the GitHub repository for illustration.
Intended outputs (examples)
-
Possible update:
“OSM post_box at location X hascollection_timesmissing or older than Swiss Post information (checked YYYY-MM-DD).” -
Possible discrepancy:
“Swiss Post lists a letterbox near location X, but no corresponding OSM object was found within N meters.” -
Possible review case:
“An OSM post_box exists, but Swiss Post no longer lists it — please verify on the ground.”
In such cases, the tool would suggest review-oriented tags such as:
check_date:collection_times=*fixme=*(sparingly, and only to prompt human review)
→ No automatic deletions are intended.
Data philosophy
- Swiss Post is not considered a single point of truth.
- OSM ground truth and local knowledge always take precedence.
- The tool is meant to highlight differences, not to decide which data are “correct”.
Overall, this project is closer in spirit to Osmose, MapRoulette, or other QA layers
than to an import or bot.
Operational model (high level)
- Multiple independent volunteers may run the crawler (for redundancy).
- Runs are intentionally infrequent (e.g. weekly).
- Outputs are published openly (GitHub / project website), with timestamps and sources.
- Human mappers decide if and how any changes are applied to OSM.
Questions for the community
I would very much appreciate feedback on the following points:
- Does this QA-oriented approach align with OSM best practices?
- Are there known pitfalls when using operator-published data in this way?
- Is
check_date:collection_times=*a reasonable tagging approach in this context? - Would you recommend treating this as an “Automated Edit”, an “Import”, or purely as a QA tool?
- Are there similar projects (in other countries) that would be useful to study?
I am intentionally asking for feedback before scaling this beyond a prototype.
Thank you very much for your time and for any guidance you are willing to share.
Best regards,
Daniel