YM a general purpose matching tool? I know of several tools being built for the import and conflation of specific features, such as buildings, several kinds of POIs (nodes, mostly), trees, and the special Nodes of Node Networks.
Several mappers (usually using python scripts or QGIS) can compare data files from data providers with OSM, and produce lists for e.g. Matches (identical=verified), near-Matches (to be checked/modified), New items (to be added), Removed items (to be removed).
The conflation itself is usually not so easy, often it can only be done by hand, one by one, e.g. by creating a maproulette challenge.
I mean tool that where you can pass geojson external data file and OSM data (or overpass filter) and generate list of matches that can be published as website - and reviewed without advanced tools like python scripts or QGIS.
I meant it for reviewing output of conflation - which for many cases can be reasonably done automatically.
For example matching of supermarkets, to import opening hours.
And Maproulette is quite bad tool here as too many people active there will click blindly “yes” (edits that require more manual action work better).
This is exactly what I see people do with QGIS. Does it matter much that it is an advanced tool? From my non-tech viewpoint any tool that does what you describe, is an advanced tool.
One genius sets it up, publishes the dynamic map, and all others can access and review the result map(s).
Maybe I’m still not getting what you actually want, so I’ll be quiet and see what others come up with!
I do know conflation of OSM and external sources in a maintainable way is not an easy task. You have the problem of the often missing merge key; the geolocation in other systems is never exactly right and often way off; existing features in OSM often have other structures attached e.g. entrances on building outlines, etc.
E.g. with node networks we have a very good source which provides us with shapefiles of all Nodes and Routes, but the routes are unusable because they are basically a set op gpx-files where our routes are lists of ways; and we map the junction nodes of the ways as Network Nodes, where they more or less try to get the geolocation of the guideposts right, which are not actually on the ways and often come in multiples (e.g. for cycling, around a sizeable roundabout, you’ll find 4-8 guideposts to indicate one logical node), and, we actually attach the Routes to the Nodes, which is completely logical to me, but seems to be totally uncomprehensible to the operators of the networks.
So I’m probably biassed toward the pre-conflation phase (analysis and matching) where this kind of issues should be solved, before we even try to merge the data. And after the conflation, we could ideally run the process again at any given time, without having to check it all over again, knowing that the external dataset still has the same (for us) erroneous details.
So far, AFAIK, only the simplest of conflation imports/updates (standalone POIs) have succeeded, and even then required lots of correction work afterwards, and they were not repeatable without having to go over all of the items again.
No, it is no custom OpenStreetMap instance as I see it, it fires up a web server as UI but that is not what I call a OpenStreetMap instance.
I think you are right that it is not possible to review POI changes like for adding opening hours to supermarkets but one would have to look better to be sure.
Just to get a better definition of what you want: Suppose you did the conflation and committed the changes. Then you can review it with the standard review tools like Achavi, OSMCha, etc. Is this a “format” that could work for you?