I’ve installed a local OpenStreetMap architecture for a project I’m currently working on. We need to load a very large amount of data into our OpenStreetMap database.
The original data is stored in a FileGDB that is around 40Gb. It has been transfered into the OSM structure (with negative ids and non version, changeset tags).
My question is what would be the ideal way to procede to load this initial set of data?
I’v tried bulk_upload.py which takes about 1 hour to process 1/30000 of the data. Either my local API 0.6 needs to be tuned or it is not made to handle such important data updates.
Osmosis requires that their be changeset and version tags. Writing directly to the database without passing through the API would possibly speed up the process.
Thank you in advance for your advice.