As mentioned in the June 8th edition of OSMUS State of Charge, supercharge.info (and specifically @Rovastar) has granted permission for OSM to import their data. This can be worldwide but I am focusing on the United States. As a test, I manually imported and edited Georgia locations. After some discussion on the supercharge.info forums I have refined my approach. For my next (and hopefully final) test, I did the state of Florida. I have shared the python script I used, the pre-formatted file, and the cleaned and filtered OSM-import ready file on my Codeberg in the /US-FL folder. This is the result of nothing other than running the script then uploading the file. No manual touches have been done.
From my review, it seems like it was resoundingly successful in Florida, and is ready to be imported into JOSM, ran through the conflation tool, then uploaded to OSM. However, in the spirit of collaboration, I want to involve everyone here. If you can, please review the filtered CSV on the Codeberg repo, and let me know if anything needs updating. I am happy to oblige to community suggestions before I change the filter from (df['address.state'] == 'FL') to (df['address.country'] == 'USA'). The values used in the script come from the allSites endpoint and definitions come from this forum post. I have also added some OSM specific tags, such as amenity=charging_station, frequency=0 (to indicate DC charging), and brand/branch information. These are static values and would apply to any station imported into OSM from supercharge.info anywhere worldwide.
Thank you all for your review and I look forward to the feedback! If there is no large objection, I will update the Florida stations in a few days, and likely the entire US after SotM next week (come say hi! Iāll talk your ear off about this stuff!)
To improve the data quality here you can use the plugs/tpc as the same as socket:nacs values.
They mean the same same thing really plug/nacs and pkugs/tpc for the socket type in OSM.
Tpc is the ātesla propriety connectorā or the tesla plug/nacs connector. We have a distinctions as that it was it as initially known and is mostly legacy in our data. There is some question if the tpc runs the same protocols as nacs as they required some hardware upgrades but generally they are used for telsa only.
Bottom line for OSM they are the same thing.
Please include all of these.
Also you could add if solarCanopy=true then covered=yes
I also would not include values for socket:type1_combo:output if there are no stalls. There is only 1 in Florida with stalls.
What is frequency in the csv is that imported? If so what does it mean.
Thanks for this! I wasnāt sure if tpc was the same hardware as NACS. That answers that. So we will have the following:
OSM Values
Meaning
access=yes socket:nacs
All Tesla partner vehicles can charge here with a personal adapter. āOpen to other EVsā
access=yes socket:nacs socket:type1_combo
All Tesla partner vehicles can charge here and an adapter is provided. āMagic Dockā
access=customers socket:nacs
Only Tesla vehicles may charge here. āV2 Stationā
Then of course I agree solarCanopy=True ā covered=yes
If a cell is blank, it wonāt be tagged, so even though only 1 station in Florida is a Magic Dock station, all get the column, but only that 1 will get the socket:type1_combo tagged. frequency=0 signifies the amenity=charging_station is a DC station. All public Superchargers are so it will be a static value but overall in OSM a station or man_made=charge_point can be AC so itās useful to distinguish. Frequency is added by the python script, not from the API endpoint.
Well thatās embarrassing, yeah if there is no socket:type1_combo then there should be no socket:type1_combo:output. Should be the same in the script where I remove street if there is no house number (as these are often describing intersections or mile markers.)
You have opening hours as 24/7 for all of these.
Supercharge.info does not. Only some of the are marked 24/7, other we donāt know yet these are null others have a description when it is limited.
On the dictionary post, it states the following for the field hours:
For sites that are not open 24/7, text indicating hours of availability for charging. (Note this is separate from the āfacilityHoursā field.) If this field is missing/null/blank, the site is available for charging 24/7.
From the script:
def extract_opening_hours(row):
hours = row.get('availability.hours', '')
return hours.strip() if isinstance(hours, str) and hours.strip() else '24/7'
df_fl.loc[:, 'opening_hours'] = df_fl.apply(extract_opening_hours, axis=1)
are you sure thatās not facilityHours? I specifically had it check for IF blank THEN 24/7 ELSE print raw format⦠then ill go through and update the few stations to OSM opening hours syntax by hand.
Updated the python script and uploaded the new FL export CSV. Edit: Added the following to the script that removes the decimal from socket:type1_combo too, but was after the FL export:
OK, Ive spent an hour trying to automate it but for some reason I canāt get a breakthrough, I will manually remove the decimal from the plugshare ID with the 2 buttons it takes in LibreCalc instead of another hour of banging my head on this keyboard. 1 manual change isnt bad!
Tesla Superchargers in Florida have been imported from Supercharge.info. Please see the changeset here:
Thank you to the Supercharge.info team (and specifically @Rovastar).
There were 4 stations present in OSM not present in SC.info. (Edit: I suspect these are really destination chargers incorrectly labelled as Supercharges but only a survey would confirm that.)
Excellent work.
I cross referenced your back to supercharge.info I noticed a couple of things. As you know we have a osm nodeID and Iām going to update our database with these values. For 2 of these we have different values to your import/merge of data.
Your matching script missed 2 sites that were in OSM and added an additional new node. These so happen to be in the place so we have them pretty much directly on top of each other.
Here are the old nodes.
Looks like a brand mismatch.
Ideally we want to minimise adding duplicate data to OSM.
Maybe you could look to see if our values differ or if that is too difficult I could run some cleanup on these after the import.