Bulk import of Tesla Superchargers in Florida

As mentioned in the June 8th edition of OSMUS State of Charge, supercharge.info (and specifically @Rovastar) has granted permission for OSM to import their data. This can be worldwide but I am focusing on the United States. As a test, I manually imported and edited Georgia locations. After some discussion on the supercharge.info forums I have refined my approach. For my next (and hopefully final) test, I did the state of Florida. I have shared the python script I used, the pre-formatted file, and the cleaned and filtered OSM-import ready file on my Codeberg in the /US-FL folder. This is the result of nothing other than running the script then uploading the file. No manual touches have been done.

From my review, it seems like it was resoundingly successful in Florida, and is ready to be imported into JOSM, ran through the conflation tool, then uploaded to OSM. However, in the spirit of collaboration, I want to involve everyone here. If you can, please review the filtered CSV on the Codeberg repo, and let me know if anything needs updating. I am happy to oblige to community suggestions before I change the filter from (df['address.state'] == 'FL') to (df['address.country'] == 'USA'). The values used in the script come from the allSites endpoint and definitions come from this forum post. I have also added some OSM specific tags, such as amenity=charging_station, frequency=0 (to indicate DC charging), and brand/branch information. These are static values and would apply to any station imported into OSM from supercharge.info anywhere worldwide.

Thank you all for your review and I look forward to the feedback! If there is no large objection, I will update the Florida stations in a few days, and likely the entire US after SotM next week (come say hi! I’ll talk your ear off about this stuff!)

Kevin

Just looked through this.

To improve the data quality here you can use the plugs/tpc as the same as socket:nacs values.

They mean the same same thing really plug/nacs and pkugs/tpc for the socket type in OSM.
Tpc is the ā€˜tesla propriety connector’ or the tesla plug/nacs connector. We have a distinctions as that it was it as initially known and is mostly legacy in our data. There is some question if the tpc runs the same protocols as nacs as they required some hardware upgrades but generally they are used for telsa only.

Bottom line for OSM they are the same thing.

Please include all of these.

Also you could add if solarCanopy=true then covered=yes

1 Like

I also would not include values for socket:type1_combo:output if there are no stalls. There is only 1 in Florida with stalls.
What is frequency in the csv is that imported? If so what does it mean.

Thanks for this! I wasn’t sure if tpc was the same hardware as NACS. That answers that. So we will have the following:

OSM Values Meaning
access=yes
socket:nacs
All Tesla partner vehicles can charge here with a personal adapter.
ā€œOpen to other EVsā€
access=yes
socket:nacs
socket:type1_combo
All Tesla partner vehicles can charge here and an adapter is provided.
ā€œMagic Dockā€
access=customers
socket:nacs
Only Tesla vehicles may charge here.
ā€œV2 Stationā€

Then of course I agree solarCanopy=True → covered=yes

If a cell is blank, it won’t be tagged, so even though only 1 station in Florida is a Magic Dock station, all get the column, but only that 1 will get the socket:type1_combo tagged. frequency=0 signifies the amenity=charging_station is a DC station. All public Superchargers are so it will be a static value but overall in OSM a station or man_made=charge_point can be AC so it’s useful to distinguish. Frequency is added by the python script, not from the API endpoint.

Also you could add these to

operator=Tesla
operator:wikidata=Q478214
brand=Tesla Supercharger
brand:wikidata=Q17089620

This is what we have for the NSI for the Tesla superchargers.
I believe will try and make these Tesla destination chargers if the brand is just tesla.

1 Like

Completely agree, brand/operator should reflect NSI values. I’ll add those as static columns as they’ll be the same for all stations in the US.

I’m talking about socket:type1_combo:output not socket:type1_combo you have values in the csv for all of them.

Well that’s embarrassing, yeah if there is no socket:type1_combo then there should be no socket:type1_combo:output. Should be the same in the script where I remove street if there is no house number (as these are often describing intersections or mile markers.)

1 Like

Just noticed another, socket:nacs has a decimal value from the source… that should probably just be the whole number.

1 Like

You can add ā€œwebsite=https://www.tesla.com/findus/location/supercharger/locationIDā€
The location ID is the ref that Tesla use and can be linked to a webpage for each location

When you have tweaked it if you can the csv again I can check again.

You have opening hours as 24/7 for all of these.
Supercharge.info does not. Only some of the are marked 24/7, other we don’t know yet these are null others have a description when it is limited.

On the dictionary post, it states the following for the field hours:

For sites that are not open 24/7, text indicating hours of availability for charging. (Note this is separate from the ā€œfacilityHoursā€ field.) If this field is missing/null/blank, the site is available for charging 24/7.

From the script:

    def extract_opening_hours(row):
        hours = row.get('availability.hours', '')
        return hours.strip() if isinstance(hours, str) and hours.strip() else '24/7'

    df_fl.loc[:, 'opening_hours'] = df_fl.apply(extract_opening_hours, axis=1)

Sorry yes null is 24/7. I was too quick to post that!

You did miss one though.

4507,2022-12-28,16,5,250,16.0,250,yes,Ellenton,FL,34222,27.53793,-82.50736,5461,Factory Shops Boulevard,0,US,Tesla,Tesla Supercharger,charging_station,24/7

Has 5:00AM - 11:00PM in supercharge.info
Maybe for simplicity is it contains any value don’t mark it 24/7

are you sure that’s not facilityHours? I specifically had it check for IF blank THEN 24/7 ELSE print raw format… then ill go through and update the few stations to OSM opening hours syntax by hand.

Updated the python script and uploaded the new FL export CSV. Edit: Added the following to the script that removes the decimal from socket:type1_combo too, but was after the FL export:

    # Remove decimals from socket counts
    df_fl['socket:nacs'] = df_fl['socket:nacs'].astype(int)
    df_fl['socket:type1_combo'] = df_fl['socket:type1_combo'].astype(int)

OK, Ive spent an hour trying to automate it but for some reason I can’t get a breakthrough, I will manually remove the decimal from the plugshare ID with the 2 buttons it takes in LibreCalc instead of another hour of banging my head on this keyboard. 1 manual change isnt bad!

If no objection, I will upload these stations tomorrow (June 18), and begin work on the overall US import.

Tesla Superchargers in Florida have been imported from Supercharge.info. Please see the changeset here:

Thank you to the Supercharge.info team (and specifically @Rovastar).

There were 4 stations present in OSM not present in SC.info. (Edit: I suspect these are really destination chargers incorrectly labelled as Supercharges but only a survey would confirm that.)

Excellent work.
I cross referenced your back to supercharge.info I noticed a couple of things. As you know we have a osm nodeID and I’m going to update our database with these values. For 2 of these we have different values to your import/merge of data.

Your matching script missed 2 sites that were in OSM and added an additional new node. These so happen to be in the place so we have them pretty much directly on top of each other.

Here are the old nodes.

Looks like a brand mismatch.
Ideally we want to minimise adding duplicate data to OSM.

Maybe you could look to see if our values differ or if that is too difficult I could run some cleanup on these after the import.