Value too long for a website


A website link has more than 255 chars so impossible to add it in the tag website.
Do you know if there is a tag where we may have more than 255 chars or is it for all tags ?

Best regards

That applies to all tags.


Hey Krako73,

welcome to OpenStreetMap community forum.

If a website is a domain: “The full domain name may not exceed the length of 253 characters in its textual representation.” Excluding http:// and .com.
Google says that the longest domain is something about 63 characters.

If you want to add an url, we also have a tag “url”.
For more than 255 chars you may use an url shortener.

But it’s also not considered for an url with parameters, filters or whatever makes your URL that long.
Maybe is not the right place for it?!

1 Like

@Krako73, in addition to what’s said above, also check if there are not only irrelevant query elements that you can remove without breaking the reference, but specifically tracker elements. They must be removed. This can dramatically reduce the length of the URL.


I believe it’s best to stay away from using URL shorteners due to security concerns.


Well, that is only because they can lead to the website you didn’t intend to visit. That is the same security reason why you shouldn’t scan QR codes, and boils down to the same reason why you shouldn’t click on unknown website links.

That being said, since the whole purpose of website=* key in OSM is to allow people to click on unknown links (possibly set there by the malicious actors), the point is kind of moot :slight_smile:

Can you give us an example of such website? While there exists tome extremely small chance that someone uses website which cannot be entered in website=* key, I find that highly unlikely.

It is much more likely that you’re trying to share a link from some privacy-hating platform (like facebook etc.) and they’re adding a ton of redirects and tracking information (making the URL long) instead of them giving you REAL website of the amenity.

Indeed! as you wrote yourself, you shouldn’t click on unknown website links. When a URL isn’t shortened, you can see it clearly and decide whether to click or not. But with a shortened URL, you’re left with no choice but to click and find out what’s on the other end.

Moreover, reviewing changesets that include shortened URLs becomes harder and riskier. That’s not a good situation.

To me, it’s obvious: try to be more transparent instead of being more obscure.

By the way, Key:website - OpenStreetMap Wiki advises against using URL shorteners.

Also, it was discussed briefly here: URL Shortening.

1 Like

There’s more to it than that. Some link shortening services record information about you before sending you to the intended destination. Any privacy risk would be hidden behind an innocuous-looking URL, so usual signals like looking for trustworthy domains no longer apply.

But taking my tinfoil hat off, another reason for minimizing the use of link shorteners in OSM is that the redirect might go away before the destination does. This is not a big risk with link shorteners provided by the same domain, such as when linking to or when linking to, but third-party link shorteners introduce more potential uncertainty. Archiving services such as the Wayback Machine won’t necessarily archive these redirects, so it would be more difficult to track down the original intended website.

Having said that, I’ve previously used shortlinks quite a bit when linking to Sophox queries on the wiki, because that’s the link shortener built into Sophox, and its URLs would otherwise be monstrously long. (Sophox is a fork of the Wikidata Query Service, which uses the more trustworthy link shortener, but that service is limited to Wikimedia sites.)


The value of the website tag (now a url tag because changed by another mapper), is visible here : Way: ‪Neualbenreuther Maar‬ (‪351201691‬) | OpenStreetMap.

I opened this talk because I wanted to know if someone had a better idea than to split the url in 2 parts.

Best regards

It looks like the download, ids, dsurl, layerfieldname, and additionallayerfieldname parameters are currently unnecessary, but the additionallayerfieldvalue parameter is necessary to get the same PDF. This is 131 characters long:

Note that I don’t speak German and am unfamiliar with this tool. Sometimes removing parameters like this can turn a permalink into a not-quite-permanent link that breaks in the future.


Great ! Thanks !

For example…

I’d hazard to guess that most of them do. (as does most any search engine, or basically any web site). Yet billions of people don’t seem to be bothered by typing up something in, say, Google (even if they have even worse track record regarding privacy than any link shortener I know of)…

Any privacy risk would be hidden behind an innocuous-looking URL, so usual signals like looking for trustworthy domains no longer apply.

Errr, I’m pretty sure everybody should consider any link shortener exactly the opposite of innocuous-looking URL. As should they consider QR codes - it should be treated it as unknown suspicious site. Because that is where they will land.

On the user side, there exist web services and apps that decode shortened URL and ask user to confirm, of course. But really, there are much more serious security problems with other things related to clicking on links - like all the issues with IDNs, domains like .zip, cyber-squatted previously-legitimite domains, XSS exploits on legitimate sites, CSRFs, spelling-doppelganger domains (or ones using different TLD like .com/.org/.net…), etc. to name just a few. And of course, site might be malicious as is :wink:

But taking my tinfoil hat off, another reason for minimizing the use of link shorteners in OSM is that the redirect might go away before the destination does.

They might, but in my experience, much more often the website URL schema changes or the website goes away (or is domain squatted). helps a little if one is proactive about it. But you are correct that link shortener does add additional point of failure.

Anyway, my point was that a normal legitimate website=* should really fit into 255 chars almost always. I also dislike URL shorteners, to be clear.

Apart from trying to strip unneeded parameters from URL (which may be problematic, as noted), another alternative to link shorteners is creating wikidata=* for that OSM object, and adding URL there. That should not only solve this particular problem, but also provide other benefits too. Of course, using third party (even one as stable and as reputable as Wikidata/Wikipedia/Wikimedia Commons) has some disadvantages too.

1 Like

Ideally, yes. However, a whole generation of Internet users has been trained to blindly click on links at,, et al. because of how widely these services have been used. Do we know anything about each of these couple-dozen websites other than what they purport to be?

I don’t think we could assume, for example, that every data consumer would display a clickable link based on website that is labeled with the domain name; it might be just an icon in a mobile application that the user taps while holding their breath. The user, in other words, trusts that OSM has vetted the website to some extent.

This may be a wholly unreasonable expectation, given that websites get squatted all the time, but at least we have some alternatives that will avoid exacerbating the risk. Your Wikidata suggestion is a good one that also applies to other keys, such as inscription, that tend to run over the 255-character limit.

In my opinion, it is an elegant alternative solution indeed.

We now know that at least a few of them I’ve clicked lead to some link-affiliation-monetization Viglink site (which uBlock origin promptly blocked), so they should be fixed (expanded and trackers removed), and their authors contacted via changeset comments, and maybe DWG involved. Any takers?

Here is MapRoulette to find those URL shorteners (and fix them)…

(also ref. discussion about more problems with website URLs: Is there a procedure to prevent link rot?)