Opinion

Tom MacWright, after joining val.town, reflects on building Placemark. It’s an honest account of what it’s like to build and grow a business—something we don’t see very often.

Placemark will live, but in what form isn’t entirely clear:

I’ve envisioned it as a tool that you can use for simple things but can grow into a tool you use professionally or semi-professionally, but maybe that’s not the future: the future is Canva, not Illustrator.

I’ve been wondering how the announcement of Felt, which happened around the same time as Placemark’s, would affect Placemark’s future. Felt has venture capital, a team of smart people, and a lot of buzz, whilst Placemark is a bootstrapped one-man show.

How did we get to the point where there’s a need for a consortium aiming to standardise road map data? And what motivates big names like AWS, Microsoft, Meta, and TomTom to join forces?

James Killick has answers: Several providers are all building their own version of the same product leading to a fractured landscape for map data. Some providers are now looking to lower the cost of producing their data, assert control over how open data is created, and improve interoperability between data sources.

James Killick on the problem of geodata standardisation:

The lack of common, broadly adopted geospatial data exchange standards is crippling the geospatial industry. It’s a bit like going to an EV charger with your shiny new electric vehicle and discovering you can’t charge it because your car has a different connector to the one used by the EV charger. The electricity is there and ready to be sucked up and used, but, sorry — your vehicle can’t consume it unless you miraculously come up with a magical adaptor that allows the energy to flow.

Standards exist for public-transport information but are missing for many other types of geodata. The commercial premise for these domains is different.

For public-transport organisations, their data is not the product. Trains and buses moving through a city are. Network and schedule data is a means to get more people to use public transport, so you want to get this information in front of as many people as possible—through displays on stations, a website or third-party applications. And you want to integrate with other transport authorities’ data to provide a seamless service. All this is best accomplished through shared interfaces and data models.

On the other hand, road-network and address data isn’t a vehicle to sell a product; it usually is the product. You license it because you offer a service (delivery, navigation) that requires this information. The companies providing that data often survey and maintain the data themselves. The idea that you could swap out or merge their data with someone else’s using the routines and data models you already build is a threat to that business model. They don’t want interoperability; they want lock-in, so you keep paying them, not somebody else.

Jonathan Crowe, writing on The Map Room, has a better understanding than I had of TomTom’s new Map platform:

TomTom plans to do so by combining map data from its own data, third-party sources, sensor data, and OpenStreetMap. I’ve been around long enough to know that combining disparate map data sources is neither trivial nor easy. It’s also very labour intensive. TomTom says they’ll be using AI and machine learning to automate that process. It’ll be a real accomplishment if they can make it work. It may actually be a very big deal. I suspect it may also be the only way to make this platform remotely any good and financially viable at the same time.

This sounds very ambitious. Automated data fusion has been a popular research topic amongst PhD students for years. Maybe TomTom will be the first organisation to create a viable product this way; who knows?

Related to yesterday’s post: Lat × Long reader DoudouOSM pointed to an interview with OpenStreetMap founder Steve Coast on the Minds Behind Maps podcast. Steve talks about the future of maps anticipating they will disappear from our apps into the background and that we will interact with geographic information much less.

The whole interview is worth watching but beware, it’s three hours long! The relevant bits here start about 20 minutes in.

James Killick, over at Map Happenings, contemplates whether we’re witnessing the end of consumer maps:

It’s all part of a trend, a downward trend in my opinion, that will result demise of consumer maps. Contrary to Beck’s approach to distill reality into its essential essence we’re moving in the opposite direction.

We are instead on a path to the dreaded metaverse, a virtual world where we should all be thankful and glad to wander around as legless avatars with the aspirational goal of reaching social media nirvana. I don’t know about you, but, ugh.

Sure, Zuck wants us all to stay home and spend all our money inside his multi-player game instead of going on holidays and exploring places.

But no matter what, we’ll continue to go places, and navigating unfamiliar territory will always involve maps. These maps will look different from what we’re using today. More real-time information will be involved, more data capturing sentiments and our phone cameras will play a vital role.

Is it really that bad if future maps don’t resemble those made by Harry Beck or the Ordnance Survey in the olden days? I don’t think so; it’s called progress. I remember arriving in London almost ten years ago. Citymapper was a godsend. Even though you rarely ever looked at a map, it made this humongous city approachable to a boy from a small-ish town in East Germany.

Whether future solutions can be called maps as defined by the National Geographic Society doesn’t matter. Whether we old people like the look of digital way-finding tools doesn’t matter either. What matters is that they make cities easier to explore and navigate for the majority of people.

Greg Miller, writing for Wired Magazine, in a portrait of Cynthia Brewer, of Colorbrewer fame:

Brewer’s influence on cartography is far-ranging. Others have imitated her approach, developing a TypeBrewer and a Map Symbol Brewer. She’s seen her color schemes in everything from financial charts to brain imaging studies.

It’s a portrait in one of the most renowned technology publications of a university professor working on a rather niche subject — goes to show how much influence Brewer’s work has on our craft.

Jacob Hall wrote a recap about how he mapped his campus at William & Mary:

The most rewarding part of this project was getting to engage with community members in and around campus that I otherwise would have never met.

One of the positive effects of going out and mapping an area, especially when done with such determination, is that you get to know your neighbourhood and its community in intricate detail. At the moment, there is probably no other person in the world who knows more about the William & Mary university campus than Jacob Hall.

Bill Dollins reflects on the value of industry standards after working with proprietary product APIs:

In the geospatial field, the work of OGC gives us a bit more shared understanding. Because of the Simple Features Specification, we have GeoJSON, GML, GeoPackage, and various similar implementations across multiple open-source and proprietary database systems and data warehouses. Each of those implementations has benefits and shortcomings, but their common root shortens the time to productivity with each. The same can be said of interfaces, such as WxS. I have often been critical of WxS, but, for all the inefficiencies across the various specs, they do provide a level of predictability across implementing technologies which frees a developer to focus on higher-level issues.

OGC’s W*S specifications (e.g., WMS, WFS, or WCS) share similar features. Each provides a getCapabilities operation advertising the service’s — well — capabilities and operations to access the service’s items (getMap, getFeature, or getCoverage). The precise parameters required to execute the requests do vary, and so do server responses, but a good understanding of one specification can be transferred to other similar specifications.

The same flexibility and predictability in built into newer standards today, like OGC API - Features, and community specifications like STAC — both share the same foundation. OGC’s processes may be slow, and the specifications may not make for an entertaining read but its diligent process leads to predictable API design, enabling service and client developers to implement applications consistently and predictably.

You appreciate that more once you had the pleasure to build a service against the Salesforce API.

Saman Bemel Benrud, previously a designer at Mapbox, reflects on his time at the company. It’s a tale of what happens when a company accepts big VC money. The priorities shift from solving relevant problems to making money.

Even if you’re a lowly designer or engineer, you must understand what your company needs to do to be sustainable. It very likely is different from what they’re doing now, and may come with unexpected ethical compromises.

What other choices do companies have when they build geo-data products and compete with Google? Maybe they can grow slower, don’t sell solutions that aren’t yet available, involve employees in decisions, or accept and support unionisation efforts. The company still has to make money, but it might feel different to the people building the product. There must be a way to build a sustainable business that doesn’t involve VC funding.

Tom MacWright explores whether newer geo-data formats, like FlatGeobuf, Zarr, GeoParquet, Arrow, or COGs, are useful for applications making frequent updates to the data.

The post dives deep into some of the characteristics of these data formats, including compression, random access, and random writes, and concludes that they are optimised for reading data and that the benefits for writes are limited:

I like these new formats and I’ll support them, but do they benefit a usecase like Placemark ? If you’re on the web, and have data that you expect to update pretty often, are there wins to be had with new geospatial formats? I’m not sure.