Tom MacWright, after joining val.town, reflects on building Placemark. It’s an honest account of what it’s like to build and grow a business—something we don’t see very often.
Placemark will live, but in what form isn’t entirely clear:
I’ve envisioned it as a tool that you can use for simple things but can grow into a tool you use professionally or semi-professionally, but maybe that’s not the future: the future is Canva, not Illustrator.
I’ve been wondering how the announcement of Felt, which happened around the same time as Placemark’s, would affect Placemark’s future. Felt has venture capital, a team of smart people, and a lot of buzz, whilst Placemark is a bootstrapped one-man show.
Seek-Optimized ZIP, a new profile for ZIP files, allows random access and selective decompression. With standard ZIP files, you have to download and decompress the ZIP file before accessing its content. While fully compatible with standard ZIP tools, with SOZip, you can now selectively access files within a ZIP, so you won’t have to download the full archive if you want to access just one file.
Currently, there are two implementations for SOZip: It’s available in the development branch GDAL and as a Python module. MapServer (on the development branch) and QGIS, both applications depending on GDAL, support SOZip too.
Seek-Optimized ZIP file adds to a growing suite of cloud-native data formats and APIs, such as COGs, Zarr or GeoParquet, allowing developers and applications to access and process large selectively without the need to download complete datasets.
The FlatGeobuf community has started the process to formally adopt the specification as an OGC Community Standard. (A document linked by Bert Temme on Twitter, providing reasoning for the spec to become a standard, is not publicly available anymore.)
FlatGeobuf provides binary encoding for geospatial vector data. It is lossless, streamable, enables random feature access, and is supported by a wide range of geospatial tools and libraries, including QGIS, GDAL, Fiona, and PostGIS.
OGC Community Standards are developed outside the more formal OGC standardisation process, usually by a group of individuals who also implement reference solutions (instead of panels of representatives from large organisations).
A quick tutorial by Bert Temme about how to turn a shape file into PMTiles using Tippecanoe:
In this blog we created in a few easy steps vector tiles from shapefile of worldwide railroads in PMTile format using Tippecanoe, and deployed to a standard webserver. No complicated backend WMS/WFS mapservers are needed anymore to get this working.
How did we get to the point where there’s a need for a consortium aiming to standardise road map data? And what motivates big names like AWS, Microsoft, Meta, and TomTom to join forces?
James Killick has answers: Several providers are all building their own version of the same product leading to a fractured landscape for map data. Some providers are now looking to lower the cost of producing their data, assert control over how open data is created, and improve interoperability between data sources.
Registration for FOSSGIS 2023, the German-speaking FOSS4G event, is now open. The conference will be held from 15 to 18 March 2023 in Berlin, Germany.
If you need convincing to attend, the conference schedule is also available. The program features a mix of developer updates, case studies, and practical applications of open-source software and OpenStreetMap, as well as a series of hands-on workshops diving into the latest and greatest open-source software for geospatial.
James Killick on the problem of geodata standardisation:
The lack of common, broadly adopted geospatial data exchange standards is crippling the geospatial industry. It’s a bit like going to an EV charger with your shiny new electric vehicle and discovering you can’t charge it because your car has a different connector to the one used by the EV charger. The electricity is there and ready to be sucked up and used, but, sorry — your vehicle can’t consume it unless you miraculously come up with a magical adaptor that allows the energy to flow.
Standards exist for public-transport information but are missing for many other types of geodata. The commercial premise for these domains is different.
For public-transport organisations, their data is not the product. Trains and buses moving through a city are. Network and schedule data is a means to get more people to use public transport, so you want to get this information in front of as many people as possible—through displays on stations, a website or third-party applications. And you want to integrate with other transport authorities’ data to provide a seamless service. All this is best accomplished through shared interfaces and data models.
On the other hand, road-network and address data isn’t a vehicle to sell a product; it usually is the product. You license it because you offer a service (delivery, navigation) that requires this information. The companies providing that data often survey and maintain the data themselves. The idea that you could swap out or merge their data with someone else’s using the routines and data models you already build is a threat to that business model. They don’t want interoperability; they want lock-in, so you keep paying them, not somebody else.
Iván Sánchez Ortega reporting from his activities during the latest OGC code sprint:
when pygeoapi is requested a coverage from GIS client (preferring image/tiff or application/ld+json or the like), the raw data is returned. But when it’s a web browser (preferring text/html), then a webpage with a small viewer is returned.
It’s an interesting deep-dive into HTTP content negotiation, how it relates to geo-data problems and what OGC API implementations could do better.
A new book documents the history and activities of Youthmappers, a global movement that engages university students in local mapping using modern technology to collect data and organise activities. Open Mapping towards Sustainable Development Goals looks Youthmappers chapters worldwide and how their work relates and contributes to achieving Sustainable Development Goals.
But this isn’t your average Springer publication with contributions from tenured university professors who only leave their offices to spend the summer uninterrupted in their holiday homes to write. All chapters were written by Youtmappers activists:
[T]his book aims to document and share insights about this movement’smovement’s emergence from the first-person voices of the very students themselves who are among those at the forefront of creating our new people’speople’s map of the world. […] Each chapter puts forward the voices of students and recent graduates in countries where YouthMappers works, all over the world. Many of them hail from countries where expertise in geospatial technologies for the SDGs is nascent and needed.
Each chapter is written in the context of a primary and a secondary SDG, identified by each chapter’s authors. The topics covered are as wide-ranging as the SDGs and as diverse as its contributors, including city planning, agriculture, gender equality or ethics.
The ebook version is available for free as PDF or ePub, the paperback is EUR 39.99, and the hardcover is EUR 49.99.
Google Maps is no longer served from the subdomain maps.google.com but from a path on Google’s main domain (google.com/maps). If you’re allowing Google Maps to use your location, Google can now use it on all sites under google.com.
Jonathan Crowe, writing on The Map Room, has a better understanding than I had of TomTom’s new Map platform:
TomTom plans to do so by combining map data from its own data, third-party sources, sensor data, and OpenStreetMap. I’ve been around long enough to know that combining disparate map data sources is neither trivial nor easy. It’s also very labour intensive. TomTom says they’ll be using AI and machine learning to automate that process. It’ll be a real accomplishment if they can make it work. It may actually be a very big deal. I suspect it may also be the only way to make this platform remotely any good and financially viable at the same time.
This sounds very ambitious. Automated data fusion has been a popular research topic amongst PhD students for years. Maybe TomTom will be the first organisation to create a viable product this way; who knows?
Development Seed1 have published a summary of two talks from this year’s PostGIS Day. One talk introduces and compares two projects, TiMVT and TiFeatures, which simplify creating tile and OGC-Features services using data from a PostGIS database. The other talk covers PgSTAC, a set of SQL functions and triggers to stand up Postgres databases to host STAC catalogs.
Thanks to the excessive length of this year’s PostGIS-Day schedule, I could catch a couple of hours on Friday morning. Here’s a quick summary of some of the talks I saw.
Ryan Lambert of Rust Proof Labs took a deep dive into some fantastic PostGIS wizardry for routing outside of roads, such as waterways, indoors or on access-restricted private roads. A considerable part of the solution to these complex problems comes down to deep knowledge of PostGIS’ functionality but doing basic things like understanding your data, cleaning data, understanding and documenting edge cases and making decisions on the problems you don’t want to solve. Ryan also recently published a book on PostGIS and OpenStreetMap.
Martin also gave a sneak peek to features landing in future versions of PostGIS, such as validating polygon coverage, simplify boundaries on coverage polygons, and simplifying inner boundaries while keeping outer boundaries unchanged.
Brendan Farrell presented db2vector, which creates bespoke vector-tile APIs from data in PostGIS. Db2vector allows you to specify a specific SQL query for each API endpoint, so you can quickly create different web maps from a single data source with great flexibility. (Unfortunately, I couldn’t find any page to link to detailed information about the service.)
And finally, Paul Ramsey talked about Moving Objects, a proof of concept he has built to demonstrate how updates to records in a Postgres database can be propagated in near real-time to clients. It uses a mix of triggers, Postgres notifications, and pg_eventserv to push notifications to web clients via WebSockets.