Lat × Long Homepage

react-google-maps is a library containing React components and hooks for building Google Maps user interfaces. It includes components to render maps, customisable markers, info windows and control panels. The hooks allow developers to access underlying object instances, such as the Map object, or to load additional APIs, like the geocoding or direction services. If you work in React and use the Google Maps JavaScript API, this library will save you a couple lines.

Estimating the Cost of Hosting a Global PMTiles Dataset

In his NACIS conference talk, Brandon Liu positions Protomaps as an altenative to what he call scarcity maps: Tile services offered by commercial companies that cost a small fortune once your project becomes popular and exceeds the number of tile requests in the free tier.

Nothing is free in this world, even hosting PMTiles yourself isn’t. If you want to convince someone that hosting Protomaps is a financially viable alternative then you need to compare numbers.

So let’s do some quick math and compare a rough estimate of the costs for hosting PMTiles on S3 to the monthly costs of Mapbox Vector tiles.

For the sake of simplicity, let’s assume that clients make 1.5 million tile requests per month. The costs incurred on S3 fall into two categories. Data storage and transfer.

On 3 November, the size of a PMTiles dataset based on OpenStreetMap covering the whole world was 107.62 GB. AWS charges $0.023 per GB and month to store data in S3, so the cost to store a global map is $2.47.

To estimate the transfer costs, we need to know the average size of a PMTile that is delivered over the network. The Protomaps website conveniently has an example that shows size of each tile response. I zoomed and panned around on the map and logged the individual size of about two hundred requests. The average size per tile in my sample was 68.88KB. 1.5 million tile requests at 68.88KB rack up about 103GB in transferred data. AWS charges $0.09 per transferred GB from S3 to the internet, so the overall data-transfer cost is $9.27.

The cost to host and serve a world-wide map dataset is about $12. But here’s a catch. If you put a Cloudfront CDN in front of your S3 bucket (which you probably want to do), then data transfer from S3 to Cloudfront is free, so is the first terra-byte from Cloudfront to the internet. Chances are your can host your PMTiles for less than $5.

The same 1.5 million vector-tile requests on Mapbox will cost you $325; a significant difference. Even considering the labour costs of setting up the infrastructure and data on AWS, and making the occasional update, PMTiles will save money. Like a lot of money.

Disclaimer: This is an informed estimate not a scientific study. I literally did this on the back of an envelope. It’s not my fault, if you take these numbers to your boss to convince them to adopt Protomaps and it turns out you’re paying $25 per month.

Overture's Global Entity Reference System Hopes to Improve Data Interoperability. It's Not There Yet.

Overture Maps have released an update to their building datasets, which now includes a unique ID for each building, the Global Entity Reference System (GERS) ID.

One of the important new features in this release is the introduction of Global Entity Reference System (GERS) IDs. GERS IDs have been assigned to over 1.6 million building footprints across several cities in North America, South America, and Europe.

GERS is a system of encoding map data to a shared universal reference, which provides an easy mechanism to conflate data from different data providers based on a specific GERS ID.

Overture introduced GERS recently; it aims to provide a globally unique reference for every entity that can be mapped. It appears, the idea is that data providers outside of Overture can enrich their data sets with GERS to increase interoperability and ease the effort required to fuse data. An enticing idea, sure, but it seems GERS is of little use outside of Overture’s ecosystem.

Let me explain.

In order to retrieve and enhance a data set with GERS IDs, you have to match your data to Overture’s using geometry intersections. This works, but it’s not a novel approach. We were able to do spatial joins before, and IDs also existed before. If a feature is not yet part of Overture’s data, then the only way to create a GERS ID is to add it to Overture’s data set. GERS practically doesn’t exist outside of Overture data.

And questions remain how GERS keeps up with changing data. What if I knock down my house and build a new one at the same place? Does the original GERS ID continue, or is this a new one. What if I subdivide my property; does this result in two new GERS IDs, or is the existing one applied to one part? How about I buy the property next door and connect the two house so they become one?—New GERS ID or one of the original ones? And what if the GERS ID for an entry changes? How do I keep track of these changes to update the IDs in my dataset? If we look at GERS as a gateway to keep datasets in sync, then these are crucial questions to answer.

It’s early stages and I’m sure there are discussions within Overture to address these concerns. But for now, GERS’ only application will be conflation of data from Overture data donors to produce their building datasets.

The recordings from FOSS4G Oceania, which took place just last week in Auckland, are already up on Youtube.

The global OpenStreetMap community will meet for their annual State of the Map conference in Nairobi, Kenya next year between 6 and 8 September.

We are thrilled to officially announce that the global conference of the OpenStreetMap community, State of the Map (SotM), will be making its way to Nairobi, Kenya from September 6th-8th 2024! This landmark event will bring together passionate mappers, data enthusiasts, technologists, and community members from all corners of the globe to celebrate the spirit of collaboration and open mapping.

The conference will be held in hybrid format, both online and in-person.

Python library Lonboard promises super fast visualisation of huge geospatial datasets in Jupyter notebooks. The demo renders over three million data points in under three seconds; a load that brings other libraries to their knees.

We’re sharing lonboard, a new python library, to fill this need. On a dataset with 3 million points, ipyleaflet crashed after 3.5 minutes, pydeck crashed after 2.5 minutes, but lonboard successfully rendered in 2.5 seconds.

Impressive speed, all without clustering or downscaling the data. Lonboard renderes exactly the amount of features that it finds in the data set. How is this possible you ask? Lonboard employs efficient binary data encodings, as opposed to more traditional text-based formats like GeoJSON:

Lonboard is so fast because it moves data from Python to JavaScript (in your browser) and then from JavaScript to your Graphics Processing Unit (GPU) more efficiently than ever before. Other Python libraries for interactive maps encode data as GeoJSON to copy from Python to the browser. GeoJSON is extremely slow to read and write, resulting in a very large data file that must be copied to the browser.

With lonboard, the entire pipeline is binary. In Python, GeoPandas to GeoArrow to GeoParquet avoids a text encoding like GeoJSON, resulting in a compressed binary buffer that can be efficiently copied to the browser. In JavaScript, GeoParquet to GeoArrow offers efficient decoding (in WebAssembly). Then deck.gl can interpret the raw binary buffers of the GeoArrow table directly without any parsing (thanks to @geoarrow/deck.gl-layers).

Very Spatial has compiled a collection of free and open books on spatial analysis.

I am teaching a straight forward, stand-alone Spatial Analysis class for the first time in a couple of decades. That means that I have been looking at resources to share with the class, especially reference materials that they can access given that they will mostly forget what I tell them by February once the next semester is in swing.

The call for presentations for next year’s FOSSGIS conference is now open until 6 November. Every year, FOSSGIS gathers German-speaking makers and users of geospatial open-source software and the OpenStreetMap community. The conference organisers are looking for proposals for presentations, lightning talks and workshops covering project news, use cases and research.

FOSSGIS 2024 will be hosted in Hamburg at the TUHH campus from 20 to 23 March 2024.

FlatGeobuf, GeoParquet, PMTiles, whatnot—I find it hard to understand when and how to use new cloud-native data formats. Brandon Liu has answers.

For analytics use FlatGeobuf or GeoParquet:

FlatGeobuf and GeoParquet are analysis-focused formats. They’re useful for answering queries like What is the sum of attribute A over features that overlap this polygon? But their design does not enable cloud-native visualization like COG does.

You can convert FlatGeoBuf and GeoParquet data into cloud-friendly formats using tools like Tippecanoe:

The best-in-class tool for creating vector tiles from datasets like FlatGeobuf and GeoParquet is tippecanoe, originally developed by Mapbox, but since v2.0 maintained by Felt. Tippecanoe doesn’t just slice features into tiles, it generates smart overviews for every zoom level matching a typical web mapping application. It adaptively simplifies and discards features, using many configuration options, to assemble a coherent overview of entire datasets with minimal tile size.

The output from Tippecanoe can be PMTiles a format that can be read in the browser:

The last missing piece is a cloud-friendly organization of tiles enabling efficient spatial operations. This is the focus of my PMTiles project, an open specification for COG-like pyramids of tiled data, suited to planet-scale vector mapping.

The post doesn’t go into any technical details. I enjoyed as a short and sweet overview of these new(ish) formats and how they are related.

This post by Jed Sundwall is a comprehensive look at the current state and future of Source.Coop, a repository for open data by Radiant Earth.

MapStack has similar goals, and back in May I wrote:

Mapstack doesn’t tie in with existing tools. Currently, there is no tooling to create or manage data, collaborate or visualise the data. It’s a place where the result of data processing might be hosted. Open data providers have invested in the infrastructure to host data—it’ll be hard to convince them to migrate to Mapstack instead.

I have similar thoughts about Source. These data repositories can be useful for providers of small-scale datasets who don’t want to run their own infrastructure, but I question whether we’ll see large, global datasets on these repositories. But Source already hosts several substantial datasets from big names such as NASA, ESA, or CGIAR, presumably because of well-established networks by the people involved in building Source—I have been wrong about these things before.

Currently it’s hard to understand what data is available on Source without paginating through all datasets. The platform lacks advanced search functionality that lets me look for data by geographic region, data format, or time. And to preview the data, I have to download it first and use third-party tooling. A map or table preview on the website would be far more convenient for people to explore data. (Both of these features are on the Source’s roadmap.)

Geofencing business Radar introduced a new suite of APIs, aiming to get a slice of Google Map’s and Mapbox’ business:

Radar Maps Platform has it all, including base maps, geocoding APIs (forward geocoding, reverse geocoding, IP geocoding), search APIs (address autocomplete, address validation, address autocomplete), routing APIs (distance, matrix, route optimization, route matching, directions), and UI kits (address autocomplete).

Our beautiful vector base maps support classic, dark, and light themes. You can easily add base maps and UI kits to your website with v4 of the Radar JavaScript SDK.

There is nothing ground-breaking in the announcement; the features offered in the platform are the pretty standard. But Radar’s pricing is competitive, so you might as well keep this one on our radar. (Yeah, I love a rubbish pun.)

A new map style has been added to the OpenStreetMap site:

It’s a mix of osm-carto and OpenTopoMap. It has many improvements: more tag support (busway, embankment, cuisine, solar plants, aquaculture, pitch, sea, tree etc.), CJK fonts, etc. There is also better internationalisation: country specific road shields, peaks using imperial system in the US, hierarchical place rendering in China, etc.

South-eastern Australia rendered in Tracestack Topo

Zoomed in, it looks a lot like the standard OSM styled. In lower zoom levels, it reminds me of school atlases I grew up with. Very nice.

How hard can it be to make two simple maps, one showing the location of addresses and one showing sales by US state? James Killick tried products from all the big names—ESRI, Google, Microsoft, and Felt. Turns out, getting started is not straightforward.

Killick went into the experiment pretending he had no prior experience, which I think is unfair. Complex software is a reflection of a complex problem space, great flexibility, or both. Not everything can and should be dumbed down to the level a disinterested teenager can be bothered to understand. Instagram is easier to use than a traditional camera but the photos all look the same. Mapping software, like any design tool, requires domain knowledge: You need to know what you want to achieve. You need to know what kind of maps exists, and which can be used to most effectively represent your data. If you know these things you’re more likely to already know the right tools and where to find them.

And let’s not forget Felt is just over one year old now but they already raised the bar for map-tech user experience and managed to remove a lot of complexity from the process through clever design and impressive software engineering. Give them a little more time and they will further change the way we think about making maps. In a few years time we might ask ourselves why map-making was so difficult in 2023.