Harel, Bart, and myself had the pleasure to join this year's Open Visualization Collaborator Summit in London. Here are some notes about this amazing two-day event.
The slides of my on presentation are here: maplibre-text-rendering-with-webgl.pdf.
At the beginning of the event there were multiple talks about Deck GL's migration from WebGL to WebGPU.
The motivation to move away from WebGL is that it is not going to get any major updates any more. All standardization efforts are going into WebGPU, which will be the future for graphics programming in web browsers. WebGPU seems to also get some nice features that the Deck GL community can profit from such as asynchronous readback and compute shaders.
Major new aspects of WebGPU compared to WebGL are:
Some things also stay the same at an architectural level. For example, there is still a vertex and a fragment shading stage.
The Deck GL team created an abstraction layer for WebGPU and WebGL called Luma GL which they use now across their codebase in Deck GL. Maplibre GL JS might want to use Luma GL as well should we ever want to move to WebGPU. Deck dropped support for WebGL 1 and now only supports WebGL 2, because there is support for Uniform Buffers in WebGL 2 but not in 1. This made the design of Luma GL much simpler.
The overall feeling was that the authors of WebGPU made it very difficult to transition an application from WebGL to WebGPU. Deck GL opted to write shaders in both frameworks nonetheless because they did not think that translation frameworks would be stable and good enough to allow writing shaders only in one shading language and then use something like transpilation.
Should MapLibre GL JS ever start transitioning to WebGPU I think the insights, learnings, tools and tricks that the Deck GL people now have will be invaluable and make the process much smoother for us.
Kyle Barron presented Lonboard. It is a tool that allows people who do geo data processing in Python Jupyter notebooks to visualize the data in Deck GL in the browser without knowing any JavaScript. It seems to be fast.
GeoArrow is used as an in-memory format and GeoParquet is used for data transfer over the network.
Unclear for me as a newcomer were
Quite cool to see the work of Kyle. It looks like he is pushing the limits of what can be done in terms of speed and volume of data...
Paul Taylor came without any slides but he brought a BIG Nvidia GPU. He opened his laptop and placed a box the size of a US mailbox next to it. Inside was a 48 GB memory GPU, like an RTX A6000 or so, that had its own power supply.
The demo he walked us through was about processing tens of GB of New York taxi rides data. He used cuDF as a drop-in replacement for Pandas which makes your code magically run on the GPU instead of the CPU. In the frontend Deck GL was used that connected to a Quart server. Querying the large dataset only took seconds. He said that the GPU can handle thousands of polygons and millions of points for parallel spatial hit testing.
The spatial library cuSpatial already has some functionality for geospatial vector data processing and much more could be added based on the needs of users...
A fun small side discussion with someone from Oxford University (don't remember the name) was about the following idea: When you render terrain 3d, you have to test which parts of the terrain are occluded from the current viewpoint. If one did such an occlusion calculation in all directions from a ground viewpoint, one would know which sides of which mountains would be visible from one location. This information could then be painted on a regular top-view 2d map to highlight mountains that are visible from somewhere.
The Global Fishing Watch is an NGO that documents fishing and other naval activities at a global scale.
The users need to visualize boat locations on a map in given time ranges, and with some filtering like boat type, country (flag), etc. Some users like for example maritim park rangers around the Galapagos islands have very limited bandwidth and need coarse low-data map overviews that they can refine on demand.
To meet their needs the Global Fishing Watch developed a custom binary format for ship tracks. It is tiled in equal area rectangular tiles. What's more, it is also tiled in time in years/months/days/hours. Deck GL can then load tiles in space and time as needed and make super fast visualizations.
There was a lot of interest in the time tiling and if something like this could become maybe a shared format one day.
One feedback was also that they would use non-web mercator projections "from day one". So as soon as MapLibre/Deck support other projections, they would use it.
Deck GL has an editable layer that replaces Nebula GL. It allows you to draw lines and polygons on the map with the mouse.
Ilya Boyandin took the editable layer a step further and made it into a collaborative editable layer. He showed us his mapcanv project which is a website where you can draw on a map, share a QR code or link with others and then they can edit it too. In a live demo, he showed the audience a QR code and immediately 15 people or so connected to the same session. We all started editing and everything worked super smooth and there was no crash. Everybody was really impressed!
Ilya used YJs for conflict resolution and Elixir with the Phoenix framework for the backend. He also gave an overview of the vast "local-first" movement with all its new developments. That was really interesting.
Big thanks to the Deck GL team Chris, Felix, Xiaoji, and Ib for organizing this beautiful event. We were happy to join from the MapLibre side and really learned a lot!