Cercana Executive Briefing — Week of March 28–April 3, 2026

152 feeds monitored. Published April 3, 2026.

Executive Summary

The most consequential development this week was the publication of the CNG Geo-Embeddings Sprint report, which moved earth observation embeddings from an emerging research thread into the standards-drafting phase. Co-hosted by CNG, Planet, and Clark University, the March sprint produced concrete patterns for storing, cataloging, and accessing EO embeddings. This is the kind of infrastructure specification work that typically precedes commercial adoption. This matters because embeddings are one of the mechanisms through which satellite imagery can be translated into forms that AI systems can use at scale. Organizations making infrastructure decisions about their EO data pipelines should be watching this thread closely. Standards that solidify here will shape which analytics platforms interoperate and which become walled gardens.

In parallel, the defense and intelligence conversation intensified. Project Geospatial published two substantial pieces, one on GeoAI-driven military targeting ethics and another on the geopolitics of quantum gravity gradiometry, while Octave rebranded Luciad as Alto 2026.0 with explicit cyberthreat visualization capabilities for defense. Taken together, these developments suggest that the defense geospatial market is expanding its technical scope while also confronting the ethical consequences of that expansion. Meanwhile, Canada’s national geospatial strategy consultations drew critical coverage revealing a system with depth but without alignment, and Australia launched the Locus Alliance to replace its collapsed national geospatial body. The pattern across both is institutional. Countries are renegotiating how geospatial infrastructure is governed, and the outcomes will likely shape procurement structures for years.

Major Market Signals

EO Embeddings Move from Research to Standards

The CNG Geo-Embeddings Sprint brought together Planet, Clark University, and other organizations to draft best practices for storing, cataloging, and accessing Earth observation embeddings. This is not another AI capabilities announcement. It is infrastructure specification work. When the EO community starts defining how embeddings are stored and discovered, it suggests that the technology has matured enough for interoperability to matter. For platform vendors and data infrastructure buyers, this is the stage at which architectural decisions can begin to lock in compatibility or isolation. The sprint outputs are headed for community review, which means a public comment period that organizations with EO data pipelines should engage with.

Defense Geospatial Expands Scope and Confronts Ethics Simultaneously

Two distinct but convergent threads emerged this week. Octave launched Alto 2026.0, the rebranded Luciad platform, adding cyberthreat visualization for defense and extending geospatial situational awareness into the cyber domain. Simultaneously, Project Geospatial published a deeply personal account of how GeoAI-driven military targeting is eroding the oversight structures that governed intelligence operations for decades. That convergence is the real signal. The defense geospatial market is rapidly expanding what these tools can do while governance frameworks struggle to keep pace. For vendors entering the defense space, the ethics conversation is no longer peripheral. It is becoming a procurement consideration.

National Geospatial Governance Under Reconstruction

Canada’s NRCan geospatial strategy consultations drew critical analysis from GoGeomatics and EarthStuff, revealing systemic gaps in coordination, infrastructure governance, and institutional alignment. In Australia, the newly formed Locus Alliance launched to fill the void left by the collapsed Geospatial Council of Australia. These are not isolated developments. Multiple countries are simultaneously renegotiating how geospatial infrastructure is governed at the national level. For vendors and service providers, the restructuring of national geospatial bodies directly shapes procurement pipelines, standards adoption, and public-sector contract structures.

Quantum Sensing Enters the Geospatial Regulatory Conversation

Project Geospatial’s deep analysis of quantum gravity gradiometry regulation, combined with SBQuantum’s announcement of a space-bound quantum magnetometer as part of the US government’s MagQuest Challenge, marks the week when quantum sensing moved from theoretical interest to a dual regulatory and commercial question. Quantum gradiometry can reveal subsurface structures, including those with defense and resource extraction significance, at resolutions that current regulatory frameworks were not designed to address. It is still early, but this looks like a genuine market-formation signal worth tracking.

Notable Company Activity

Product Releases

  • Octave (Alto 2026.0): Rebranded its Luciad platform as Alto and launched version 2026.0 with cyberthreat visualization capabilities for defense situational awareness. The rebrand sharpens Octave’s positioning as a defense-focused geospatial intelligence platform.
  • MapTiler: Released April updates that included professional grid overlays and a major satellite imagery refresh across its basemap products.
  • USGS: Released a machine learning tool that forecasts streamflow drought conditions up to 90 days ahead nationwide. This is a significant applied AI deployment for water resource management.
  • Esri: Released the Protected Area Management solution and March 2026 ArcGIS Solutions update, alongside updates to its geocoding and world traffic services.

Partnerships

  • DroneDeploy × Cairn: Enterprise-wide aerial and ground reality capture partnership for housing development portfolio management. A useful example of reality capture moving from project-level use to portfolio-level deployment in construction.
  • Astroscale × Exotrail: Advancing France-Japan cooperation on space sustainability and on-orbit servicing, backed by a visit from Emmanuel Macron and Sanae Takaichi.

Funding & M&A

  • Xona Space Systems: Closed an oversubscribed $170M Series C to accelerate deployment of its Pulsar LEO navigation constellation. Investors include Hexagon, Craft Ventures, and Samsung Next. The round suggests strong market confidence in GPS-alternative positioning infrastructure.
  • Trimble: Signed an agreement to acquire Document Crunch, an AI-powered construction document analysis and risk management company, integrating it into the Trimble Construction One ecosystem.
  • Woolpert: Awarded a $49.9M USACE contract to support I-ATLAS coastal mapping and nautical charting efforts. It is a significant federal LiDAR and survey contract.

Government and Policy Developments

The US National Geodetic Survey’s NSRS modernization effort was the subject of a Geo Week News webinar bringing together NGS leadership and the geospatial community. The message was practical. The reference frame transition is coming, and the community should be preparing rather than worrying. For survey and mapping firms, the modernization will affect nearly every coordinate-dependent workflow in the US market, and early preparation is likely to be an advantage.

Canada’s geospatial strategy consultations drew substantive analysis revealing that while the country has depth in geospatial capabilities, it lacks consistent alignment across governance, infrastructure, and coordination. NRCan’s consultations are surfacing long-standing structural problems rather than resolving them. Sparkgeo’s piece on building a national urban forest data view illustrated both the ambition and the fragmentation of Canadian geospatial infrastructure.

In Australia, the Locus Alliance launched as a new national geospatial body to fill the gap left by the Geospatial Council of Australia’s collapse. The Alliance aims to be structurally different from its predecessor, though details of its governance are still emerging. The Ordnance Survey in the UK announced that its National Geographic Database now holds 16 data collections and 70 major enhancements, positioning Britain’s national mapping as a continuously updated digital product rather than a periodic release.

Technology and Research Trends

The technology story of the week centers on the maturation of EO data infrastructure. CNG’s Geo-Embeddings Sprint produced actionable specifications rather than aspirational roadmaps, while EarthDaily published on how real-time crop signals from its constellation are changing agricultural market decisions. It was one of the rarer posts connecting EO technology directly to commercial demand-side outcomes. TerraWatch’s “Anatomy of an Earth Observation Use Case” offered a structural critique of how the EO industry uses (and misuses) the term “use case,” pushing toward more rigorous framing of what makes an EO application commercially viable.

The Spatial Edge’s weekly digest covered satellites mapping local human development levels, LLMs estimating flood damage without training data, and foundation models for ecological mapping. Taken together, these offer a concentrated snapshot of where applied spatial data science seems to be heading. The through-line across these is the compression of traditional multi-step geospatial workflows into single-model inference. That has implications for both the skills market and the value chain.

Spatialists covered Stefan Ziegler’s work raster-enabling Apache Hop with GDAL-based transforms, demonstrating practical LiDAR-to-building-height ETL pipelines. This is exactly the kind of hands-on cloud-native geospatial tutorial content that has been persistently absent from the ecosystem. It stands out in part because material of this kind is still relatively rare.

Open Source Ecosystem Signals

The open-source ecosystem had a quieter week following the QGIS 4.0 and FOSSGIS 2026 activity of recent weeks. geoObserver noted a QGIS tip on SFCGAL functions now available as a plugin, which is better understood as a post-4.0 ecosystem refinement than a headline release. geoObserver also reflected on FOSSGIS 2026 and celebrated 44,444 downloads of the GeoBasis_Loader plugin, a milestone for the German open-data geospatial tooling community.

The Spatialists coverage of Apache Hop raster enablement is worth flagging here as well: the hop-gdal-plugin extends an open-source ETL framework with geospatial raster capabilities, bridging the gap between data engineering and geospatial processing. It represents the kind of cross-pollination between general-purpose open-source tooling and geospatial-specific capabilities that tends to strengthen the broader ecosystem.

The CNG Geo-Embeddings Sprint, covered in Market Signals above, also carries open-source ecosystem significance: the sprint’s outputs are intended for community review and adoption, meaning they will likely influence how open-source EO tooling handles embedding storage and discovery.

Watch List

  • Spiral Blue (Australia): Delivered space LiDAR hardware to a UK company as part of its strategy to build an EO space LiDAR capability. Space-based LiDAR is a nascent market with potentially transformative implications for forestry, bathymetry, and terrain mapping if costs come down.
  • Geospatial data and the EU Deforestation Regulation (EUDR): Coverage on Medium explored the geolocation data challenges and compliance implications of the EUDR. The regulation will create a distinct demand signal for geospatial verification services across commodity supply chains.
  • SBQuantum’s space quantum magnetometer: The MagQuest-funded mission could initiate a new class of GPS-independent navigation and subsurface sensing from orbit. If the technology performs, it opens regulatory and commercial questions that the geospatial industry has not yet grappled with.
  • Mainz cloud-native geospatial infrastructure: A German city implementing a fully cloud-based geospatial data infrastructure with VertiGIS and Esri. Municipal adoption of cloud-native GDI at this scale is an early but meaningful demand signal for enterprise cloud geospatial platforms in European public administration.

Top Posts of the Week

  1. Geo-Embeddings Sprint: Advancing standards for Earth observation embeddings (CNG Blog) moves EO embeddings from research into standards specification, with direct implications for data infrastructure interoperability.
  2. The New Battlespace: How Geospatial AI, Outdated Intelligence, and the Illusion of Oversight Are Reshaping Military Targeting (Project Geospatial) is a deeply informed and personal account of how GeoAI is outpacing the governance structures designed to prevent intelligence failures.
  3. The Anatomy of an Earth Observation Use Case (TerraWatch Space) offers a structural critique of how the EO industry frames commercial viability and pushes beyond “use case” as marketing shorthand.
  4. The Subsurface Geopolitics: Regulating the Commercial Use of Quantum Gravity Gradiometry (Project Geospatial) maps the emerging regulatory landscape for a technology that can reveal what lies underground at unprecedented resolution.
  5. Three Geospatial AI Myths Federal Buyers Should Not Believe (Cercana Systems) provides practical procurement-focused guidance that cuts through GeoAI marketing claims for federal decision-makers.

Cercana Executive Briefing is derived from 152 feeds aggregated by geofeeds.me.

Cercana Executive Briefing — Week of March 21–27, 2026

142 feeds monitored. Published March 27, 2026.

Executive Summary

The clearest story of this week is the merging of two narratives that have been running in parallel: sovereign AI and geospatial intelligence. On Sunday, GoGeomatics published a piece authored by Will Cadell whose title states the thesis plainly: “SovereignAI is GeoAI.” Within 72 hours, Australia made three distinct institutional moves: Geoscience Australia launched a new 10-year national strategy; a new National Geospatial Advisory Committee was announced with cross-sector representation; and Geospatial World ran a feature on Australia’s Austral-Asian Space Innovation Institute discussing sovereign satellite capability and the National Digital Twin for Agriculture. This is not messaging from one company — it is institutional behavior from a government treating geospatial infrastructure as strategic national infrastructure.

That same framing is arriving in U.S. federal policy. The GeoAI and the Law Newsletter this week dissected the Artificial Intelligence Regulation and Safeguards Act and found an expanded geolocation definition that could reshape how geospatial companies collect and use location data. The GSA’s proposed AI contract clause for federal procurement is described as the most consequential shift for geospatial vendors in years.

Meanwhile, European standards are in flux. Javier de la Torre’s analysis of the INSPIRE Directive simplification argues this is not mere bureaucratic tidying but an opening to embrace analytics-native paradigms, which is a structural shift in how European geospatial infrastructure is conceived.

Across all three developments, the same question is being asked simultaneously in Washington, Brussels, Canberra, and Ottawa: what does geospatial data mean for national capability? Leaders who treat this as a technical standards conversation are reading it wrong. It is a strategic infrastructure conversation, and the answer is being written this week in policy documents, not product roadmaps.

Major Market Signals

SovereignAI and GeoAI Are Converging as a Policy Frame

GoGeomatics published “SovereignAI is GeoAI” on March 22, arguing that national AI sovereignty strategies are fundamentally geospatial challenges, asserting that understanding territory, movement, resources, and infrastructure at scale requires geospatial intelligence as a foundational layer. Within days, Australia produced three institutional signals in the same direction: a new 10-year strategy from Geoscience Australia framed around shaping “Australia’s future through geoscience insights”; a new National Geospatial Advisory Committee advising government; and a Geospatial World feature on the Austral-Asian Space Innovation Institute discussing sovereign satellite capability and the National Digital Twin for Agriculture. Canada is also in motion: the retirement of CCMEO Director General Eric Loubier after 25 years is characterized by GoGeomatics as arriving at a “critical time” for Canada’s geospatial sector. The policy frame is hardening across the middle powers, with geospatial seen as strategic infrastructure, not technical tooling.

U.S. Federal GeoAI Regulation Is Taking Shape

The Artificial Intelligence Regulation and Safeguards Act, which the GeoAI and the Law Newsletter calls the “Trump AI Act,” contains an expanded geolocation definition that could require geospatial companies to alter how they collect, store, and use location data. Separately, the GSA’s proposed AI contract clause would affect how federal agencies procure AI-enabled geospatial services. The White House push for a unified federal AI standard would supersede the patchwork of state-level rules that geospatial companies currently navigate. Taken together, these three instruments represent the most significant regulatory shift for the U.S. geospatial market since CIPSEA. Companies with federal contracts or location-data products should be conducting legal exposure assessments now, not after enactment.

Commercial EO Capacity Is Expanding Across Multiple Modalities Simultaneously

Three distinct capability additions arrived this week: Synspective successfully placed its 8th StriX SAR satellite in orbit, continuing its build toward a 30-satellite constellation by 2030; Satellogic announced its Merlin satellite will deliver daily 1-meter resolution optical imagery; and Open Cosmos launched what it describes as the largest space-based real-time data service, fusing broadband connectivity, Earth observation, and IoT in a single platform. The pattern is consistent with broader commercial EO maturation: higher revisit, higher resolution, and tighter integration with downstream data pipelines. Organizations that have been waiting for the market to stabilize before committing to EO-based workflows should note that the infrastructure is arriving whether they are ready or not.

European Geospatial Standards Infrastructure Is at a Decision Point

Javier de la Torre’s analysis in Spatialists — titled “geo beyond INSPIRE” — frames the simplification of the EU INSPIRE Directive not as a retreat but as a structural opportunity. The argument is that INSPIRE’s interoperability-first model, built for a previous era, is increasingly misaligned with how geospatial data is actually consumed. Analytics-native paradigms, where data is designed for computation from the start, not formatted for exchange, offer a better fit for the AI-era use cases now driving demand. The OGC simultaneously announced its Testbed on Trusted Data and Systems has expanded beyond Europe to include non-European NMCAs, reflecting growing global interest in how authoritative public geospatial data can be modernized and made computationally useful. These two developments together mark a transition moment for European and global geospatial standards. The question is not whether INSPIRE changes, but who shapes what replaces it.

Notable Company Activity

Product Releases

  • Esri: A coordinated spring release wave this week covered ArcGIS GeoAnalytics Engine 2.0 (cloud-scale spatial analytics), ArcGIS Urban (March 2026 update), ArcGIS StoryMaps (March 2026), ArcGIS Pro SDK updates, R-ArcGIS Bridge Spring 2026, and Lidar updates to World Elevation Layers in Living Atlas. The breadth and simultaneity of these releases signals a major platform release cycle, not incremental maintenance.
  • Satellogic: The Merlin satellite will offer daily 1-meter optical imagery, a meaningful step toward sub-daily revisit at commercial resolutions.
  • Open Cosmos: Announced what it calls the largest space-based real-time data service, combining broadband connectivity, Earth observation, and IoT telemetry in a single commercial offering.
  • Septentrio: Launched the AsteRx EB, a compact high-accuracy GNSS receiver targeting robotics, logistics, and industrial automation, extending precision positioning into non-traditional industrial sectors.
  • SBG Systems: Unveiled the Stellar-40, a modular and scalable inertial navigation system for demanding and mission-critical environments.
  • Apple: Announced that ads will come to Apple Maps in the United States and Canada beginning this summer via its new Apple Business platform.

Partnerships

  • ANELLO Photonics × Q-CTRL: A strategic collaboration combining silicon photonics inertial sensing with quantum magnetic navigation, targeting UAV operation in GPS-denied environments. The press release cites a $1 billion-per-day global cost from navigation failures, a finding that may attract defense and logistics attention.
  • Kongsberg Discovery × Fugro: A new Main Supplier framework agreement formalizing a decades-long relationship between the ocean technology and subsurface surveying firms.
  • Seabed 2030 × Greenroom Robotics: The international seabed mapping program partnered with the Australian autonomous vessel company to expand ocean floor data collection.

Funding & M&A

  • Arlula: Raised A$3.4 million to build out software workflows for automated satellite tasking and imagery analysis. This is a small-ticket award but is strategically directional in the EO automation space.
  • e-GEOS (Leonardo Group): Won a contract from Italy’s Ministry of Environment and Energy Security to conduct nationwide satellite mapping of asbestos.

Government and Policy Developments

Australia produced the most concentrated national geospatial policy activity of the week. Geoscience Australia launched a 10-year strategy framed explicitly around national capability, with ministerial endorsement. A new National Geospatial Advisory Committee was established to provide cross-sector advice to government. And the Austral-Asian Space Innovation Institute’s founding CEO used a Geospatial World platform to articulate how sovereign space capability, satellite constellation data, and the National Digital Twin for Agriculture are linked strategic assets. Three announcements in four days from one government signals that geospatial is a designated policy priority in Canberra, not a technical afterthought.

Canada’s situation is the mirror image: a leadership vacuum at CCMEO is arriving precisely when Canada needs to respond to both sovereignty pressures and a rapidly changing EO commercial market. GoGeomatics’ framing of this as a “critical time” reflects the real institutional risk that mid-cycle leadership transitions at national mapping agencies have historically been associated with delayed procurement decisions and stalled modernization programs.

In the United States, the GeoAI and the Law Newsletter’s detailed reading of the Artificial Intelligence Regulation and Safeguards Act and the GSA’s proposed AI procurement clause deserves board-level attention for any company selling geospatial AI capabilities to federal agencies. The expanded geolocation definition in the proposed legislation is not incidental and it appears to bring a wider range of location data products within the act’s scope than current law covers.

The OGC’s Testbed on Trusted Data and Systems is worth tracking as a governance model. Originally launched as Testbed Europe, its expansion reflects interest from non-European NMCAs who face the same modernization challenge: how to make authoritative public spatial data computationally useful without sacrificing trustworthiness. This is engineering work with standards implications that will matter across every market where national mapping agencies are significant data providers.

Ordnance Survey data is also anchoring a new UK multi-agency emergency communications system designed to reduce the time required to transfer incident data between control rooms, demonstrating a practical example of authoritative location data embedded in safety-critical infrastructure.

Technology and Research Trends

The Spatial Edge newsletter this week highlighted research in Nature Communications integrating seismic risk modeling, geospatial infrastructure inventory, and climate accounting that shows earthquake-related building repairs generate massive CO2 emissions. The implication for the market is directional: insurers, municipal governments, and climate-disclosure frameworks will need spatial datasets that link physical risk exposure to embodied carbon accounting. This is an early signal of a new analytical product category.

QGIS Server gained time-series (WMS-T) support for grouped layers this week, contributed by Oslandia in collaboration with Ifremer, the French oceanographic research institute. The technical significance extends beyond the feature: it enables institutional EO data providers to distribute time-varying imagery through standards-compliant web services without bespoke infrastructure. As more governments look to publish national EO datasets via QGIS-based portals, this capability removes a meaningful barrier.

Swiss cadastral survey data is now available in IFC format for BIM integration via geodienste.ch, with four cantons participating and more expected. This represents one of the first examples of authoritative cadastral data crossing the traditional boundary between GIS and building information modeling workflows. For vendors selling into the AEC sector, it is a signal that the BIM-GIS convergence is becoming a data standards reality, not just a vision document.

The “Shortening Translation Distance” essay by Bill Dollins in geoMusings this week offered a practitioner’s-eye view of how AI code generation is changing the relationship between user-centric domain knowledge and programming in geospatial work.

Open Source Ecosystem Signals

FOSSGIS 2026, the annual German open-source geospatial conference, took place this week in Göttingen. The CCC (Chaos Computer Club) published Day 1 session recordings on the same day as the presentations, which is described by geoObserver as a record turnaround that reflects both organizational maturity and the community’s commitment to accessibility. For organizations evaluating open-source geospatial tooling, FOSSGIS 2026 session recordings represent a concentrated resource: they document the current state of practice across QGIS, PostGIS, MapLibre, GeoServer, and adjacent tools, often before formal release notes appear.

Oslandia had a notable week in the European open-source ecosystem: the QGIS Server WMS-T contribution for Ifremer (technical post published), a recap of the QGIS-Fr French user days, and an announcement of GeoDataDays 2026 in Tours. Oslandia’s activity this week illustrates how open-source QGIS ecosystem contributors operate as professional services firms with direct government and research institution clients. This is model that can mitigate lifecycle concerns in procurement decisions for public sector geospatial programs.


Watch List

  • Apple Maps advertising model: If Apple’s entry into map advertising succeeds commercially, it will pressure Google to expand its own ad surface area in Maps, potentially restructuring the economics of consumer location data platforms globally. B2B geospatial vendors whose products sit downstream of consumer map data APIs should monitor closely.
  • OGC MUDDI standard: The OGC published a detailed narrative this week on the MUDDI (Model for Underground Data Definition and Integration) standard, framing it as a model for cross-system urban spatial data interoperability. Underground infrastructure mapping is a large, fragmented market and a maturing standard here could unlock significant procurement activity.
  • GPS-denied navigation commercialization: The ANELLO/Q-CTRL partnership is the most prominent announcement in a cluster of GPS-alternative navigation products reaching market. The $1B/day framing will attract defense and logistics capital. Watch for follow-on partnerships or acquisition interest from platform navigation vendors.
  • Radiant Earth governance shift: New board members this week include Cassie Ely, who played a role in bringing MethaneSAT to life, and David X. Cohen, executive producer of Futurama. The combination of climate-finance experience and science communication expertise signals that Radiant Earth is positioning itself for a higher-visibility role in the EO-climate intersection.
  • BIM-GIS cadastral convergence: Switzerland’s IFC-format cadastral data is the leading example, but the pattern of authoritative government cadastral data flowing into BIM workflows is likely to appear in other jurisdictions. AEC-sector geospatial vendors should be tracking the OGC BIM-standards working group for early signal.

Top Posts of the Week

  1. SovereignAI is GeoAIGoGeomatics — Establishes the thesis that national AI sovereignty strategies are fundamentally geospatial challenges; the most strategically significant framing piece of the week.
  1. “geo” beyond INSPIRESpatialists (Ralph Straumann / Javier de la Torre) — Frames the INSPIRE Directive simplification as an opportunity to adopt analytics-native paradigms rather than simply reducing compliance burden.
  1. GeoAI and the Law NewsletterSpatial Law & Policy — Detailed reading of the Trump AI Act’s expanded geolocation definition, the White House unified AI standard push, and the GSA AI contract clause — essential reading for any geospatial vendor with federal exposure.
  1. Contextual Location Data, Unified Foundational Maps Paramount for IndustryGeospatial World — Interview with Overture Maps Foundation Executive Director Will Mortenson on interoperability, the Global Entity Reference System, and the foundation’s AI and machine learning roadmap.
  1. Testbed on Trusted Data & SystemsOpen Geospatial Consortium — Announcement of the formerly Europe-only testbed going global, focused on practical NMCA modernization with reusable open outputs.

The Cercana Executive Briefing is sourced from 142 feeds aggregated by geofeeds.me.

Geospatial Sovereignty as a Strategic Requirement

Executive Summary

Geospatial systems are no longer peripheral tools; they underpin critical infrastructure, national security functions, and capital-intensive operations across government and industry. They support logistics, infrastructure management, environmental compliance, security operations, and strategic planning across government and industry. Yet many organizations rely on externally managed platforms for the storage, processing, and delivery of spatial intelligence.

This post discusses the concept of geospatial sovereignty, a governance and risk management discipline concerned with the degree to which an organization retains control over its geospatial data, infrastructure, and operational continuity.

As regulatory requirements expand, vendor ecosystems consolidate, positioning infrastructure faces disruption, and sovereign cloud models gain traction, executive teams must understand where they sit on the spectrum between dependency and control. The objective is informed, intentional architecture grounded in clear visibility into operational risk and long-term optionality.

Perception of Operational Control

For more than a decade, convenience has shaped enterprise technology decisions. Cloud-hosted platforms reduced infrastructure burdens. Subscription licensing simplified procurement. Managed services shifted operational responsibility outward. For many organizations, these shifts accelerated deployment and reduced internal complexity. But in the domain of geospatial operations, a different question is emerging: Who actually controls the systems that underpin your spatial intelligence?

Consider a composite scenario drawn from observable trends. A logistics provider experiences degraded positioning data during a regional GPS disruption and discovers that routing intelligence depends on upstream signals it does not control. A cloud vendor modifies pricing tiers or usage thresholds, quietly altering long-term cost projections embedded in multi‑year operating plans. A regulatory audit raises questions about data residency and physical storage location, forcing leadership to answer questions they assumed were settled at contract signature. A mission‑critical geospatial workflow is interrupted by an upstream service outage, revealing how tightly coupled internal processes have become to external infrastructure.

In each case, the organization technically “owns” its mission and its data. Yet operational continuity depends on infrastructure, policy decisions, and technical roadmaps defined elsewhere. This is not a critique of cloud providers; many are reliable, professionally managed, and appropriate for a wide range of workloads. The issue is structural. Control of infrastructure, data, and operational continuity is not the same as platform access. When those elements diverge, organizations may discover that their geospatial capabilities are more dependent than leadership intended.

From Data Sovereignty to Geospatial Sovereignty

The concept of sovereignty in digital systems is not new. Data sovereignty is commonly defined as the principle that data are subject to the laws and governance structures of the jurisdiction in which they are collected or stored.

It is important to distinguish related but separate concepts. Data residency refers to the physical location where data are stored. Data localization refers to legal requirements that certain categories of data remain within specific geographic boundaries. Data sovereignty concerns the legal authority governing that data and the jurisdiction whose laws apply. These distinctions are discussed in detail by enterprise security and cloud governance analysts (CIO, 2026).

Increasingly, major technology publications are also examining “sovereign cloud” and “geopatriation” trends, where governments and enterprises seek to re-anchor sensitive workloads within controlled jurisdictions. These discussions reinforce that sovereignty is not theoretical. It is shaping procurement, cloud architecture, and national digital strategies.

Geospatial sovereignty extends this conversation beyond legal jurisdiction. It asks whether an organization retains meaningful authority over how its spatial infrastructure is architected and governed, whether operations can continue during vendor, network, or geopolitical disruption, how systems are updated and configured, how spatial data integrates with broader enterprise platforms, and where critical skills and knowledge reside.

In this context, sovereignty is operational. It concerns who can make consequential decisions about the systems that support mission execution.

Why This Issue Is Emerging Now

Several converging pressures are elevating geospatial sovereignty from a technical concern to an executive one.

1. Geospatial Is Foundational

Spatial data now informs asset management, utilities maintenance, supply chain routing, environmental monitoring, agriculture, mining, emergency response, and national security operations. As geospatial becomes core to operations, its governance becomes a strategic concern. National Academies research has repeatedly emphasized that geospatial information infrastructure is critical to modern governance and infrastructure management (National Academies of Sciences, Engineering, and Medicine, n.d.).

2. Regulatory and Compliance Demands Are Expanding

Globally, more than one hundred privacy and data governance laws now affect how organizations collect, process, and store data. These include GDPR in the European Union and numerous national and state-level frameworks.

As data governance regimes expand, spatial datasets, which are often rich with location intelligence tied to individuals, infrastructure, or sensitive assets, fall under increasing scrutiny. European discussions around building digital sovereignty through authoritative geodata emphasize that trusted, nationally governed datasets are foundational to public policy, security, and economic competitiveness (GIM International, 2026). The implication for enterprises is clear that geospatial data is no longer merely operational. It is policy-relevant and potentially regulated. Governance expectations are rising accordingly.

Commercial perspectives are echoing this shift. Industry commentary aimed at enterprise operators has begun to frame sovereign geospatial data as a competitive and operational necessity rather than a compliance afterthought. Discussions emphasize that organizations dependent on third-party platforms for core spatial intelligence may struggle to preserve data lineage, portability, and strategic control as markets evolve (Nimbo, 2025). The argument is not ideological; it is pragmatic. When spatial data informs capital allocation, logistics optimization, and asset performance, control over that data becomes economically material.

3. Vendor Ecosystems Are Consolidating

The technology industry continues to experience consolidation through mergers and acquisitions. Platform acquisition can alter licensing terms, support models, product roadmaps, and pricing structures.

Organizations that rely exclusively on proprietary ecosystems may find their long-term cost models and integration assumptions shifting unexpectedly. Industry commentary has begun to frame geospatial sovereignty as requiring both legal alignment and architectural foresight, which highlights that compliance without architectural control can still leave organizations strategically exposed (CARTO, 2026).

In other words, sovereignty is not solved by contract language alone. It must be reflected in system design.

4. Strategic Uncertainty Is Increasing

Positioning, Navigation, and Timing (PNT) infrastructure such as the Global Positioning System (GPS) is globally relied upon and operated by national governments (U.S. Space Force, 2023). Disruptions, whether technical, environmental, or geopolitical, demonstrate that foundational spatial services are not immune to systemic risk.

Recent analysis from Australia’s spatial industry has highlighted both the economic potential and systemic vulnerability of national PNT infrastructure. Commentary in Spatial Source has warned that while modern economies are deeply dependent on satellite-based positioning, they often lack redundancy and assurance frameworks to mitigate disruption (Spatial Source, 2026a).

Further, discussion of “navwar,” or navigation warfare, underscores that PNT denial and degradation are no longer theoretical military edge cases but active considerations in contested environments (Spatial Source, 2026b). Even outside conflict scenarios, signal interference, spoofing, and systemic outages pose measurable operational risk.

In response, governments are investing in resilience and sovereign capability. Frontiersi’s launch of Australia’s first dedicated PNT Labs reflects a recognition that positioning infrastructure requires independent testing, validation, and assurance capacity (Spatial Source, 2025). Similarly, Canada’s evolving Defence Industrial Strategy explicitly acknowledges the strategic importance of domestic geospatial capability within national security and industrial planning (GoGeomatics Canada, 2026).

These developments signal a broader shift. When governments treat geospatial and PNT systems as strategic assets requiring sovereign capability, commercial enterprises should take note. If national infrastructure planners view spatial systems through a sovereignty lens, enterprises whose operations depend on those systems must evaluate their own dependency assumptions.

At the same time, analysts are emphasizing that data provenance and trust are becoming central to reliable forecasting, AI modeling, and decision support. Without clear lineage and governance, spatial analytics risk becoming less defensible and less auditable (Ready Signal, 2026).

When spatial infrastructure becomes integral to mission execution, resilience and traceability are no longer purely technical considerations. They become executive concerns tied to continuity, liability, competitive positioning, and public trust.

Costs of Dependency

Dependency is rarely visible when systems function as expected. It becomes visible when change is required. Organizations may discover that data export is limited or constrained by proprietary formats, that migration costs are materially higher than anticipated, or that integration depth is bounded by vendor APIs rather than internal design choices. Custom workflows may be constrained by externally managed roadmaps, and over time internal skills may atrophy because expertise resides primarily with the platform provider rather than within the enterprise.

These costs are often architectural rather than purely financial. They shape how quickly an organization can pivot, how confidently it can integrate acquisitions, and how effectively it can respond to regulatory change. Over time, optimizing for short-term convenience can reduce long-term flexibility. That trade-off may be acceptable for commoditized functions. It is less acceptable when spatial intelligence underpins strategic decision-making.

Sovereignty as Institutional Capability

At its most practical level, geospatial sovereignty is about institutional capability. It asks whether the organization possesses the internal knowledge required to operate and evolve its spatial systems, whether it can transition platforms without losing intellectual capital, whether its most critical spatial workflows are portable, and whether leadership has explicitly defined which components must remain under direct control.

Sovereignty exists on a spectrum, ranging from fully vendor-managed SaaS environments where infrastructure and architectural direction are externally controlled to fully self-managed systems governed internally. Most organizations operate somewhere between these poles. The leadership challenge is to ensure that dependency is intentional, understood, and aligned with mission risk tolerance. Where dependency is acceptable, it should be consciously accepted; where it is not, architectural adjustments should follow.

The Architecture Question

When organizations examine geospatial sovereignty seriously, the discussion shifts from tools to architecture. Questions emerge that are fundamentally architectural in nature. Which datasets are mission-critical versus supportive? Which workflows must remain operational during network disruption? Where does regulatory exposure exist? Which integrations define competitive advantage? How portable are spatial assets across platforms and vendors?

Answering these questions requires cross-functional engagement across technology, operations, legal, compliance, and executive leadership. The conversation moves beyond tool comparison and into enterprise design. Sovereignty is ultimately a governance and architecture exercise intersecting risk management, operational resilience, and long-term strategy.

Seeking Guidance

As geospatial systems become more deeply integrated with enterprise operations, governance cannot remain purely technical. Executive leadership increasingly needs visibility into structural dependencies, long-term total cost trajectories, regulatory exposure, continuity planning assumptions, and the sustainability of internal talent.

Organizations that evaluate sovereignty proactively retain optionality. Those that defer the conversation may discover constraints only when disruption forces action. In that moment, architectural flexibility is no longer a strategic advantage; it becomes an emergency requirement.

The role of trusted advisors in this context is not to prescribe universal solutions or advocate a single technology stack. It is to help organizations map existing dependencies, clarify strategic priorities, assess architectural alternatives, and align technology decisions with mission risk tolerance. Sovereignty decisions should reflect leadership intent rather than historical inertia.

For organizations navigating this terrain, the challenge is rarely theoretical. It is practical, architectural, and often constrained by legacy decisions. Experienced advisory support can help leadership teams translate sovereignty from an abstract principle into an actionable roadmap. That work begins with disciplined assessment, grounded risk analysis, and a clear understanding of how geospatial capabilities align with mission priorities.

A Conversation Worth Having

This discussion does not require immediate platform replacement or imply that current systems are deficient. It begins with assessment. Who controls your geospatial infrastructure? Where are your true dependencies? Which elements are strategic, and which are commoditized?

As spatial intelligence becomes central to both public and private sector operations, these questions move from theoretical to structural. They shape procurement strategy, workforce planning, compliance posture, and long-term competitiveness.

In subsequent posts, we will examine architectural models, hybrid approaches, and the role of open-source ecosystems in enabling greater geospatial independence without sacrificing innovation. We will also explore practical assessment frameworks that allow leadership teams to quantify dependency instead of debating it abstractly.

For now, the imperative is straightforward: understand your position on the sovereignty spectrum before external events force the issue. In an era of increasing complexity, control means ensuring that the systems most critical to your mission remain aligned with strategic intent.

References

CARTO. (2026, Jan 15). Geospatial sovereignty requires law and architecture. https://carto.com/blog/geospatial-sovereignty-why-it-requires-both-law-and-architecture

CIO. (2026, Feb 13). Geopatriation and sovereign cloud: How data returns to its origin. https://www.cio.com/article/4131458/geopatriacion-and-sovereign-cloud-how-data-returns-to-its-origin.html

GIM International. (2026, Feb 25). Building digital sovereignty through authoritative European geodata. https://www.gim-international.com/content/news/building-digital-sovereignty-through-authoritative-european-geodata

GoGeomatics Canada. (2026, Feb 18). What Canada’s defence industrial strategy really means for geospatial. https://gogeomatics.ca/what-canadas-defence-industrial-strategy-really-means-for-geospatial/

National Academies of Sciences, Engineering, and Medicine. (n.d.). Geospatial information infrastructure and governance. https://www.nationalacademies.org/read/28857/chapter/10

Nimbo. (2025, Dec 2). Sovereign geospatial data. https://nimbo.earth/stories/sovereign-geospatial-data/

Ready Signal. (2026, Feb 19). Data sovereignty, provenance, and trustworthy forecasts. https://www.readysignal.com/blog/data-sovereignty-provenance-trustworthy-forecasts-2026

Spatial Source. (2026, Feb 20). Australian PNT: Lots of potential, lots of danger. https://www.spatialsource.com.au/australian-pnt-lots-of-potential-lots-of-danger/

Spatial Source. (2026, Feb 10). PNT assurance in the age of navwar. https://www.spatialsource.com.au/pnt-assurance-in-the-age-of-navwar/

Spatial Source. (2025, Feb 26). Frontiersi launches Australia’s first PNT labs. https://www.spatialsource.com.au/frontiersi-launches-australias-first-pnt-labs/

U.S. Space Force. (n.d.). Global Positioning System (GPS). https://www.spaceforce.mil/About-Us/Fact-Sheets/Article/2197765/global-positioning-system/

Header image: Mhsheikholeslami, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

Reducing the Costs of Fragmented Spatial Data in 2026

Organizations invested heavily in geospatial tools and data throughout 2025. Yet many leaders found the return on that investment lower than expected. A core issue is fragmentation rather than a lack of data or technology capability. When spatial data is scattered across teams, tools, and formats, it becomes harder to trust, harder to maintain, and harder to use for meaningful decisions.

This is why 2026 will reward organizations that focus not on bigger geospatial systems, but on cleaner, right-sized spatial data pipelines that deliver clarity rather than complexity.

Industry forecasts reflect this shift. Analysts estimate the global geospatial analytics market at $114 billion in 2024, projecting growth to more than $226 billion by 2030 (Grand View Research, 2024). Another independent forecast places the market at $258 billion by 2032, driven by adoption across infrastructure, logistics, and smart-city applications (Fortune Business Insights, 2024). But as adoption accelerates, complexity rises: many organizations still struggle with data quality and context, which remain barriers to effective geospatial insight (Korem, 2025).

Costs of Fragmentation

Fragmentation rarely announces itself. It appears subtly in duplicate datasets, inconsistent update cycles, siloed maps, or “shadow” spatial layers created by individual teams. These inconsistencies introduce persistent operational friction:

  • Analysts spend more time reconciling data than interpreting it.
  • Cross-functional teams make decisions based on slightly different versions of the truth.
  • Trust in spatial outputs erodes as discrepancies accumulate.

Broader technology trend research highlights the same issue: modern digital environments are growing more complex, making integration discipline essential (McKinsey & Company, 2025). Nowhere is this truer than in geospatial workflows, where inconsistent data pipelines undermine the insights organizations depend on.

Bigger Systems != Bigger Insight

A persistent misconception is that impactful geospatial work requires enterprise-scale GIS stacks, large teams, or massive datasets. But today’s ecosystem offers a spectrum of tools, from legacy proprietary solutions like ArcGIS to modern enterprise-grade open-source platforms using tools such as DuckDB or Sedona, and an expanding set of specialized tools used across planning, logistics, environmental management, and operations. Independent analysis notes that GIS platforms enable organizations to integrate spatial data, visualize patterns, and support decision-making across sectors ranging from transportation to public safety (Research.com, 2025). Leaders can match tools to decisions rather than building infrastructure for its own sake.

Industry observers note similar trends: cloud-based GIS, AI-driven spatial analytics, and real-time data streams increasingly enable organizations of all sizes to integrate geospatial insight into their decision frameworks (LightBox, 2025). The threshold for adopting spatial intelligence is lower than ever — provided data pipelines remain clean and coherent.

ROI in Small, Targeted Spatial Insights

Some of the highest-value geospatial outcomes arise not from extensive datasets but from small, curated insights aligned to operational needs:

  • Identifying which assets fall inside specific risk zones.
  • Visualizing coverage gaps or service inconsistencies through a single boundary overlay.
  • Pinpointing route or deployment inefficiencies affecting field productivity.

Innovation trends reinforce this path. New geospatial entrants are developing AI-assisted mapping tools that allow non-technical teams to generate spatial insights without relying on specialized staff (Business Insider, 2025). This democratization of spatial intelligence reduces the need for one-off, isolated datasets, helping to prevent fragmentation before it starts.


MapIdea offers a particularly relevant perspective: geography can serve as a unifying analytical key, allowing organizations to connect datasets that share no other identifiers and reduce fragmentation across systems (MapIdea, 2025).

How To Start Simplifying in 2026

A right-sized approach doesn’t require heavy investment. It requires intentional design:

  1. Establish authoritative versions of key spatial datasets and retire duplicates.
  2. Align update cycles with operational rhythms, whether monthly or real-time.
  3. Integrate spatial data into existing analytics environments rather than building parallel systems.
  4. Start with one meaningful decision, demonstrate value, and scale deliberately.

These steps reduce friction, strengthen trust, and create a foundation for more advanced geospatial capability in the future.

The 2026 Opportunity

As the geospatial analytics market continues to grow at double-digit rates, organizations face a choice: accumulate complexity or pursue clarity.

Right-sized geospatial, built on coherent pipelines and targeted insights, offers a practical path forward. It replaces fragmentation with consolidation, trades overhead for agility, and most importantly, it positions geography as a shared context for informed, decision-making across your organization.

Cercana can help you streamline your geospatial data portfolio and operations. Contact us today to learn more.

References

Business Insider. (2025). AI-powered mapping platform secures funding for next-generation geospatial tools. https://www.businessinsider.com/felt-ai-mapping-platform-funding-geographic-information-system-2025-7

Fortune Business Insights. (2024). Geospatial analytics market report. https://www.fortunebusinessinsights.com/geospatial-analytics-market-102219

Grand View Research. (2024). Geospatial analytics market size, share & trends analysis report. https://www.grandviewresearch.com/industry-analysis/geospatial-analytics-market

Korem. (2025). Geospatial trends in 2025: The latest industry evolutions. https://www.korem.com/geospatial-trends-in-2025-the-latest-industry-evolutions

LightBox. (2025). Top 10 trends in GIS technology for 2025. https://www.lightboxre.com/insight/top-10-trends-in-gis-technology-for-2025

MapIdea. (2025). Open letter to data and analytics leaders on geography. https://www.mapidea.com/blog/open-letter-to-data-and-analytics-about-geo

McKinsey & Company. (2025). Technology trends outlook 2025. https://www.mckinsey.com/~/media/mckinsey/business%20functions/mckinsey%20digital/our%20insights/the%20top%20trends%20in%20tech%202025/mckinsey-technology-trends-outlook-2025.pdf

Research.com. (2025). Best geographic information systems (GIS) in 2026. https://research.com/software/guides/best-geographic-information-systems

Header Image Credit: National Oceanic and Atmospheric, Public Domain

A Novice Takes a Stab at GIS – Part 3

At this point in my entry-level upskilling project, the ground work has been done. I have a polygon of the Chesapeake Bay laid over an OpenStreetMap layer and I know how to change the color of it. Going back to the initial post, my hope with this project is to show change over time in the crab population of the Bay. As a complete novice, I don’t even know if there’s a way for me to do that in QGIS, or if I’m going to make 15 different maps with the 15 years of data and turn images of them into a .gif. So, I went back to ChatGPT for guidance. 

It also told me I could use style changes by time attribute, the TimeManager Plugin, or the manual process I had considered doing with turning a series of images into a .gif. 

I’ll be using the Temporal Controller since it was the first option. I asked ChatGPT for a step-by-step guide of how to do this.

Before getting bogged down in the process of creating the visualization, it’s important to have my data prepped and ready to go. I asked ChatGPT how it needed to be set up in order to use the temporal controller function.

In this case, I’ve decided to not do the thing that ChatGPT says is easier. The “Join External Time Data to Polygon” option seems to involve more data preparation work and be a better process to know for future projects. I began by taking a screen capture of the data table from the Maryland DNR’s Winter Dredge Survey history, uploaded it into ChatGPT, and had it use its OCR capabilities to make a table that I could paste into Excel and save as a .csv.

Step 1. 

Step 2. 

Step 3. 

After looking back at some of the steps in the process of using the temporal controller (Step 2 above), the final product ended up looking like this. I went into the attribute table of the polygon and saw that it already had an assigned ID of “2250”, so I added that column. Additionally, the geometry type is a polygon so that was added, as well.

With that, data preparation was complete and now I’m ready to move on to joining the data table to the polygon and creating the visualization. 

A Novice Takes a Stab at GIS – Part Two

Last week, I was able to settle on what the map I was creating would illustrate and find trustworthy data to use. This week, the focus is on actually creating the map itself. To do this, I need shapefiles of the Chesapeake Bay Watershed. 

I was able to source one from the Chesapeake Bay Program at data-chesbay.opendata.argis.com. This took me a handful of tries as most of the publicly available shape files of the Bay are a polygon of all the land and water considered to be within the Chesapeake Bay watershed. For the purposes of this map, I was looking for just the water itself. 

As a reminder, this is a self-guided process where I’m using ChatGPT to guide me through learning how to use QGIS. I’ve never loaded a shapefile before and ChatGPT gave me clear instructions.

In order to load the shapefile into QGIS, I dragged the downloaded folder, which included .shp, .xml, .shx, .prj, .dbf, and .cpg files, into a blank new project. I felt a brief moment of triumph before realizing that getting the land surrounding the Bay into the project would likely not be as simple, but it was actually even easier. 

QGIS has an OpenStreetMap layer built into the “XYZ Tiles” tab on the left side of the window. I turned it on, reordered the layers so that my shapefile of the water was over top of OSM, and that was all that needed to be done. The program had already lined up the shapefile of the Bay itself perfectly with where OSM had the Bay. 

Now it’s time to go back to Professor ChatGPT. I need to know how to change the color of the shapefile before I can even worry about assigning different colors to different levels of crab population, finding out how to automatically change the color based on data in a table, or anything else. 

Just to practice, I made the Bay crimson. 

Step 1. 

Step 2.

Step 3.

In my next post, I’ll be going back to ChatGPT to learn how I can set up a table of data and instruct QGIS to change the color of the water based on the data in said table. I’m not sure how that will work or look yet, but that’s part of the learning. 

A Novice Takes a Stab at GIS

For the last handful of months, I’ve been working with my father’s company, Cercana Systems, to assist with content marketing and business development. In college, I finished most of a public relations degree at The University of Alabama before the first of my two daughters graced us with her presence and we decided to move back home to Maryland in order for her to actually have her extended family in her life. Since that time, I’ve found myself wanting more and more to find a way to help build things that provide a more tangible contribution to the world around me. Then, two weeks ago, I had the opportunity to attend FedGeoDay 2025 and decided it was time to start teaching myself some novice-level GIS skills. 

This year’s FedGeoDay theme of “open resilience” brought out a host of presenters that were using open geospatial ecosystems to do highly critical work in disaster prediction, preparation, and response. Everyone there was doing work that was well worth doing.

That brings us to the purpose of this blog post. It is the first in a series of undetermined length about me teaching myself how to use QGIS as a first foray into learning GIS skills. I am a complete novice with no coding or technical background of any kind. My hope is that documenting the process of attempting to make my first data-visualization in a commonly used tool may help others that want to delve into the field get past any apprehension they have about getting off the starting block. 

Step 1 – Pick a topic and find data

Initially, I had wanted to create a handful of heat maps that showed the change in the number of youth lacrosse programs in the mid-atlantic over the last 15 years. I turned to Chat GPT to source the data, but the engine was unable to find historical records for me to use.

“I want to build a heat map showing change over time in the concentration of youth lacrosse programs in the states of Maryland, Virginia, Delaware, New Jersey, Pennsylvania, North Carolina, West Virginia, and the District of Columbia. Could you scour the internet for data that I can use for this and put the data in a spreadsheet? The spread sheet should have three columns. The first column will indicate the year, the second will indicate the county, and the third should indicate the number of youth teams in that county. Let’s look at Spring teams only, not Fall, Winter, or Summer. Data should go back 15 years (2010-2025). Let’s consider “youth lacrosse” to be teams for children as old as 8th grade or 14 years of age. Some teams will use the “under X” age categorization, some will use the projected year of high school graduation, and some will just indicate a standard age range or grade level for the children it caters to, so be sure to look out for all of those. Can you do this?”

Compiling a comprehensive dataset detailing the number of youth lacrosse teams by county from 2010 to 2025 across Maryland, Virginia, Delaware, New Jersey, Pennsylvania, North Carolina, West Virginia, and the District of Columbia presents significant challenges. The primary hurdle is the lack of publicly available, granular historical data spanning this 15-year period. While some organizations provide current information, historical records at the county level are scarce.

### Available Data and Resources

* **Maryland Youth Lacrosse Association (MYLA)**: Oversees youth lacrosse in Maryland, offering current season details and age groupings. However, historical data by county is not readily accessible on their website. ([mylalax.com][1])

* **Western Maryland Youth Lacrosse Conference (WMYLC)**: Covers counties in Western Maryland and parts of neighboring states, with a focus on promoting youth lacrosse. Specific team counts by county and year are not provided online. 

* **Anne Arundel County Recreation and Parks**: Provides detailed information on current boys’ lacrosse leagues, including age divisions and game schedules. Historical data is not available on their platform. ([Anne Arundel County Government][2])

* **Inside Lacrosse Articles**: Discuss trends in lacrosse participation, noting shifts in player demographics and regional growth. These articles offer insights but lack specific county-level data over the desired time frame. ([Inside Lacrosse][3])

### Challenges in Data Compilation

* **Data Availability**: Most organizations do not maintain or publicly share historical team counts by county.

In the real world, sourcing and compiling accurate data can often be the bulk of the work done on a GIS project. However the point of this pursuit is to learn how to use QGIS, not spend countless hours calling up the Parks and Rec department of every county in the Mid-Atlantic, so I decided to pivot to something else.

So now, I’m looking for historical data over the last 15 years on the blue crab population in various sections of the Chesapeake Bay estuary. My new goal will be to create one map that shows the places where the population has increased the most, increased the least, and even decreased since 2010. 

This information was readily available from Maryland’s Department of Natural Resources, with one caveat. 

There was plenty of data on blue crab population available, but I wasn’t finding any that was split up into certain regions of the bay. Nonetheless, creating the map and shading the entire Bay based on percent change in population density from the median of the data year-to-year is a good beginner project to learn anything about QGIS at all, so we’re rolling with it. 

Step 2 – Installing QGIS

While it may seem like a silly step to document, this is supposed to be a properly novice guide to making a map in QGIS, and it’s a touch difficult to do that without installing the program. The machine I’m using is a 2020 M1 Macbook Air running Sonoma 14.6.1. I downloaded the installer for the “long term” version of QGIS from qgis.org, went through the install process, and attempted to open it. 

Naturally, my Macbook was less than thrilled that I was attempting to run a program that I hadn’t downloaded from the app store. It was completely blocking me from running the software when I opened it from the main application navigation screen. This issue was resolved by going to the “Applications” folder in Finder and using the control+left click method. A warning popped up about not being able to verify that the application contained no malware, I ran it anyway, and I have not had any issues opening the application since then. 

The next step will be to actually crack QGIS open and begin creating a map of the Chesapeake Bay. 

Geospatial Without Maps

When most people hear “geospatial,” they immediately think of maps. But in many advanced applications, maps never enter the picture at all. Instead, geospatial data becomes a powerful input to machine learning workflows, unlocking insights and automation in ways that don’t require a single visual.

At its core, geospatial data is structured around location—coordinates, areas, movements, or relationships in space. Machine learning models can harness this spatial logic to solve complex problems without ever generating a map. For example:

  • Predictive Maintenance: Utility companies use the GPS coordinates of assets (like transformers or pipelines) to predict failures based on environmental variables like elevation, soil type, or proximity to vegetation (AltexSoft, 2020). No map is needed—only spatially enriched feature sets for training the model.
  • Crop Classification and Yield Prediction: Satellite imagery is commonly processed into grids of numerical features (such as NDVI indices, surface temperature, soil moisture) associated with locations. Models use these purely as tabular inputs to predict crop types or estimate yields (Dash, 2023).
  • Urban Mobility Analysis: Ride-share companies model supply, demand, and surge pricing based on geographic patterns. Inputs like distance to transit hubs, density of trip starts, or average trip speeds by zone feed machine learning models that optimize logistics in real time (MIT Urban Mobility Lab, n.d.).
  • Smart Infrastructure Optimization: Photometrics AI employs geospatial AI to enhance urban lighting systems. By integrating spatial data and AI-driven analytics, it optimizes outdoor lighting to ensure appropriate illumination on streets, sidewalks, crosswalks, and bike lanes while minimizing light pollution in residential areas and natural habitats. This approach not only improves safety and energy efficiency but also supports environmental conservation efforts (EvariLABS, n.d.).

These examples show how spatial logic—such as spatial joins, proximity analysis, and zonal statistics—can drive powerful workflows even when no visualization is involved. In each case, the emphasis shifts from presenting information to enabling analysis and automation. Features are engineered based on where things are, not just what they are. However, once the spatial context is baked into the dataset, the model itself treats location-derived features just like any other numerical or categorical variable.

Using geospatial technology without maps allows organizations to focus on operational efficiency, predictive insights, and automation without the overhead of visualization. In many workflows, the spatial relationships between objects are valuable as data features rather than elements needing human interpretation. By integrating geospatial intelligence directly into machine learning models and decision systems, businesses and governments can act on spatial context faster, at scale, and with greater precision.

To capture these relationships systematically, spatial models like the Dimensionally Extended nine-Intersection Model (DE-9IM) (Clementini & Felice, 1993) provide a critical foundation. In traditional relational databases, connections between records are typically simple—one-to-one, one-to-many, or many-to-many—and must be explicitly designed and maintained. DE-9IM extends this by defining nuanced geometric interactions, such as overlapping, touching, containment, or disjointness, which are implicit in the spatial nature of geographic objects. This significantly reduces the design and maintenance overhead while allowing for much richer, more dynamic spatial relationships to be leveraged in analysis and workflows.

By embedding DE-9IM spatial predicates into machine learning workflows, organizations can extract richer, context-aware features from their data. For example, rather than merely knowing two infrastructure assets are ‘related,’ DE-9IM enables classification of whether one is physically inside a risk zone, adjacent to a hazard, or entirely separate—substantially improving the precision of classification models, risk assessments, and operational planning.

Machine learning and AI systems benefit from the DE-9IM framework by gaining access to structured, machine-readable spatial relationships without requiring manual feature engineering. Instead of inferring spatial context from raw coordinates or designing custom proximity rules, models can directly leverage DE-9IM predicates as input features. This enhances model performance in tasks such as spatial clustering, anomaly detection, and context-aware classification, where the precise nature of spatial interactions often carries critical predictive signals. Integrating DE-9IM into AI pipelines streamlines spatial feature extraction, improves model explainability, and reduces the risk of omitting important spatial dependencies.

Harnessing geospatial intelligence without relying on maps opens up powerful new pathways for innovation, operational excellence, and automation. Whether optimizing infrastructure, improving predictive maintenance, or enriching machine learning models with spatial logic, organizations can leverage these techniques to achieve better outcomes with less overhead. At Cercana Systems, we specialize in helping clients turn geospatial data into actionable insights that drive real-world results. Ready to put geospatial AI to work for you? Contact us today to learn how we can help you modernize and optimize your data-driven workflows.

References

Clementini, E., & Felice, P. D. (1993). A model for representing topological relationships between complex geometric objects. ACM Transactions on Information Systems, 11(2), 161–193. https://doi.org/10.1016/0020-0255(95)00289-8

AltexSoft. (2020). Predictive maintenance: Employing IIoT and machine learning to prevent equipment failures. AltexSoft. https://www.altexsoft.com/blog/predictive-maintenance/

Dash, S. K. (2023, May 10). Crop classification via satellite image time-series and PSETAE deep learning model. Medium. https://medium.com/geoai/crop-classification-via-satellite-image-time-series-and-psetae-deep-learning-model-c685bfb52ce

MIT Urban Mobility Lab. (n.d.). Machine learning for transportation. Massachusetts Institute of Technology. https://mobility.mit.edu/machine-learning

EvariLABS. (2025, April 14). Photometrics AI. https://www.linkedin.com/pulse/what-counts-real-roi-streetlight-owners-operators-photometricsai-vqv7c/

Reflections on the Process of Planning FedGeoDay 2025

What is FedGeoDay?

FedGeoDay is a single-track conference dedicated to federal use-cases of open geospatial ecosystems. The open ecosystems have a wide variety of uses and forms, but largely include anything designed around open data, open source software, and open standards. The main event is a one day commitment and is followed by a day of optional hands-on workshops. 

FedGeoDay has existed for roughly a decade , serving as a day of learning, networking, and collaboration in the Washington, D.C. area. Recently, Cercana Systems president Bill Dollins was invited to join the planning committee, and served as one of the co-chairs for FedGeoDay 2024 and 2025. His hope is that attendees are able to come away with practical examples of how to effectively use open geospatial ecosystems in their jobs. 

Photo courtesy of OpenStreetMap US on LinkedIn.

“Sometimes the discussion around those concepts can be highly technical and even a little esoteric, and that’s not necessarily helpful for someone who’s just got a day job that revolves around solving a problem. Events like this are very helpful in showing practical ways that open software and open data can be used.”

Dollins joined the committee for a multitude of reasons. In this post, we will explore some of his reasons for joining, as well as what he thinks he brings to the table in planning the event and things he has learned from the process. 

Why did you join the committee?

When asked for some of the reasons why he joined the planning committee for FedGeoDay, Dollins indicated that his primary purpose was to give back to a community that has been very helpful and valuable to him throughout his career in a very hands-on way. 

“In my business, I derive a lot of value from open-source software. I use it a lot in the solutions I deliver in my consulting, and when you’re using open-source software you should find a way that works for you to give back to the community that developed it. That can come in a number of ways. That can be contributing code back to the projects that you use to make them better. You can develop documentation for it, you can provide funding, or you can provide education, advocacy, and outreach. Those last three components are a big part of what FedGeoDay does.”

He also says that while being a co-chair of such an impactful event helps him maintain visibility in the community, getting the opportunity to keep his team working skills fresh was important to him, too. 

“For me, also, I’m self-employed. Essentially, I am my team,” said Dollins. “It can be really easy to sit at your desk and deliver things and sort of lose those skills.”

What do you think you brought to the committee?

Dollins has had a long career in the geospatial field and has spent the majority of his time in leadership positions, so he was confident in his ability to contribute in this new form of leadership role. Event planning is a beast of its own, but early on in the more junior roles of his career, the senior leadership around him went out of their way to teach him about project cost management, staffing, and planning agendas. He then was able to take those skills into a partner role at a small contracting firm where he wore every hat he could fit on his head for the next 15 years, including still doing a lot of technical and development work. Following his time there, he had the opportunity to join the C-suite of a private sector SaaS company and was there for six years, really rounding out his leadership experience. 

He felt one thing he was lacking in was experience in community engagement, and event planning is a great way to develop those skills. 

“Luckily, there’s a core group of people who have been planning and organizing these events for several years. They’re generally always happy to get additional help and they’re really encouraging and really patient in showing you the rules of the road, so that’s been beneficial, but my core skills around leadership were what applied most directly. It also didn’t hurt that I’ve worked with geospatial technology for over 30 years and open-source geospatial technology for almost 20, so I understood the community these events serve and the technology they are centered around,” said Dollins.

Photo courtesy of Ran Goldblatt on LinkedIn.

What were some of the hard decisions that had to be made?

Photo Courtesy of Cercana Systems on LinkedIn.

Attendees of FedGeoDay in previous years will likely remember that, in the past, the event has always been free for feds to attend. The planning committee, upon examining the revenue sheets from last year’s event, noted that the single largest unaccounted for cost was the free luncheon. A post-event survey was sent out, and federal attendees largely indicated that they would not take issue with contributing $20 to cover the cost of lunch. However, the landscape of the community changed in a manner most people did not see coming.

“We made the decision last year, and keep in mind the tickets went on sale before the change of administration, so at the time we made the decision last year it looked like a pretty low-risk thing to do,” said Dollins.

Dollins continued to say that while the landscape changes any time the administration changes, even without changing parties in power, this one has been a particularly jarring change. 

“There’s probably a case to be made that we could have bumped up the cost of some of the sponsorships and possibly the industry tickets a little bit and made an attempt to close the gap that way. We’ll have to see what the numbers look like at the end. The most obvious variable cost was the cost of lunches against the free tickets, so it made sense to do last year and we’ll just have to look and see how the numbers play out this year.”**

What have you taken away from this experience?

Dollins says one of the biggest takeaways from the process of helping to plan FedGeoDay has been learning to apply leadership in a different context. Throughout most of his career, he has served as a leader in more traditional team structures with a clearly defined hierarchy and specified roles. When working with a team of volunteers that have their own day jobs to be primarily concerned with, it requires a different approach. 

“Everyone’s got a point of view, everyone’s a professional and generally a peer of yours, and so there’s a lot more dialogue. The other aspect is that it also means everyone else has a day job, so sometimes there’s an important meeting and the one person that you needed to be there couldn’t do it because of that. You have to be able to be a lot more asynchronous in the way you do these things. That’s a good thing to give you a different approach to leadership and team work,” said Dollins on the growth opportunity. 

Dollins has even picked up some new work from his efforts on the planning committee by virtue of getting to work and network with people that weren’t necessarily in his circle beforehand. Though he’s worked in the geospatial field for 30 years and focused heavily on open-source work for 20, he says he felt hidden away from the community in a sense during his time in the private sector. 

Photo courtesy of Lane Goodman on LinkedIn.

“This has helped me get back circulating in the community and to be perceived in a different way. In my previous iterations, I was seen mainly from a technical perspective, and so this has kind of helped me let the community see me in a different capacity, which I think has been beneficial.”

FedGeoDay 2025 has concluded and was a huge success for all involved. Cercana Systems looks forward to continuing to sponsor the event going forward, and Dollins looks forward to continuing to help this impactful event bring the community together in the future. 

Photo courtesy of Cercana Systems on LinkedIn.

**This interview was conducted before FedGeoDay 2025 took place. The event exceeded the attendance levels of FedGeoDay 2024. 

Demystifying the Medallion Architecture for Geospatial Data Processing

Introduction

Geospatial data volumes and complexity are growing due to diverse sources, such as GPS, satellite imagery, and sensor data. Traditional geospatial processing methods face challenges, including scalability, handling various formats, and ensuring data consistency. The medallion architecture offers a layered approach to data management, improving data processing, reliability, and scalability. While the medallion architecture is often associated with specific implementation such as the Delta Lake, its concepts are applicable to other technical implementations. This post introduces the medallion architecture and discusses two workflows—traditional GIS-based and advanced cloud-native—to demonstrate how it can be applied to geospatial data processing.

Overview of the Medallion Architecture

The medallion architecture was developed to address the need for incremental, layered data processing, especially in big data and analytics environments. It is composed of three layers:

  • Bronze Layer: Stores raw data as-is from various sources.
  • Silver Layer: Cleans and transforms data for consistency and enrichment.
  • Gold Layer: Contains aggregated and optimized data ready for analysis and visualization.

The architecture is particularly useful in geospatial applications due to its ability to handle large datasets, maintain data lineage, and support both batch and real-time data processing. This structured approach ensures that data quality improves progressively, making downstream consumption more reliable and efficient.

Why Geospatial Data Architects Should Consider the Medallion Architecture

Geospatial data processing involves unique challenges, such as handling different formats (raster, vector), managing spatial operations (joins, buffers), and accommodating varying data sizes. Traditional methods struggle when scaling to large, real-time datasets or integrating data from multiple sources. The medallion architecture addresses these challenges through its layered approach. The bronze layer preserves the integrity of raw data, allowing for transformations to be traced easily. The silver layer handles transformations of the data, such as projections, spatial joins, and data enrichment. The gold layer provides ready-to-consume, performance optimized data ready for downstream systems. 

Example Workflow 1: Traditional GIS-Based Workflow  

For organizations that rely on established GIS tools or operate with limited cloud infrastructure, the medallion architecture provides a structured approach to data management while maintaining compatibility with traditional workflows. This method ensures efficient handling of both vector and raster data, leveraging familiar GIS technologies while optimizing data accessibility and performance.  

This workflow integrates key technologies to support data ingestion, processing, and visualization. FME serves as the primary ETL tool, streamlining data movement and transformation. Object storage solutions like AWS S3 or Azure Blob Storage store raw spatial data, ensuring scalable and cost-effective management. PostGIS enables spatial analysis and processing for vector datasets. Cloud-Optimized GeoTIFFs (COGs) facilitate efficient access to large raster datasets by allowing partial file reads, reducing storage and processing overhead. 

Bronze – Raw Data Ingestion 

The process begins with the ingestion of raw spatial data into object storage. Vector datasets, such as Shapefiles and CSVs containing spatial attributes, are uploaded alongside raster datasets like GeoTIFFs. FME plays a crucial role in automating this ingestion, ensuring that all incoming data is systematically organized and accessible for further processing.  

Silver – Data Cleaning and Processing

At this stage, vector data is loaded into PostGIS, where essential transformations take place. Operations such as spatial joins, coordinate system projections, and attribute filtering help refine the dataset for analytical use. Meanwhile, raster data undergoes optimization through conversion into COGs using FME. This transformation enhances performance by enabling GIS applications to read only the necessary portions of large imagery files, improving efficiency in spatial analysis and visualization.  

Gold – Optimized Data for Analysis and Visualization  

Once processed, the refined vector data in PostGIS and optimized raster datasets in COG format are made available for GIS tools. Analysts and decision-makers can interact with the data using platforms such as QGIS, Tableau, or Geoserver. These tools provide the necessary visualization and analytical capabilities, allowing users to generate maps, conduct spatial analyses, and derive actionable insights.  

This traditional GIS-based implementation of medallion architecture offers several advantages. It leverages established GIS tools and workflows, minimizing the need for extensive retraining or infrastructure changes. It is optimized for traditional environments yet still provides the flexibility to integrate with hybrid or cloud-based analytics platforms. Additionally, it enhances data accessibility and performance, ensuring that spatial datasets remain efficient and manageable for analysis and visualization.  

By adopting this workflow, organizations can modernize their spatial data management practices while maintaining compatibility with familiar GIS tools, resulting in a seamless transition toward more structured and optimized data handling. 

Example Workflow 2: Advanced Cloud-Native Workflow  

For organizations managing large-scale spatial datasets and requiring high-performance processing in cloud environments, a cloud-native approach to medallion architecture provides scalability, efficiency, and advanced analytics capabilities. By leveraging distributed computing and modern storage solutions, this workflow enables seamless processing of vector and raster data while maintaining cost efficiency and performance.  

This workflow is powered by cutting-edge cloud-native technologies that optimize storage, processing, and version control. 

Object Storage solutions such as AWS S3, Google Cloud Storage, or Azure Blob Storage serve as the foundation for storing raw geospatial data, ensuring scalable and cost-effective data management. Apache Spark with Apache Sedona enables large-scale spatial data processing, leveraging distributed computing to handle complex spatial joins, transformations, and aggregations. Delta Lake provides structured data management, supporting versioning and ACID transactions to ensure data integrity throughout processing. RasterFrames or Rasterio facilitate raster data transformations, including operations like mosaicking, resampling, and reprojection, while optimizing data storage and retrieval.  

Bronze – Raw Data Ingestion

The workflow begins by ingesting raw spatial data into object storage. This includes vector data such as GPS logs in CSV format and raster data like satellite imagery stored as GeoTIFFs. By leveraging cloud-based storage solutions, organizations can manage and access massive datasets without traditional on-premises limitations.  

Silver – Data Processing and Transformation

At this stage, vector data undergoes large-scale processing using Spark with Sedona. Distributed spatial operations such as filtering, joins, and projections enable efficient refinement of large datasets. Meanwhile, raster data is transformed using RasterFrames or Rasterio, which facilitate operations like mosaicking, resampling, and metadata extraction. These tools ensure that raster datasets are optimized for both analytical workloads and visualization purposes.  

Gold – Optimized Data for Analysis and Visualization

Once processed, vector data is stored in Delta Lake, where it benefits from structured storage, versioning, and enhanced querying capabilities. This ensures that analysts can access well-maintained datasets with full historical tracking. Optimized raster data is converted into Cloud-Optimized GeoTIFFs, allowing efficient cloud-based visualization and integration with GIS tools. These refined datasets can then be used in cloud analytics environments or GIS platforms for advanced spatial analysis and decision-making.  

This cloud-native implementation of medallion architecture provides several advantages for large-scale spatial data workflows. It features high scalability, enabling efficient processing of vast datasets without the constraints of traditional infrastructure, parallelized data transformations, significantly reducing processing time through distributed computing frameworks, and cloud-native optimizations, ensuring seamless integration with advanced analytics platforms, storage solutions, and visualization tools.  

By adopting this approach, organizations can harness the power of cloud computing to manage, analyze, and visualize geospatial data at an unprecedented scale, improving both efficiency and insight generation.  

Comparing the Two Workflows

AspectTraditional Workflow (FME + PostGIS)Advanced Workflow (Spark + Delta Lake)
ScalabilitySuitable for small to medium workloadsIdeal for large-scale datasets
TechnologiesFME, PostGIS, COGs, file system or object storageSpark, Sedona, Delta Lake, RasterFrames, object storage
Processing MethodSequential or batch processingParallel and distributed processing
PerformanceLimited by local infrastructure or on-premise serversOptimized for cloud-native and distributed environments
Use CasesSmall teams, traditional GIS setups, hybrid cloud setupsLarge organizations, big data environments

Key Takeaways

The medallion architecture offers much needed flexibility and scalability for geospatial data processing. It meshes well with traditional workflows using FME and PostGIS, which is effective for organizations with established GIS infrastructure. Additionally, it can be used in cloud-native workflows using Apache Spark and Delta Lake to provide scalability for large-scale processing. Both of these workflows can be adapted depending on the organization’s technological maturity and requirements. 

Conclusion

Medallion architecture provides a structured, scalable approach to geospatial data management, ensuring better data quality and streamlined processing. Whether using a traditional GIS-based workflow or an advanced cloud-native approach, this framework helps organizations refine raw spatial data into high-value insights. By assessing their infrastructure and data needs, teams can adopt the workflow that best aligns with their goals, optimizing efficiency and unlocking the full potential of their geospatial data.

Applying Porter’s Five Forces to Open-Source Geospatial

Introduction

The geospatial industry has seen significant transformation with the rise of open-source solutions. Tools like QGIS, PostGIS, OpenLayers, and GDAL have provided alternatives to proprietary GIS software, providing cost-effective, customizable, and community-driven mapping and spatial analysis capabilities. While open-source GIS thrives on collaboration and accessibility, it still operates within a competitive landscape influenced by external pressures.

Applying Porter’s Five Forces, a framework for competitive analysis developed by Michael E. Porter in 1979, allows us to analyze the industry dynamics and understand the challenges and opportunities open-source GIS solutions face. The five forces include the threat of new entrants, bargaining power of suppliers, industry rivalry, bargaining power of buyers, and the threat of substitutes. We will explore how these forces shape the world of open-source geospatial technology.

Porter’s Five Forces was conceived to analyze traditional market-driven dynamics. While open-source software development is not necessarily driven by a profit motive, successful open-source projects require thriving, supportive communities. Such communities still require resources – either money or, even more importantly and scarce, time. As a result, a certain amount of market thinking can be useful when considering adoption of open-source into your operations or starting a new project.

Porter articulated the five forces in terms of “threats” and “power” and “rivalry.” We have chosen to retain that language here for alignment with the model but, in the open-source world, many of these threats can represent opportunities for greater collaboration.

1. Threat of New Entrants: Low to Moderate

The barriers to entry in open-source geospatial solutions are low for basic tool development compared to proprietary software development. Developers can utilize existing open-source libraries, open geospatial data, and community-driven documentation to build new tools with minimal investment.

However, gaining significant adoption or community traction presents higher barriers than described in traditional new entrant scenarios. Well-established open-source solutions like QGIS, PostGIS, and OpenLayers have strong community backing and extensive documentation, making it challenging for new entrants to attract users.

New players may find success by focusing on novel or emerging use case areas like AI-powered GIS, cloud-based mapping solutions, or real-time spatial analytics. Companies that provide specialized integrations or enhancements to existing open-source GIS tools may also gain traction. DuckDB and its edge-deployability is a good example of this.

While new tools are relatively easy to develop, achieving broad community engagement often requires differentiation, sustained innovation, and compatibility with established standards and ecosystems.

2. Bargaining Power of Suppliers: Low to Moderate

Unlike proprietary GIS, where vendors control software access, open-source GIS minimizes supplier dependence due to its open standards and community-driven development. The availability of open geospatial datasets (e.g., OpenStreetMap, NASA Earthdata, USGS) further reduces the influence of traditional suppliers.

Moderate supplier power can arise in scenarios where users depend heavily on specific service providers for enterprise-level support, long-term maintenance, or proprietary enhancements (e.g., enterprise hosting or AI-powered extensions). Companies offering such services, like Red Hat’s model for Linux, could gain localized influence over organizations that require continuous, tailored support.

However, competition among service providers ensures that no single vendor holds significant leverage. This can work to the benefit of users, who often require lifecycle support. Localized supplier influence can grow in enterprise settings where long-term support contracts are critical, making it a consideration in high-complexity deployments.

3. Industry Rivalry: Moderate to High

While open-source GIS tools are developed with a collaborative ethos, competition still exists, particularly in terms of user adoption, funding, and enterprise contracts. Users typically don’t choose multiple solutions in a single category, so a level of de facto competition is implied even though open-source projects don’t explicitly and directly compete with each other in the same manner as proprietary software.

  • Open-source projects compete for users: QGIS, GRASS GIS, and gvSIG compete in desktop GIS; OpenLayers, Leaflet, and MapLibre compete in web mapping.
  • Enterprise support: Companies providing commercial support for open-source GIS tools compete for government and business contracts.
  • Competition from proprietary GIS: Esri, Google Maps, and Hexagon offer integrated GIS solutions with robust support, putting pressure on open-source tools to keep innovating.

However, open-source collaboration reduces direct rivalry. Many projects integrate with one another (e.g., PostGIS works alongside QGIS), creating a cooperative rather than competitive environment. While open-source GIS projects indirectly compete for users and funding, collaboration mitigates this and creates shared value. 

Emerging competition from cloud-native platforms and real-time analytics tools, such as SaaS GIS and geospatial AI services, increases rivalry. As geospatial technology evolves, integrating AI and cloud functionalities may determine long-term competitiveness.

When looking to adopt open-source, consider that loose coupling through the use of open standards can add greater value. When considering starting a new open-source project, have integration and standardization in mind to potentially increase adoption.

4. Bargaining Power of Buyers: Moderate

In the case of open-source, “bargaining” refers to the ability of the user to switch between projects, rather than a form of direct negotiation. The bargaining power of buyers in the open-source GIS space is significant, primarily due to the lack of upfront capital expenditure. This financial flexibility enables users to explore and switch between tools without major cost concerns. While both organizational and individual users have numerous alternatives across different categories, this flexibility does not necessarily translate to strong influence over the software’s development.

Key factors influencing buyer power:

  • Minimal financial lock-in: In the early stages of adoption, users can easily migrate between open-source tools. However, as organizations invest more time in customization, workflow integration, and user training, switching costs increase, gradually reducing their flexibility.
  • Community-driven and self-support options: Buyers can access free support through online forums, GitHub repositories, and community-driven resources, lowering their dependence on paid services.
  • Customizability and adaptability: Open-source GIS software allows organizations to tailor the tools to their specific needs without vendor constraints. However, creating a custom version (or “fork”) requires caution, as it could result in a bespoke solution that the organization must maintain independently.

To maximize their influence, new users should familiarize themselves with the project’s community and actively participate by submitting bug reports, fixes, or documentation. Consistent contributions aligned with community practices can gradually enhance a user’s role and influence over time.

For large enterprises and government agencies, long-term support requirements – especially for mission-critical applications – can reduce their flexibility and bargaining power over time. This dependency highlights the importance of enterprise-level agreements in managing risk.

5. Threat of Substitutes: Moderate to High

Substitutes for open-source GIS tools refer to alternatives that provide similar functionality. These substitutes include:

  • Proprietary GIS software: Tools like ArcGIS, Google Maps, and Hexagon are preferred by many organizations due to their perceived stability, advanced features, and enterprise-level support.
  • Cloud-based and SaaS GIS platforms: Services such as Felt, MapIdea, Atlas, Mapbox, and CARTO offer user-friendly, web-based mapping solutions with minimal infrastructure requirements.
  • Business Intelligence (BI) and AI-driven analytics: Platforms like Tableau, Power BI, and AI-driven geospatial tools can partially or fully replace traditional GIS in certain applications.
  • Other open-source GIS tools: Users can switch between alternatives like QGIS, GRASS, OpenLayers, or MapServer with minimal switching costs.

However, open-source GIS tools often complement rather than fully replace proprietary systems. For instance, libraries like GDAL and GeoPandas are frequently used alongside proprietary solutions like ArcGIS. Additionally, many SaaS platforms incorporate open-source components, offering organizations a hybrid approach that minimizes infrastructure investment while leveraging open-source capabilities.

The emergence of AI-driven spatial analysis and real-time location intelligence platforms is increasingly positioning them as partial substitutes to traditional GIS, intensifying this threat. As these technologies mature, hybrid models integrating both open-source and proprietary elements will become more common.

Conclusion

Porter’s Five Forces analysis reveals that open-source geospatial solutions exist in a highly competitive and evolving landscape. While they benefit from free access, strong community support, and low supplier dependence, they also face competition from proprietary GIS, SaaS-based alternatives, and substitutes like AI-driven geospatial analytics.

To remain competitive, open-source GIS projects must not only innovate in cloud integration and AI-enhanced spatial analysis but also respond to the shifting landscape of real-time analytics and SaaS-based delivery models. Strengthening enterprise support, improving user-friendliness, and maintaining strong community engagement will be key to their long-term sustainability.

As geospatial technology advances, open-source GIS will continue to play a crucial role in democratizing access to spatial data and analytics, offering an alternative to fully proprietary systems while fostering collaboration and technological growth.

To learn more about how Cercana can help you develop your open-source geospatial strategy, contact us here.

Using Hstore to Analyze OSM in PostgreSQL

OpenStreetMap (OSM) is a primary authoritative source of geographic information, offering a variety of community-validated feature types. However, efficiently querying and analyzing OSM poses unique challenges. PostgreSQL, with its hstore data type, can be a powerful tool in the data analyst’s arsenal.

Understanding hstore in PostgreSQL

Before getting into the specifics of OpenStreetMap, let’s understand the hstore data type. Hstore is a key-value store within PostgreSQL, allowing data to be stored in a schema-less fashion. This flexibility makes it ideal for handling semi-structured data like OpenStreetMap.

Setting Up Your Environment

To get started, you’ll need a PostgreSQL database with PostGIS extension, which adds support for geographic objects. You will also need to add support for the hstore type. Both PostGIS and hstore are installed as extensions. The SQL to install them is:

create extension postgis;
create extension hstore;

After setting up your database, import OpenStreetMap data using tools like osm2pgsql, ensuring to import the data with the hstore option enabled. This step is crucial as it allows the key-value pairs of OSM tags to be stored in an hstore column. Be sure to install osm2pgsql using the instructions for your platform.

The syntax for importing is as follows:

osm2pgsql -c -d my_database -U my_username -W -H my_host -P my_port --hstore my_downloaded.osm

Querying OpenStreetMap Data

With your data imported, you can now unleash the power of hstore. Here’s a basic example: Let’s say you want to find all the coffee shops in a specific area. The SQL query would look something like this:

SELECT name, tags
FROM planet_osm_point
where name is not null
and tags -> 'cuisine' = 'pizza'

This query demonstrates the power of using hstore to filter data based on specific key-value pairs (finding pizza shops in this case).

Advanced Analysis Techniques

While basic queries are useful, the real power of hstore comes with its ability to facilitate complex analyses. For example, you can aggregate data based on certain criteria, such as counting the number of amenities in a given area or categorizing roads based on their condition.

Here is an example that totals the sources for each type of cuisine available in Leonardtown, Maryland:

SELECT tags -> 'cuisine' AS amenity_type, COUNT(*) AS total
FROM planet_osm_point
WHERE tags ? 'cuisine'
AND ST_Within(ST_Transform(way, 4326), ST_MakeEnvelope(-76.66779675183034, 38.285044882153485, -76.62251613561185, 38.31911201477845, 4326))
GROUP BY tags -> 'cuisine'
ORDER BY total DESC;

The above query combines hstore analysis with a PostGIS function to limit the query to a specific area. The full range of PostGIS functions can be used to perform spatial analysis in combination with hstore queries. For instance, you could analyze the spatial distribution of certain amenities, like public toilets or bus stops, within a city. You can use PostGIS functions to calculate distances, create buffers, and perform spatial joins.

Performance Considerations

Working with large datasets like OpenStreetMap can be resource-intensive. Indexing your hstore column is crucial for performance. Creating GIN (Generalized Inverted Index) indexes on hstore columns can significantly speed up query times.

Challenges and Best Practices

While hstore is powerful, it also comes with challenges. The schema-less nature of hstore can lead to inconsistencies in data, especially if the source data is not standardized. It’s important to clean and preprocess your data before analysis. OSM tends to preserve local flavor in attribution, so a good knowledge of the geographic area you are analyzing will help you be more successful when using hstore with OSM.

Conclusion

The PostgreSQL hstore data type is a potent tool for analyzing OpenStreetMap data. Its flexibility in handling semi-structured data, combined with the spatial analysis capabilities of PostGIS, makes it an compelling resource for geospatial analysts. By understanding its strengths and limitations, you can harness the power of PostgreSQL and OpenStreetMap in your work.

Remember, the key to effective data analysis is not just about choosing the right tools but also understanding the data itself. With PostgreSQL and hstore, you are well-equipped to extract meaningful insights from OpenStreetMap data.

Contact us to learn more about our services and how we can help turn your geospatial challenges into opportunities.

Do You Need a Data Pipeline?

Do you need a data pipeline? That depends on a few things. Does your organization see data as an input into its key decisions? Is data a product? Do you deal with large volumes of data or data from disparate sources? Depending on the answers to these and other questions, you may be looking at the need for a data pipeline. But what is a data pipeline and what are the considerations for implementing one, especially if your organization deals heavily with geospatial data? This post will examine those issues.

A data pipeline is a set of actions that extract, transform, and load data from one system to another. A data pipeline may be set up to run on a specific schedule (e.g., every night at midnight), or it might be event-driven, running in response to specific triggers or actions. Data pipelines are critical to data-driven organizations, as key information may need to be synthesized from various systems or sources. A data pipeline automates accepted processes, enabling data to be efficiently and reliably moved and transformed for analysis and decision-making.

A data pipeline can start small – maybe a set of shell or python scripts that run on a schedule – and it can be modified to grow along with your organization to the point where it may be driven my a full-fledged event-driven platform like AirFlow or FME (discussed later). It can be confusing, and there are a lot of commercial and open-source solutions available, so we’ll try to demystify data pipelines in this post.

Geospatial data presents unique challenges in data pipelines. Geospatial data are often large and complex, containing multiple dimensions of information (geometry, elevation, time, etc.). Processing and transforming this data can be computationally intensive and may require significant storage capacity. Managing this complexity efficiently is a major challenge. Data quality and accuracy is also a challenge. Geospatial data can come from a variety of sources (satellites, sensors, user inputs, etc.) and can be prone to errors, inconsistencies, or inaccuracies. Ensuring data quality – dealing with missing data, handling noise and outliers, verifying accuracy of coordinates – adds complexity to standard data management processes.

Standardization and interoperability challenges, while not unique to geospatial data, present additional challenges due to the nature of the data. There are many different formats, standards, and coordinate systems used in geospatial data (for example, reconciling coordinate systems between WGS84, Mercator, state plane, and various national grids). Transforming between these can be complex, due to issues such as datum transformation. Furthermore, metadata (data about the data) is crucial in geospatial datasets to understand the context, source, and reliability of the data, which adds another layer of complexity to the processing pipeline.

While these challenges make the design, implementation, and management of data pipelines for geospatial data a complex task, they can provide significant benefits to organizations that process large amounts of geospatial data:

  • Efficiency and automation: Data pipelines can automate the entire process of data extraction, transformation, and loading (ETL). Automation is particularly powerful in the transformation stage. “Transformation” is a deceptively simple term for a process that can contain many enrichment and standardization tasks. For example, as the coordinate system transformations described above are validated, they can be automated and included in the transformation stage to remove human error. Additionally, tools like Segment Anything can be called during this stage to turn imagery into actionable, analyst-ready information.
  • Data quality and consistency: The transformation phase includes steps to clean and validate data, helping to ensure data quality. This can include resolving inconsistencies, filling in missing values, normalizing data, and validating the format and accuracy of geospatial coordinates. By standardizing and automating these operations in a pipeline, an organization can ensure that the same operations are applied consistently to all data, improving overall data quality and reliability.
  • Data Integration: So far, we’ve talked a lot about the transformation phase, but the extract phase provides integration benefits. A data pipeline allows for the integration of diverse data sources, such as your CRM, ERP, or support ticketing system. It also enables extraction from a wide variety of formats (shapefile, GeoParquet, GeoJSON, GeoPackage, etc). This is crucial for organizations dealing with geospatial data, as it often comes from a variety of sources in different formats. Integration with data from business systems can provide insights into performance as relates to the use of geospatial data. 
  • Staging analyst-ready data: With good execution, a data pipeline produces clean, consistent, integrated data that enables people to conduct advanced analysis, such as predictive modeling, machine learning, or complex geospatial statistical analysis. This can provide valuable insights and support data-driven decision making.

A data pipeline is first and foremost about automating accepted data acquisition and management processes for your organization, but it is ultimately a technical architecture that will be added to your portfolio. The technology ecosystem for such tools is vast, but we will discuss a few with which we have experience.

  • Apache Airflow: Developed by Airbnb and later donated to the Apache Foundation, Airflow is a platform to programmatically author, schedule, and monitor workflows. It uses directed acyclic graphs (DAGs) to manage workflow orchestration. It supports a wide range of integrations and is highly customizable, making it a popular choice for complex data pipelines. AirFlow is capable of being your entire data pipeline.
  • GDAL/OGR: The Geospatial Data Abstraction Library (GDAL) is an open-source, translator library for raster and vector geospatial data formats. It provides a unified API for over 200 types of geospatial data formats, allowing developers to write applications that are format-agnostic. GDAL supports various operations like format conversion, data extraction, reprojection, and mosaicking. It is used in GIS software like QGIS, ArcGIS, and PostGIS. As a library it can also be used in large data processing tasks and in AirFlow workflows. Its flexibility makes it a powerful component of a data pipeline, especially where support for geospatial data is required.
  • FME: FME is a data integration platform developed by Safe Software. It allows users to connect and transform data between over 450 different formats, including geospatial, tabular, and more. With its visual interface, users can create complex data transformation workflows without coding. FME’s capabilities include data validation, transformation, integration, and distribution. FME in the geospatial information market and is the most geospatially literate commercial product in the data integration segment. In addition it supports a wide range of non-spatial sources, including proprietary platforms such as Salesforce. FME has a wide range of components, making it possible for it to scale up to support enterprise-scale data pipelines.

In addition to the tools listed above, there is a fairly crowded market segment for hosted solutions, known as “integration platform as a service” or IPaaS. These platforms all generally have ready-made connectors for various sources and destinations, but spatial awareness tends to be limited, as does customization options for adding spatial. A good data pipeline is tightly coupled to the data governance procedures of your organization, so you’ll see greater benefits from technologies that allow you customize to your needs.

Back to the original question: Do you need a data pipeline? If data-driven decisions are key to your organization, and consistent data governance is necessary to have confidence in your decisions, then you may need a data pipeline. At Cercana, we have experience implementing data pipelines and data governance procedures for organizations large and small. Contact us today to learn more about how we can help you.