Cercana Executive Briefing — Week of March 28–April 3, 2026

152 feeds monitored. Published April 3, 2026.

Executive Summary

The most consequential development this week was the publication of the CNG Geo-Embeddings Sprint report, which moved earth observation embeddings from an emerging research thread into the standards-drafting phase. Co-hosted by CNG, Planet, and Clark University, the March sprint produced concrete patterns for storing, cataloging, and accessing EO embeddings. This is the kind of infrastructure specification work that typically precedes commercial adoption. This matters because embeddings are one of the mechanisms through which satellite imagery can be translated into forms that AI systems can use at scale. Organizations making infrastructure decisions about their EO data pipelines should be watching this thread closely. Standards that solidify here will shape which analytics platforms interoperate and which become walled gardens.

In parallel, the defense and intelligence conversation intensified. Project Geospatial published two substantial pieces, one on GeoAI-driven military targeting ethics and another on the geopolitics of quantum gravity gradiometry, while Octave rebranded Luciad as Alto 2026.0 with explicit cyberthreat visualization capabilities for defense. Taken together, these developments suggest that the defense geospatial market is expanding its technical scope while also confronting the ethical consequences of that expansion. Meanwhile, Canada’s national geospatial strategy consultations drew critical coverage revealing a system with depth but without alignment, and Australia launched the Locus Alliance to replace its collapsed national geospatial body. The pattern across both is institutional. Countries are renegotiating how geospatial infrastructure is governed, and the outcomes will likely shape procurement structures for years.

Major Market Signals

EO Embeddings Move from Research to Standards

The CNG Geo-Embeddings Sprint brought together Planet, Clark University, and other organizations to draft best practices for storing, cataloging, and accessing Earth observation embeddings. This is not another AI capabilities announcement. It is infrastructure specification work. When the EO community starts defining how embeddings are stored and discovered, it suggests that the technology has matured enough for interoperability to matter. For platform vendors and data infrastructure buyers, this is the stage at which architectural decisions can begin to lock in compatibility or isolation. The sprint outputs are headed for community review, which means a public comment period that organizations with EO data pipelines should engage with.

Defense Geospatial Expands Scope and Confronts Ethics Simultaneously

Two distinct but convergent threads emerged this week. Octave launched Alto 2026.0, the rebranded Luciad platform, adding cyberthreat visualization for defense and extending geospatial situational awareness into the cyber domain. Simultaneously, Project Geospatial published a deeply personal account of how GeoAI-driven military targeting is eroding the oversight structures that governed intelligence operations for decades. That convergence is the real signal. The defense geospatial market is rapidly expanding what these tools can do while governance frameworks struggle to keep pace. For vendors entering the defense space, the ethics conversation is no longer peripheral. It is becoming a procurement consideration.

National Geospatial Governance Under Reconstruction

Canada’s NRCan geospatial strategy consultations drew critical analysis from GoGeomatics and EarthStuff, revealing systemic gaps in coordination, infrastructure governance, and institutional alignment. In Australia, the newly formed Locus Alliance launched to fill the void left by the collapsed Geospatial Council of Australia. These are not isolated developments. Multiple countries are simultaneously renegotiating how geospatial infrastructure is governed at the national level. For vendors and service providers, the restructuring of national geospatial bodies directly shapes procurement pipelines, standards adoption, and public-sector contract structures.

Quantum Sensing Enters the Geospatial Regulatory Conversation

Project Geospatial’s deep analysis of quantum gravity gradiometry regulation, combined with SBQuantum’s announcement of a space-bound quantum magnetometer as part of the US government’s MagQuest Challenge, marks the week when quantum sensing moved from theoretical interest to a dual regulatory and commercial question. Quantum gradiometry can reveal subsurface structures, including those with defense and resource extraction significance, at resolutions that current regulatory frameworks were not designed to address. It is still early, but this looks like a genuine market-formation signal worth tracking.

Notable Company Activity

Product Releases

  • Octave (Alto 2026.0): Rebranded its Luciad platform as Alto and launched version 2026.0 with cyberthreat visualization capabilities for defense situational awareness. The rebrand sharpens Octave’s positioning as a defense-focused geospatial intelligence platform.
  • MapTiler: Released April updates that included professional grid overlays and a major satellite imagery refresh across its basemap products.
  • USGS: Released a machine learning tool that forecasts streamflow drought conditions up to 90 days ahead nationwide. This is a significant applied AI deployment for water resource management.
  • Esri: Released the Protected Area Management solution and March 2026 ArcGIS Solutions update, alongside updates to its geocoding and world traffic services.

Partnerships

  • DroneDeploy × Cairn: Enterprise-wide aerial and ground reality capture partnership for housing development portfolio management. A useful example of reality capture moving from project-level use to portfolio-level deployment in construction.
  • Astroscale × Exotrail: Advancing France-Japan cooperation on space sustainability and on-orbit servicing, backed by a visit from Emmanuel Macron and Sanae Takaichi.

Funding & M&A

  • Xona Space Systems: Closed an oversubscribed $170M Series C to accelerate deployment of its Pulsar LEO navigation constellation. Investors include Hexagon, Craft Ventures, and Samsung Next. The round suggests strong market confidence in GPS-alternative positioning infrastructure.
  • Trimble: Signed an agreement to acquire Document Crunch, an AI-powered construction document analysis and risk management company, integrating it into the Trimble Construction One ecosystem.
  • Woolpert: Awarded a $49.9M USACE contract to support I-ATLAS coastal mapping and nautical charting efforts. It is a significant federal LiDAR and survey contract.

Government and Policy Developments

The US National Geodetic Survey’s NSRS modernization effort was the subject of a Geo Week News webinar bringing together NGS leadership and the geospatial community. The message was practical. The reference frame transition is coming, and the community should be preparing rather than worrying. For survey and mapping firms, the modernization will affect nearly every coordinate-dependent workflow in the US market, and early preparation is likely to be an advantage.

Canada’s geospatial strategy consultations drew substantive analysis revealing that while the country has depth in geospatial capabilities, it lacks consistent alignment across governance, infrastructure, and coordination. NRCan’s consultations are surfacing long-standing structural problems rather than resolving them. Sparkgeo’s piece on building a national urban forest data view illustrated both the ambition and the fragmentation of Canadian geospatial infrastructure.

In Australia, the Locus Alliance launched as a new national geospatial body to fill the gap left by the Geospatial Council of Australia’s collapse. The Alliance aims to be structurally different from its predecessor, though details of its governance are still emerging. The Ordnance Survey in the UK announced that its National Geographic Database now holds 16 data collections and 70 major enhancements, positioning Britain’s national mapping as a continuously updated digital product rather than a periodic release.

Technology and Research Trends

The technology story of the week centers on the maturation of EO data infrastructure. CNG’s Geo-Embeddings Sprint produced actionable specifications rather than aspirational roadmaps, while EarthDaily published on how real-time crop signals from its constellation are changing agricultural market decisions. It was one of the rarer posts connecting EO technology directly to commercial demand-side outcomes. TerraWatch’s “Anatomy of an Earth Observation Use Case” offered a structural critique of how the EO industry uses (and misuses) the term “use case,” pushing toward more rigorous framing of what makes an EO application commercially viable.

The Spatial Edge’s weekly digest covered satellites mapping local human development levels, LLMs estimating flood damage without training data, and foundation models for ecological mapping. Taken together, these offer a concentrated snapshot of where applied spatial data science seems to be heading. The through-line across these is the compression of traditional multi-step geospatial workflows into single-model inference. That has implications for both the skills market and the value chain.

Spatialists covered Stefan Ziegler’s work raster-enabling Apache Hop with GDAL-based transforms, demonstrating practical LiDAR-to-building-height ETL pipelines. This is exactly the kind of hands-on cloud-native geospatial tutorial content that has been persistently absent from the ecosystem. It stands out in part because material of this kind is still relatively rare.

Open Source Ecosystem Signals

The open-source ecosystem had a quieter week following the QGIS 4.0 and FOSSGIS 2026 activity of recent weeks. geoObserver noted a QGIS tip on SFCGAL functions now available as a plugin, which is better understood as a post-4.0 ecosystem refinement than a headline release. geoObserver also reflected on FOSSGIS 2026 and celebrated 44,444 downloads of the GeoBasis_Loader plugin, a milestone for the German open-data geospatial tooling community.

The Spatialists coverage of Apache Hop raster enablement is worth flagging here as well: the hop-gdal-plugin extends an open-source ETL framework with geospatial raster capabilities, bridging the gap between data engineering and geospatial processing. It represents the kind of cross-pollination between general-purpose open-source tooling and geospatial-specific capabilities that tends to strengthen the broader ecosystem.

The CNG Geo-Embeddings Sprint, covered in Market Signals above, also carries open-source ecosystem significance: the sprint’s outputs are intended for community review and adoption, meaning they will likely influence how open-source EO tooling handles embedding storage and discovery.

Watch List

  • Spiral Blue (Australia): Delivered space LiDAR hardware to a UK company as part of its strategy to build an EO space LiDAR capability. Space-based LiDAR is a nascent market with potentially transformative implications for forestry, bathymetry, and terrain mapping if costs come down.
  • Geospatial data and the EU Deforestation Regulation (EUDR): Coverage on Medium explored the geolocation data challenges and compliance implications of the EUDR. The regulation will create a distinct demand signal for geospatial verification services across commodity supply chains.
  • SBQuantum’s space quantum magnetometer: The MagQuest-funded mission could initiate a new class of GPS-independent navigation and subsurface sensing from orbit. If the technology performs, it opens regulatory and commercial questions that the geospatial industry has not yet grappled with.
  • Mainz cloud-native geospatial infrastructure: A German city implementing a fully cloud-based geospatial data infrastructure with VertiGIS and Esri. Municipal adoption of cloud-native GDI at this scale is an early but meaningful demand signal for enterprise cloud geospatial platforms in European public administration.

Top Posts of the Week

  1. Geo-Embeddings Sprint: Advancing standards for Earth observation embeddings (CNG Blog) moves EO embeddings from research into standards specification, with direct implications for data infrastructure interoperability.
  2. The New Battlespace: How Geospatial AI, Outdated Intelligence, and the Illusion of Oversight Are Reshaping Military Targeting (Project Geospatial) is a deeply informed and personal account of how GeoAI is outpacing the governance structures designed to prevent intelligence failures.
  3. The Anatomy of an Earth Observation Use Case (TerraWatch Space) offers a structural critique of how the EO industry frames commercial viability and pushes beyond “use case” as marketing shorthand.
  4. The Subsurface Geopolitics: Regulating the Commercial Use of Quantum Gravity Gradiometry (Project Geospatial) maps the emerging regulatory landscape for a technology that can reveal what lies underground at unprecedented resolution.
  5. Three Geospatial AI Myths Federal Buyers Should Not Believe (Cercana Systems) provides practical procurement-focused guidance that cuts through GeoAI marketing claims for federal decision-makers.

Cercana Executive Briefing is derived from 152 feeds aggregated by geofeeds.me.

Three Geospatial AI Myths Federal Buyers Should Not Believe

April Fools’ Day is as good a time as any to talk about geospatial AI, because there is still a surprising amount of wishful thinking in the market.

Some of it is harmless marketing shorthand. Some of it is not. For federal buyers, the difference matters. Procurement decisions made on inflated claims can leave agencies with brittle systems, poor data quality, and very expensive disappointment.

So, in the spirit of the day, here are three geospatial AI myths federal buyers should stop believing.

Myth 1: “AI will replace your GIS analysts”

It will not.

What AI can do, and increasingly does well, is accelerate parts of geospatial work that are repetitive, labor-intensive, or structurally well-bounded. That includes things like feature extraction from imagery, draft attribute population, metadata assistance, document entity extraction, semantic search, and automated QA/QC flagging for human review. Those are real gains, and they matter. They can make analysts faster, reduce backlog, and shift staff time toward higher-value work. But that is augmentation, not replacement (Pierdicca et al., 2025; Mansourian et al., 2024).

The part vendors often glide past is that geospatial work is rarely just data processing. It is judgment. It is fitness-for-use. It is understanding whether a dataset, workflow, or model output is actually suitable for a mission context. Federal geospatial programs do not succeed because someone can draw a polygon quickly. They succeed because someone knows whether that polygon should be trusted, how it was derived, what its limitations are, and what the consequences are if it is wrong.

That is why current federal AI policy still centers governance, risk management, testing, and monitoring rather than simple automation narratives. OMB’s current guidance requires agencies to manage risk in AI use cases, and its acquisition guidance emphasizes contract terms for ongoing testing and monitoring. NIST’s AI Risk Management Framework likewise treats validity, reliability, explainability, accountability, and transparency as core characteristics of trustworthy AI systems. More broadly, that emphasis is consistent with a longer-running federal concern that agencies need stronger governance around how data and technology are managed in practice, not just optimistic adoption narratives (National Institute of Standards and Technology, 2023; Office of Management and Budget, 2025a, 2025b; U.S. Government Accountability Office, 2020).

The practical question for federal buyers is not whether AI removes analysts. It is whether it makes analysts more effective without removing the controls that make their work defensible.

Ask vendors:

  • Where do humans stay in the loop?
  • What does analyst review look like in practice?
  • What happens when the model encounters unfamiliar data or edge cases?
  • What are the false positive and false negative rates?
  • Can the system be tested on our data before procurement?

If a vendor cannot answer those questions clearly, the “replacement” story is usually just a maturity problem wearing a marketing jacket.

Myth 2: “Our AI understands geography”

Usually, it does not. At least not in the way geospatial professionals mean it. Large language models can recognize place names, infer rough spatial relationships from training data, and produce plausible-sounding geographic language. That can be useful. They can help with geocoding workflows when paired with external validation, extract geographic entities from documents, generate natural-language descriptions of geospatial content, and route requests to the right tools. That is a meaningful capability. But it is not the same thing as spatial reasoning (Mansourian et al., 2024; Pierdicca et al., 2025).

Actual geospatial understanding requires more than knowing that Annapolis is in Maryland or that rivers flow downhill. It requires handling coordinate reference systems, projections, topology, scale, measurement, uncertainty, and the consequences of transforming data from one spatial framework into another. Those are not side issues. They are the work.

Recent research in LLM-enabled GIS is promising, but the stronger examples generally do not rely on a pure language model acting alone. They connect the model to external GIS tools, geospatial databases, scripted workflows, or validation layers. In other words, the most credible systems are not “the model understands geography.” They are “the model helps drive software that actually does geospatial work.”

Federal buyers should be very careful here, because this is where demo theater often flourishes. A chatbot that talks fluently about maps is not necessarily capable of performing sound spatial analysis. There is a large gap between linguistic confidence and geospatial competence.

Ask vendors:

  • Does the system rely on external geospatial databases and tools, or only on an LLM?
  • How does it handle coordinate transformations?
  • How does it deal with ambiguous place names?
  • What happens when topology, buffering, area, or network calculations are required?
  • Can you show the exact toolchain used for a spatial result?

If the answer is basically “trust the model,” that is not a geospatial AI strategy. That is a procurement warning sign.

Myth 3: “AI-generated geospatial data is production-ready”

Sometimes it is operationally useful. That is not the same thing as production-ready. AI-extracted features, auto-generated metadata, inferred attributes, and synthetic data can all play a useful role in geospatial workflows. But the word to keep in mind is assistive. These outputs can accelerate review, expand triage capacity, and help agencies focus expert attention where it matters most. What they should not do is bypass validation in mission-critical settings.

This is not an abstract concern. NIST frames trustworthy AI around validity, reliability, accountability, transparency, and explainability, all of which become especially important when outputs are used in operational or public-facing contexts. OMB’s acquisition guidance also points agencies toward contractual mechanisms for testing and monitoring over time, not just acceptance at delivery (National Institute of Standards and Technology, 2023; Office of Management and Budget, 2025b).

That matters because geospatial AI systems can produce outputs that look convincing while still being wrong. A computer vision model can miss features or invent them. Metadata generation can sound polished while omitting essential limitations. Synthetic attributes can appear statistically tidy while being operationally misleading. A confidence score can help, but only if there is a real workflow behind it that routes low-confidence or ambiguous outputs to human review.

This is where federal buyers should push hardest. Not on the happy path. On failure.

Ask vendors:

  • What validation workflow is included?
  • How do you measure and report accuracy?
  • Can we review incorrect output examples?
  • What happens at low confidence?
  • What kind of explainability is available to reviewers and auditors?
  • What monitoring exists after deployment?

A vendor who only wants to show perfect outputs is telling you less than they think.

What federal buyers should look for instead

The best geospatial AI vendors are usually the ones least interested in magic. They will show you where the model helps and where it does not. They will be explicit about toolchains, validation steps, and performance limitations. They will welcome testing on your data. They will talk about governance, monitoring, and human review without treating those things as inconvenient objections.

That posture aligns much better with where federal policy already is. Government guidance is not built around blind faith in AI. It is built around risk management, trustworthiness, accountability, and context-of-use. That is a much healthier frame for buying geospatial AI than the current crop of sweeping claims (National Institute of Standards and Technology, 2023; Office of Management and Budget, 2025a, 2025b).

So the simple rule is this: the vendors most confident in their systems should be willing to demonstrate them transparently on your data, discuss limitations openly, and show where humans remain essential.

If they will not, that is probably the most useful signal you are going to get.

For federal organizations trying to separate durable capability from AI theater, that evaluation work is becoming part of the job. It requires more than technical curiosity. It requires a clear view of mission fit, data readiness, governance, procurement risk, and where AI can actually improve operational outcomes. That is exactly the kind of problem strategic advisory and applied AI/ML support should help solve.

At Cercana Systems, this is the kind of work we help clients think through: where geospatial AI fits, where it does not, and how to evaluate, pilot, and implement it with a clear understanding of mission context and operational risk.

References

Office of Management and Budget. (2025a). Accelerating Federal Use of AI through Innovation, Governance, and Public Trust (M-25-21). Executive Office of the President.

Office of Management and Budget. (2025b). Driving Efficient Acquisition of Artificial Intelligence in Government (M-25-22). Executive Office of the President.

Mansourian, A., Pilesjö, P., Harrie, L., and others. (2024). ChatGeoAI: Enabling geospatial analysis for public through natural language, with large language models. ISPRS International Journal of Geo-Information, 13(10), 348.

National Institute of Standards and Technology. (2023). Artificial Intelligence Risk Management Framework (AI RMF 1.0) (NIST AI 100-1). U.S. Department of Commerce.

Pierdicca, R., Zingaretti, P., Frontoni, E., and others. (2025). On the use of LLMs for GIS-based spatial analysis. ISPRS International Journal of Geo-Information, 14(10), 401.

U.S. Government Accountability Office. (2020). Data governance: Agencies made progress in establishing governance, but need to address key milestones (GAO-21-152).

Cercana Executive Briefing — Week of March 21–27, 2026

142 feeds monitored. Published March 27, 2026.

Executive Summary

The clearest story of this week is the merging of two narratives that have been running in parallel: sovereign AI and geospatial intelligence. On Sunday, GoGeomatics published a piece authored by Will Cadell whose title states the thesis plainly: “SovereignAI is GeoAI.” Within 72 hours, Australia made three distinct institutional moves: Geoscience Australia launched a new 10-year national strategy; a new National Geospatial Advisory Committee was announced with cross-sector representation; and Geospatial World ran a feature on Australia’s Austral-Asian Space Innovation Institute discussing sovereign satellite capability and the National Digital Twin for Agriculture. This is not messaging from one company — it is institutional behavior from a government treating geospatial infrastructure as strategic national infrastructure.

That same framing is arriving in U.S. federal policy. The GeoAI and the Law Newsletter this week dissected the Artificial Intelligence Regulation and Safeguards Act and found an expanded geolocation definition that could reshape how geospatial companies collect and use location data. The GSA’s proposed AI contract clause for federal procurement is described as the most consequential shift for geospatial vendors in years.

Meanwhile, European standards are in flux. Javier de la Torre’s analysis of the INSPIRE Directive simplification argues this is not mere bureaucratic tidying but an opening to embrace analytics-native paradigms, which is a structural shift in how European geospatial infrastructure is conceived.

Across all three developments, the same question is being asked simultaneously in Washington, Brussels, Canberra, and Ottawa: what does geospatial data mean for national capability? Leaders who treat this as a technical standards conversation are reading it wrong. It is a strategic infrastructure conversation, and the answer is being written this week in policy documents, not product roadmaps.

Major Market Signals

SovereignAI and GeoAI Are Converging as a Policy Frame

GoGeomatics published “SovereignAI is GeoAI” on March 22, arguing that national AI sovereignty strategies are fundamentally geospatial challenges, asserting that understanding territory, movement, resources, and infrastructure at scale requires geospatial intelligence as a foundational layer. Within days, Australia produced three institutional signals in the same direction: a new 10-year strategy from Geoscience Australia framed around shaping “Australia’s future through geoscience insights”; a new National Geospatial Advisory Committee advising government; and a Geospatial World feature on the Austral-Asian Space Innovation Institute discussing sovereign satellite capability and the National Digital Twin for Agriculture. Canada is also in motion: the retirement of CCMEO Director General Eric Loubier after 25 years is characterized by GoGeomatics as arriving at a “critical time” for Canada’s geospatial sector. The policy frame is hardening across the middle powers, with geospatial seen as strategic infrastructure, not technical tooling.

U.S. Federal GeoAI Regulation Is Taking Shape

The Artificial Intelligence Regulation and Safeguards Act, which the GeoAI and the Law Newsletter calls the “Trump AI Act,” contains an expanded geolocation definition that could require geospatial companies to alter how they collect, store, and use location data. Separately, the GSA’s proposed AI contract clause would affect how federal agencies procure AI-enabled geospatial services. The White House push for a unified federal AI standard would supersede the patchwork of state-level rules that geospatial companies currently navigate. Taken together, these three instruments represent the most significant regulatory shift for the U.S. geospatial market since CIPSEA. Companies with federal contracts or location-data products should be conducting legal exposure assessments now, not after enactment.

Commercial EO Capacity Is Expanding Across Multiple Modalities Simultaneously

Three distinct capability additions arrived this week: Synspective successfully placed its 8th StriX SAR satellite in orbit, continuing its build toward a 30-satellite constellation by 2030; Satellogic announced its Merlin satellite will deliver daily 1-meter resolution optical imagery; and Open Cosmos launched what it describes as the largest space-based real-time data service, fusing broadband connectivity, Earth observation, and IoT in a single platform. The pattern is consistent with broader commercial EO maturation: higher revisit, higher resolution, and tighter integration with downstream data pipelines. Organizations that have been waiting for the market to stabilize before committing to EO-based workflows should note that the infrastructure is arriving whether they are ready or not.

European Geospatial Standards Infrastructure Is at a Decision Point

Javier de la Torre’s analysis in Spatialists — titled “geo beyond INSPIRE” — frames the simplification of the EU INSPIRE Directive not as a retreat but as a structural opportunity. The argument is that INSPIRE’s interoperability-first model, built for a previous era, is increasingly misaligned with how geospatial data is actually consumed. Analytics-native paradigms, where data is designed for computation from the start, not formatted for exchange, offer a better fit for the AI-era use cases now driving demand. The OGC simultaneously announced its Testbed on Trusted Data and Systems has expanded beyond Europe to include non-European NMCAs, reflecting growing global interest in how authoritative public geospatial data can be modernized and made computationally useful. These two developments together mark a transition moment for European and global geospatial standards. The question is not whether INSPIRE changes, but who shapes what replaces it.

Notable Company Activity

Product Releases

  • Esri: A coordinated spring release wave this week covered ArcGIS GeoAnalytics Engine 2.0 (cloud-scale spatial analytics), ArcGIS Urban (March 2026 update), ArcGIS StoryMaps (March 2026), ArcGIS Pro SDK updates, R-ArcGIS Bridge Spring 2026, and Lidar updates to World Elevation Layers in Living Atlas. The breadth and simultaneity of these releases signals a major platform release cycle, not incremental maintenance.
  • Satellogic: The Merlin satellite will offer daily 1-meter optical imagery, a meaningful step toward sub-daily revisit at commercial resolutions.
  • Open Cosmos: Announced what it calls the largest space-based real-time data service, combining broadband connectivity, Earth observation, and IoT telemetry in a single commercial offering.
  • Septentrio: Launched the AsteRx EB, a compact high-accuracy GNSS receiver targeting robotics, logistics, and industrial automation, extending precision positioning into non-traditional industrial sectors.
  • SBG Systems: Unveiled the Stellar-40, a modular and scalable inertial navigation system for demanding and mission-critical environments.
  • Apple: Announced that ads will come to Apple Maps in the United States and Canada beginning this summer via its new Apple Business platform.

Partnerships

  • ANELLO Photonics × Q-CTRL: A strategic collaboration combining silicon photonics inertial sensing with quantum magnetic navigation, targeting UAV operation in GPS-denied environments. The press release cites a $1 billion-per-day global cost from navigation failures, a finding that may attract defense and logistics attention.
  • Kongsberg Discovery × Fugro: A new Main Supplier framework agreement formalizing a decades-long relationship between the ocean technology and subsurface surveying firms.
  • Seabed 2030 × Greenroom Robotics: The international seabed mapping program partnered with the Australian autonomous vessel company to expand ocean floor data collection.

Funding & M&A

  • Arlula: Raised A$3.4 million to build out software workflows for automated satellite tasking and imagery analysis. This is a small-ticket award but is strategically directional in the EO automation space.
  • e-GEOS (Leonardo Group): Won a contract from Italy’s Ministry of Environment and Energy Security to conduct nationwide satellite mapping of asbestos.

Government and Policy Developments

Australia produced the most concentrated national geospatial policy activity of the week. Geoscience Australia launched a 10-year strategy framed explicitly around national capability, with ministerial endorsement. A new National Geospatial Advisory Committee was established to provide cross-sector advice to government. And the Austral-Asian Space Innovation Institute’s founding CEO used a Geospatial World platform to articulate how sovereign space capability, satellite constellation data, and the National Digital Twin for Agriculture are linked strategic assets. Three announcements in four days from one government signals that geospatial is a designated policy priority in Canberra, not a technical afterthought.

Canada’s situation is the mirror image: a leadership vacuum at CCMEO is arriving precisely when Canada needs to respond to both sovereignty pressures and a rapidly changing EO commercial market. GoGeomatics’ framing of this as a “critical time” reflects the real institutional risk that mid-cycle leadership transitions at national mapping agencies have historically been associated with delayed procurement decisions and stalled modernization programs.

In the United States, the GeoAI and the Law Newsletter’s detailed reading of the Artificial Intelligence Regulation and Safeguards Act and the GSA’s proposed AI procurement clause deserves board-level attention for any company selling geospatial AI capabilities to federal agencies. The expanded geolocation definition in the proposed legislation is not incidental and it appears to bring a wider range of location data products within the act’s scope than current law covers.

The OGC’s Testbed on Trusted Data and Systems is worth tracking as a governance model. Originally launched as Testbed Europe, its expansion reflects interest from non-European NMCAs who face the same modernization challenge: how to make authoritative public spatial data computationally useful without sacrificing trustworthiness. This is engineering work with standards implications that will matter across every market where national mapping agencies are significant data providers.

Ordnance Survey data is also anchoring a new UK multi-agency emergency communications system designed to reduce the time required to transfer incident data between control rooms, demonstrating a practical example of authoritative location data embedded in safety-critical infrastructure.

Technology and Research Trends

The Spatial Edge newsletter this week highlighted research in Nature Communications integrating seismic risk modeling, geospatial infrastructure inventory, and climate accounting that shows earthquake-related building repairs generate massive CO2 emissions. The implication for the market is directional: insurers, municipal governments, and climate-disclosure frameworks will need spatial datasets that link physical risk exposure to embodied carbon accounting. This is an early signal of a new analytical product category.

QGIS Server gained time-series (WMS-T) support for grouped layers this week, contributed by Oslandia in collaboration with Ifremer, the French oceanographic research institute. The technical significance extends beyond the feature: it enables institutional EO data providers to distribute time-varying imagery through standards-compliant web services without bespoke infrastructure. As more governments look to publish national EO datasets via QGIS-based portals, this capability removes a meaningful barrier.

Swiss cadastral survey data is now available in IFC format for BIM integration via geodienste.ch, with four cantons participating and more expected. This represents one of the first examples of authoritative cadastral data crossing the traditional boundary between GIS and building information modeling workflows. For vendors selling into the AEC sector, it is a signal that the BIM-GIS convergence is becoming a data standards reality, not just a vision document.

The “Shortening Translation Distance” essay by Bill Dollins in geoMusings this week offered a practitioner’s-eye view of how AI code generation is changing the relationship between user-centric domain knowledge and programming in geospatial work.

Open Source Ecosystem Signals

FOSSGIS 2026, the annual German open-source geospatial conference, took place this week in Göttingen. The CCC (Chaos Computer Club) published Day 1 session recordings on the same day as the presentations, which is described by geoObserver as a record turnaround that reflects both organizational maturity and the community’s commitment to accessibility. For organizations evaluating open-source geospatial tooling, FOSSGIS 2026 session recordings represent a concentrated resource: they document the current state of practice across QGIS, PostGIS, MapLibre, GeoServer, and adjacent tools, often before formal release notes appear.

Oslandia had a notable week in the European open-source ecosystem: the QGIS Server WMS-T contribution for Ifremer (technical post published), a recap of the QGIS-Fr French user days, and an announcement of GeoDataDays 2026 in Tours. Oslandia’s activity this week illustrates how open-source QGIS ecosystem contributors operate as professional services firms with direct government and research institution clients. This is model that can mitigate lifecycle concerns in procurement decisions for public sector geospatial programs.


Watch List

  • Apple Maps advertising model: If Apple’s entry into map advertising succeeds commercially, it will pressure Google to expand its own ad surface area in Maps, potentially restructuring the economics of consumer location data platforms globally. B2B geospatial vendors whose products sit downstream of consumer map data APIs should monitor closely.
  • OGC MUDDI standard: The OGC published a detailed narrative this week on the MUDDI (Model for Underground Data Definition and Integration) standard, framing it as a model for cross-system urban spatial data interoperability. Underground infrastructure mapping is a large, fragmented market and a maturing standard here could unlock significant procurement activity.
  • GPS-denied navigation commercialization: The ANELLO/Q-CTRL partnership is the most prominent announcement in a cluster of GPS-alternative navigation products reaching market. The $1B/day framing will attract defense and logistics capital. Watch for follow-on partnerships or acquisition interest from platform navigation vendors.
  • Radiant Earth governance shift: New board members this week include Cassie Ely, who played a role in bringing MethaneSAT to life, and David X. Cohen, executive producer of Futurama. The combination of climate-finance experience and science communication expertise signals that Radiant Earth is positioning itself for a higher-visibility role in the EO-climate intersection.
  • BIM-GIS cadastral convergence: Switzerland’s IFC-format cadastral data is the leading example, but the pattern of authoritative government cadastral data flowing into BIM workflows is likely to appear in other jurisdictions. AEC-sector geospatial vendors should be tracking the OGC BIM-standards working group for early signal.

Top Posts of the Week

  1. SovereignAI is GeoAIGoGeomatics — Establishes the thesis that national AI sovereignty strategies are fundamentally geospatial challenges; the most strategically significant framing piece of the week.
  1. “geo” beyond INSPIRESpatialists (Ralph Straumann / Javier de la Torre) — Frames the INSPIRE Directive simplification as an opportunity to adopt analytics-native paradigms rather than simply reducing compliance burden.
  1. GeoAI and the Law NewsletterSpatial Law & Policy — Detailed reading of the Trump AI Act’s expanded geolocation definition, the White House unified AI standard push, and the GSA AI contract clause — essential reading for any geospatial vendor with federal exposure.
  1. Contextual Location Data, Unified Foundational Maps Paramount for IndustryGeospatial World — Interview with Overture Maps Foundation Executive Director Will Mortenson on interoperability, the Global Entity Reference System, and the foundation’s AI and machine learning roadmap.
  1. Testbed on Trusted Data & SystemsOpen Geospatial Consortium — Announcement of the formerly Europe-only testbed going global, focused on practical NMCA modernization with reusable open outputs.

The Cercana Executive Briefing is sourced from 142 feeds aggregated by geofeeds.me.

Cercana Executive Briefing — Week of March 14–20, 2026

Executive Summary

The clearest story this week is the way two separate announcements converge into a single market signal. NVIDIA introduced its first space-optimized AI computing module at GTC in San Jose. At nearly the same time, Planet Labs announced a GPU-native AI engine built on NVIDIA’s Blackwell and IGX Thor platforms. Taken together, these are more than routine product updates. They are a public indication that AI-accelerated geospatial intelligence is moving from experimentation into infrastructure.

Processing satellite imagery in seconds instead of hours, building planetary-scale vector embeddings for semantic search, and placing GPU hardware directly on satellites all point toward the same conclusion. This is becoming the new architecture of Earth observation.

That signal grew stronger this week as Google completed the global rollout of its AlphaEarth Foundations 2025 satellite embedding layer and demonstrated vector search integration across BigQuery and Earth Engine. This is a different path toward a very similar destination. When two major platforms move toward AI-native EO infrastructure at the same time, it is hard to dismiss it as coincidence. It looks much more like an inflection point.

At the same time, governments continued to build strategic distance from GNSS dependency. The United Kingdom released new funding to advance its National Timing Centre with atomic clock infrastructure as part of a broader multi-year program focused on sovereign PNT resilience. In the United States, the community-led HIFLD Next Commons launched to restore access to federal infrastructure datasets that were quietly shuttered in August.

The open-source side of the industry had its own important signal. QGIS launched one of its most coordinated sustainability pushes in recent memory, with a sustaining member campaign, a grants round, and a new flagship member all announced in the same week.

For decision-makers, the questions raised this week are fairly direct. Is your EO pipeline ready for GPU-native processing architectures? What is your exposure to GNSS dependency in critical operations? And if your work depends on U.S. public geospatial data, how much of it still rests on federal datasets that may no longer be reliably available?

Major Market Signals

GPU-Native EO Infrastructure Becomes the Production Standard

The most important structural development this week is the simultaneous move by NVIDIA and Planet Labs toward a GPU-native architecture for satellite imagery processing. NVIDIA launched its space computing platform at GTC, combining the IGX Thor and Jetson Orin modules into a system designed for size-, weight-, and power-constrained spacecraft environments. The company’s goal is explicit: bring data-center-class performance into orbit.

Jensen Huang put it simply: “intelligence must live wherever data is generated.” Planet, which operates the world’s largest Earth-observation constellation, said it will deploy NVIDIA hardware on its next-generation Pelican satellites and Owl constellation. The company also said this will reduce imagery processing from hours to seconds and allow it to apply NVIDIA’s CorrDiff generative AI diffusion model to produce physics-informed super-resolution imagery from its existing archive.

Planet also disclosed plans to convert its daily data stream into AI embeddings, making semantic search across global imagery possible at a new scale. Its stock rose by roughly 8% on the announcement, and the company also reported record Q4 revenue of $86.8 million, up 41% year over year. The market appears to be reading this not simply as a partnership announcement, but as a deeper shift in Planet’s model from imagery provider toward AI-native intelligence platform.

Other space companies, including Kepler, Aetherflux, Axiom Space, Capella, Sophia Space, and Starcloud, are also integrating NVIDIA platforms for orbital compute. That makes this look less like a one-off deal and more like an ecosystem shift.

Geospatial Foundation Models at Planetary Scale

The convergence between Planet and NVIDIA’s GPU-native processing strategy and Google’s AlphaEarth work points to a new baseline for EO analytics. Google completed its 2025 AlphaEarth Foundations satellite embedding update this week, delivering full global coverage at 10-meter resolution as a freely available dataset in Google Earth Engine and Cloud Storage.

The model compresses annual multi-sensor data, including Sentinel-2 optical, Sentinel-1 radar, Landsat thermal, GEDI LiDAR, climate models, and gravity fields, into 64-dimensional embeddings per pixel. That creates a practical foundation for similarity search, change detection, and downstream machine learning without requiring users to run their own deep learning inference stack.

Separately, a Google Earth blog post this week demonstrated embedding-based vector search across BigQuery, Earth Engine, and AlphaEarth Foundations. That is one of the clearest public signals yet of a unified semantic search pipeline for planetary data. For enterprise buyers, the significance is straightforward. A growing set of geospatial intelligence tasks, including similarity search, supply chain monitoring, and climate risk assessment, can increasingly be performed without deep remote sensing expertise or heavy infrastructure investment. That creates real pressure for traditional EO analytics vendors.

GNSS Sovereignty Investment Expands

A new funding release from the UK’s National Timing Centre program advanced construction of dedicated atomic clock infrastructure sites that will be connected through fiber and satellite to distribute a nationally assured timing signal independent of GPS or Galileo. The announcement, covered by Spatial Source this week, extends a multi-year UK investment program that now totals hundreds of millions of pounds across eLoran, atomic timing, GNSS interference monitoring, and space-based time transfer research and development.

The strategic framing is clear. Russian jamming and spoofing activity in and around conflict zones has demonstrated that GNSS-dependent critical infrastructure, including banking, telecommunications, energy, and transportation, carries an unacceptable single-point-of-failure risk. Australia is moving in a similar direction, with a reported $100 million CRC bid known as SHIELD for sovereign PNT submitted to the federal government.

The business implications extend beyond defense. Financial institutions, autonomous vehicle operators, precision agriculture vendors, and telecommunications infrastructure managers all face growing exposure to this risk category. National investment programs are also beginning to create procurement and partnership opportunities for vendors that can help address it.

US Open Infrastructure Geospatial Data Requires Alternative Sources

The launch of HIFLD Next, a community-stewarded portal built by Public Environmental Data Partners and supported by a growing coalition, marks a structured response from the geospatial community to the August 2025 shutdown of the HIFLD Open federal portal.

For more than two decades, HIFLD Open provided free and authoritative geospatial data on national infrastructure, including hospitals, schools, power plants, flood zones, and transportation networks. Emergency managers, researchers, planners, and state and local governments used it extensively. HIFLD Next preserves archived datasets in modern formats such as GeoParquet and PMTiles, and it is developing a governance model through the HIFLD Next Commons coalition.

This is a meaningful change in the U.S. geospatial data landscape. Organizations that relied on HIFLD Open data in their workflows, products, or contracts should be inventorying those dependencies now. Some datasets may need commercial replacements. Others may still exist through originating agencies, but without the unified access point that HIFLD once provided.

QGIS Coordinated Sustainability Push

In an unusually coordinated week for the QGIS ecosystem, three significant announcements landed almost together: the launch of the QGIS Sustaining Member Campaign 2026, the opening of QGIS Grants Round 11 with a call for development proposals, and the announcement that COSS has become the latest flagship sustaining member. Seen together, these moves suggest deliberate timing around a broader funding and engagement effort.

For enterprise decision-makers, the QGIS funding picture matters because it directly affects the reliability, security, and development pace of what is arguably the most widely deployed open-source GIS platform in the world. The expanding roster of sustaining members, which now includes companies, public agencies, and academic institutions, reflects the platform’s deeper role in production environments. The grants program continues to support core technical improvements that benefit the broader user base.

Notable Company Activity

Product Releases

  • Planet Labs: Announced a GPU-native AI engine in collaboration with NVIDIA, applying CorrDiff generative AI for super-resolution on PlanetScope imagery and building planetary-scale vector embeddings for semantic search. The company also reported Q4 FY2026 record revenue of $86.8 million, up 41% year over year, with full-year revenue of $307.7 million and FY2027 guidance of $415 million to $440 million.
  • Google: Completed the global rollout of AlphaEarth Foundations 2025 satellite embeddings in Earth Engine and Google Cloud Storage. The company also announced integration across BigQuery, Earth Engine, and AlphaEarth for vector search.
  • GeoSolutions: Released MapStore 2025.02, the latest version of its open-source web map composition platform.
  • Esri: Published several ArcGIS Blog posts this week covering Field Maps geospatial PDF workflows, cloud storage connectivity in ArcGIS Pro, and an AI-powered support chatbot. Together, they reflect continued incremental AI integration across the ArcGIS platform.
  • GeoCue: Launched the TrueView GO NEO LiDAR scanner, extending its airborne scanning portfolio.

Partnerships

  • NVIDIA × Planet Labs: GPU hardware integration on next-generation Pelican and Owl constellation satellites, along with CorrDiff AI and embedding architectures for ground processing. See the major signals section for more detail.
  • Trimble × Vermeer: Announced the Trimble Ready Option for Vermeer’s new SM55 Surface Miner, continuing the pattern of precision positioning integration in heavy construction equipment.
  • Q-Free × Sony Semiconductor Solutions: Announced a partnership to advance satellite-based road user charging technology, which suggests GNSS-based mobility pricing is moving closer to commercial deployment.
  • Quantum Solutions × Delmar Aerospace × Perspectum Drone: Partnered to deploy aerial hyperspectral water intelligence capabilities across North America.
  • Sanborn Geophysics: Expanded airborne electromagnetic survey services to support critical minerals exploration, positioning itself within the broader strategic minerals supply chain debate.
  • Fugro: Won a contract to map Texas river basins with LiDAR and geospatial analysis to improve flood resilience for regional water authorities.

Government and Policy Developments

The UK’s National Timing Centre advanced this week with a new funding release supporting dedicated atomic clock sites and combined fiber-and-satellite signal distribution. This forms part of a broader multi-year PNT resilience framework valued at more than £155 million and spanning eLoran terrestrial navigation, interference monitoring, and space-based time transfer research.

The framing from UK Science Minister Lord Vallance is notable. He said GNSS-based timing signals “are increasingly vulnerable to disruption” and that the government is “acting now.” That posture is becoming increasingly common across Europe and the Five Eyes community. Australia’s parallel bid for a $100 million Cooperative Research Centre for Secure, Hardened PNT, known as SHIELD, suggests this is not an isolated policy concern.

In the United States, the policy story this week is shaped as much by absence as by action. HIFLD Next represents an attempt by the community to fill the gap left by the federal retreat from open infrastructure data.

The Screening Tools post announcing HIFLD Next makes the issue plain. Authoritative and nationally consistent geospatial infrastructure data remains essential for emergency management, disaster response, and public safety planning, and the shutdown of HIFLD Open has not been matched by any federal replacement. For state and local governments, that is already an operational problem.

The Open Geospatial Consortium also published commentary this week on individual membership and influence through standards. That matters because its new individual membership tier, announced last week, lowers the barrier to formal participation in standards work. For vendors and practitioners who want to shape work around OGC API, GeoParquet, and emerging AI-related standards, that is a meaningful change.

Australia’s Digital Earth Australia also published water coverage datasets this week that reveal decades of historical water body presence. That is a significant open data release for catchment management, flood risk analysis, and agricultural planning.

Technology and Research Trends

The dominant technology story this week is the convergence of GPU compute with Earth observation pipelines. NVIDIA’s space computing platform, centered on the IGX Thor module introduced at GTC, is designed for the real constraints of spacecraft, including limited size, weight, and power, while still aiming to support data-center-class AI workloads. This is not a research concept. It is a production-oriented hardware offering aimed at satellite programs.

Combined with Planet’s CorrDiff super-resolution work and planetary vector embedding plans, the week points toward an architecture in which AI inference moves steadily closer to the sensor. The sequence is familiar now. First it moved to the ground station, then to the cloud, and now increasingly to the satellite itself. That has implications for downstream analytics vendors. If imagery arrives already processed and enriched, the value of adding analysis later in the chain may be reduced.

Google’s approach is related, though somewhat different. AlphaEarth Foundations looks backward across historical imagery, producing 64-dimensional annual embeddings for every 10-meter land pixel back to 2017. That supports similarity search, change detection, and classification with relatively little labeled data.

This week’s global 2025 update makes the current dataset broadly available, and the BigQuery integration suggests Google wants it to be usable at enterprise scale without requiring Earth Engine specialization. In practical terms, both Google and Planet are moving toward the same customer outcome from different directions. They are making geospatial intelligence more available on demand from very large EO archives, without requiring every customer to operate as a remote sensing specialist.

Elsewhere, LiDAR continues its steady expansion into new operational settings. Darling Geomatics published analysis on aerial LiDAR and photogrammetry for large-scale topographic surveys. Spatial Source covered a case study showing LiDAR helping reopen a mine after a safety event. GeoCue launched a new scanner. None of this suggests novelty. What it does suggest is a continued lowering of operational barriers and cost thresholds for proven technology.

Open Source Ecosystem Signals

QGIS’s coordinated sustainability effort this week may be the clearest sign yet of its evolution from a community-maintained tool into a strategically governed open-source platform. The Sustaining Member Campaign 2026 explicitly asks commercial users to formalize financial support. Grants Round 11 invites funded development proposals. COSS joining as a flagship sustaining member adds another institutional anchor.

This pattern is familiar from the longer history of projects such as PostgreSQL and Apache, where broad commercial dependence eventually leads to more deliberate funding structures for long-term sustainability. For enterprise QGIS users that have not yet become sustaining members, the question is becoming less philosophical and more operational. It is about supporting a dependency that already sits inside production workflows.

The PROJ coordinate transformation library turned 27 this week, as noted by geoObserver. PROJ sits underneath an enormous share of the geospatial software stack, both commercial and open source, anywhere coordinate systems and projections matter. Its continued maintenance is easy to overlook because it is so foundational. Anniversaries like this are useful reminders to ask whether organizations have any formal relationship with the open-source projects they rely on most.

OpenStreetMap US released the PWG Sidewalk Mapping Schema 1.0, a standardized schema for pedestrian infrastructure mapping. That matters for mobility planning, accessibility compliance, and autonomous navigation use cases that depend on structured and consistent sidewalk data at scale. The release marks a step forward from ad hoc community practice toward something organizations can more readily integrate into operational workflows.

GeoSolutions also released MapStore 2025.02, continuing development of an open-source web mapping platform used by national mapping agencies and public sector organizations across Europe.

Watch List

  • EO as a Public Good: Spectral Reflectance published “The Economics of Openness: Funding Earth Observation as a Public Good”, which makes an analytical case for treating EO data as public infrastructure. The argument is gaining relevance as U.S. federal open data retreats and commercial EO consolidation continues.
  • GeoAI Legal Frameworks Maturing: The GeoAI and the Law Newsletter published its latest edition, tracking regulatory and liability developments around AI applied to geospatial data. As GeoAI moves into production workflows, procurement language, liability standards, and intellectual property questions are likely to become more visible in vendor conversations.
  • Satellite-Based Road User Charging: The Q-Free and Sony partnership around satellite-based road pricing is a relatively quiet but commercially important application of GNSS-derived positioning. It also makes the UK’s GNSS sovereignty efforts more relevant for those watching mobility infrastructure.
  • Electronic Warfare and Geospatial Intelligence: Project Geospatial published an analysis of spectral techniques in modern electronic warfare. The growing use of GNSS jamming and spoofing in conflict zones is beginning to influence both defense procurement and civilian infrastructure policy.
  • Data Centre Geography: The Spatial Edge’s post “We’re running out of room for data centres” points to an emerging geographic constraint on cloud-scale geospatial processing created by the AI infrastructure boom.

Top Posts of the Week

  1. Planet to Build World’s First GPU-Native AI Engine for Planetary Intelligence with NVIDIA — Business Wire / Planet Labs — The defining story of the week: GPU-native imagery processing, CorrDiff generative super-resolution, and NVIDIA hardware on orbit marks EO’s architectural shift.
  2. Now available: Google Earth data layers go global — Google Earth and Earth Engine / Medium — AlphaEarth Foundations 2025 embeddings reach full global coverage, cementing Google’s position in the geospatial foundation model market.
  3. UK invests $340m in non-GNSS timing system — Spatial Source — The latest tranche of the UK’s National Timing Centre program signals that GNSS sovereignty investment has moved beyond planning into funded infrastructure delivery.
  4. HIFLD Next: Restoring America’s Infrastructure Datasets — Data + Screening Tools — The community-led successor to HIFLD Open takes shape, with important implications for emergency management, research, and government workflows that depended on federal open data.
  5. The Economics of Openness: Funding Earth Observation as a Public Good — Spectral Reflectance — A compelling analytical argument for public funding models for EO data, arriving precisely when the HIFLD shutdown and commercial EO consolidation have made the question urgent.

This week’s Cercana Executive Briefing is sourced from 137 feeds aggregated by geofeeds.me. Analysis by Cercana.

Geospatial Sovereignty as a Strategic Requirement

Executive Summary

Geospatial systems are no longer peripheral tools; they underpin critical infrastructure, national security functions, and capital-intensive operations across government and industry. They support logistics, infrastructure management, environmental compliance, security operations, and strategic planning across government and industry. Yet many organizations rely on externally managed platforms for the storage, processing, and delivery of spatial intelligence.

This post discusses the concept of geospatial sovereignty, a governance and risk management discipline concerned with the degree to which an organization retains control over its geospatial data, infrastructure, and operational continuity.

As regulatory requirements expand, vendor ecosystems consolidate, positioning infrastructure faces disruption, and sovereign cloud models gain traction, executive teams must understand where they sit on the spectrum between dependency and control. The objective is informed, intentional architecture grounded in clear visibility into operational risk and long-term optionality.

Perception of Operational Control

For more than a decade, convenience has shaped enterprise technology decisions. Cloud-hosted platforms reduced infrastructure burdens. Subscription licensing simplified procurement. Managed services shifted operational responsibility outward. For many organizations, these shifts accelerated deployment and reduced internal complexity. But in the domain of geospatial operations, a different question is emerging: Who actually controls the systems that underpin your spatial intelligence?

Consider a composite scenario drawn from observable trends. A logistics provider experiences degraded positioning data during a regional GPS disruption and discovers that routing intelligence depends on upstream signals it does not control. A cloud vendor modifies pricing tiers or usage thresholds, quietly altering long-term cost projections embedded in multi‑year operating plans. A regulatory audit raises questions about data residency and physical storage location, forcing leadership to answer questions they assumed were settled at contract signature. A mission‑critical geospatial workflow is interrupted by an upstream service outage, revealing how tightly coupled internal processes have become to external infrastructure.

In each case, the organization technically “owns” its mission and its data. Yet operational continuity depends on infrastructure, policy decisions, and technical roadmaps defined elsewhere. This is not a critique of cloud providers; many are reliable, professionally managed, and appropriate for a wide range of workloads. The issue is structural. Control of infrastructure, data, and operational continuity is not the same as platform access. When those elements diverge, organizations may discover that their geospatial capabilities are more dependent than leadership intended.

From Data Sovereignty to Geospatial Sovereignty

The concept of sovereignty in digital systems is not new. Data sovereignty is commonly defined as the principle that data are subject to the laws and governance structures of the jurisdiction in which they are collected or stored.

It is important to distinguish related but separate concepts. Data residency refers to the physical location where data are stored. Data localization refers to legal requirements that certain categories of data remain within specific geographic boundaries. Data sovereignty concerns the legal authority governing that data and the jurisdiction whose laws apply. These distinctions are discussed in detail by enterprise security and cloud governance analysts (CIO, 2026).

Increasingly, major technology publications are also examining “sovereign cloud” and “geopatriation” trends, where governments and enterprises seek to re-anchor sensitive workloads within controlled jurisdictions. These discussions reinforce that sovereignty is not theoretical. It is shaping procurement, cloud architecture, and national digital strategies.

Geospatial sovereignty extends this conversation beyond legal jurisdiction. It asks whether an organization retains meaningful authority over how its spatial infrastructure is architected and governed, whether operations can continue during vendor, network, or geopolitical disruption, how systems are updated and configured, how spatial data integrates with broader enterprise platforms, and where critical skills and knowledge reside.

In this context, sovereignty is operational. It concerns who can make consequential decisions about the systems that support mission execution.

Why This Issue Is Emerging Now

Several converging pressures are elevating geospatial sovereignty from a technical concern to an executive one.

1. Geospatial Is Foundational

Spatial data now informs asset management, utilities maintenance, supply chain routing, environmental monitoring, agriculture, mining, emergency response, and national security operations. As geospatial becomes core to operations, its governance becomes a strategic concern. National Academies research has repeatedly emphasized that geospatial information infrastructure is critical to modern governance and infrastructure management (National Academies of Sciences, Engineering, and Medicine, n.d.).

2. Regulatory and Compliance Demands Are Expanding

Globally, more than one hundred privacy and data governance laws now affect how organizations collect, process, and store data. These include GDPR in the European Union and numerous national and state-level frameworks.

As data governance regimes expand, spatial datasets, which are often rich with location intelligence tied to individuals, infrastructure, or sensitive assets, fall under increasing scrutiny. European discussions around building digital sovereignty through authoritative geodata emphasize that trusted, nationally governed datasets are foundational to public policy, security, and economic competitiveness (GIM International, 2026). The implication for enterprises is clear that geospatial data is no longer merely operational. It is policy-relevant and potentially regulated. Governance expectations are rising accordingly.

Commercial perspectives are echoing this shift. Industry commentary aimed at enterprise operators has begun to frame sovereign geospatial data as a competitive and operational necessity rather than a compliance afterthought. Discussions emphasize that organizations dependent on third-party platforms for core spatial intelligence may struggle to preserve data lineage, portability, and strategic control as markets evolve (Nimbo, 2025). The argument is not ideological; it is pragmatic. When spatial data informs capital allocation, logistics optimization, and asset performance, control over that data becomes economically material.

3. Vendor Ecosystems Are Consolidating

The technology industry continues to experience consolidation through mergers and acquisitions. Platform acquisition can alter licensing terms, support models, product roadmaps, and pricing structures.

Organizations that rely exclusively on proprietary ecosystems may find their long-term cost models and integration assumptions shifting unexpectedly. Industry commentary has begun to frame geospatial sovereignty as requiring both legal alignment and architectural foresight, which highlights that compliance without architectural control can still leave organizations strategically exposed (CARTO, 2026).

In other words, sovereignty is not solved by contract language alone. It must be reflected in system design.

4. Strategic Uncertainty Is Increasing

Positioning, Navigation, and Timing (PNT) infrastructure such as the Global Positioning System (GPS) is globally relied upon and operated by national governments (U.S. Space Force, 2023). Disruptions, whether technical, environmental, or geopolitical, demonstrate that foundational spatial services are not immune to systemic risk.

Recent analysis from Australia’s spatial industry has highlighted both the economic potential and systemic vulnerability of national PNT infrastructure. Commentary in Spatial Source has warned that while modern economies are deeply dependent on satellite-based positioning, they often lack redundancy and assurance frameworks to mitigate disruption (Spatial Source, 2026a).

Further, discussion of “navwar,” or navigation warfare, underscores that PNT denial and degradation are no longer theoretical military edge cases but active considerations in contested environments (Spatial Source, 2026b). Even outside conflict scenarios, signal interference, spoofing, and systemic outages pose measurable operational risk.

In response, governments are investing in resilience and sovereign capability. Frontiersi’s launch of Australia’s first dedicated PNT Labs reflects a recognition that positioning infrastructure requires independent testing, validation, and assurance capacity (Spatial Source, 2025). Similarly, Canada’s evolving Defence Industrial Strategy explicitly acknowledges the strategic importance of domestic geospatial capability within national security and industrial planning (GoGeomatics Canada, 2026).

These developments signal a broader shift. When governments treat geospatial and PNT systems as strategic assets requiring sovereign capability, commercial enterprises should take note. If national infrastructure planners view spatial systems through a sovereignty lens, enterprises whose operations depend on those systems must evaluate their own dependency assumptions.

At the same time, analysts are emphasizing that data provenance and trust are becoming central to reliable forecasting, AI modeling, and decision support. Without clear lineage and governance, spatial analytics risk becoming less defensible and less auditable (Ready Signal, 2026).

When spatial infrastructure becomes integral to mission execution, resilience and traceability are no longer purely technical considerations. They become executive concerns tied to continuity, liability, competitive positioning, and public trust.

Costs of Dependency

Dependency is rarely visible when systems function as expected. It becomes visible when change is required. Organizations may discover that data export is limited or constrained by proprietary formats, that migration costs are materially higher than anticipated, or that integration depth is bounded by vendor APIs rather than internal design choices. Custom workflows may be constrained by externally managed roadmaps, and over time internal skills may atrophy because expertise resides primarily with the platform provider rather than within the enterprise.

These costs are often architectural rather than purely financial. They shape how quickly an organization can pivot, how confidently it can integrate acquisitions, and how effectively it can respond to regulatory change. Over time, optimizing for short-term convenience can reduce long-term flexibility. That trade-off may be acceptable for commoditized functions. It is less acceptable when spatial intelligence underpins strategic decision-making.

Sovereignty as Institutional Capability

At its most practical level, geospatial sovereignty is about institutional capability. It asks whether the organization possesses the internal knowledge required to operate and evolve its spatial systems, whether it can transition platforms without losing intellectual capital, whether its most critical spatial workflows are portable, and whether leadership has explicitly defined which components must remain under direct control.

Sovereignty exists on a spectrum, ranging from fully vendor-managed SaaS environments where infrastructure and architectural direction are externally controlled to fully self-managed systems governed internally. Most organizations operate somewhere between these poles. The leadership challenge is to ensure that dependency is intentional, understood, and aligned with mission risk tolerance. Where dependency is acceptable, it should be consciously accepted; where it is not, architectural adjustments should follow.

The Architecture Question

When organizations examine geospatial sovereignty seriously, the discussion shifts from tools to architecture. Questions emerge that are fundamentally architectural in nature. Which datasets are mission-critical versus supportive? Which workflows must remain operational during network disruption? Where does regulatory exposure exist? Which integrations define competitive advantage? How portable are spatial assets across platforms and vendors?

Answering these questions requires cross-functional engagement across technology, operations, legal, compliance, and executive leadership. The conversation moves beyond tool comparison and into enterprise design. Sovereignty is ultimately a governance and architecture exercise intersecting risk management, operational resilience, and long-term strategy.

Seeking Guidance

As geospatial systems become more deeply integrated with enterprise operations, governance cannot remain purely technical. Executive leadership increasingly needs visibility into structural dependencies, long-term total cost trajectories, regulatory exposure, continuity planning assumptions, and the sustainability of internal talent.

Organizations that evaluate sovereignty proactively retain optionality. Those that defer the conversation may discover constraints only when disruption forces action. In that moment, architectural flexibility is no longer a strategic advantage; it becomes an emergency requirement.

The role of trusted advisors in this context is not to prescribe universal solutions or advocate a single technology stack. It is to help organizations map existing dependencies, clarify strategic priorities, assess architectural alternatives, and align technology decisions with mission risk tolerance. Sovereignty decisions should reflect leadership intent rather than historical inertia.

For organizations navigating this terrain, the challenge is rarely theoretical. It is practical, architectural, and often constrained by legacy decisions. Experienced advisory support can help leadership teams translate sovereignty from an abstract principle into an actionable roadmap. That work begins with disciplined assessment, grounded risk analysis, and a clear understanding of how geospatial capabilities align with mission priorities.

A Conversation Worth Having

This discussion does not require immediate platform replacement or imply that current systems are deficient. It begins with assessment. Who controls your geospatial infrastructure? Where are your true dependencies? Which elements are strategic, and which are commoditized?

As spatial intelligence becomes central to both public and private sector operations, these questions move from theoretical to structural. They shape procurement strategy, workforce planning, compliance posture, and long-term competitiveness.

In subsequent posts, we will examine architectural models, hybrid approaches, and the role of open-source ecosystems in enabling greater geospatial independence without sacrificing innovation. We will also explore practical assessment frameworks that allow leadership teams to quantify dependency instead of debating it abstractly.

For now, the imperative is straightforward: understand your position on the sovereignty spectrum before external events force the issue. In an era of increasing complexity, control means ensuring that the systems most critical to your mission remain aligned with strategic intent.

References

CARTO. (2026, Jan 15). Geospatial sovereignty requires law and architecture. https://carto.com/blog/geospatial-sovereignty-why-it-requires-both-law-and-architecture

CIO. (2026, Feb 13). Geopatriation and sovereign cloud: How data returns to its origin. https://www.cio.com/article/4131458/geopatriacion-and-sovereign-cloud-how-data-returns-to-its-origin.html

GIM International. (2026, Feb 25). Building digital sovereignty through authoritative European geodata. https://www.gim-international.com/content/news/building-digital-sovereignty-through-authoritative-european-geodata

GoGeomatics Canada. (2026, Feb 18). What Canada’s defence industrial strategy really means for geospatial. https://gogeomatics.ca/what-canadas-defence-industrial-strategy-really-means-for-geospatial/

National Academies of Sciences, Engineering, and Medicine. (n.d.). Geospatial information infrastructure and governance. https://www.nationalacademies.org/read/28857/chapter/10

Nimbo. (2025, Dec 2). Sovereign geospatial data. https://nimbo.earth/stories/sovereign-geospatial-data/

Ready Signal. (2026, Feb 19). Data sovereignty, provenance, and trustworthy forecasts. https://www.readysignal.com/blog/data-sovereignty-provenance-trustworthy-forecasts-2026

Spatial Source. (2026, Feb 20). Australian PNT: Lots of potential, lots of danger. https://www.spatialsource.com.au/australian-pnt-lots-of-potential-lots-of-danger/

Spatial Source. (2026, Feb 10). PNT assurance in the age of navwar. https://www.spatialsource.com.au/pnt-assurance-in-the-age-of-navwar/

Spatial Source. (2025, Feb 26). Frontiersi launches Australia’s first PNT labs. https://www.spatialsource.com.au/frontiersi-launches-australias-first-pnt-labs/

U.S. Space Force. (n.d.). Global Positioning System (GPS). https://www.spaceforce.mil/About-Us/Fact-Sheets/Article/2197765/global-positioning-system/

Header image: Mhsheikholeslami, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

Strategic Teaming for Small Businesses

In federal and technically complex markets, small businesses often treat teaming as a procedural step in the pursuit lifecycle, something to evaluate during bid/no-bid discussions and formalize before proposal submission.

That framing understates its importance.

Teaming is not merely a mechanism for satisfying requirements. When approached deliberately, it becomes an institutional discipline that shapes competitive posture, delivery resilience, and long-term market positioning.

For leadership teams, the issue is not whether to team. The issue is whether teaming decisions reflect strategic intent or short-term convenience.

Executive Summary

Strong small business teaming relationships are built on four disciplines:

  1. Acknowledge capability gaps before pursuing partnerships.
  2. Build resilience through strategic capability overlap, not just gap-filling.
  3. Define workshare commitments clearly and early.
  4. Maintain professional discipline in competitive markets.

Organizations that internalize these principles strengthen both proposal credibility and long-term competitive architecture.

Why Self-Awareness Is Critical in Small Business Teaming

Organizations that consistently perform well in competitive environments share a defining trait: clarity about their capabilities — including their limitations.

No small business, regardless of technical depth, is equally strong across every domain. Attempting to project comprehensive sufficiency may satisfy internal confidence, but it can introduce structural risk into proposals and execution plans.

Strategic teaming begins with disciplined internal assessment:

  • Where does the organization create differentiated value?
  • Where does it rely on marginal capacity?
  • Where would complementary expertise materially strengthen delivery confidence?

Acknowledging capability boundaries is not weakness, it is risk management. When leadership approaches partnership from a position of institutional clarity, teaming becomes a deliberate enhancement of performance and not a reactive concession.

Should Small Businesses Avoid Overlapping Capabilities When Teaming?

A common approach to teaming is to identify narrow capability gaps and select partners who provide only those discrete functions. Overlap is often avoided in the name of efficiency.

This approach assumes static requirements and predictable execution environments. In reality, contracts evolve. Staffing markets tighten, technical requirements expand, and surge demands arise with limited notice. Under these conditions, resilience becomes more valuable than theoretical efficiency.

Strategic overlap in which partners possess adjacent or even similar capabilities provides:

  • Flexibility in resource allocation
  • Accelerated response to emergent requirements
  • Reduced dependence on extended hiring cycles
  • Continuity when individual contributors transition

Managed properly, overlapping capability is not redundancy. It is operational insurance. For leaders accountable for performance, this distinction is material.

How Should Small Businesses Structure Teaming Agreements?

Teaming agreements are often viewed as preliminary instruments necessary for proposal submission but secondary to the eventual subcontract.

In practice, they establish the psychological and operational foundation for the entire relationship. Partners who contribute proposal effort, past performance, and strategic positioning incur real opportunity cost. When post-award workshare remains ambiguous, trust erodes before execution begins.

High-functioning teams address this directly by defining:

  • Concrete areas of responsibility
  • Structured workshare commitments where feasible
  • Explicit constraints tied to funding or regulatory requirements (such as the 51% requirement in small-business set-asides)
  • Clear mechanisms for adjustment as scope evolves

Clarity does not eliminate uncertainty. It reduces avoidable friction. Trust built during formation strengthens collaboration during execution, where it matters most.

Why Professional Discipline Matters in Competitive Markets

In tightly networked technical markets, such as the geospatial technology market, roles shift frequently. Today’s teammate may be you competition tomorrow. Yesterday’s competitor may become a strategic partner.

Every organization carries an obligation to remain viable and act in the best interest of its workforce and stakeholders. Decisions about which team to join, or whether to prime independently, are strategic business judgments. Emotional reactions to competitive outcomes can introduce unnecessary long-term cost.

Professional discipline, by contrast:

  • Preserves relationships
  • Protects reputation
  • Maintains strategic optionality

In small-business ecosystems especially, credibility compounds over time.

What Makes a Strong Small Business Teaming Relationship?

A strong teaming relationship is defined less by formal structure and more by institutional alignment.

Effective teams demonstrate:

  • Clear understanding of differentiated strengths
  • Willingness to build depth rather than minimal compliance
  • Transparent workshare expectations
  • Mature responses to competitive shifts

When these elements are present, teaming strengthens not only a single proposal but the long-term capability network of the organization.

Building Competitive Architecture, Not Just Winning Contracts

Sustained growth in complex technical markets rarely comes from isolated contract awards. It comes from constructing a reliable competitive architecture grounded in disciplined execution, credible relationships, and thoughtful capability alignment. Teaming decisions are central to that architecture.

Organizations that approach partnership deliberately, with institutional self-awareness, operational foresight, and professional maturity, create networks that strengthen both pursuit and performance.

For leadership teams navigating modernization initiatives, shifting procurement priorities, evolving mission requirements, and constrained resources, the quality of partnerships is often as consequential as internal capability.

Teaming, treated as an executive-level discipline, becomes a force multiplier and a durable source of competitive strength.

Header image: G. Edward Johnson, CC BY 4.0 https://creativecommons.org/licenses/by/4.0, via Wikimedia Commons

5 Things CFOs Should Know About Open Source

You don’t have to understand open source to have an opinion on it. But here’s the thing: your company is already running on it. The question isn’t whether to engage with open source, but whether you’re being intentional about it.

Here’s what CFOs should have on their radar.

1. “Free” is a starting price, not a total cost.

Maintenance costs are baked into every software decision. The question is whether you’re paying them in engineering time or in vendor fees. Every shortcut taken today becomes tech debt tomorrow, and McKinsey warns that companies spending more than half their IT budget servicing that debt are likely paying interest only, with little left for innovation (Lamarre et al., 2020). Open source lets you redirect that spend toward work that actually differentiates your business.

2. Your vendor leverage depends on it.

When your stack is built on open source foundations, you have a credible alternative to any vendor relationship. That optionality changes contract negotiations in your favor. Companies locked into proprietary systems don’t have the same walk-away power and vendors know it.

3. It shows up in your talent numbers.

Engineers tend to follow interesting technology, making a modern open source stack is a recruiting asset, and a legacy proprietary one is a quiet repellent. A modern open source stack is a recruiting asset; a legacy proprietary one is a quiet repellent. Gallup estimates that replacing an employee in a technical role costs around 80% of their annual salary, and that figure excludes harder-to-measure losses like institutional knowledge and team morale (Gallup, 2025). Retain one engineer who would have left, and open source has already paid for itself.

4. Your open source licenses are a portfolio to manage, not a minefield to avoid.

Many open source licenses, such as Apache, MIT, or BSD are straightforwardly business-friendly. The few that carry restrictions only become a problem when nobody’s tracking them. According to Synopsys’s 2024 Open Source Security and Risk Analysis report, 53% of audited codebases contained open source license conflicts (Synopsys Cybersecurity Research Center, 2024) That number that drops sharply for organizations that treat license exposure the way they’d treat any other portfolio: with visibility and periodic review.

5. You’re more dependent on it than you think.

The infrastructure underneath your products, including databases, operating systems, security tools, and cloud platforms, is almost certainly built on open source. That’s not a problem. But it is a reason to pay attention. Synopsys’s 2024 report found that 96% of commercial codebases contain open source components, with open source accounting for 77% of the total code scanned across more than 1,000 audits (Synopsys Cybersecurity Research Center, 2024).

References

Gallup. (2025, August 22). Employee retention depends on getting recognition right. https://www.gallup.com/workplace/650174/employee-retention-depends-getting-recognition-right.aspx

Lamarre, E., Smaje, K., & Zemmel, R. (2020, October 5). Tech debt: Reclaiming tech equity. McKinsey & Company. https://www.mckinsey.com/capabilities/mckinsey-digital/our-insights/tech-debt-reclaiming-tech-equity

Synopsys Cybersecurity Research Center. (2024). Open source security and risk analysis (OSSRA) report (9th ed.). Synopsys. https://www.synopsys.com/software-integrity/resources/analyst-reports/open-source-security-risk-analysis.html

Operational Readiness in a Geospatial World: How to Be Prepared in 2026

Executive Summary

Geospatial operational readiness in 2026 requires more than reliable systems, it demands contextually aware operations that understand where work happens, under what conditions, and how those conditions affect decision quality. Deterministic workflows remain foundational for repeatability and auditability, but they were not designed to interpret the conditional complexity inherent in geospatial data.

As organizations encounter more spatially driven processes, from field operations to inspections, logistics, asset tracking, and risk monitoring, traditional automation approaches begin to strain. AI-enabled orchestration augments deterministic workflows by interpreting spatial nuance at runtime, ensuring that rules and processes execute correctly under real-world conditions.

Underlying this capability is a modern data architecture: cloud-native geospatial formats and workflows that make spatial data accessible, scalable, and usable by both analytics and orchestrated operational systems.

This post provides a leadership-focused framework for understanding and maturing geospatial operational readiness in 2026.

Operational Readiness Has Become Spatial

Across industries, operations increasingly hinge on spatial context, whether teams are coordinating operations (Cadell, 2025), managing field activities, managing distributed assets, responding to incidents, navigating compliance demands, or optimizing logistics. As environments shift, real-world conditions can change quickly. Boundaries move, access varies, environmental factors fluctuate, and situational constraints evolve.

The modern geospatial landscape has advanced beyond desktop-era workflows and static files. Cloud-native architectures and scalable analytics platforms have reshaped how organizations collect, store, and use spatial data. These developments are no longer niche; they form part of the core operational fabric for organizations whose work depends on location, context, and changing conditions.

Operational readiness in 2026 means more than maintaining systems, it means maintaining situational awareness. Together, these shifts signal a need for operational approaches that treat spatial variability as a core design assumption rather than an exceptional case.

Why Deterministic Workflows Need Contextual Orchestration in a Geospatial Environment

Deterministic workflows, systems that follow defined rules and structured decision paths, remain essential for reliability, predictability, and governance. But spatial data introduces conditional complexity (SafeGraph, 2025; Cloud-Native Geospatial Forum, 2023; Holmes, 2017): geometric variability, inconsistent resolution, temporal dependencies, incomplete or intermittent data, and shifting field conditions.

The rules aren’t the problem (Goodchild, 2007). The challenge is determining which rule or deterministic branch applies under a given set of real-world spatial conditions. Traditional workflow tools assume uniformity in inputs and predictable operating environments (Davenport Group, 2024). Spatial nuance violates those assumptions (Cadell, 2025).

AI-enabled orchestration supplies the missing capability (Smit, 2024; CARTO, 2025). Spatial knowledge graphs strengthen this capability by structuring spatial relationships, enabling orchestration engines to reason over topology, proximity, and semantic context at runtime (OGC, 2023; Hu et al., 2013).

  • Evaluating spatial and environmental context at runtime.
  • Selecting the correct deterministic tools based on real-world conditions.
  • Handling exceptions when data is incomplete or ambiguous.
  • Scaling to support complex operations without requiring brittle, manually crafted rules for every scenario (Russell & Norvig, 2021; IBM, 2023).

This approach doesn’t replace determinism, it supports it, ensuring workflows remain accurate and contextually appropriate. These dynamics create a natural bridge to understanding the operational readiness gaps that most SMBs still face.

Hidden Readiness Gaps

Many of the challenges organizations face stem not from technology failures but from misalignment between deterministic logic and real-world spatial complexity. The most common gaps include:

1. Insufficient Context Delivery

Workflows often execute without the spatial context necessary to determine the right path (SafeGraph, 2025; Esri, 2023). This gap is more than a simple data availability issue. Many organizations rely on static rule sets that assume uniform conditions, but operational environments rarely behave uniformly. Field conditions, environmental constraints, or network coverage can shift within minutes, and without real-time context delivery, deterministic logic defaults to outdated assumptions. As a result, decisions that appear correct in isolation become misaligned with real-world circumstances.

2. Fragmented or Isolated Spatial Data

Spatial information frequently remains siloed within vendor tools, logs, exports, or specialized systems (Estuary, 2025; SafeGraph, 2025). Fragmentation undermines operational readiness because organizations cannot form a coherent view of their spatial landscape. Spatial knowledge graphs help address this by unifying heterogeneous datasets through shared spatial semantics, reducing the friction caused by isolated sources (OGC, 2023; Hu et al., 2013). Even when datasets exist, they are often inconsistently updated, stored in incompatible formats, or accessible only through manual processes. These barriers limit the ability of analytics, workflows, and orchestration engines to use spatial information effectively, resulting in delayed or incomplete situational awareness.

3. Integration Fragility

Integrations built around assumptions of uniform, predictable data fail when exposed to spatial variability (SafeGraph, 2025; Davenport Group, 2024). Traditional integration pipelines expect consistent schemas and steady data quality, but geospatial data often varies in resolution, precision, or completeness. When an integration pipeline encounters deviations, it may break silently or generate outputs that appear correct but are contextually wrong. This fragility erodes confidence in automated systems and forces teams to rely on manual checks or fallbacks that negate the value of automation.

4. Spatial Governance Blind Spots

Location-enabled data raises privacy, continuity, and security concerns that are often not formally addressed (OECD, 2024; OneTrust, 2023). Many organizations treat spatial data as operational rather than sensitive, overlooking its potential to infer personal movement, facility vulnerabilities, or logistic patterns. Without explicit governance policies, data may be shared too broadly, retained too long, or used in ways that expose the organization to compliance or reputational risks. Effective spatial governance frameworks ensure clear accountability, transparent access controls, and alignment with regulatory expectations.

5. Limited Geospatial and Orchestration Literacy

Teams may understand business rules but lack familiarity with how spatial nuance affects rule application (Korem, 2022; United Nations Committee of Experts on Global Geospatial Information Management, 2020). This literacy gap appears across industries and organizations may not recognize how physical geography or human geography modifies operational decisions, nor how orchestration tools interpret those conditions. Without shared understanding, it can be difficult to diagnose workflow failures, design effective automations, or evaluate the implications of new spatial data sources.

6. Legacy vs. Cloud-Native Data Practices

Cloud-native formats (for example, Cloud Optimized GeoTIFF, GeoParquet) unlock scalable analytics, streaming access, and AI integration, but many organizations still rely on older, file-based workflows that limit orchestration potential (Cloud-Native Geospatial Forum, 2023; Holmes, 2017; Smit, 2024).

These gaps prevent deterministic workflows from operating reliably when spatial conditions matter most. Addressing them requires a structured pathway for maturing geospatial and orchestrational capability.

7. Complexity of Physical and Human Geography

Automated workflows often struggle when physical geography, such as terrain, hydrology, vegetation, or built infrastructure, interacts with human geography, including population density, mobility patterns, land use, and socio-economic activity. These layers frequently produce nonlinear or unexpected operational constraints that deterministic logic alone cannot anticipate. For example, human movement patterns may be shaped by physical barriers, and environmental risks are often amplified or mitigated by settlement patterns. Effective orchestration must interpret how these geographic domains interact so that workflows can make contextually accurate decisions in environments where spatial relationships are complex and dynamically interdependent (Goodchild, 2007; National Academies of Sciences, Engineering, and Medicine, 2021).

A Geospatial Operational Readiness Model

A credible maturity model helps organizations understand how to align geospatial data, deterministic workflows, and contextual orchestration over time.

Stage 1: Spatial Data Foundation

A strong spatial data foundation ensures that all subsequent analytical and operational capabilities rest on accurate, accessible, and trustworthy geospatial information.

  • Clean, current, governed spatial datasets provide reliable inputs for analytics and workflows, reducing errors that stem from outdated or inconsistent data (Cloud-Native Geospatial Forum, 2023).
  • The use of cloud-native geospatial formats enables scalable access and processing, which supports real-time orchestration and modern integration patterns.
  • Defined ownership and access controls ensure accountability and reduce risk, allowing teams to trust and effectively leverage spatial data.
  • Robust metadata management, including lineage, temporal currency, coordinate systems, and quality indicators, ensures that both analytics and AI-assisted orchestration can correctly interpret spatial data and make reliable context-driven decisions.

Stage 2: Contextual Spatial Analytics

Contextual spatial analytics transforms raw location data into actionable insights that inform operational decisions and prepare an organization for orchestrated workflows.

  • Analytics platforms reveal patterns, risks, and operational hotspots, helping leaders understand where geographic conditions influence performance (CARTO, 2025).
  • Insights expressed through APIs or events provide machine-readable context that orchestration systems can incorporate into decision pathways.

Stage 3: Spatial Integration with Deterministic Workflow Support

Spatial integration ensures that deterministic workflows can operate correctly by incorporating real-world geographic complexity into execution logic.

  • Systems that ingest spatial context and operate despite variability help maintain reliability when environmental conditions change unexpectedly (SafeGraph, 2025).
  • Deterministic workflows instrumented to accept context inputs execute more accurately because they adapt rule selection to real-time conditions.
  • Integrations designed for spatial complexity improve system resilience, reducing workflow failures caused by assumptions of uniformity.

Stage 4: AI-Assisted Contextual Orchestration

AI-assisted orchestration enables workflows to interpret spatial nuance dynamically, improving decision accuracy under varying conditions. Support for Model Context Protocol (MCP) aligned orchestration patterns enhances interoperability between systems and ensures that contextual signals can be exchanged consistently across operational environments.

  • AI that interprets spatial nuance to guide deterministic rule selection improves workflow correctness and reduces misaligned decisions (Smit, 2024).
  • Adaptive exception handling ensures that workflows continue operating even when data is missing, ambiguous, or contradictory.
  • Governance that ensures transparency, auditability, and escalation pathways builds trust and compliance into the orchestration process.

Stage 5: Operational Intelligence

Operational intelligence represents the culmination of geospatial maturity, where organizations use context-aware, orchestrated workflows to drive predictive and adaptive operations.

  • Real-time spatial context that continuously informs decisions enables proactive rather than reactive operational management (United Nations Committee of Experts on Global Geospatial Information Management, 2020).
  • Workflows that execute with full contextual fidelity operate consistently and accurately, even when conditions are complex or rapidly evolving.
  • Predictive and scenario-driven capabilities strengthen strategic planning, allowing leaders to anticipate risk and optimize resource allocation.

This model emphasizes capability-building, not tool adoption, as the central theme of operational readiness. It also provides a logical foundation for the leadership considerations that follow.


Leadership Implications – Modern Technology Direction Requires Geospatial and Orchestrational Awareness

Technology leadership must evolve from managing systems to managing context. In practice, this means:

  • Treating spatial context as a first-class input to operational decisions.
  • Ensuring deterministic workflows have structured ways to receive and interpret contextual data.
  • Championing cloud-native geospatial architectures.
  • Establishing governance around both deterministic logic and AI-assisted orchestration.
  • Developing geospatial literacy at the leadership level, not just within technical teams.

Technology leaders must guide the evolution from static, assumption-driven workflows to contextually adaptive operations (Cadell, 2025).

A Strategic Pathway Toward Geospatial Operational Readiness

Geospatial operational readiness isn’t achieved through a short-term checklist, it is a strategic capability that develops over time. A sustainable approach includes:

  1. Establish Leadership Alignment Around Context-Aware Operations
    Clarify how spatial context influences operational outcomes and where deterministic logic needs contextual support (UN-GGIM, 2020; OECD, 2024).
  2. Conduct a Context-Centric Operational Assessment
    Identify workflows where spatial nuance materially affects execution accuracy or efficiency (CARTO, 2025; Esri, 2023).
  3. Develop an Architecture That Supports Contextual Orchestration
    Modernize data and integration patterns to ensure workflows can access and act upon spatial context (Cloud-Native Geospatial Forum, 2023; Smit, 2024).
  4. Introduce Contextual Orchestration Through Iterative Pilots
    Start with workflows where errors or inefficiencies are most clearly tied to spatial complexity (Davenport Group, 2024; SafeGraph, 2025).
  5. Build a Long-Term Roadmap for Spatially Informed Operations
    Plan for phased capability expansion aligned with operational, compliance, and partner expectations (UN-GGIM, 2020; OECD, 2024).

This pathway reinforces that readiness is ongoing and multi-dimensional. It also sets the stage for translating strategy into daily operational practice.


Conclusion: Readiness Requires Spatially Informed, Contextually Aware Operations

Operational readiness in 2026 rests on the ability to interpret spatial context and execute deterministic workflows with precision. Organizations that combine modern geospatial architectures with AI-assisted contextual orchestration will be equipped to make accurate, timely, and resilient decisions.

Those that continue relying on static assumptions and isolated spatial tools may find their operations increasingly misaligned with real-world conditions, and with the expectations of customers, partners, and regulators.

A disciplined, strategic approach to geospatial operational readiness provides both resilience today and a foundation for innovation tomorrow.

If your organization is exploring how to strengthen geospatial operational readiness, modernize workflows, or introduce AI orchestration, Cercana can help. We partner with SMB technology leaders and prime contractors to evaluate current capabilities, identify maturity gaps, and design architectures that support scalable, context-aware operations. Contact us to discuss your goals and learn how we can support your next stage of geospatial transformation.

References

Cadell, W. (2025, December 29). Custom Inc. Strategicgeospatial.com; Strategic Geospatial. https://www.strategicgeospatial.com/p/custom-inc

CARTO. (2025, August 28). Branch out: New control components for more resilient workflows. CARTO. https://carto.com/blog/new-control-components-for-more-resilient-workflows

Cloud-Native Geospatial Forum. (2023). Cloud-optimized GeoTIFFs. In Cloud-native geospatial data formats guide. Cloud-Native Geospatial Forum. https://guide.cloudnativegeo.org/cloud-optimized-geotiffs/intro.html

Davenport Group. (2024, November 13). Key challenges of data integration and solutions. Davenport Group. https://davenportgroup.com/insights/key-challenges-of-data-integration-and-solutions/

Estuary. (2025, March 20). 8 reasons why data silos are problematic and how to fix them. Estuary. https://estuary.dev/blog/why-data-silos-problematic/

Esri. (2023). ArcGIS Field Maps: Real-time data capture and situational awareness. Esri. https://www.esri.com/en-us/arcgis/products/arcgis-field-maps/overview

Holmes, C. (2017, October 10). Cloud native geospatial part 2: The Cloud optimized GeoTIFF. Planet Stories. https://medium.com/planet-stories/cloud-native-geospatial-part-2-the-cloud-optimized-geotiff-6b3f15c696ed

Hu, Y. et al. (2013). A Geo-ontology Design Pattern for Semantic Trajectories. In: Tenbrink, T., Stell, J., Galton, A., Wood, Z. (eds) Spatial Information Theory. COSIT 2013. Lecture Notes in Computer Science, vol 8116. Springer, Cham. https://doi.org/10.1007/978-3-319-01790-7_24

Korem. (2022, March 28). Lack of resources and expertise within companies to exploit geospatial technology. Korem. https://www.korem.com/lack-of-resources-and-expertise-within-companies-to-exploit-geospatial-technology/

OECD. (2024). AI, data governance and privacy: Synergies and areas of international co-operation. OECD Publishing. https://www.oecd.org/en/publications/ai-data-governance-and-privacy_2476b1a4-en.html

OneTrust. (2023). Privacy on the horizon: What organizations need to watch in 2023. OneTrust. https://www.onetrust.com/resources/privacy-on-the-horizon-what-organizations-need-to-watch-in-2023-report/

OGC GeoSPARQL – A Geographic Query Language for RDF Data. (2023). Docs.ogc.org. https://docs.ogc.org/is/22-047r1/22-047r1.html

SafeGraph. (2025). Challenges of geospatial data integrations. SafeGraph. https://www.safegraph.com/guides/geospatial-data-integration-challenges

Smit, C. (2024, October 16). Cloud-native geospatial data formats. NASA Goddard Earth Sciences Data and Information Services Center. https://ntrs.nasa.gov/api/citations/20240012643/downloads/Cloud-Native%20Geospatial%20Data%20Formats.pdf

United Nations Committee of Experts on Global Geospatial Information Management. (2020). Integrated geospatial information framework: Strategic pathway 8 – Capacity and education. UN-GGIM. https://ggim.un.org/UN-IGIF/documents/SP8-Capacity_and_Education_19May2020_GLOBAL_CONSULTATION.pdf

Reducing the Costs of Fragmented Spatial Data in 2026

Organizations invested heavily in geospatial tools and data throughout 2025. Yet many leaders found the return on that investment lower than expected. A core issue is fragmentation rather than a lack of data or technology capability. When spatial data is scattered across teams, tools, and formats, it becomes harder to trust, harder to maintain, and harder to use for meaningful decisions.

This is why 2026 will reward organizations that focus not on bigger geospatial systems, but on cleaner, right-sized spatial data pipelines that deliver clarity rather than complexity.

Industry forecasts reflect this shift. Analysts estimate the global geospatial analytics market at $114 billion in 2024, projecting growth to more than $226 billion by 2030 (Grand View Research, 2024). Another independent forecast places the market at $258 billion by 2032, driven by adoption across infrastructure, logistics, and smart-city applications (Fortune Business Insights, 2024). But as adoption accelerates, complexity rises: many organizations still struggle with data quality and context, which remain barriers to effective geospatial insight (Korem, 2025).

Costs of Fragmentation

Fragmentation rarely announces itself. It appears subtly in duplicate datasets, inconsistent update cycles, siloed maps, or “shadow” spatial layers created by individual teams. These inconsistencies introduce persistent operational friction:

  • Analysts spend more time reconciling data than interpreting it.
  • Cross-functional teams make decisions based on slightly different versions of the truth.
  • Trust in spatial outputs erodes as discrepancies accumulate.

Broader technology trend research highlights the same issue: modern digital environments are growing more complex, making integration discipline essential (McKinsey & Company, 2025). Nowhere is this truer than in geospatial workflows, where inconsistent data pipelines undermine the insights organizations depend on.

Bigger Systems != Bigger Insight

A persistent misconception is that impactful geospatial work requires enterprise-scale GIS stacks, large teams, or massive datasets. But today’s ecosystem offers a spectrum of tools, from legacy proprietary solutions like ArcGIS to modern enterprise-grade open-source platforms using tools such as DuckDB or Sedona, and an expanding set of specialized tools used across planning, logistics, environmental management, and operations. Independent analysis notes that GIS platforms enable organizations to integrate spatial data, visualize patterns, and support decision-making across sectors ranging from transportation to public safety (Research.com, 2025). Leaders can match tools to decisions rather than building infrastructure for its own sake.

Industry observers note similar trends: cloud-based GIS, AI-driven spatial analytics, and real-time data streams increasingly enable organizations of all sizes to integrate geospatial insight into their decision frameworks (LightBox, 2025). The threshold for adopting spatial intelligence is lower than ever — provided data pipelines remain clean and coherent.

ROI in Small, Targeted Spatial Insights

Some of the highest-value geospatial outcomes arise not from extensive datasets but from small, curated insights aligned to operational needs:

  • Identifying which assets fall inside specific risk zones.
  • Visualizing coverage gaps or service inconsistencies through a single boundary overlay.
  • Pinpointing route or deployment inefficiencies affecting field productivity.

Innovation trends reinforce this path. New geospatial entrants are developing AI-assisted mapping tools that allow non-technical teams to generate spatial insights without relying on specialized staff (Business Insider, 2025). This democratization of spatial intelligence reduces the need for one-off, isolated datasets, helping to prevent fragmentation before it starts.


MapIdea offers a particularly relevant perspective: geography can serve as a unifying analytical key, allowing organizations to connect datasets that share no other identifiers and reduce fragmentation across systems (MapIdea, 2025).

How To Start Simplifying in 2026

A right-sized approach doesn’t require heavy investment. It requires intentional design:

  1. Establish authoritative versions of key spatial datasets and retire duplicates.
  2. Align update cycles with operational rhythms, whether monthly or real-time.
  3. Integrate spatial data into existing analytics environments rather than building parallel systems.
  4. Start with one meaningful decision, demonstrate value, and scale deliberately.

These steps reduce friction, strengthen trust, and create a foundation for more advanced geospatial capability in the future.

The 2026 Opportunity

As the geospatial analytics market continues to grow at double-digit rates, organizations face a choice: accumulate complexity or pursue clarity.

Right-sized geospatial, built on coherent pipelines and targeted insights, offers a practical path forward. It replaces fragmentation with consolidation, trades overhead for agility, and most importantly, it positions geography as a shared context for informed, decision-making across your organization.

Cercana can help you streamline your geospatial data portfolio and operations. Contact us today to learn more.

References

Business Insider. (2025). AI-powered mapping platform secures funding for next-generation geospatial tools. https://www.businessinsider.com/felt-ai-mapping-platform-funding-geographic-information-system-2025-7

Fortune Business Insights. (2024). Geospatial analytics market report. https://www.fortunebusinessinsights.com/geospatial-analytics-market-102219

Grand View Research. (2024). Geospatial analytics market size, share & trends analysis report. https://www.grandviewresearch.com/industry-analysis/geospatial-analytics-market

Korem. (2025). Geospatial trends in 2025: The latest industry evolutions. https://www.korem.com/geospatial-trends-in-2025-the-latest-industry-evolutions

LightBox. (2025). Top 10 trends in GIS technology for 2025. https://www.lightboxre.com/insight/top-10-trends-in-gis-technology-for-2025

MapIdea. (2025). Open letter to data and analytics leaders on geography. https://www.mapidea.com/blog/open-letter-to-data-and-analytics-about-geo

McKinsey & Company. (2025). Technology trends outlook 2025. https://www.mckinsey.com/~/media/mckinsey/business%20functions/mckinsey%20digital/our%20insights/the%20top%20trends%20in%20tech%202025/mckinsey-technology-trends-outlook-2025.pdf

Research.com. (2025). Best geographic information systems (GIS) in 2026. https://research.com/software/guides/best-geographic-information-systems

Header Image Credit: National Oceanic and Atmospheric, Public Domain

Variations of Open

Introduction

The word “open” gets used so often in tech that it starts to feel universal, like everyone must be talking about the same thing. But once you listen closely, it becomes obvious that different groups mean very different things when they say it. A software engineer is thinking about readable source code and licenses. Someone who works with data is thinking about public portals and Creative Commons. People in AI might be picturing model weights you can download even if you can’t see how the model was trained. And increasingly, someone might just mean information that’s publicly visible online, such as social media posts, ship trackers, or livestreams, without any license at all.

None of these interpretations are wrong. They just grew out of different needs. Openness meant one thing when it applied to code, something else entirely when governments began releasing public data, and now it’s shifting again as AI models enter the mix. Meanwhile, the rise of OSINT has blurred things further, with “open” sometimes meaning nothing more than “accessible to anyone with an internet connection.”

The result is that modern systems combine pieces from all these traditions and people often assume they’re aligned when they’re not. The friction shows up not because anyone misunderstands the technology, but because the language hasn’t kept up with how quickly the idea of openness has expanded.

Open-Source Software

In terms of software, “open” means open-source. In that context it has a clear meaning. You can read the code, change it, and share it as long as you follow the license. That predictability is a big part of why the movement grew. People understood the rules and trusted that the rules would hold.

But the full spectrum of open-source shows up in the habits and culture around it. Communities develop their own rhythms for how to submit a pull request, file a useful bug report, talk through disagreements, and decide which features or fixes make it into a release. None of that comes from a license. People learn it by watching others work, answering questions in long issue threads, or showing up in mailing lists and channels where projects live.

There’s also an unspoken agreement inside open-source software. If a project helps you, you try to help it back. Maybe you fix a typo, or you donate, or you answer someone else’s question. It’s not required, but it’s how projects stay healthy.

Anyone who has maintained an open-source project knows it isn’t glamorous. It can be repetitive, sometimes thankless, and often demanding. Good maintainers end up juggling technical decisions, community management, and the occasional bit of diplomacy.

All this shapes a shared understanding of what openness means in software. People think not just about reading code, but about the whole ecosystem built around it: contribution paths, governance models, release practices, and the blend of freedom and responsibility that holds everything together.

Once the idea of openness moved beyond software, that ecosystem didn’t necessarily apply. As other fields developed their own approaches to openness, patterns and practices evolved in alignments with each unique domain.

Open Data

Open data developed along its own path. Instead of code, publishers released information about the world: transit schedules, land use maps, environmental readings, census tables. The goal was simple: make public data easier to access so people could put it to use.

Because software licenses didn’t fit, data and content licenses such as Creative Commons were developed. CC BY and CC0 became common. Open Data Commons created specialized database licenses—ODbL added share-alike requirements specifically for databases, while PDDL offered a public domain dedication. You can see the differences in well known datasets. OpenStreetMap’s ODbL means derived data often has to stay open and always require attribution. USGS datasets, which are mostly public domain, are easy to fold into commercial tools. City transit feeds under CC BY only ask for attribution.

Privacy concerns complicate open data, which isn’t exempt from GDPR, CCPA, or similar laws. Even seemingly innocuous data can reveal personal information—location datasets showing frequent stops at specific addresses, or timestamped transit records that establish movement patterns. Many publishers address this through aggregation, anonymization, or by removing granular temporal and spatial details, but anyone working with open data still ends up checking metadata, tracking where files came from, and thinking about what patterns a dataset might reveal.

Open Source Information (OSINT)

Open-source Intelligence (OSINT) is an overlapping but different concept from open data. Information is considered “open” in OSINT because anyone can access it, not because anyone has the right to reuse it. A ship tracker, a social media post, a livestream from a city camera are examples of data that may fall into this category.

These sources vary widely in reliability. Some come from official databases or verified journalism. Others come from unvetted social media content, fast-moving crisis reporting, or user-generated material with no clear provenance. OSINT analysts rely heavily on validation techniques such as correlation, triangulation, consensus across multiple sources, and structured analytic methods.

While OSINT has deep roots in government intelligence work, it is now widely practiced across sectors including journalism, cybersecurity, disaster response, financial services, and competitive intelligence. Marketing technologies have expanded OSINT further into the private sector, making large-scale collection and analysis tools widely accessible.

Confusion can arise when open data and OSINT are treated as interchangeable. Someone may say they used open data, meaning a licensed dataset from a government portal. Someone else hears open and assumes it means scraping whatever is publicly visible.

This distinction matters because the two categories carry fundamentally different permissions and obligations. Open data comes with explicit rights to reuse, modify, and redistribute—legal clarity that enables innovation and collaboration. OSINT, by contrast, exists in a gray area where accessibility doesn’t imply permission, and users must navigate copyright, privacy laws, and terms of service on a case-by-case basis. 

Understanding this difference isn’t just semantic precision; it shapes how organizations design data strategies, assess legal risks, and build ethical frameworks for information use. When practitioners clearly specify whether they’re working with licensed open data or publicly accessible OSINT, they help prevent costly misunderstandings and ensure their work rests on solid legal and ethical foundations.

Open AI Models

In AI, openness takes on another meaning entirely. A model is more than code or data. It’s architecture, training data, weights, and the training process that binds everything together. So when a model is described as open, it’s natural to ask which part is actually open.

You see the variety in projects released over the past few years. Some groups publish only the weights and keep the training data private. Meta’s Llama models fall into this category. You can download and fine tune them, but you don’t see what went into them. Others release architectural details and research papers without sharing trained weights—early transformer work from Google and OpenAI showed the approach without providing usable models. GPT-NeoX took a middle path, releasing both architecture and weights but with limited training data transparency.

A few projects aim for full transparency. BLOOM is the most visible example, with its openly released code, data sources, logs, and weights. It took a global effort to pull that off, and it remains the exception, though projects like Falcon and some smaller research models have attempted similar transparency.

This partial openness shapes how people use these models. If you only have the weights, you can run and fine tune the model, but you can’t inspect the underlying data. When the training corpus stays private, the risks and biases are harder to understand. And when licenses restrict use cases, as they do with Llama’s custom license that prohibits certain commercial applications, or research-only models like many academic releases, you might be able to experiment but not deploy. Mistral’s models show another approach—Apache 2.0 licensing for some releases but custom licenses for others.

The idea of contribution looks different too. You don’t patch a model the way you patch a library. You build adapters, write wrappers, create LoRA fine-tunes, or train new models inspired by what came before. Openness becomes less about modifying the original artifact and more about having the freedom to build around it.

So in AI, open has become a spectrum. Sometimes it means transparent. Sometimes it means accessible. Sometimes it means the weights are downloadable even if everything else is hidden. The word is familiar, but the details rarely match what openness means in software or data.

Real World Considerations

These differences are fairly straightforward when they live in their own domains. Complexity can arise when they meet inside real systems. Modern projects rarely stick to one tradition. A workflow might rely on an open-source library, an open dataset, publicly scraped OSINT, and a model with open weights, and each piece brings its own rules.

Teams can run into this without realizing it. Someone pulls in an Apache-licensed geospatial tool and combines it smoothly with CC BY data. These work fine together. But then someone else loads OpenStreetMap data without noticing the share-alike license that affects everything it touches. A third person adds web-scraped location data from social media, not considering the platform’s terms of service or privacy implications. A model checkpoint from Hugging Face gets added on top, even though the license limits commercial fine-tuning. Most of these combinations are manageable with proper documentation, but some create real legal barriers.

Expectations collide too. A software engineer assumes they can tweak anything they pull in. A data analyst assumes the dataset is stable and comes with clear reuse rights. An OSINT analyst assumes publicly visible means fair game for analysis. Someone working with models assumes fine-tuning is allowed. All reasonable assumptions inside their own worlds, but they don’t line up automatically.

The same thing happens in procurement. Leadership hears open and thinks it means lower cost or fewer restrictions. But an open-source library under Apache is not the same thing as a CC BY dataset, neither is the same as scraped public data that’s accessible but not licensed, and none of those are the same as an open-weight model with a noncommercial license.

Geospatial and AI workflows feel these tensions even more. They rarely live cleanly in one domain. You might preprocess imagery with open-source tools, mix in open government data, correlate it with ship tracking OSINT, and run everything through a model that’s open enough to test but not open enough to ship. Sometimes careful documentation and attribution solve the problem. Other times, you discover a share-alike clause or terms-of-service violation that requires rethinking the entire pipeline.

This is when teams slow down and start sorting things out. Not because anyone did something wrong, but because the word open did more work than it should have and because “publicly accessible” got mistaken for “openly licensed.”

Clarifying Open

A lot of this gets easier when teams slow down just enough to say what they actually mean by open. It sounds almost too simple, but it helps. Are we talking about open code, open data, open weights, open access research, or just information that’s publicly visible? Each one carries its own rules and expectations, and naming it upfront clears out a lot of the fog.

Most teams don’t need a formal checklist, though those in regulated industries, government contracting, or high-stakes commercial work often do. What every team needs is a little more curiosity about the parts they’re pulling in—and a lightweight way to record the answers. If someone says a dataset is open, ask under what license and note it in your README or project docs. If a model is open, check whether that means you can fine-tune it, use it commercially, or only experiment with it—and document which version you’re using, since terms can change. If a library is open-source, make sure everyone knows what the license allows in your context. If you’re using publicly visible information—social media posts, ship trackers, livestreams—be clear that this is OSINT, not licensed open data, and understand what legal ground you’re standing on.

These questions matter most at project boundaries: when you’re about to publish results, share with partners, or move from research to production. A quick decision log—even just a shared document listing what you’re using and under what terms—prevents expensive surprises. It also helps when someone new joins the team or when you revisit the project months later.

The more people get used to naming the specific flavor of openness they’re dealing with and writing it down somewhere searchable, the smoother everything else goes. Projects move faster when everyone shares the same assumptions. Compliance reviews become straightforward when the licensing story is already documented. Teams stop discovering deal-breakers right when they’re trying to ship something. It’s not about being overly cautious or building heavy process. It’s just about giving everyone a clear, recorded starting point before the real work begins.

Conclusion

If there’s a theme running through all of this, it’s that the word open has grown far beyond its original boundaries. It meant one thing in software, something different in the world of public data, another in AI, and gets stretched even further when people conflate it with simply being publicly accessible. Each tradition built its own norms and expectations, and none of them are wrong. They just don’t automatically line up the way people sometimes expect.

Most of the friction we see in real projects doesn’t come from bad decisions. It comes from people talking past one another while using the same word. A workflow can look straightforward on paper but fall apart once you realize each component brought its own version of openness, or that some parts aren’t “open” at all, just visible. By the time a team has to sort it out, they’ve already committed to choices they didn’t realize they were making.

The good news is that this is manageable. When people take a moment to say which kind of open they mean, or acknowledge when they’re actually talking about OSINT or other public information, everything downstream gets smoother: design, licensing, procurement, expectations, even the conversations themselves. It turns a fuzzy idea into something teams can actually work with. It requires ongoing attention, especially as projects grow and cross domains, but the effort pays off.

Openness is a powerful idea, maybe more powerful now than ever. But using it well means meeting it where it actually is, not where we assume it came from.

At Cercana Systems, we have deep experience with the full open stack and can help you navigate the complexities as you implement open assets in your organization. Contact us to learn more.

Header image credit: Aaron Pruzaniec, CC BY 2.0 https://creativecommons.org/licenses/by/2.0, via Wikimedia Commons

Taming Tech Overwhelm – Modernization for Growing Businesses

For many small and mid-sized organizations, technology can feel like a moving target. Every week brings a new buzzword (AI, automation, digital transformation) while your core systems feel increasingly outdated. Your data may be scattered across tools, workflows may depend on manual processes, and the thought of a major modernization project can feel overwhelming.

You’re not alone. Across industries, small and mid-sized organizations often struggle to keep up with the speed of technological change while balancing day-to-day operations. Yet modernization doesn’t have to be disruptive or intimidating. It is a strategic, step-by-step process that aligns your technology investments with your business goals and, when done right, it can be transformative.

This post will help you reframe modernization as a series of practical, achievable steps that move you from feeling overwhelmed to being ahead of the curve.

Why Modernization Feels Overwhelming

Small businesses frequently cite barriers like cost, lack of expertise, and integration complexity as reasons for delaying tech adoption. A 2023 study from AI Business found that many organizations hesitate to invest in modern solutions because of a lack of in-house expertise and concerns over security and ROI (AI Business, 2023). Similarly, a 2025 TechRadar report noted that small and mid-sized businesses (SMBs) often see technology investment as a financial risk, even while recognizing its necessity for growth (TechRadar, 2025).

40% of business leaders cite lack of expertise as a barrier to technology adoption.
(AI Business, 2023)

Yet, delaying modernization comes with its own costs: slower operations, missed opportunities, and competitive disadvantage. Encouragingly, businesses that have embraced digital tools overwhelmingly report efficiency gains and optimism about future growth (U.S. Chamber of Commerce, 2023).

The good news is that modernization doesn’t need to be a massive leap. A structured, goal-driven approach can help you take small, confident steps forward.

A Practical Approach to Modernization

Modernization should be driven by your strategic business goals, not by chasing the latest technology trend. Here’s an approach to help you get started.

1. Start with Your Goals, Not the Tech

Before looking at software, hardware, or AI models, ask yourself: “What business problems do I need to solve?” Are you trying to speed up workflows, improve decision-making, enhance customer experience, support growth, or a combination of these and other goals?

When organizations align IT initiatives with clear business objectives, they see higher ROI and faster decision-making (Davenport, 2024). Modernization is most effective when it’s a means to an end, not an end in itself.

2. Audit Your Workflows and Data

A modernization journey should start with a clear picture of your current state:

  • Workflows: Identify bottlenecks, redundant tasks, or manual processes that slow your team down.
  • Data: Evaluate the quality and organization of your data. What’s reliable? What needs cleaning? Are there gaps preventing meaningful analysis?

This audit creates a roadmap for targeted improvements. Often, even simple steps—like consolidating duplicate data sources or automating a repetitive report—can deliver measurable time savings. 

The term “audit” can be overwhelming in itself, as we often picture a laborious, top-down, expensive deconstruction of every process. While that can be useful, it’s not always necessary. Often, you or your staff are already intimately familiar with the inefficiencies in your operations and these can be captured via interviews. This is often enough to get started.

3. Prioritize Small, High-Impact Wins

Modernization doesn’t have to mean overhauling everything at once. Start small:

  • Automate a single, repetitive process.
  • Consolidate reporting into a single dashboard.
  • Migrate one high-value workflow to a cloud-based solution.

These small wins build momentum, reduce team anxiety, and demonstrate clear ROI, paving the way for larger initiatives.

That second benefit, reducing team anxiety, is important. While we can all recognize inefficiencies, the inefficiency you know often causes less anxiety than the improvement you don’t. Getting your team comfortable with some small, early wins builds confidence in the overall process to come.

4. Build a Scalable Technology Strategy

Once you’ve addressed the most obvious inefficiencies, it’s time to lay a scalable foundation for growth. In this context, “scalability” simply means your technology can handle increased demand, complexity, or new functionality without needing a total replacement.

Scalability means your systems should grow as you grow—without costly rebuilds.

For small and medium-sized organizations, this often doesn’t mean buying the most expensive enterprise software. It can mean:

  • Choosing cloud platforms and analytics tools that allow you to add users, storage, or features incrementally.
  • Using modular systems that make it easy to plug in new capabilities—like automation or AI—when you’re ready.
  • Designing workflows so that new services, staff, or customers can be added without reworking everything from scratch.

By focusing on flexibility instead of “big tech,” you create systems that evolve alongside your organization’s needs. This approach can save money in the long run and reduces the stress of future growth.

Make AI and Automation Accessible

Artificial intelligence (AI) and automation are often seen as intimidating, but they don’t have to be. The key is to view AI as a tool to augment decision-making and streamline workflows rather than as a complex, abstract concept.

Practical applications include:

  • Automating repetitive reporting or data entry tasks.
  • Using predictive analytics to forecast demand or identify trends.
  • Applying natural language processing (NLP) to analyze customer feedback.

By starting with small, low-risk pilot projects, you can test AI solutions, measure results, and gain confidence in scaling them further. These pilots often produce immediate efficiency gains, helping you build internal support for deeper adoption.

Targeted Expertise

Many growing organizations delay modernization because they don’t have a CIO, CTO, or internal team member to lead the effort. This is where fractional executive leadership or embedded technical advisors can make a difference.

A skilled advisor provides:

  • Strategic vision: Ensuring your technology roadmap supports your goals.
  • Vendor neutrality: Recommending the right tools for your needs, not pushing a specific platform.
  • Execution support: Overseeing implementation to reduce risk and avoid costly missteps.
  • Knowledge transfer: Building your internal team’s capability over time.

At Cercana, we pair executive-grade advisory services with hands-on delivery, helping organizations modernize confidently without the overhead of a full-time executive hire. Our goal is to guide you through modernization while helping you build your internal capacity so that you can move into the future on your own.

Moving Forward with Confidence

Modernization isn’t a single project; it’s an ongoing journey that starts with understanding your goals, streamlining workflows, and building a flexible foundation. By breaking the process into manageable steps and leveraging expert guidance, you can move from feeling overwhelmed to gaining a competitive edge.

The organizations that thrive in today’s landscape are those that see technology as a growth enabler rather than a burden. With the right plan and guidance, modernization can unlock new opportunities, create efficiencies, and position your business for long-term success.

Ready to start your journey? Let’s talk. We’ll help you identify your highest-impact opportunities, design a strategy that works for your business, and deliver clarity at every step.

References

AI Business. (2023, June 21). Small-medium businesses face barriers to technology adoption. AI Business. https://aibusiness.com/verticals/small-medium-businesses-face-barriers-to-technology-adoption

Davenport, L. (2024, October 27). Business and IT alignment: Why is it so important? Davenport Group. https://davenportgroup.com/insights/business-and-it-alignment-why-is-it-so-important/

TechRadar. (2025, April 1). SMBs want to use tech more in order to grow—but costs are proving a big barrier. TechRadar Pro. https://www.techradar.com/pro/smbs-want-to-use-tech-more-in-order-to-grow-but-costs-are-proving-a-big-barrier

U.S. Chamber of Commerce. (2023, September 14). The impact of technology on U.S. small businesses. U.S. Chamber of Commerce. https://www.uschamber.com/small-business/smallbusinesstech

Header Image: Bernd Fiedler, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

A Novice Takes a Stab at GIS – Part 3

At this point in my entry-level upskilling project, the ground work has been done. I have a polygon of the Chesapeake Bay laid over an OpenStreetMap layer and I know how to change the color of it. Going back to the initial post, my hope with this project is to show change over time in the crab population of the Bay. As a complete novice, I don’t even know if there’s a way for me to do that in QGIS, or if I’m going to make 15 different maps with the 15 years of data and turn images of them into a .gif. So, I went back to ChatGPT for guidance. 

It also told me I could use style changes by time attribute, the TimeManager Plugin, or the manual process I had considered doing with turning a series of images into a .gif. 

I’ll be using the Temporal Controller since it was the first option. I asked ChatGPT for a step-by-step guide of how to do this.

Before getting bogged down in the process of creating the visualization, it’s important to have my data prepped and ready to go. I asked ChatGPT how it needed to be set up in order to use the temporal controller function.

In this case, I’ve decided to not do the thing that ChatGPT says is easier. The “Join External Time Data to Polygon” option seems to involve more data preparation work and be a better process to know for future projects. I began by taking a screen capture of the data table from the Maryland DNR’s Winter Dredge Survey history, uploaded it into ChatGPT, and had it use its OCR capabilities to make a table that I could paste into Excel and save as a .csv.

Step 1. 

Step 2. 

Step 3. 

After looking back at some of the steps in the process of using the temporal controller (Step 2 above), the final product ended up looking like this. I went into the attribute table of the polygon and saw that it already had an assigned ID of “2250”, so I added that column. Additionally, the geometry type is a polygon so that was added, as well.

With that, data preparation was complete and now I’m ready to move on to joining the data table to the polygon and creating the visualization. 

A Novice Takes a Stab at GIS – Part Two

Last week, I was able to settle on what the map I was creating would illustrate and find trustworthy data to use. This week, the focus is on actually creating the map itself. To do this, I need shapefiles of the Chesapeake Bay Watershed. 

I was able to source one from the Chesapeake Bay Program at data-chesbay.opendata.argis.com. This took me a handful of tries as most of the publicly available shape files of the Bay are a polygon of all the land and water considered to be within the Chesapeake Bay watershed. For the purposes of this map, I was looking for just the water itself. 

As a reminder, this is a self-guided process where I’m using ChatGPT to guide me through learning how to use QGIS. I’ve never loaded a shapefile before and ChatGPT gave me clear instructions.

In order to load the shapefile into QGIS, I dragged the downloaded folder, which included .shp, .xml, .shx, .prj, .dbf, and .cpg files, into a blank new project. I felt a brief moment of triumph before realizing that getting the land surrounding the Bay into the project would likely not be as simple, but it was actually even easier. 

QGIS has an OpenStreetMap layer built into the “XYZ Tiles” tab on the left side of the window. I turned it on, reordered the layers so that my shapefile of the water was over top of OSM, and that was all that needed to be done. The program had already lined up the shapefile of the Bay itself perfectly with where OSM had the Bay. 

Now it’s time to go back to Professor ChatGPT. I need to know how to change the color of the shapefile before I can even worry about assigning different colors to different levels of crab population, finding out how to automatically change the color based on data in a table, or anything else. 

Just to practice, I made the Bay crimson. 

Step 1. 

Step 2.

Step 3.

In my next post, I’ll be going back to ChatGPT to learn how I can set up a table of data and instruct QGIS to change the color of the water based on the data in said table. I’m not sure how that will work or look yet, but that’s part of the learning. 

Reframing Location Intelligence From Where to Why

Location intelligence is becoming increasingly central to enterprise analytics, with organizations in sectors such as retail, logistics, and financial services integrating geospatial data into decision-making systems. A 2016 McKinsey report projected that data-driven decision-making could generate trillions in economic value, with location data playing a key role in operational and strategic improvements (Manyika et al., 2016). Yet too often, location intelligence stops at the “where” , relying on maps, heatmaps, and dashboards that answer where events occur but fail to uncover why they happen. In a world where spatial data is richer and more interconnected than ever, it’s time to reframe the question.

Beyond Map-Centric Thinking

In a previous post, we explored how geospatial thinking often extends beyond visual maps and into the structure of the data itself. This post builds on that perspective by examining how organizations can move from observing where things happen to understanding why they happen.

Traditional geospatial tools have served us well by constructing maps from layers of information , showing us densities, boundaries, and movements. But these layers often describe what is happening, not what is causing it. They prioritize representation over interpretation.

As geographer Rob Kitchin observed, data infrastructures are shaped by what they are designed to reveal , and too often, spatial tools are built around display rather than reasoning (Kitchin, 2014). A map may show that customer churn is higher in certain neighborhoods, but it won’t explain the underlying factors , such as infrastructure decay, service gaps, or shifting demographics. The real opportunity lies not in seeing where something happens, but in understanding why it happens there , and what to do about it.

Why ‘Why’ Matters

In this context, understanding why means uncovering the underlying factors, influences, and sequences that drive spatial events. This goes beyond simply observing patterns to reveal the relationships and conditions that cause them. At its core, this means identifying causal factors (what directly or indirectly triggers an event), recognizing spatial influence (how neighboring locations or connected networks impact outcomes), and analyzing temporal sequences (how events unfold over time and shape one another).
To uncover the why, organizations must expand beyond latitude and longitude. They must analyze relationships, influences, and sequences that affect outcomes. This means incorporating spatial-temporal data, behavioral context, and causal modeling into their workflows.

For example:

  • Why do outages cluster in specific parts of a grid?
  • Why do certain stores underperform despite high foot traffic?
  • Why does a transportation route fail under specific weather conditions?

These questions require a shift from descriptive to diagnostic and predictive reasoning. As Harvey Miller emphasized in his work on time geography, it’s essential to understand how entities move through space over time , and how those movements interact (Miller, 2005).

Enabling the Shift from Where to Why

Several techniques support this evolution:

  • Spatial-temporal modeling captures how patterns change over time and space, useful for everything from crime forecasting to disease tracking.
  • Graph-based spatial reasoning allows entities to be analyzed in networks of relationships , for example, how upstream supply chain disruptions propagate downstream.
  • Machine learning models can incorporate spatial lag and neighborhood context as predictive features, treating geography as more than metadata.

Spatial-temporal modeling has proven essential in forecasting dynamic phenomena such as urban crime, traffic congestion, and disease spread. For instance, spatial-temporal models were central to COVID-19 response strategies, enabling public health officials to predict transmission hotspots and allocate resources accordingly (Yang et al., 2020).

Graph-based spatial reasoning enhances the ability to model systems as interconnected networks rather than isolated locations. This is especially useful in domains like disaster response and logistics. Recent research by Attah et al. (2024) explores how AI-driven graph analytics can improve supply chain resilience by revealing hidden interdependencies and points of failure across logistics networks.

Machine learning techniques are increasingly integrating spatial features to improve prediction accuracy. By incorporating spatial lag , the influence of neighboring observations , models can more accurately predict property values, infrastructure failure, or customer churn. The PySAL library, for example, supports spatial regression and clustering techniques that extend traditional ML approaches to account for spatial dependence (Rey & Anselin, 2010).

A wide range of modern technologies now support advanced spatial reasoning and spatio-temporal analytics at scale. These include open-source databases like PostgreSQL with PostGIS for spatial querying, graph databases such as Neo4j for topological reasoning, and analytical libraries like PySAL for spatial econometrics and clustering. Complementing these are cloud-native tools and formats that enhance scalability, flexibility, and real-time responsiveness. Columnar storage formats like Parquet and Zarr, distributed processing frameworks such as Apache Spark and Delta Lake, and streaming platforms like Kafka all enable organizations to model space and time as interconnected dimensions , moving beyond static maps toward continuous, context-aware decision-making.

Such methods shift the focus from identifying where something happened to uncovering why it happened by revealing the spatial dependencies, temporal sequences, and system-level interactions that drive outcomes. Far from being merely theoretical, these techniques are already delivering measurable impact across a wide range of sectors including public health, logistics, urban planning, and infrastructure. Organizations that embrace them are better positioned to make timely, data-driven decisions grounded in a deeper understanding of cause and context.

How Cercana Helps

At Cercana Systems, we help clients build deep-stack geospatial solutions that go beyond visualization. Our expertise lies in:

  • Designing data architectures that integrate spatial, temporal, and behavioral signals
  • Embedding spatial relationships into data pipelines
  • Supporting location-aware decision-making across logistics, infrastructure, and public services

We help clients uncover the deeper patterns and relationships within their data that inform not just what is happening, but why it’s happening and what actions to take in response.

Conclusion

The future of location intelligence lies not in better maps, but in better questions. As spatial data grows in scope and complexity, organizations must look beyond cartography and embrace spatial reasoning. Reframing the question from “Where is this happening?” to “Why is this happening here?” opens the door to more strategic, informed, and adaptive decision-making.

References

Attah, R. U., Garba, B. M. P., Gil-Ozoudeh, I., & Iwuanyanwu, O. (2024). Enhancing supply chain resilience through artificial intelligence: Analyzing problem-solving approaches in logistics management. International Journal of Management & Entrepreneurship Research, 6(12), 3883–3901. https://doi.org/10.51594/ijmer.v6i12.1745

Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. SAGE Publications. https://doi.org/10.4135/9781473909472

Manyika, J., Chui, M., Brown, B., et al. (2016). The age of analytics: Competing in a data-driven world. McKinsey Global Institute. https://www.mckinsey.com/capabilities/quantumblack/our-insights/the-age-of-analytics-competing-in-a-data-driven-world

Miller, H. J. (Harvey J.) (2005). A measurement theory for time geography. Geographical Analysis, 37(1), 17–45. https://doi.org/10.1111/j.1538-4632.2005.00575.x

Rey, S. J., & Anselin, L. (2010). PySAL: A Python library of spatial analytical methods. In Handbook of Applied Spatial Analysis (pp. 175–193). Springer. https://doi.org/10.1007/978-3-642-03647-7_11

Shekhar, S., Evans, M. R., Gunturi, V. M., Yang, K., & Abdelzaher, T. (2014). Spatial big-data challenges intersecting mobility and cloud computing. 2012 NSF Workshop on Social Networks and Mobility in the Cloud. http://dx.doi.org/10.1145/2258056.2258058Yang, W., Zhang, D., Peng, L., Zhuge, C., & Hong, L. (2020). Rational evaluation of various epidemic models based on the COVID-19 data of China. Mathematical Biosciences and Engineering, 17(3), 3051–3064. https://doi.org/10.1016/j.epidem.2021.100501

A Novice Takes a Stab at GIS

For the last handful of months, I’ve been working with my father’s company, Cercana Systems, to assist with content marketing and business development. In college, I finished most of a public relations degree at The University of Alabama before the first of my two daughters graced us with her presence and we decided to move back home to Maryland in order for her to actually have her extended family in her life. Since that time, I’ve found myself wanting more and more to find a way to help build things that provide a more tangible contribution to the world around me. Then, two weeks ago, I had the opportunity to attend FedGeoDay 2025 and decided it was time to start teaching myself some novice-level GIS skills. 

This year’s FedGeoDay theme of “open resilience” brought out a host of presenters that were using open geospatial ecosystems to do highly critical work in disaster prediction, preparation, and response. Everyone there was doing work that was well worth doing.

That brings us to the purpose of this blog post. It is the first in a series of undetermined length about me teaching myself how to use QGIS as a first foray into learning GIS skills. I am a complete novice with no coding or technical background of any kind. My hope is that documenting the process of attempting to make my first data-visualization in a commonly used tool may help others that want to delve into the field get past any apprehension they have about getting off the starting block. 

Step 1 – Pick a topic and find data

Initially, I had wanted to create a handful of heat maps that showed the change in the number of youth lacrosse programs in the mid-atlantic over the last 15 years. I turned to Chat GPT to source the data, but the engine was unable to find historical records for me to use.

“I want to build a heat map showing change over time in the concentration of youth lacrosse programs in the states of Maryland, Virginia, Delaware, New Jersey, Pennsylvania, North Carolina, West Virginia, and the District of Columbia. Could you scour the internet for data that I can use for this and put the data in a spreadsheet? The spread sheet should have three columns. The first column will indicate the year, the second will indicate the county, and the third should indicate the number of youth teams in that county. Let’s look at Spring teams only, not Fall, Winter, or Summer. Data should go back 15 years (2010-2025). Let’s consider “youth lacrosse” to be teams for children as old as 8th grade or 14 years of age. Some teams will use the “under X” age categorization, some will use the projected year of high school graduation, and some will just indicate a standard age range or grade level for the children it caters to, so be sure to look out for all of those. Can you do this?”

Compiling a comprehensive dataset detailing the number of youth lacrosse teams by county from 2010 to 2025 across Maryland, Virginia, Delaware, New Jersey, Pennsylvania, North Carolina, West Virginia, and the District of Columbia presents significant challenges. The primary hurdle is the lack of publicly available, granular historical data spanning this 15-year period. While some organizations provide current information, historical records at the county level are scarce.

### Available Data and Resources

* **Maryland Youth Lacrosse Association (MYLA)**: Oversees youth lacrosse in Maryland, offering current season details and age groupings. However, historical data by county is not readily accessible on their website. ([mylalax.com][1])

* **Western Maryland Youth Lacrosse Conference (WMYLC)**: Covers counties in Western Maryland and parts of neighboring states, with a focus on promoting youth lacrosse. Specific team counts by county and year are not provided online. 

* **Anne Arundel County Recreation and Parks**: Provides detailed information on current boys’ lacrosse leagues, including age divisions and game schedules. Historical data is not available on their platform. ([Anne Arundel County Government][2])

* **Inside Lacrosse Articles**: Discuss trends in lacrosse participation, noting shifts in player demographics and regional growth. These articles offer insights but lack specific county-level data over the desired time frame. ([Inside Lacrosse][3])

### Challenges in Data Compilation

* **Data Availability**: Most organizations do not maintain or publicly share historical team counts by county.

In the real world, sourcing and compiling accurate data can often be the bulk of the work done on a GIS project. However the point of this pursuit is to learn how to use QGIS, not spend countless hours calling up the Parks and Rec department of every county in the Mid-Atlantic, so I decided to pivot to something else.

So now, I’m looking for historical data over the last 15 years on the blue crab population in various sections of the Chesapeake Bay estuary. My new goal will be to create one map that shows the places where the population has increased the most, increased the least, and even decreased since 2010. 

This information was readily available from Maryland’s Department of Natural Resources, with one caveat. 

There was plenty of data on blue crab population available, but I wasn’t finding any that was split up into certain regions of the bay. Nonetheless, creating the map and shading the entire Bay based on percent change in population density from the median of the data year-to-year is a good beginner project to learn anything about QGIS at all, so we’re rolling with it. 

Step 2 – Installing QGIS

While it may seem like a silly step to document, this is supposed to be a properly novice guide to making a map in QGIS, and it’s a touch difficult to do that without installing the program. The machine I’m using is a 2020 M1 Macbook Air running Sonoma 14.6.1. I downloaded the installer for the “long term” version of QGIS from qgis.org, went through the install process, and attempted to open it. 

Naturally, my Macbook was less than thrilled that I was attempting to run a program that I hadn’t downloaded from the app store. It was completely blocking me from running the software when I opened it from the main application navigation screen. This issue was resolved by going to the “Applications” folder in Finder and using the control+left click method. A warning popped up about not being able to verify that the application contained no malware, I ran it anyway, and I have not had any issues opening the application since then. 

The next step will be to actually crack QGIS open and begin creating a map of the Chesapeake Bay. 

Geospatial Without Maps

When most people hear “geospatial,” they immediately think of maps. But in many advanced applications, maps never enter the picture at all. Instead, geospatial data becomes a powerful input to machine learning workflows, unlocking insights and automation in ways that don’t require a single visual.

At its core, geospatial data is structured around location—coordinates, areas, movements, or relationships in space. Machine learning models can harness this spatial logic to solve complex problems without ever generating a map. For example:

  • Predictive Maintenance: Utility companies use the GPS coordinates of assets (like transformers or pipelines) to predict failures based on environmental variables like elevation, soil type, or proximity to vegetation (AltexSoft, 2020). No map is needed—only spatially enriched feature sets for training the model.
  • Crop Classification and Yield Prediction: Satellite imagery is commonly processed into grids of numerical features (such as NDVI indices, surface temperature, soil moisture) associated with locations. Models use these purely as tabular inputs to predict crop types or estimate yields (Dash, 2023).
  • Urban Mobility Analysis: Ride-share companies model supply, demand, and surge pricing based on geographic patterns. Inputs like distance to transit hubs, density of trip starts, or average trip speeds by zone feed machine learning models that optimize logistics in real time (MIT Urban Mobility Lab, n.d.).
  • Smart Infrastructure Optimization: Photometrics AI employs geospatial AI to enhance urban lighting systems. By integrating spatial data and AI-driven analytics, it optimizes outdoor lighting to ensure appropriate illumination on streets, sidewalks, crosswalks, and bike lanes while minimizing light pollution in residential areas and natural habitats. This approach not only improves safety and energy efficiency but also supports environmental conservation efforts (EvariLABS, n.d.).

These examples show how spatial logic—such as spatial joins, proximity analysis, and zonal statistics—can drive powerful workflows even when no visualization is involved. In each case, the emphasis shifts from presenting information to enabling analysis and automation. Features are engineered based on where things are, not just what they are. However, once the spatial context is baked into the dataset, the model itself treats location-derived features just like any other numerical or categorical variable.

Using geospatial technology without maps allows organizations to focus on operational efficiency, predictive insights, and automation without the overhead of visualization. In many workflows, the spatial relationships between objects are valuable as data features rather than elements needing human interpretation. By integrating geospatial intelligence directly into machine learning models and decision systems, businesses and governments can act on spatial context faster, at scale, and with greater precision.

To capture these relationships systematically, spatial models like the Dimensionally Extended nine-Intersection Model (DE-9IM) (Clementini & Felice, 1993) provide a critical foundation. In traditional relational databases, connections between records are typically simple—one-to-one, one-to-many, or many-to-many—and must be explicitly designed and maintained. DE-9IM extends this by defining nuanced geometric interactions, such as overlapping, touching, containment, or disjointness, which are implicit in the spatial nature of geographic objects. This significantly reduces the design and maintenance overhead while allowing for much richer, more dynamic spatial relationships to be leveraged in analysis and workflows.

By embedding DE-9IM spatial predicates into machine learning workflows, organizations can extract richer, context-aware features from their data. For example, rather than merely knowing two infrastructure assets are ‘related,’ DE-9IM enables classification of whether one is physically inside a risk zone, adjacent to a hazard, or entirely separate—substantially improving the precision of classification models, risk assessments, and operational planning.

Machine learning and AI systems benefit from the DE-9IM framework by gaining access to structured, machine-readable spatial relationships without requiring manual feature engineering. Instead of inferring spatial context from raw coordinates or designing custom proximity rules, models can directly leverage DE-9IM predicates as input features. This enhances model performance in tasks such as spatial clustering, anomaly detection, and context-aware classification, where the precise nature of spatial interactions often carries critical predictive signals. Integrating DE-9IM into AI pipelines streamlines spatial feature extraction, improves model explainability, and reduces the risk of omitting important spatial dependencies.

Harnessing geospatial intelligence without relying on maps opens up powerful new pathways for innovation, operational excellence, and automation. Whether optimizing infrastructure, improving predictive maintenance, or enriching machine learning models with spatial logic, organizations can leverage these techniques to achieve better outcomes with less overhead. At Cercana Systems, we specialize in helping clients turn geospatial data into actionable insights that drive real-world results. Ready to put geospatial AI to work for you? Contact us today to learn how we can help you modernize and optimize your data-driven workflows.

References

Clementini, E., & Felice, P. D. (1993). A model for representing topological relationships between complex geometric objects. ACM Transactions on Information Systems, 11(2), 161–193. https://doi.org/10.1016/0020-0255(95)00289-8

AltexSoft. (2020). Predictive maintenance: Employing IIoT and machine learning to prevent equipment failures. AltexSoft. https://www.altexsoft.com/blog/predictive-maintenance/

Dash, S. K. (2023, May 10). Crop classification via satellite image time-series and PSETAE deep learning model. Medium. https://medium.com/geoai/crop-classification-via-satellite-image-time-series-and-psetae-deep-learning-model-c685bfb52ce

MIT Urban Mobility Lab. (n.d.). Machine learning for transportation. Massachusetts Institute of Technology. https://mobility.mit.edu/machine-learning

EvariLABS. (2025, April 14). Photometrics AI. https://www.linkedin.com/pulse/what-counts-real-roi-streetlight-owners-operators-photometricsai-vqv7c/

Reflections on the Process of Planning FedGeoDay 2025

What is FedGeoDay?

FedGeoDay is a single-track conference dedicated to federal use-cases of open geospatial ecosystems. The open ecosystems have a wide variety of uses and forms, but largely include anything designed around open data, open source software, and open standards. The main event is a one day commitment and is followed by a day of optional hands-on workshops. 

FedGeoDay has existed for roughly a decade , serving as a day of learning, networking, and collaboration in the Washington, D.C. area. Recently, Cercana Systems president Bill Dollins was invited to join the planning committee, and served as one of the co-chairs for FedGeoDay 2024 and 2025. His hope is that attendees are able to come away with practical examples of how to effectively use open geospatial ecosystems in their jobs. 

Photo courtesy of OpenStreetMap US on LinkedIn.

“Sometimes the discussion around those concepts can be highly technical and even a little esoteric, and that’s not necessarily helpful for someone who’s just got a day job that revolves around solving a problem. Events like this are very helpful in showing practical ways that open software and open data can be used.”

Dollins joined the committee for a multitude of reasons. In this post, we will explore some of his reasons for joining, as well as what he thinks he brings to the table in planning the event and things he has learned from the process. 

Why did you join the committee?

When asked for some of the reasons why he joined the planning committee for FedGeoDay, Dollins indicated that his primary purpose was to give back to a community that has been very helpful and valuable to him throughout his career in a very hands-on way. 

“In my business, I derive a lot of value from open-source software. I use it a lot in the solutions I deliver in my consulting, and when you’re using open-source software you should find a way that works for you to give back to the community that developed it. That can come in a number of ways. That can be contributing code back to the projects that you use to make them better. You can develop documentation for it, you can provide funding, or you can provide education, advocacy, and outreach. Those last three components are a big part of what FedGeoDay does.”

He also says that while being a co-chair of such an impactful event helps him maintain visibility in the community, getting the opportunity to keep his team working skills fresh was important to him, too. 

“For me, also, I’m self-employed. Essentially, I am my team,” said Dollins. “It can be really easy to sit at your desk and deliver things and sort of lose those skills.”

What do you think you brought to the committee?

Dollins has had a long career in the geospatial field and has spent the majority of his time in leadership positions, so he was confident in his ability to contribute in this new form of leadership role. Event planning is a beast of its own, but early on in the more junior roles of his career, the senior leadership around him went out of their way to teach him about project cost management, staffing, and planning agendas. He then was able to take those skills into a partner role at a small contracting firm where he wore every hat he could fit on his head for the next 15 years, including still doing a lot of technical and development work. Following his time there, he had the opportunity to join the C-suite of a private sector SaaS company and was there for six years, really rounding out his leadership experience. 

He felt one thing he was lacking in was experience in community engagement, and event planning is a great way to develop those skills. 

“Luckily, there’s a core group of people who have been planning and organizing these events for several years. They’re generally always happy to get additional help and they’re really encouraging and really patient in showing you the rules of the road, so that’s been beneficial, but my core skills around leadership were what applied most directly. It also didn’t hurt that I’ve worked with geospatial technology for over 30 years and open-source geospatial technology for almost 20, so I understood the community these events serve and the technology they are centered around,” said Dollins.

Photo courtesy of Ran Goldblatt on LinkedIn.

What were some of the hard decisions that had to be made?

Photo Courtesy of Cercana Systems on LinkedIn.

Attendees of FedGeoDay in previous years will likely remember that, in the past, the event has always been free for feds to attend. The planning committee, upon examining the revenue sheets from last year’s event, noted that the single largest unaccounted for cost was the free luncheon. A post-event survey was sent out, and federal attendees largely indicated that they would not take issue with contributing $20 to cover the cost of lunch. However, the landscape of the community changed in a manner most people did not see coming.

“We made the decision last year, and keep in mind the tickets went on sale before the change of administration, so at the time we made the decision last year it looked like a pretty low-risk thing to do,” said Dollins.

Dollins continued to say that while the landscape changes any time the administration changes, even without changing parties in power, this one has been a particularly jarring change. 

“There’s probably a case to be made that we could have bumped up the cost of some of the sponsorships and possibly the industry tickets a little bit and made an attempt to close the gap that way. We’ll have to see what the numbers look like at the end. The most obvious variable cost was the cost of lunches against the free tickets, so it made sense to do last year and we’ll just have to look and see how the numbers play out this year.”**

What have you taken away from this experience?

Dollins says one of the biggest takeaways from the process of helping to plan FedGeoDay has been learning to apply leadership in a different context. Throughout most of his career, he has served as a leader in more traditional team structures with a clearly defined hierarchy and specified roles. When working with a team of volunteers that have their own day jobs to be primarily concerned with, it requires a different approach. 

“Everyone’s got a point of view, everyone’s a professional and generally a peer of yours, and so there’s a lot more dialogue. The other aspect is that it also means everyone else has a day job, so sometimes there’s an important meeting and the one person that you needed to be there couldn’t do it because of that. You have to be able to be a lot more asynchronous in the way you do these things. That’s a good thing to give you a different approach to leadership and team work,” said Dollins on the growth opportunity. 

Dollins has even picked up some new work from his efforts on the planning committee by virtue of getting to work and network with people that weren’t necessarily in his circle beforehand. Though he’s worked in the geospatial field for 30 years and focused heavily on open-source work for 20, he says he felt hidden away from the community in a sense during his time in the private sector. 

Photo courtesy of Lane Goodman on LinkedIn.

“This has helped me get back circulating in the community and to be perceived in a different way. In my previous iterations, I was seen mainly from a technical perspective, and so this has kind of helped me let the community see me in a different capacity, which I think has been beneficial.”

FedGeoDay 2025 has concluded and was a huge success for all involved. Cercana Systems looks forward to continuing to sponsor the event going forward, and Dollins looks forward to continuing to help this impactful event bring the community together in the future. 

Photo courtesy of Cercana Systems on LinkedIn.

**This interview was conducted before FedGeoDay 2025 took place. The event exceeded the attendance levels of FedGeoDay 2024. 

FedGeoDay 2025 Highlights

The Cercana Systems team had a wonderful time attending FedGeoDay 2025 in Washington, D.C.! It was fun to catch up with long-time colleagues, make new professional connections, and learn how a wide array of new projects are contributing to the ever-evolving world of open geospatial ecosystems. 

A standout highlight was the in-depth keynote by Katie Picchione of NASA’s Disasters Program on the critical role played by open geospatial data in disaster response. Additionally, Ryan Burley of GeoSolutions moderated an excellent panel on Open-Source Geospatial Applications for Resilience, and Eddie Pickle of Crunchy Data led an energetic panel on Open Data for Resilience. 

We were especially excited about the “Demystifying AI” panel with panelists Emily Kalda of RGi, Jason Gilman of Element 84, Ran Goldblatt of New Light Technologies, and Jackie Kazil of Bana Solutions which was moderated by Cercana’s president Bill Dollins.

Location is an increasingly important component of cybersecurity and FedGeoDay featured a fireside chat on cybersecurity led by Ashley Fairman of DICE Cyber.  On either side of the lunch break, Wayne Hawkins of RGi moderated a series of informative lightning talks on a range of topics. 

FedGeoDay was a content-rich event that was upbeat from beginning to end. We are grateful to all of the presenters and panelists for taking the time to share their knowledge and to the organizing committee for their work in pulling together such a high-quality event. Cercana is proud to support FedGeoDay and looks forward to continuing to do so for years to come.

Cercana At FedGeoDay

Cercana Systems is excited to share that our entire team will be in attendance at FedGeoDay 2025! This is a great opportunity to meet with us face-to-face and learn more about our capabilities and the work we do. The event is happening April 22, 2025 at the Department of Interior’s Yates Auditorium in Washington, D.C. 

Company President Bill Dollins will be moderating a panel discussion on “Demystifying AI” at 4 p.m. The panel will feature input from multiple experts from across the geospatial and AI communities. 

We’re looking forward to meeting and engaging with a host of people from around the country who utilize, contribute to, and advocate for open geospatial ecosystems. We hope to see you there!

Why Young Professionals Should Get Out of the Office and Into Industry Events

In today’s fast-paced professional world, it’s easy for young professionals to assume that hard work alone will get them ahead. While grinding at the desk and delivering results matters, relying solely on your work to speak for itself may leave you overlooked in a competitive field. Getting out of the office and into local conferences, workshops, and networking events can provide invaluable opportunities that simply can’t be replicated from behind a desk.

Build Meaningful Professional Relationships

Networking remains one of the most powerful tools for career growth. According to a 2023 LinkedIn survey, 85% of job roles are filled through networking, not traditional applications. Attending local conferences puts you face-to-face with people in your industry—from potential mentors and collaborators to future employers and clients. These relationships can open doors to new opportunities that might never make it to job boards or public listings.

Stay Current With Industry Trends

Local events are also a great way to keep your knowledge sharp and up to date. Industry leaders often use conferences as platforms to discuss the latest trends, tools, and innovations. The Harvard Business Review emphasizes that staying current with changes in your field helps you remain relevant and competitive, especially in industries being rapidly transformed by technology and globalization (HBR, 2021).

Showcase Yourself Beyond the Resume

When you attend events, you get the chance to show people not just what you do—but how you do it. Your communication style, curiosity, and initiative become part of the impression you make. This visibility can lead to referrals, collaborations, or speaking invitations, all of which enhance your professional reputation in ways your LinkedIn profile alone cannot.

Gain Confidence and Soft Skills

Finally, stepping into a room full of peers and industry veterans can be intimidating—but it’s also empowering. Each interaction hones your communication skills, boosts your confidence, and teaches you how to navigate complex social dynamics in a professional context—critical soft skills that employers value highly.

Bottom Line

If you’re a young professional looking to grow, staying in your comfort zone won’t cut it. Attending local conferences and events is more than just networking—it’s about investing in your personal and professional development. Get out there, be visible, and let the right people see what you’re capable of.

Sources

Demystifying the Medallion Architecture for Geospatial Data Processing

Introduction

Geospatial data volumes and complexity are growing due to diverse sources, such as GPS, satellite imagery, and sensor data. Traditional geospatial processing methods face challenges, including scalability, handling various formats, and ensuring data consistency. The medallion architecture offers a layered approach to data management, improving data processing, reliability, and scalability. While the medallion architecture is often associated with specific implementation such as the Delta Lake, its concepts are applicable to other technical implementations. This post introduces the medallion architecture and discusses two workflows—traditional GIS-based and advanced cloud-native—to demonstrate how it can be applied to geospatial data processing.

Overview of the Medallion Architecture

The medallion architecture was developed to address the need for incremental, layered data processing, especially in big data and analytics environments. It is composed of three layers:

  • Bronze Layer: Stores raw data as-is from various sources.
  • Silver Layer: Cleans and transforms data for consistency and enrichment.
  • Gold Layer: Contains aggregated and optimized data ready for analysis and visualization.

The architecture is particularly useful in geospatial applications due to its ability to handle large datasets, maintain data lineage, and support both batch and real-time data processing. This structured approach ensures that data quality improves progressively, making downstream consumption more reliable and efficient.

Why Geospatial Data Architects Should Consider the Medallion Architecture

Geospatial data processing involves unique challenges, such as handling different formats (raster, vector), managing spatial operations (joins, buffers), and accommodating varying data sizes. Traditional methods struggle when scaling to large, real-time datasets or integrating data from multiple sources. The medallion architecture addresses these challenges through its layered approach. The bronze layer preserves the integrity of raw data, allowing for transformations to be traced easily. The silver layer handles transformations of the data, such as projections, spatial joins, and data enrichment. The gold layer provides ready-to-consume, performance optimized data ready for downstream systems. 

Example Workflow 1: Traditional GIS-Based Workflow  

For organizations that rely on established GIS tools or operate with limited cloud infrastructure, the medallion architecture provides a structured approach to data management while maintaining compatibility with traditional workflows. This method ensures efficient handling of both vector and raster data, leveraging familiar GIS technologies while optimizing data accessibility and performance.  

This workflow integrates key technologies to support data ingestion, processing, and visualization. FME serves as the primary ETL tool, streamlining data movement and transformation. Object storage solutions like AWS S3 or Azure Blob Storage store raw spatial data, ensuring scalable and cost-effective management. PostGIS enables spatial analysis and processing for vector datasets. Cloud-Optimized GeoTIFFs (COGs) facilitate efficient access to large raster datasets by allowing partial file reads, reducing storage and processing overhead. 

Bronze – Raw Data Ingestion 

The process begins with the ingestion of raw spatial data into object storage. Vector datasets, such as Shapefiles and CSVs containing spatial attributes, are uploaded alongside raster datasets like GeoTIFFs. FME plays a crucial role in automating this ingestion, ensuring that all incoming data is systematically organized and accessible for further processing.  

Silver – Data Cleaning and Processing

At this stage, vector data is loaded into PostGIS, where essential transformations take place. Operations such as spatial joins, coordinate system projections, and attribute filtering help refine the dataset for analytical use. Meanwhile, raster data undergoes optimization through conversion into COGs using FME. This transformation enhances performance by enabling GIS applications to read only the necessary portions of large imagery files, improving efficiency in spatial analysis and visualization.  

Gold – Optimized Data for Analysis and Visualization  

Once processed, the refined vector data in PostGIS and optimized raster datasets in COG format are made available for GIS tools. Analysts and decision-makers can interact with the data using platforms such as QGIS, Tableau, or Geoserver. These tools provide the necessary visualization and analytical capabilities, allowing users to generate maps, conduct spatial analyses, and derive actionable insights.  

This traditional GIS-based implementation of medallion architecture offers several advantages. It leverages established GIS tools and workflows, minimizing the need for extensive retraining or infrastructure changes. It is optimized for traditional environments yet still provides the flexibility to integrate with hybrid or cloud-based analytics platforms. Additionally, it enhances data accessibility and performance, ensuring that spatial datasets remain efficient and manageable for analysis and visualization.  

By adopting this workflow, organizations can modernize their spatial data management practices while maintaining compatibility with familiar GIS tools, resulting in a seamless transition toward more structured and optimized data handling. 

Example Workflow 2: Advanced Cloud-Native Workflow  

For organizations managing large-scale spatial datasets and requiring high-performance processing in cloud environments, a cloud-native approach to medallion architecture provides scalability, efficiency, and advanced analytics capabilities. By leveraging distributed computing and modern storage solutions, this workflow enables seamless processing of vector and raster data while maintaining cost efficiency and performance.  

This workflow is powered by cutting-edge cloud-native technologies that optimize storage, processing, and version control. 

Object Storage solutions such as AWS S3, Google Cloud Storage, or Azure Blob Storage serve as the foundation for storing raw geospatial data, ensuring scalable and cost-effective data management. Apache Spark with Apache Sedona enables large-scale spatial data processing, leveraging distributed computing to handle complex spatial joins, transformations, and aggregations. Delta Lake provides structured data management, supporting versioning and ACID transactions to ensure data integrity throughout processing. RasterFrames or Rasterio facilitate raster data transformations, including operations like mosaicking, resampling, and reprojection, while optimizing data storage and retrieval.  

Bronze – Raw Data Ingestion

The workflow begins by ingesting raw spatial data into object storage. This includes vector data such as GPS logs in CSV format and raster data like satellite imagery stored as GeoTIFFs. By leveraging cloud-based storage solutions, organizations can manage and access massive datasets without traditional on-premises limitations.  

Silver – Data Processing and Transformation

At this stage, vector data undergoes large-scale processing using Spark with Sedona. Distributed spatial operations such as filtering, joins, and projections enable efficient refinement of large datasets. Meanwhile, raster data is transformed using RasterFrames or Rasterio, which facilitate operations like mosaicking, resampling, and metadata extraction. These tools ensure that raster datasets are optimized for both analytical workloads and visualization purposes.  

Gold – Optimized Data for Analysis and Visualization

Once processed, vector data is stored in Delta Lake, where it benefits from structured storage, versioning, and enhanced querying capabilities. This ensures that analysts can access well-maintained datasets with full historical tracking. Optimized raster data is converted into Cloud-Optimized GeoTIFFs, allowing efficient cloud-based visualization and integration with GIS tools. These refined datasets can then be used in cloud analytics environments or GIS platforms for advanced spatial analysis and decision-making.  

This cloud-native implementation of medallion architecture provides several advantages for large-scale spatial data workflows. It features high scalability, enabling efficient processing of vast datasets without the constraints of traditional infrastructure, parallelized data transformations, significantly reducing processing time through distributed computing frameworks, and cloud-native optimizations, ensuring seamless integration with advanced analytics platforms, storage solutions, and visualization tools.  

By adopting this approach, organizations can harness the power of cloud computing to manage, analyze, and visualize geospatial data at an unprecedented scale, improving both efficiency and insight generation.  

Comparing the Two Workflows

AspectTraditional Workflow (FME + PostGIS)Advanced Workflow (Spark + Delta Lake)
ScalabilitySuitable for small to medium workloadsIdeal for large-scale datasets
TechnologiesFME, PostGIS, COGs, file system or object storageSpark, Sedona, Delta Lake, RasterFrames, object storage
Processing MethodSequential or batch processingParallel and distributed processing
PerformanceLimited by local infrastructure or on-premise serversOptimized for cloud-native and distributed environments
Use CasesSmall teams, traditional GIS setups, hybrid cloud setupsLarge organizations, big data environments

Key Takeaways

The medallion architecture offers much needed flexibility and scalability for geospatial data processing. It meshes well with traditional workflows using FME and PostGIS, which is effective for organizations with established GIS infrastructure. Additionally, it can be used in cloud-native workflows using Apache Spark and Delta Lake to provide scalability for large-scale processing. Both of these workflows can be adapted depending on the organization’s technological maturity and requirements. 

Conclusion

Medallion architecture provides a structured, scalable approach to geospatial data management, ensuring better data quality and streamlined processing. Whether using a traditional GIS-based workflow or an advanced cloud-native approach, this framework helps organizations refine raw spatial data into high-value insights. By assessing their infrastructure and data needs, teams can adopt the workflow that best aligns with their goals, optimizing efficiency and unlocking the full potential of their geospatial data.

Three Ways to Use GeoPandas in Your ArcGIS Workflow

Introduction

When combining open-source GIS tools with the ArcGIS ecosystem, there are a handful of challenges one can encounter. The compatibility of data formats, issues with interoperability, tool chain fragmentation, and performance at scale come to mind quickly. However, the use of the open-source Python library GeoPandas can be an effective way of working around these problems. When working with GeoPandas, there’s a simple series of steps to follow – you start with the data in ArcGIS, process it with the GeoPandas library, and import it back into ArcGIS.

It is worth noting that ArcPy and GeoPandas are not mutually exclusive. Because of its tight coupling with ArcGIS, it may be advantageous to use ArcPy in parts of your workflow and pass your data off to GeoPandas for other parts. This post covers three specific ways GeoPandas can enhance ArcGIS workflows and why it can better than using ArcPy in some cases.

Scenario 1: Spatial Joins Between Large Datasets

Spatial joins in ArcPy can be computationally expensive and time-consuming, especially for large datasets, as they process row by row and write to disk. GeoPandas’ gpd.sjoin() provides a more efficient in-memory alternative for point-to-polygon and polygon-to-polygon joins, leveraging Shapely’s spatial operations. While GeoPandas can be significantly faster for moderately large datasets that fit in memory, ArcPy’s disk-based approach may handle extremely large datasets more efficiently. GeoPandas also simplifies attribute-based filtering and aggregation, making it easier to summarize data—such as joining customer locations to sales regions and calculating total sales per region. Results can be exported to ArcGIS-compatible formats, though conversion is required. For best performance, enabling spatial indexing (gdf.sindex) in GeoPandas is recommended.

Bplewe, CC BY-SA 4.0 https://creativecommons.org/licenses/by-sa/4.0, via Wikimedia Commons

Scenario 2: Geometric Operations (Buffering, Clipping, and Dissolving Features)

Buffering and dissolving in ArcPy can be memory-intensive and time-consuming, particularly for large or complex geometries. Using functions like buffer(), clip(), and dissolve() to preprocess geometries before importing them back to ArcGIS is an effective solution to that problem. These functions can help make a multitude of processes more efficient. They can create buffer zones around road networks, dissolve any overlapping zones, and export the results as a new feature class for ArcGIS-based impact analysis. 

These functions can be cleaner and more efficient with regards to geometry processing than ArcPy and require fewer steps to carry out. They also integrate well with data science workflows using pandas-like syntax. 

Below is a detailed side-by-side comparison of GeoPandas and ArcPy for spatial analysis operations, specifically focusing on buffering and dissolving tasks.

AspectGeoPandas 🐍ArcPy 🌎
Processing SpeedFaster for medium-sized datasets due to vectorized NumPy/Shapely operations. Slows down with very large datasets.Slower for smaller datasets but optimized for large-scale GIS processing due to disk-based operations.
Memory UsageFully in-memory, efficient for moderately large data but can struggle with very large datasets.Uses ArcGIS’s optimized storage and caching mechanisms, which help handle large datasets without running out of RAM.
Ease of UseRequires fewer lines of code; syntax is cleaner for many operations.More verbose; requires handling geoprocessing environments and ArcPy-specific data structures.
Buffering CapabilitiesUses GeoSeries.buffer(distance), efficient but requires a projected CRS.arcpy.Buffer_analysis(), supports geodesic buffers and larger datasets more reliably.
Dissolve FunctionalityGeoDataFrame.dissolve(by=”column”), vectorized and fast for reasonably large data.arcpy.Dissolve_management(), slower for small datasets but scales better for massive datasets.
Coordinate System HandlingRequires explicit CRS conversion for accurate distance-based operations.Natively supports geodesic buffering (without requiring projection changes).
Data FormatsWorks with GeoDataFrames, exports to GeoJSON, Shapefile, Parquet, etc.Works with File Geodatabases (.gdb), Shapefiles, and enterprise GIS databases.
Integration with ArcGISRequires conversion (e.g., gdf.to_file(“data.shp”)) before using results in ArcGIS.Seamless integration with ArcGIS software and services.
Parallel Processing SupportLimited parallelism (can use Dask or multiprocessing for workarounds).Can leverage ArcGIS Pro’s built-in multiprocessing tools.
License RequirementsOpen-source, free to use.Requires an ArcGIS license.

Scenario 3: Bulk Updates and Data Cleaning

When performing bulk updates (e.g., modifying attribute values, recalculating fields, or updating geometries), ArcPy and GeoPandas have different approaches and performance characteristics. ArcPy uses a cursor-based approach, applying updates row-by-row. GeoPandas uses an in-memory GeoDataframe and vectorized operations via the underlying Pandas library. This can make GeoPandas orders of magnitude faster on bulk updates than ArcPy, but it can be memory intensive. Modern computing systems generally have a lot of memory so this is rarely a concern but, if you are working in a memory-constrained environment, ArcPy may suit your needs better.

Here is a side-by-side comparison:

FeatureGeoPandas 🐍ArcPy 🌎
Processing ModelUses in-memory GeoDataFrame for updates (vectorized with Pandas).Uses a cursor-based approach (UpdateCursor), modifying records row by row.
SpeedFaster for large batch updates (leverages NumPy, vectorized operations).Slower for large datasets due to row-by-row processing but scales well with large file geodatabases.
Memory UsageHigher, since it loads the entire dataset into memory.Lower, as it processes one row at a time and writes directly to disk.
Ease of UseSimpler, using Pandas-like syntax.More complex, requiring explicit cursor handling.
Parallel ProcessingCan use multiprocessing/Dask to improve performance.Limited, but ArcGIS Pro supports some multiprocessing tools.
Spatial Database SupportWorks well with PostGIS, SpatiaLite, and other open formats.Optimized for Esri File Geodatabases (.gdb) and enterprise databases.
File Format CompatibilityReads/writes GeoJSON, Shapefiles, Parquet, etc.Reads/writes File Geodatabase, Shapefile, Enterprise Databases.

5. When to Use ArcPy Instead

There are still times that using ArcPy would be the better solution. Things like network analysis, topology validation, or tasks that require a deeper integration with ArcGIS Enterprise in some other capacity are better done in ArcPy as opposed to GeoPandas. In the case of network analysis, ArcPy integrates ArcGIS’s native network analyst extension. On its own, it supports finding the shortest path between locations, calculating service areas, origin-destination cost analysis, vehicle routing problems, and closest facility analysis. It also works natively with ArcGIS’s advanced network datasets such as turn restrictions, traffic conditions, one-way streets, and elevation-based restrictions. 

6. Conclusion

GeoPandas offer greater efficiency, speed, flexibility, and simplicity when working with open-source tools in ArcGIS workflows, especially with regard to custom analysis and preprocessing. If you haven’t tried using GeoPandas before, it is more than worth your time to play around with. 

Have you had your own positive or negative experiences using GeoPandas with ArcGIS? Feel free to leave them in the comments, or give us a suggestion of other workflows you would like to see a blog post about! 

Applying Porter’s Five Forces to Open-Source Geospatial

Introduction

The geospatial industry has seen significant transformation with the rise of open-source solutions. Tools like QGIS, PostGIS, OpenLayers, and GDAL have provided alternatives to proprietary GIS software, providing cost-effective, customizable, and community-driven mapping and spatial analysis capabilities. While open-source GIS thrives on collaboration and accessibility, it still operates within a competitive landscape influenced by external pressures.

Applying Porter’s Five Forces, a framework for competitive analysis developed by Michael E. Porter in 1979, allows us to analyze the industry dynamics and understand the challenges and opportunities open-source GIS solutions face. The five forces include the threat of new entrants, bargaining power of suppliers, industry rivalry, bargaining power of buyers, and the threat of substitutes. We will explore how these forces shape the world of open-source geospatial technology.

Porter’s Five Forces was conceived to analyze traditional market-driven dynamics. While open-source software development is not necessarily driven by a profit motive, successful open-source projects require thriving, supportive communities. Such communities still require resources – either money or, even more importantly and scarce, time. As a result, a certain amount of market thinking can be useful when considering adoption of open-source into your operations or starting a new project.

Porter articulated the five forces in terms of “threats” and “power” and “rivalry.” We have chosen to retain that language here for alignment with the model but, in the open-source world, many of these threats can represent opportunities for greater collaboration.

1. Threat of New Entrants: Low to Moderate

The barriers to entry in open-source geospatial solutions are low for basic tool development compared to proprietary software development. Developers can utilize existing open-source libraries, open geospatial data, and community-driven documentation to build new tools with minimal investment.

However, gaining significant adoption or community traction presents higher barriers than described in traditional new entrant scenarios. Well-established open-source solutions like QGIS, PostGIS, and OpenLayers have strong community backing and extensive documentation, making it challenging for new entrants to attract users.

New players may find success by focusing on novel or emerging use case areas like AI-powered GIS, cloud-based mapping solutions, or real-time spatial analytics. Companies that provide specialized integrations or enhancements to existing open-source GIS tools may also gain traction. DuckDB and its edge-deployability is a good example of this.

While new tools are relatively easy to develop, achieving broad community engagement often requires differentiation, sustained innovation, and compatibility with established standards and ecosystems.

2. Bargaining Power of Suppliers: Low to Moderate

Unlike proprietary GIS, where vendors control software access, open-source GIS minimizes supplier dependence due to its open standards and community-driven development. The availability of open geospatial datasets (e.g., OpenStreetMap, NASA Earthdata, USGS) further reduces the influence of traditional suppliers.

Moderate supplier power can arise in scenarios where users depend heavily on specific service providers for enterprise-level support, long-term maintenance, or proprietary enhancements (e.g., enterprise hosting or AI-powered extensions). Companies offering such services, like Red Hat’s model for Linux, could gain localized influence over organizations that require continuous, tailored support.

However, competition among service providers ensures that no single vendor holds significant leverage. This can work to the benefit of users, who often require lifecycle support. Localized supplier influence can grow in enterprise settings where long-term support contracts are critical, making it a consideration in high-complexity deployments.

3. Industry Rivalry: Moderate to High

While open-source GIS tools are developed with a collaborative ethos, competition still exists, particularly in terms of user adoption, funding, and enterprise contracts. Users typically don’t choose multiple solutions in a single category, so a level of de facto competition is implied even though open-source projects don’t explicitly and directly compete with each other in the same manner as proprietary software.

  • Open-source projects compete for users: QGIS, GRASS GIS, and gvSIG compete in desktop GIS; OpenLayers, Leaflet, and MapLibre compete in web mapping.
  • Enterprise support: Companies providing commercial support for open-source GIS tools compete for government and business contracts.
  • Competition from proprietary GIS: Esri, Google Maps, and Hexagon offer integrated GIS solutions with robust support, putting pressure on open-source tools to keep innovating.

However, open-source collaboration reduces direct rivalry. Many projects integrate with one another (e.g., PostGIS works alongside QGIS), creating a cooperative rather than competitive environment. While open-source GIS projects indirectly compete for users and funding, collaboration mitigates this and creates shared value. 

Emerging competition from cloud-native platforms and real-time analytics tools, such as SaaS GIS and geospatial AI services, increases rivalry. As geospatial technology evolves, integrating AI and cloud functionalities may determine long-term competitiveness.

When looking to adopt open-source, consider that loose coupling through the use of open standards can add greater value. When considering starting a new open-source project, have integration and standardization in mind to potentially increase adoption.

4. Bargaining Power of Buyers: Moderate

In the case of open-source, “bargaining” refers to the ability of the user to switch between projects, rather than a form of direct negotiation. The bargaining power of buyers in the open-source GIS space is significant, primarily due to the lack of upfront capital expenditure. This financial flexibility enables users to explore and switch between tools without major cost concerns. While both organizational and individual users have numerous alternatives across different categories, this flexibility does not necessarily translate to strong influence over the software’s development.

Key factors influencing buyer power:

  • Minimal financial lock-in: In the early stages of adoption, users can easily migrate between open-source tools. However, as organizations invest more time in customization, workflow integration, and user training, switching costs increase, gradually reducing their flexibility.
  • Community-driven and self-support options: Buyers can access free support through online forums, GitHub repositories, and community-driven resources, lowering their dependence on paid services.
  • Customizability and adaptability: Open-source GIS software allows organizations to tailor the tools to their specific needs without vendor constraints. However, creating a custom version (or “fork”) requires caution, as it could result in a bespoke solution that the organization must maintain independently.

To maximize their influence, new users should familiarize themselves with the project’s community and actively participate by submitting bug reports, fixes, or documentation. Consistent contributions aligned with community practices can gradually enhance a user’s role and influence over time.

For large enterprises and government agencies, long-term support requirements – especially for mission-critical applications – can reduce their flexibility and bargaining power over time. This dependency highlights the importance of enterprise-level agreements in managing risk.

5. Threat of Substitutes: Moderate to High

Substitutes for open-source GIS tools refer to alternatives that provide similar functionality. These substitutes include:

  • Proprietary GIS software: Tools like ArcGIS, Google Maps, and Hexagon are preferred by many organizations due to their perceived stability, advanced features, and enterprise-level support.
  • Cloud-based and SaaS GIS platforms: Services such as Felt, MapIdea, Atlas, Mapbox, and CARTO offer user-friendly, web-based mapping solutions with minimal infrastructure requirements.
  • Business Intelligence (BI) and AI-driven analytics: Platforms like Tableau, Power BI, and AI-driven geospatial tools can partially or fully replace traditional GIS in certain applications.
  • Other open-source GIS tools: Users can switch between alternatives like QGIS, GRASS, OpenLayers, or MapServer with minimal switching costs.

However, open-source GIS tools often complement rather than fully replace proprietary systems. For instance, libraries like GDAL and GeoPandas are frequently used alongside proprietary solutions like ArcGIS. Additionally, many SaaS platforms incorporate open-source components, offering organizations a hybrid approach that minimizes infrastructure investment while leveraging open-source capabilities.

The emergence of AI-driven spatial analysis and real-time location intelligence platforms is increasingly positioning them as partial substitutes to traditional GIS, intensifying this threat. As these technologies mature, hybrid models integrating both open-source and proprietary elements will become more common.

Conclusion

Porter’s Five Forces analysis reveals that open-source geospatial solutions exist in a highly competitive and evolving landscape. While they benefit from free access, strong community support, and low supplier dependence, they also face competition from proprietary GIS, SaaS-based alternatives, and substitutes like AI-driven geospatial analytics.

To remain competitive, open-source GIS projects must not only innovate in cloud integration and AI-enhanced spatial analysis but also respond to the shifting landscape of real-time analytics and SaaS-based delivery models. Strengthening enterprise support, improving user-friendliness, and maintaining strong community engagement will be key to their long-term sustainability.

As geospatial technology advances, open-source GIS will continue to play a crucial role in democratizing access to spatial data and analytics, offering an alternative to fully proprietary systems while fostering collaboration and technological growth.

To learn more about how Cercana can help you develop your open-source geospatial strategy, contact us here.

Developing a Geospatially-Aware Strategic Plan for Your Organization

What is Strategic Planning and Why Does it Matter?

Strategic planning is one of the most important things you can do for your organization. It helps you not only paint the picture of where you want your organization to be in the future, but also draws the roadmap for how you’re going to get there. 

Having goals is crucial for any business. Companies generally don’t thrive and grow by accident. However, having those goals is only the first step. You need to know what the steps to reaching them are, what challenges lie in the way, and what you’re going to do when those challenges arise. The better your plans, the quicker you can address the challenges.

Key Factors in Developing a Strategic Plan

It is equally important to know both where you want your organization to be in five years as well as next year. Your short-term goals will act as checkpoints along the journey to your long-term goals. While articulating these, you’ll want to make sure to craft a clear mission statement to help bring your entire organization into alignment on where the ship is trying to steer and why.

Geospatial tools can be helpful in accomplishing your goals. There are a variety of ways they can help you provide useful insights that support your organization’s operations. Geospatial tools can help you determine the best areas for expansion, understand your supply chains in greater depth, reveal location-based trends in your consumers, and so much more. The more exact your understanding of your business and customers is, the more sure you can be in your next steps and your plan. 

A key step in formulating your strategic plan is conducting a SWOT analysis. All four components – your strengths, weaknesses, opportunities, and threats – can be examined in greater depth through geospatial insights. 

If one of your strengths is a particular location of your operations, there are many geospatial factors that can be contributing to that. Spatial analysis using demographic data such as census blocks and local parcel data can help you understand the population within proximity to the store. This information can be used to characterize new locations with similar populations. Geospatial tools can create an easily understood map that helps your leadership visualize results without having to sift through pages and pages of data.

Perhaps a weakness of your operation is high distribution cost and long turn-around time for acquiring inventory. Geospatial tools can provide a deeper understanding of your supply chain in a manner that’s easy to understand. They can help to optimize distribution points in relation to upstream suppliers and downstream retail locations. It can also help to identify gaps that may require new supplier relationships to fill.

When analyzing the opportunities, understanding geospatial factors can help capitalize on them. If you want to expand your operations in the Northeast, a geospatial analysis of commercial locations could tell you what parcels are ideal to target based on municipal taxes, proximity to distribution channels, demographic make-up for sourcing labor and finding consumers, and environmental factors such as frequency of natural disasters. 

Similar analysis can help identify potential threats that may have been previously unrecognized. As you’re looking at expanding in the Northeast, perhaps you notice there’s a very limited number of properties that ideally suit your needs, and the majority of them that exist are already owned by one of your competitors. This presents you with an opportunity to reassess your market entry strategy. 

Devise a Strategy

After completing your SWOT analysis, you’ll want to put pen to paper on your strategic plan. Be sure to make the steps in your plan actionable, clear, and measurable. Your steps and strategies should align with your organization’s values and mission statement. Incorporating geospatial analysis into your plan of action is important, as well. Geospatial insights can provide excellent visualizations of data and progress towards your goals. 

Execute Your Strategy

Executing a strategic plan across all business departments ensures alignment, efficiency, and goal achievement. Effective communication fosters collaboration, breaking down silos between teams. Proper resource allocation optimizes budgets, personnel, and technology, preventing inefficiencies. A well-integrated plan leverages geospatial skills and tools, placing them where they add the most value—whether in logistics, marketing, or risk assessment. 

This enhances decision-making, improves operational efficiency, and boosts competitiveness. Geospatial insights drive location-based strategies, ensuring businesses optimize site selection, route planning, and customer targeting. A strategic, cross-departmental approach maximizes the impact of geospatial tools, leading to smarter business decisions and sustainable growth.

Evaluate and Control

Monitoring progress on a strategic plan requires tracking KPIs to measure success and identify areas for improvement. Geospatial KPIs, such as delivery efficiency, store performance by location, or market penetration, provide location-based insights to optimize decisions. Regular analysis ensures alignment with business goals, allowing for timely adjustments. By leveraging real-time geospatial data, businesses can refine strategies, improve resource allocation, and adapt to changing market conditions for sustained success.

Common Challenges and Solutions

Change is often met with uncertainty. It’s important to foster a culture of open communication and honesty about the direction of the organization with all your players on all levels of all departments in order to get everyone on the same page. Here, geospatial analysis can contribute to ensuring that decisions are data-driven. Maps are a great communication tool for aligning your organization around your goals.

Another common problem with effective implementation of a strategic plan is working with inadequate data. Data collection has to be the number one priority of your organization at all levels in order to ensure that data-driven decisions can be made accurately and effectively. In a geospatial context, ensuring that all tools being used enable location as a first-class citizen is paramount. This way, all data collected is location aware from the onset of your plan’s implementation. 

Commonly, geospatial tools and data are stovepiped and underutilized. Modern geospatial systems can live side-by-side with other line-of-business tools used in strategic planning and monitoring, such as ERP and CRM. It is no longer necessary to have fully dedicated geospatial tools and expertise that resides outside of mainstream business systems. Ensuring that geospatial analysis is tightly integrated will help ensure that strategic data-driven decisions are also location-aware.

Conclusion

Most organizations understand how location affects their operation, but the cost and effort of integrating geospatial analysis into strategic decision-making is often seen as too high. Modern geospatial tools and practices, when deployed in targeted ways, can ensure that location is properly accounted for in goal-setting and execution. If you’d like to learn more about how Cercana can help you maximize the value of location in your strategic planning, contact us here.

Geospatial Portfolio Management and Rationalization

Many organizations rely on geospatial technology to derive insights based on location and spatial relationship. Whether they are mapping infrastructure, analyzing environmental changes, or optimizing logistics, managing geospatial investments effectively is imperative. Two strategies, IT portfolio management and IT rationalization, can help organizations maximize the value of their geospatial assets while reducing inefficiencies. Leveraging the right tools and techniques ensures these strategies are implemented effectively.

When we talk about “geospatial investments,” we are talking about infrastructure, software, and data. Assets in those categories may be acquired through proprietary licenses, subscriptions such as SaaS or DaaS, or by onboarding open-source solutions. Regardless of the provenance of the asset, its acquisition and incorporation into operations brings costs that must be managed like any other investments.

What Is IT Portfolio Management?

IT portfolio management is a structured process for evaluating and aligning IT assets, including geospatial tools and data, with organizational goals. Think of it like managing a financial portfolio—prioritizing investments to maximize returns while managing risks.

In practice, effective IT portfolio management involves a combination of strategic planning, resource allocation, and continuous assessment. Organizations leverage portfolio management to ensure IT investments, including geospatial tools and data, align with long-term objectives while remaining adaptable to changing priorities. This often entails mapping projects to business outcomes, identifying dependencies, and evaluating performance metrics to measure success. Additionally, fostering collaboration between IT and operational teams enhances decision-making, ensuring geospatial initiatives address both technical and organizational needs. By applying these principles, organizations can maximize the value of their geospatial assets while mitigating risks associated with resource misallocation or misaligned goals.

Tools and Techniques for Geospatial Portfolio Management

  1. Portfolio Management Software:
    • Tools like Planview, SailPoint, or Smartsheet can help track geospatial technology assets. Some integrate with ERP systems to identify spend. This is useful for tracking commercial software licenses, subscriptions, and even support contracts from open-source tools. They can be especially useful for identifying “shadow IT” in which staff onboard SaaS tools and then expense the subscription fees.
    • Mobile Device Management (MDM) tools such as JAMF or JumpCloud can be effective at tracking or deploying software on managed devices. This can help with license optimization for commercial software, patch management for commercial and  open-source tools, or data inventory management at the edge.
  2. Geospatial Data Inventory:
    • Platforms like GeoNode, CKAN, or Esri’s Data Catalog helps centralize and manage spatial datasets across multiple teams and locations.
    • Search tools such as Voyager, have features that enable discovery of geospatial data and the assessment of data redundancy.
  3. Prioritization Frameworks:
    • Weighted Scoring Models enable the use of organization-specific criteria to provide consistent evaluation of alternatives.
    • Benefit-Cost Analysis provides a relatively simple way to objectively evaluate and rank geospatial investments.

By utilizing these tools and techniques, organizations can align their geospatial investments with business goals and make data-driven decisions.


What Is IT Rationalization?

IT rationalization focuses on simplifying and streamlining IT assets to eliminate redundancy and reduce costs. It requires a systematic approach to evaluate the relevance, efficiency, and performance of IT assets. It involves cataloging all technology assets, assessing their value and usage, and identifying areas of overlap or obsolescence. For geospatial technology, this process includes analyzing the lifecycle of geospatial tools, evaluating data quality and relevance, and determining the efficiency of current workflows. Organizations often use rationalization to create a unified technology ecosystem by consolidating systems, integrating data sources, and phasing out redundant or underperforming applications. This ensures that geospatial investments support operational needs while reducing costs and improving overall agility.

Geospatial rationalization involves a systematic approach to streamlining geospatial technology and data assets, ensuring they align with organizational goals while reducing inefficiencies and costs. The process begins with inventorying assets using tools like an MDM platform or Voyager, which can track software, hardware, and data. Identifying redundancies is a critical next step, where tools like FME or Voyager can uncover duplicate data for cleanup, while GDAL/OGR standardizes and consolidates diverse datasets to ensure consistency. Migration and consolidation further enhance efficiency by moving geospatial data to modern, scalable platforms like Apache Sedona with Spark, PostGIS, or a data warehouse, often leveraging ELT/ETL tools. 

Application rationalization frameworks help organizations evaluate and classify geospatial applications for retention or retirement. Finally, performance monitoring tools like ensure applications operate efficiently, allowing for proactive identification of bottlenecks and optimization of resources. Together, these steps enable organizations to create a unified, cost-effective geospatial technology ecosystem tailored to their operational needs.

Challenges of Geospatial Technology in Portfolio Management and Rationalization

Managing and rationalizing geospatial tools and data present unique challenges due to the specialized nature and complexity of these systems. For instance, the vast volumes and diverse formats of geospatial data—such as vector layers, satellite imagery, and real-time sensor feeds—require robust storage and processing solutions. Organizations often grapple with ensuring data integrity, accessibility, and compatibility, especially when datasets come in formats like shapefiles, GeoJSON, or KML. For example, a municipality managing urban planning projects might need to consolidate data from various sources into a unified format using tools like GDAL/OGR. Apache Sedona, integrated with Spark in a data lake employing a medallion architecture, provides an efficient framework for managing large-scale geospatial datasets. This architecture allows organizations to organize raw data into bronze, silver, and gold layers, enabling a scalable and structured approach to data cleansing, integration, and analysis while maintaining high performance and flexibility.

Another significant concern is aligning tools and resources with organizational priorities while managing costs and governance. Differing project requirements can lead to overlapping software tools. For example multiple desktop GIS software or mobile data collection platforms can exist across a portfolio. Additionally, ensuring data governance is important, particularly when handling sensitive geospatial information, such as infrastructure as-built data or parcel boundaries. For instance, a transportation agency may use GeoNetwork to manage metadata securely while employing encryption and role-based access controls to comply with privacy regulations. Collaboration platforms, such as ArcGIS Enterprise or GeoNode, can help bring together diverse stakeholders—urban planners, emergency responders, and environmental analysts—by centralizing geospatial data and tools, fostering better alignment, and ensuring efficient resource utilization.

Conclusion

Geospatial technology is critical for modern organizations but presents unique challenges that demand careful management. Combining geospatial tools with standard strategies for handling data volume, interoperability, and governance, organizations can streamline their geospatial systems and integrate geospatial assets into larger organizational governance frameworks. IT portfolio management and rationalization not only optimize costs but also ensure geospatial investments align with strategic goals, delivering long-term value.

To learn more about how Cercana can help you optimize your geospatial portfolio, contact us.

Hybrid Approaches to Geospatial Architectures

At Cercana, we have worked with geospatial systems that have run the gamut—from all-in proprietary stacks to pure open-source toolchains. As the technology landscape evolves, many organizations are blending both proprietary and open-source solutions. These hybrid architectures aim to capitalize on the best of each world, providing flexibility in how users store data, serve maps, run analyses, or deploy applications – but making this approach work requires thinking through a few key considerations.

Whether you’re starting from a pure proprietary environment and eyeing open-source options, or you’re heavily invested in open-source but considering some proprietary tools for specific tasks, it helps to understand where each may fit best.

The Realities of Hybrid Architecture

No two organizations have the exact same requirements. Some rely on legacy systems tied to proprietary platforms. Others have in-house developers more comfortable with taking on the full lifecycle maintenance of open-source code – which includes contributing back to projects. Others – especially in the public sector – may face strict procurement rules or governance models that dictate one approach or the other. A hybrid stack can acknowledge these constraints while providing flexibility. It says: “We’ll pick the right tool for the right job, from whatever ecosystem makes sense.”

Of course, “right tool for the right job” sounds simple. But deciding what’s “right” can be tricky.

Where Proprietary Tools Fit Well

Tightly-Coupled Stacks

One of the biggest strengths of proprietary solutions is the cohesiveness of their ecosystems. Vendors spend a lot of time enabling end-to-end data and application integration. If an organization is willing to put aside preconceived notions about the uniqueness of its workflows, it can achieve productivity quickly by simply adopting a proprietary stack and its embedded processes and methods. This approach essentially trades money for time. The organization pays the vendor on the premise that it will get up to speed quickly.

For example, Esri’s ArcGIS platform integrates desktop, server, cloud, and mobile components. If your organization leans heavily on complex, out-of-the-box analytics or well-supported data management workflows, going with this solution can shorten the learning curve. Tools like ArcGIS Pro or ArcGIS Enterprise can handle data ingestion, manage user access, provide advanced analytics, and generate polished cartography—all within a single environment.

Easily Available Support and Roadmaps

Commercial vendors often provide guaranteed support and clearly stated product roadmaps. If your organization prioritizes the idea of risk reduction, having a help desk and service-level agreement (SLA) behind you can tip the scales toward proprietary platforms.

There’s a lot to be said about the quality of help desk support, the timeliness of remedies under an SLA, and the speed and availability of things like security patches. That said, organizations that are very process-oriented place a lot of value in the existence of the agreement itself, which gives them a place to start, even if it is inefficient in execution.

User-Friendly Interfaces

This is far less of an issue than it used to be, but it is a misconception that persists among adherents to proprietary systems. There was a time when GUIs – especially on the desktop – were superior with proprietary software. Open-source was the domain of developers who were happy to work via APIs or the CLI and other such users were their target audience. That distinction has mostly evaporated – especially with the move of applications and processing to the web and cloud. The recent advent of natural-language interfaces will continue to close this gap.

Where proprietary GUIs still shine has more to do with the end-to-end workflow integration discussed earlier. Vendors do a good job of exposing tools and using consistent nomenclature throughout their stacks, which helps users follow their nose through a workflow. In ArcGIS, it is relatively easy to to chart the journey of a feature class, through to being a map package, and finally a map service exposed via ArcGIS Online.

In the end, it is important to recognize the distinction between an “interface that I know how to use” versus an “interface that is better.” Market dominance has a strong effect on perception (see Windows v. Mac, or iPhone v. any other mobile device).

Where Open-Source Tools Shine

Flexibility and Interoperability

Open-source geospatial tools often align tightly with open standards like WMS, WFS, and GeoPackage. This makes it easier to integrate with other systems, add new capabilities, or swap out components without rewriting everything. For instance, using PostGIS as your spatial database allows you to connect easily with GeoServer for serving OGC web services or with QGIS for editing and analysis. Especially in geospatial, open-source tools tend to align heavily around open standards, such as those from the OGC, as a first principle. This streamlines integration of systems that are mostly developed by independent or loosely-coupled project teams.

Cost Savings and Scalability

Open-source tools don’t carry licensing fees, which can be a big plus for tight budgets. And since you can run them on your own hardware or in the cloud, scaling up often involves fewer financial hurdles. For massive datasets or complex operations, you might spin up multiple PostGIS instances or deploy a cluster of servers running GeoServer to handle load—all without worrying about additional per-core or per-seat licenses.

That said, open-source tools aren’t (or shouldn’t be) entirely free of cost. If you are using open-source and deriving business value from it, you should contribute back in some way. That can take many forms. You can let your staff spend part of their time developing enhancements to open-source code, documentation, or tests that can be contributed back. You could at least partially employ someone who maintains projects that you use. You could simply donate funds to projects. Regardless of how you choose to support open-source, there will be a cost, but it will most likely be far less than what you’d spend on a per-seat/core/whatever licensing model.

Finally, organizations can procure support for open-source tools from third parties who often employ maintainers. This begins to approach the help-desk/SLA model discussed above in relation to proprietary systems. It is often not an exact match for that model, but it is a good way for an organization that doesn’t think of software as its “day job” to simultaneously get support and contribute back to the open-source from which it derives value.

Deep Customization

Because the code is open, if you have development resources, you can tailor these tools to your exact needs. We’ve seen teams customize QGIS plugins to automate their entire workflow or tweak GeoServer configurations for specialized queries. You’re not stuck waiting for a vendor to implement that one feature you need.

That said, think through how you approach such customizations before you jump in. The moment you fork a project and change its core code, you own it – especially if the maintainers reject your changes. You’ll want to think about a modular approach that isolates your changes in a way that leaves you able to continue to receive updates to the core code from the maintainers while preserving your customizations. QGIS is a great example – build a plug-in rather than changing the QGIS code itself. Many open-source geospatial tools have extensible models – like GDAL or GeoServer. Understand how those work and have a plan for your customizations before you get going on them.

Taking a Hybrid Approach

A hybrid architecture tries to find balance. Consider the following patterns that we’ve seen work well:

Pattern 1: Proprietary Desktop + Open-Source Backend

In this scenario, you might run ArcGIS Pro for your cartographic and analytic workflows—especially if your staff is well-versed in it—while managing all your spatial data in PostGIS. You maintain the user-friendly environment that your team knows, but you also gain the scalability and interoperability of a robust spatial database. ArcGIS Pro can connect to PostGIS tables, perform analyses, and visualize results. Meanwhile, data integration and sharing can happen through open formats and APIs.

Pattern 2: Open-Source Desktop + Proprietary Web Services

This might sound like a twist, but we’ve seen teams rely on an open-source desktop tool like QGIS while they serve their data through a proprietary server product or a cloud-based hosted solution. Perhaps your organization invested heavily in a proprietary web platform (like ArcGIS Enterprise) that integrates with enterprise user management, security, and BI tools. QGIS users can consume services from that server, taking advantage of familiar open-source editing tools while benefiting from a managed, well-supported data environment.

Pattern 3: Proprietary Spatial Analysis + Open-Source Front-Ends

If you’re dealing with complex spatial modeling—maybe you’re working with advanced network analysis or 3D analytics that a proprietary tool excels at—you can still present and distribute those results through open-source web maps or dashboards. For example, run your analysis in ArcGIS Pro or FME, then publish the output as a service via GeoServer and visualize it in a Leaflet-based web app. Now your end users interact through a lightweight, custom UI that’s easy to update.

Pattern 4: Open-Source Core with Proprietary Add-Ons

Alternatively, your core environment might be open-source—PostGIS for data storage, QGIS for editing, GeoServer for OGC services—but maybe you integrate a proprietary imagery analysis tool because it handles specific sensor data or advanced machine learning models out-of-the-box. This “best-of-breed” approach can deliver specialized capabilities without forcing your entire stack into one ecosystem.

Key Considerations

Governance and Security

A hybrid environment means more moving parts. You’ll need clear policies on data governance, security practices, version control, and how updates get rolled out. Vetting open-source tools for security and licensing compliance is essential, as is ensuring that proprietary components don’t introduce unexpected vendor constraints. 

There are two important points here. The first is that open-source is not less secure than proprietary software – in fact, it is often demonstrably more secure. Acquisition policies often (fairly or not) have extra processes for the use of open-source. You’ll need to be aware of how your organization approaches this. As part of that, you’ll need a plan to show how you’ll integrate security patches as they become available, since there’s usually not a vendor-provided system that automatically pushes them.

The second point is that open-source licenses are legally-binding licenses. Because you do not pay for the software does not mean the licenses do not apply. You’ll want to understand the nuances of open-source licenses (permissive vs. restrictive, copy-left, etc.) to ensure you remain compliant as you integrate open-source into your stack.

Skill Sets and Training

Your team may need to learn new tools. If everyone is fluent in ArcGIS Pro but you introduce PostGIS, you’ll need to provide SQL training. Conversely, if you bring in ArcGIS Enterprise to complement your open-source stack, your team may need guidance on navigating that environment. Don’t skimp on professional development—investing in training pays off in smoother operations down the line.

This is simply best practice in terms of lifecycle management of your technology stack. Give your staff the knowledge they need to be productive. There is ample information and training available for both proprietary and open-source geospatial tooling, so the provenance of a software solution should not affect the availability of training to get your staff up to speed.

Performance and Scalability

As you blend tools, test performance early and often. Proprietary solutions may have certain hardware recommendations or licensing constraints that impact scaling. Open-source tools can scale horizontally, but you may need devops practices to manage containers, virtual machines, security patching, or cloud deployments. Think through how you’ll handle bigger data volumes or higher user traffic before it becomes an urgent issue.

Long-Term Viability and Community Support

Open-source tools thrive on community involvement. Check activity on GitHub repos or forums—are they lively? Do updates happen regularly? Proprietary vendors usually maintain formal roadmaps and documentation. Balancing these factors ensures you’re not tied to a dead-end project or a product that doesn’t meet your evolving needs.

Wrapping Up

We’re long past the days when an all-in proprietary approach was the only game in town. At the same time, not everyone is ready (or able) to go fully open-source. A hybrid architecture acknowledges that each technology ecosystem brings something different to the table, and there is value in mixing and matching.

If you want stable support and integrated workflows right out of the box, proprietary tools might be your go-to. If you’re looking to scale rapidly, and stay agile, open-source solutions are hard to beat. Most organizations find themselves somewhere in between. By thoughtfully picking where you deploy proprietary versus open-source tools, you can build a geospatial architecture that’s both pragmatic and innovative—ready for whatever challenges (and opportunities) lie ahead.

To learn more about how Cercana can help you optimize your geospatial architecture, contact us.

Integrating AI Into Geospatial Operations

At Cercana, we’re excited by the constant evolution of geospatial technology. AI and its related technologies are becoming increasingly important components of geospatial workflows. Recently, our founder, Bill Dollins, has shared some of his explorations into AI through his personal blog, geoMusings, where he has written about topics like Named Entity Recognition (NER), image similarity using pgVector, and retrieval-augmented generation (RAG). These explorations reflect Cercana’s commitment to helping our clients understand emerging technologies and consider how best to integrate them into their operations.

Coarse Geocoding with ChatGPT

In his May 2024 post, Bill explored the application of ChatGPT for Named Entity Recognition (NER), a vital tool in the AI toolkit. NER can extract key information from unstructured text, such as identifying people, locations, and organizations. At Cercana Systems, we see potential in using AI to streamline geospatial data processing, particularly in the context of large-scale data integration tasks. By using AI tools like ChatGPT, we can automate the extraction of spatial information from textual data, making it easier for our clients to analyze and take action.

This capability is particularly relevant in scenarios where large volumes of data need to be sifted through quickly—whether for real-time monitoring or in-depth analysis. As we continue to refine our capabilities, Cercana is positioned to offer more precise and scalable solutions.

Exploring Image Similarity with pgVector

In a July 2024 post, Bill explored analyzing image similarity using pgVector, a vector database extension for PostgreSQL. This post examines the creation of direct vector embeddings from images, rather than solely using image metadata such as EXIF tags. Combined with such metadata, including location, direct embeddings enable a more discreet kind of “looks like” analysis on a corpus of images.

By integrating pgVector with existing geospatial data pipelines, we are enhancing our ability to process and analyze visual data more efficiently. This capability not only speeds up workflows but also opens new avenues for our clients to derive actionable insights from their image datasets.

Experimenting with RAG Using ChatGPT and DuckDuckGo

Most recently, in August 2024, Bill explored the concept of retrieval-augmented generation (RAG) by combining ChatGPT with DuckDuckGo for information retrieval. RAG models are a quickly-developing capability in AI because they blend the generative capabilities of models like LLMs with the precision of traditional search engines and database queries. This fusion enables more accurate and contextually relevant information retrieval, which can be valuable for analytical tasks and other data-intensive operations.

At Cercana, we’re looking at RAG to enable AI-enhanced geospatial solutions. By integrating RAG, we seek to provide clients with more discreet, context-aware tools that can make sense out of large volumes of unstructured data.

Moving Forward

As Cercana grows and evolves, we are finding new ways of integrating AI into our geospatial services. Bill’s explorations into NER, image similarity, and RAG are examples of this. We believe that AI tools and related technologies, when paired with more traditional tools such as data pipelines and geospatial databases, provide the opportunity to improve data quality and shorten the time-to-market for actionable information.

To learn more about how Cercana can help you integrate AI into your operations, please contact us.

Choosing Between an iPaaS and Building a Custom Data Pipeline

In today’s data-driven world, integrating various systems and managing data effectively is crucial for organizations to make informed decisions and remain responsive. Two popular approaches to data integration are using an Integration Platform as a Service (iPaaS) or building a custom data pipeline. Each approach has its advantages and challenges, and the best choice depends on your organization’s specific needs, resources, and strategic goals.

Understanding iPaaS

An iPaaS is a hosted platform that provides a suite of tools to connect various applications, data sources, and systems, both within the cloud and on-premises. It enables businesses to manage and automate data flows without the need for extensive coding, offering pre-built connectors, data transformation capabilities, and support for real-time integration.

For example, the image below shows an integration done in FME, an iPaaS that is commonly used in geospatial environments but has native support for common non-spatial platforms such as Salesforce. This integration creates a Jira ticket each time a new Salesforce opportunity object is created. It also posts notifications to Slack to ensure the new tickets are visible to assignees.

iPaas Salesforce-to-Jira pipeline in FME

This integration illustrates the typical visual nature of the iPaaS design approach, where flows and customizations are designed primarily through configurations, rather than through the development of custom code. This low-code approach is one of the primary value propositions of iPaaS solutions.

Advantages of iPaaS:

  • Speed and Agility: Quick setup and deployment of integrations with minimal coding required.
  • Scalability: Easily scales to accommodate growing data volumes and integration needs.
  • Reduced Maintenance: The iPaaS provider manages the infrastructure, ensuring high availability and security.

Challenges of iPaaS:

  • Limited Customization: While iPaaS solutions are flexible, there may be limitations to how much the integrations can be customized to meet unique business requirements.
  • Subscription Costs: Ongoing subscription fees can add up, especially as your integration needs grow.

Building a Custom Data Pipeline

Creating a custom data pipeline involves developing a bespoke solution tailored to your specific data integration and management needs. This approach allows for complete control over the data flow, including how data is collected, processed, transformed, and stored. This will typically be done using a mix of tools such as Python, serverless functions, and/or SQL.

Advantages of Custom Data Pipelines:

  • Complete Customization: Tailor every aspect of your data pipeline to fit your business’s unique needs.
  • Integration Depth: Address complex or unique integration scenarios that off-the-shelf solutions cannot.
  • Ownership and Control: Full ownership of your integration solution, allowing for adjustments and optimizations as needed.

Challenges of Custom Data Pipelines:

  • Higher Costs and Resources: Significant upfront investment in development, plus ongoing costs for maintenance, updates, and scaling. Proper cost modeling over a reasonable payback period can give a more accurate picture of costs. Many costs will be fixed and may dilute as your organization scales when compared to iPaaS consumption pricing.
  • Longer Time to Market: Development and testing of a custom solution can be time-consuming.
  • Expertise Required: Need for skilled developers with knowledge in integration patterns and technologies.

Making the Choice

When deciding between an iPaaS and building a custom data pipeline, consider the following factors:

  • Complexity of Integration Needs: For complex, highly specialized integration requirements, a custom pipeline might be necessary. For more standardized integrations, an iPaaS could suffice. For example, an ELT pipeline may lend itself more to an iPaaS since transformations will be performed after your data reaches its desitnation.
  • Resource Availability: Do you have the in-house expertise and resources to build and maintain a custom pipeline, or would leveraging an iPaaS allow your team to focus on core business activities? Opportunity cost related to custom development should be considered over the development period.
  • Cost Considerations: Evaluate the total cost of ownership (TCO) for both options, including upfront development costs for a custom solution versus ongoing subscription fees for an iPaaS. iPaaS tytpically has lower upfront onboarding costs than custom development, but long-term costs can rise as your organization scales.
  • Scalability and Flexibility: Consider your future needs and how each option would scale with your business. An iPaaS might offer quicker scaling, while a custom solution provides more control over scaling components.

Conclusion

Ultimately, the decision between an iPaaS and a custom data pipeline is not one-size-fits-all. It requires a strategic evaluation of your current and future integration needs, available resources, and business objectives. By carefully weighing these factors, you can choose the path that best supports your organization’s data integration and management goals, enabling seamless data flow and informed decision-making.

Contact us to learn more about our services and how we can help turn your data integration challenges into opportunities.

Using Hstore to Analyze OSM in PostgreSQL

OpenStreetMap (OSM) is a primary authoritative source of geographic information, offering a variety of community-validated feature types. However, efficiently querying and analyzing OSM poses unique challenges. PostgreSQL, with its hstore data type, can be a powerful tool in the data analyst’s arsenal.

Understanding hstore in PostgreSQL

Before getting into the specifics of OpenStreetMap, let’s understand the hstore data type. Hstore is a key-value store within PostgreSQL, allowing data to be stored in a schema-less fashion. This flexibility makes it ideal for handling semi-structured data like OpenStreetMap.

Setting Up Your Environment

To get started, you’ll need a PostgreSQL database with PostGIS extension, which adds support for geographic objects. You will also need to add support for the hstore type. Both PostGIS and hstore are installed as extensions. The SQL to install them is:

create extension postgis;
create extension hstore;

After setting up your database, import OpenStreetMap data using tools like osm2pgsql, ensuring to import the data with the hstore option enabled. This step is crucial as it allows the key-value pairs of OSM tags to be stored in an hstore column. Be sure to install osm2pgsql using the instructions for your platform.

The syntax for importing is as follows:

osm2pgsql -c -d my_database -U my_username -W -H my_host -P my_port --hstore my_downloaded.osm

Querying OpenStreetMap Data

With your data imported, you can now unleash the power of hstore. Here’s a basic example: Let’s say you want to find all the coffee shops in a specific area. The SQL query would look something like this:

SELECT name, tags
FROM planet_osm_point
where name is not null
and tags -> 'cuisine' = 'pizza'

This query demonstrates the power of using hstore to filter data based on specific key-value pairs (finding pizza shops in this case).

Advanced Analysis Techniques

While basic queries are useful, the real power of hstore comes with its ability to facilitate complex analyses. For example, you can aggregate data based on certain criteria, such as counting the number of amenities in a given area or categorizing roads based on their condition.

Here is an example that totals the sources for each type of cuisine available in Leonardtown, Maryland:

SELECT tags -> 'cuisine' AS amenity_type, COUNT(*) AS total
FROM planet_osm_point
WHERE tags ? 'cuisine'
AND ST_Within(ST_Transform(way, 4326), ST_MakeEnvelope(-76.66779675183034, 38.285044882153485, -76.62251613561185, 38.31911201477845, 4326))
GROUP BY tags -> 'cuisine'
ORDER BY total DESC;

The above query combines hstore analysis with a PostGIS function to limit the query to a specific area. The full range of PostGIS functions can be used to perform spatial analysis in combination with hstore queries. For instance, you could analyze the spatial distribution of certain amenities, like public toilets or bus stops, within a city. You can use PostGIS functions to calculate distances, create buffers, and perform spatial joins.

Performance Considerations

Working with large datasets like OpenStreetMap can be resource-intensive. Indexing your hstore column is crucial for performance. Creating GIN (Generalized Inverted Index) indexes on hstore columns can significantly speed up query times.

Challenges and Best Practices

While hstore is powerful, it also comes with challenges. The schema-less nature of hstore can lead to inconsistencies in data, especially if the source data is not standardized. It’s important to clean and preprocess your data before analysis. OSM tends to preserve local flavor in attribution, so a good knowledge of the geographic area you are analyzing will help you be more successful when using hstore with OSM.

Conclusion

The PostgreSQL hstore data type is a potent tool for analyzing OpenStreetMap data. Its flexibility in handling semi-structured data, combined with the spatial analysis capabilities of PostGIS, makes it an compelling resource for geospatial analysts. By understanding its strengths and limitations, you can harness the power of PostgreSQL and OpenStreetMap in your work.

Remember, the key to effective data analysis is not just about choosing the right tools but also understanding the data itself. With PostgreSQL and hstore, you are well-equipped to extract meaningful insights from OpenStreetMap data.

Contact us to learn more about our services and how we can help turn your geospatial challenges into opportunities.

Do You Need a Data Pipeline?

Do you need a data pipeline? That depends on a few things. Does your organization see data as an input into its key decisions? Is data a product? Do you deal with large volumes of data or data from disparate sources? Depending on the answers to these and other questions, you may be looking at the need for a data pipeline. But what is a data pipeline and what are the considerations for implementing one, especially if your organization deals heavily with geospatial data? This post will examine those issues.

A data pipeline is a set of actions that extract, transform, and load data from one system to another. A data pipeline may be set up to run on a specific schedule (e.g., every night at midnight), or it might be event-driven, running in response to specific triggers or actions. Data pipelines are critical to data-driven organizations, as key information may need to be synthesized from various systems or sources. A data pipeline automates accepted processes, enabling data to be efficiently and reliably moved and transformed for analysis and decision-making.

A data pipeline can start small – maybe a set of shell or python scripts that run on a schedule – and it can be modified to grow along with your organization to the point where it may be driven my a full-fledged event-driven platform like AirFlow or FME (discussed later). It can be confusing, and there are a lot of commercial and open-source solutions available, so we’ll try to demystify data pipelines in this post.

Geospatial data presents unique challenges in data pipelines. Geospatial data are often large and complex, containing multiple dimensions of information (geometry, elevation, time, etc.). Processing and transforming this data can be computationally intensive and may require significant storage capacity. Managing this complexity efficiently is a major challenge. Data quality and accuracy is also a challenge. Geospatial data can come from a variety of sources (satellites, sensors, user inputs, etc.) and can be prone to errors, inconsistencies, or inaccuracies. Ensuring data quality – dealing with missing data, handling noise and outliers, verifying accuracy of coordinates – adds complexity to standard data management processes.

Standardization and interoperability challenges, while not unique to geospatial data, present additional challenges due to the nature of the data. There are many different formats, standards, and coordinate systems used in geospatial data (for example, reconciling coordinate systems between WGS84, Mercator, state plane, and various national grids). Transforming between these can be complex, due to issues such as datum transformation. Furthermore, metadata (data about the data) is crucial in geospatial datasets to understand the context, source, and reliability of the data, which adds another layer of complexity to the processing pipeline.

While these challenges make the design, implementation, and management of data pipelines for geospatial data a complex task, they can provide significant benefits to organizations that process large amounts of geospatial data:

  • Efficiency and automation: Data pipelines can automate the entire process of data extraction, transformation, and loading (ETL). Automation is particularly powerful in the transformation stage. “Transformation” is a deceptively simple term for a process that can contain many enrichment and standardization tasks. For example, as the coordinate system transformations described above are validated, they can be automated and included in the transformation stage to remove human error. Additionally, tools like Segment Anything can be called during this stage to turn imagery into actionable, analyst-ready information.
  • Data quality and consistency: The transformation phase includes steps to clean and validate data, helping to ensure data quality. This can include resolving inconsistencies, filling in missing values, normalizing data, and validating the format and accuracy of geospatial coordinates. By standardizing and automating these operations in a pipeline, an organization can ensure that the same operations are applied consistently to all data, improving overall data quality and reliability.
  • Data Integration: So far, we’ve talked a lot about the transformation phase, but the extract phase provides integration benefits. A data pipeline allows for the integration of diverse data sources, such as your CRM, ERP, or support ticketing system. It also enables extraction from a wide variety of formats (shapefile, GeoParquet, GeoJSON, GeoPackage, etc). This is crucial for organizations dealing with geospatial data, as it often comes from a variety of sources in different formats. Integration with data from business systems can provide insights into performance as relates to the use of geospatial data. 
  • Staging analyst-ready data: With good execution, a data pipeline produces clean, consistent, integrated data that enables people to conduct advanced analysis, such as predictive modeling, machine learning, or complex geospatial statistical analysis. This can provide valuable insights and support data-driven decision making.

A data pipeline is first and foremost about automating accepted data acquisition and management processes for your organization, but it is ultimately a technical architecture that will be added to your portfolio. The technology ecosystem for such tools is vast, but we will discuss a few with which we have experience.

  • Apache Airflow: Developed by Airbnb and later donated to the Apache Foundation, Airflow is a platform to programmatically author, schedule, and monitor workflows. It uses directed acyclic graphs (DAGs) to manage workflow orchestration. It supports a wide range of integrations and is highly customizable, making it a popular choice for complex data pipelines. AirFlow is capable of being your entire data pipeline.
  • GDAL/OGR: The Geospatial Data Abstraction Library (GDAL) is an open-source, translator library for raster and vector geospatial data formats. It provides a unified API for over 200 types of geospatial data formats, allowing developers to write applications that are format-agnostic. GDAL supports various operations like format conversion, data extraction, reprojection, and mosaicking. It is used in GIS software like QGIS, ArcGIS, and PostGIS. As a library it can also be used in large data processing tasks and in AirFlow workflows. Its flexibility makes it a powerful component of a data pipeline, especially where support for geospatial data is required.
  • FME: FME is a data integration platform developed by Safe Software. It allows users to connect and transform data between over 450 different formats, including geospatial, tabular, and more. With its visual interface, users can create complex data transformation workflows without coding. FME’s capabilities include data validation, transformation, integration, and distribution. FME in the geospatial information market and is the most geospatially literate commercial product in the data integration segment. In addition it supports a wide range of non-spatial sources, including proprietary platforms such as Salesforce. FME has a wide range of components, making it possible for it to scale up to support enterprise-scale data pipelines.

In addition to the tools listed above, there is a fairly crowded market segment for hosted solutions, known as “integration platform as a service” or IPaaS. These platforms all generally have ready-made connectors for various sources and destinations, but spatial awareness tends to be limited, as does customization options for adding spatial. A good data pipeline is tightly coupled to the data governance procedures of your organization, so you’ll see greater benefits from technologies that allow you customize to your needs.

Back to the original question: Do you need a data pipeline? If data-driven decisions are key to your organization, and consistent data governance is necessary to have confidence in your decisions, then you may need a data pipeline. At Cercana, we have experience implementing data pipelines and data governance procedures for organizations large and small. Contact us today to learn more about how we can help you.