Overview

In this webinar, Senior Product Manager Shriram Sankaran demonstrated how to build intelligent, natural-language-driven map experiences inside Blazor applications without relying on geospatial APIs, external databases, or complex GIS knowledge. The session focused on combining Azure OpenAI’s generative capabilities with Syncfusion Blazor components to generate location data dynamically and render it on an interactive map.

What the demo covered

Prerequisites

To follow along, you’ll need:

Building an AI-powered map

The core of the demo involved creating a Blazor application that:

  • Accepts natural language queries.
  • Sends prompts to Azure OpenAI.
  • Parses the returned JSON.
  • Displays map markers dynamically.
  • Enhances tooltips with conditional images.

Syncfusion components used

  • Maps
  • TextBox
  • Spinner
  • Inputs and layout enhancements

Key benefits

  • No geospatial database required.
  • Minimal code, major capability.
  • Easy to extend.
  • End-to-end Blazor solution.

Key takeaways

  • Azure OpenAI can generate reliable location data directly from natural language.
  • Syncfusion Blazor Maps visualizes the data with rich UI features.
  • Blazor + AI enables the rapid development of smart, interactive experiences.
  • This demo provides a complete template you can extend for production apps.

Time stamps

  • [00:00] Welcome and session overview
  • [00:33] What you’ll build today
  • [01:27] Agenda and learning path
  • [02:12] Demo overview
  • [03:01] Why AI for geospatial experience?
  • [03:56] Prerequisites and setup
  • [04:18] Starting the live demo
  • [05:32] Installing Syncfusion packages
  • [08:16] Registering services and configuring the app
  • [09:38] Creating the Azure OpenAI service class
  • [16:49] Handling AI response success and error paths
  • [18:12] Registering ID, license keys, and models
  • [21:06] Building the marker model
  • [23:16] Implementing the AI-powered marker fetch
  • [25:43] Adding Syncfusion Map components
  • [27:18] Adding a natural language search box
  • [30:19] Adding dynamic tooltip images
  • [32:04] Live demo showcase
  • [33:20] Key takeaways and resources

Q&A

Q1: Azure OpenAI only or OpenAI too?

A: This demonstration focuses on integrating specifically with Azure OpenAI, not the public OpenAI API. Azure OpenAI offers enterprise-grade security, legal compliance, and regional deployment options, which is why it’s used in this example. If you prefer to use OpenAI’s public API, you would need to modify the code to call OpenAI’s endpoints and manage authentication accordingly.

Q2: Hi, guys. What’s the major difference between this vs. Google Maps?

A: Syncfusion Blazor Maps is a UI component designed for data visualization in web and desktop applications. It is primarily used to render GeoJSON maps, integrate map providers, and display custom map layers for thematic or analytical purposes.

Google Maps, on the other hand, is a comprehensive mapping service offering global coverage, real-time navigation, traffic data, satellite imagery, and APIs for geolocation and routing. It serves as a map provider that can be displayed within Syncfusion Maps for additional customization and enhanced data visualization.

Q3: Are there any plans for an MCP server for Syncfusion docs? I was hoping for an MCP server to use with VS Code Copilot, similar to the MS Learn MCP server.

A: We already have an MCP server available for multiple platforms as AI coding assistants. Please refer to the link below for more information.

Related links