A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://github.com/dotnet-presentations/ai-workshop below:

dotnet-presentations/ai-workshop: Building GenAI Apps in C#: AI Templates, GitHub Models, Azure OpenAI & More

Get up to speed quickly with AI app building in .NET! Explore the new .NET AI project templates integrated with Microsoft Extensions for AI (MEAI), GitHub Models, and vector data stores. Learn how to take advantage of free GitHub Models in development, then deploy with global scale and enterprise support using Azure OpenAI. Gain hands-on experience building cutting-edge intelligent solutions with state-of-the-art frameworks and best practices.

AI Web Chat Application Requirements (Parts 1-6) Model Context Protocol (Parts 7-9)

The lab consists of a series of hands-on exercises where you'll build an AI-powered web application using the new .NET AI project templates. The application includes:

This lab guides you through building a complete AI-powered web application for an outdoor gear company. The application enables users to chat with an AI assistant that has knowledge of the company's product catalog through document ingestion.

Application Architecture ๐Ÿข
flowchart TD
    User([User]) <--> WebApp[Web Application<br>Blazor UI]
    WebApp <--> VectorDB[(Vector Database<br>Qdrant)]
    WebApp <--> AIChatService[AI Chat Service<br>Microsoft.Extensions.AI]
    AIChatService <--> AIProvider[AI Provider<br>GitHub Models / Azure OpenAI]
    
    subgraph Data Flow
        PDFs[Product PDFs] --> Ingestion[Data Ingestion]
        Ingestion --> Embeddings[Text Embeddings]
        Ingestion --> ProductData[Product Metadata]
        Embeddings --> VectorDB
        ProductData --> VectorDB
    end
    
    classDef webapp fill:#2774AE,stroke:#000,color:#fff
    classDef aiservice fill:#F58025,stroke:#000,color:#fff
    classDef database fill:#8A2BE2,stroke:#000,color:#fff
    classDef dataflow fill:#4CAF50,stroke:#000,color:#fff
    
    class WebApp webapp
    class AIChatService,AIProvider aiservice
    class VectorDB,ProductDB database
    class PDFs,Ingestion,Embeddings dataflow
Loading

Architecture Overview This diagram illustrates the component relationships in our outdoor gear application. The Blazor web application connects with three key components: a vector database for storing embeddings, an AI chat service powered by Microsoft.Extensions.AI, and a product database. The AI functionality is provided by either GitHub Models (for development) or Azure OpenAI (for production). The data flow shows how product PDFs are ingested, transformed into embeddings, and stored in the vector database to enable contextual AI responses.

sequenceDiagram
    actor User
    participant UI as Blazor UI
    participant Service as Product Service
    participant AI as AI Model
    participant DB as Vector Database
    
    User->>UI: Ask question about product
    UI->>Service: Query product information
    Service->>AI: Generate embeddings
    AI-->>Service: Return embeddings
    Service->>DB: Search similar vectors
    DB-->>Service: Return relevant documents
    Service->>AI: Generate response with context
    AI-->>Service: Return AI response
    Service-->>UI: Display response to user
Loading

Sequence Overview This diagram demonstrates the interaction flow when a user queries the system. When a customer asks about a product, their question is processed by the UI and passed to the Product Service. The AI model generates text embeddings for the query, which are then used to search the Vector Database for relevant documents. Once matching information is found, both the original question and retrieved context are sent to the AI model to generate a contextually informed response. This response is then returned through the service layer to the UI for display to the user.

Development to Production Flow ๐Ÿš€
flowchart LR
    Dev[Development<br>GitHub Models] --> Prod[Production<br>Azure OpenAI]
    Local[Local Vector DB<br>Qdrant] --> Cloud[Cloud Vector DB<br>Qdrant]
    
    subgraph Development Environment
        Dev
        Local
    end
    
    subgraph Production Environment
        Prod
        Cloud
        ACA[Azure Container Apps]
    end
    
    classDef devnode fill:#2774AE,stroke:#000,color:#fff
    classDef prodnode fill:#F58025,stroke:#000,color:#fff
    classDef dbnode fill:#8A2BE2,stroke:#000,color:#fff
    
    class Dev,ACA devnode
    class Prod prodnode
    class Local,Cloud dbnode
Loading

Development to Production Pathway This diagram illustrates the transition path from a local development environment to production deployment. During development, you'll use GitHub Models and a local vector database, which provides a cost-effective environment for experimentation and testing. In production, the application transitions to Azure OpenAI for enterprise-grade AI capabilities, Qdrant for scalable vector storage, and Azure Container Apps for a scalable, managed cloud hosting environment. This migration path enables seamless transition while maintaining architectural consistency.

Throughout this lab, you'll implement each part of this architecture, from setting up the AI chat interface to building the product catalog and finally deploying to Azure.

Follow the setup instructions to get started with the lab.

The lab is divided into nine modules:

AI Web Chat Application (Parts 1-6)
  1. ๐Ÿ—๏ธ Setup: Configure prerequisites and development environment for the AI workshop.

  2. ๐Ÿ—๏ธ Project Creation: Build a web application using the .NET AI Web Chat template.

  3. ๐Ÿ” Template Exploration: Understand the implementation of vector embeddings, semantic search, and chat interfaces in AI Web Chat projects.

  4. โ˜๏ธ Azure OpenAI: Transition from GitHub Models to the Azure OpenAI service for production-ready capabilities.

  5. ๐Ÿ›๏ธ Products Page: Implement a product catalog that leverages AI for enhanced product information.

  6. ๐Ÿš€ Deployment: Deploy your application to Azure using the Azure Developer CLI.

Model Context Protocol (MCP) Servers (Parts 7-9)
  1. ๐Ÿ”ง MCP Server Basics: Create your first MCP server with weather tools that extend AI agents like GitHub Copilot.

  2. ๐Ÿข Enhanced MCP Server: Build sophisticated business tools for order management, inventory, and customer service scenarios.

  3. ๐Ÿ“ฆ MCP Publishing: Package, publish, and distribute your MCP servers through NuGet for professional deployment.

The repository is structured as follows:

For workshop instructors and contributors who want to validate the workshop content, a comprehensive testing procedure is available:

Automated Credential Setup

Before testing the workshop, run the credential setup script to configure required API keys and endpoints:

# Navigate to the workshop root directory
cd ai-workshop

# Run the credential setup script
.\.github\scripts\setup-workshop-credentials.ps1

This script will prompt you for:

The credentials are saved as environment variables (WORKSHOP_GITHUB_TOKEN, WORKSHOP_AZURE_OPENAI_ENDPOINT, WORKSHOP_AZURE_OPENAI_KEY) and will be available for subsequent testing sessions.

The complete testing procedure and validation scripts are available in .github/prompts/test-workshop.prompt.md. This includes:

This project is licensed under the MIT License - see the LICENSE file for details.


RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4