Skip to main content

Building My First AI Chatbot with Azure OpenAI Services

    

     

In April 2024, I embarked on an exciting journey into the world of artificial intelligence (AI), starting with learning Python. Coming from an infrastructure-focused background, diving into AI development felt like a monumental shift. Without prior coding experience, I realized that building AI tools or training models from scratch would require a long-term commitment. However, I was determined to take small, practical steps toward understanding this fascinating field.

Exploring cloud AI services from providers like AWS and Azure led me to Azure OpenAI services. This platform showed me how AI could be leveraged to create applications—like chatbots—without deep coding expertise. I was especially intrigued by concepts such as indexing, semantic search, and retrieval-augmented generation (RAG). These tools enabled me to build my first business use case: a Microsoft Teams chatbot designed to resolve internal queries and reduce dependency on subject matter experts (SMEs).

This blog is a step-by-step guide to creating such a chatbot using Azure OpenAI services.

Prerequisites

Before starting, ensure you have:

  1. An Azure Portal Account with a Pay-As-You-Go (PAYG) subscription.

Steps to Create the Chatbot

  1. Create a Resource Group
  • In Azure, a resource group acts as a container to organize and manage related resources
        2. Deploy an Azure AI Hub

  • Azure AI Hub serves as the central location to manage AI projects, including configuring security and connectivity.


  • Upon successful deployment, resources like a storage account and Key Vault are automatically created for secure storage.

    3. Set Up a Project
  • Within the AI Hub, create a new project.
  •  Use this space to build the chatbot, access model catalogs, and utilize tools like Prompt Flow.

    4.   Deploy a Base Model

·       Select a model to serve as the foundation for your chatbot. For this project, I used GPT-4o-mini.

    5.      Prepare and Upload Data
  • Compile relevant data in a CSV file.
  • Upload the file to the Azure Studio storage for indexing.
    6.      Create a Data Index

·       This index allows the chatbot to search through your uploaded data efficiently.

    7.      Develop a Prompt Flow

·       Design a Prompt Flow that uses the data index to process user queries.

·       Add logic to send user inputs to the language model (LLM) and provide responses based on indexed information.

             Integrate SerpAPI

·       To enhance functionality, integrate SerpAPI for real-time access to search engine results like Google and Bing. This allows the chatbot to address broader questions beyond its dataset. 

                                             


    8.      Deploy the Prompt Flow

·       Publish the flow as an endpoint that can be accessed by external applications.


    9.      Integrate with Microsoft Teams

·       Use Microsoft Power Automate to consume the endpoint.

·       Build a workflow that integrates the chatbot with Microsoft Teams, enabling seamless interactions within your organization.


 Now, let’s see the chatbot in action with a sample question and its response.


Key Takeaways

Building this chatbot marked my first practical application of AI. It allowed me to:

  • Learn and apply cutting-edge AI services without coding expertise.
  • Solve a real-world problem by reducing dependencies on specific team members.
  • Explore tools like Azure AI Hub, Prompt Flow, and SerpAPI.

Conclusion

This project demonstrates how cloud-based AI tools like Azure OpenAI can empower individuals without a coding background to create impactful solutions. It’s a small yet significant step in my AI journey, and I hope this guide inspires you to start your own.

 


Feel free to share your thoughts or questions in the comments!

 

Comments

Popular posts from this blog

Deleting stale kubernetes clusters in vCD

Unlike the previous version the CSE 4.x is a stateless appliance and its data is stored in VMware Cloud Director Database.  The cluster creation and deletion compared with CSE 3.x version has improved. Besides, there are some scenarios where the cluster deletion is failing even when the "Force Delete" option is chosen. We can use vCD API explorer to delete it, the following are the API queries you can execute  Under definedEntity POST /1.0.0/entities/{id}/resolve DELETE /1.0.0/entities/{id}

Manage RabbitMQ using VCP LCM

I have been working in vCD for quite some time, and most of the implementation engineers or consultants faced issues during the deployment or upgrade of RabbitMQ for the vCD message queuing service. From vCD 10.2.2, we can use the built-in MQTT client instead of RabbitMQ however, for VCD multisite configuration or some 3rd party applications need RabbitMQ, such as Veeam or VMware HCX. Using the VCP LCM, we can create a new RabbitMQ environment or manage an existing environment. The reason for this blog is that none of the VMware documentation has the information that registering an existing RMQ instance is only going to work if the RMQ instance was previously deployed by the VCP LCM (or at least, if it is a similar setup based on a Bitnami RMQ VM). Other RMQ instances (e.g., running in CentOS) are not supported and cannot be imported into the VCP LCM 1.5. I hope this information will be useful for someone who is performing green field deployment or upgrading an existing setup. ...