ai
844 TopicsBuilding a WhatsApp AI bot for customer support
In this blog post, we’ll explore how to build a customer support application that integrates with WhatsApp using Azure Communication Services and Azure OpenAI. This app enables users to interact with a self-service bot to resolve common customer queries, such as troubleshooting errors or checking order status.AI-Powered Chat with Azure Communication Services and Azure OpenAI
Many applications offer chat with automated capabilities but lack the depth to fully understand and address user needs. What if a chat app could not only connect people but also improve conversations with AI insights? Imagine detecting customer sentiment, bringing in experts as needed, and supporting global customers with real-time language translation. These aren’t hypothetical AI features, but ways you can enhance your chat apps using Azure Communication Services and Azure OpenAI today. In this blog post, we guide you through a quickstart available on GitHub for you to clone and try on your own. We highlight key features and functions, making it easy to follow along. Learn how to upgrade basic chat functionality using AI to analyze user sentiment, summarize conversations, and translate messages in real-time. Natural Language Processing for Chat Messages First, let’s go through the key features of this project. Chat Management: The Azure Communication Services Chat SDK enables you to manage chat threads and messages, including adding and removing participants in addition to sending messages. AI Integration: Use Azure OpenAI GPT models to perform: Sentiment Analysis: Determine if user chat messages are positive, negative, or neutral. Summarization: Get a summary of chat threads to understand the key points of a conversation. Translation: Translate into different languages. RESTful endpoints: Easily integrate these AI capabilities and chat management through RESTful endpoints. Event Handling (optional): Use Azure Event Grid to handle chat message events and trigger the AI processing. The starter code for the quickstart is designed to get you up and running quickly. After entering your Azure Communication Services and OpenAI credentials in the config file and running a few commands in your terminal, you can observe the features listed above in action. There are two main components to this example. The first is the ChatClient, which manages the capturing and sending of messages, via a basic chat application using Azure Communication Services. The second component, OpenAIClient, enhances your chat application by transmitting messages to Azure OpenAI along with instructions for the desired types of AI analysis. AI Analysis with OpenAIClient Azure OpenAI can perform a multitude of AI analyses, but this quickstart focuses on summarizing, sentiment analysis, and translation. To achieve this, we created three distinct prompts for each of the AI analysis we want to perform on our chat messages. These system prompts serve as the instructions for how Azure OpenAI should process the user messages. To summarize a message, we hard-coded a system prompt that says, “Act like you are an agent specialized in generating summary of a chat conversation, you will be provided with a JSON list of messages of a conversation, generate a summary for the conversation based on the content message.” Like the best LLM prompts, it’s clear, specific, and provides context for the inputs it will get. The system prompts for translating and sentiment analysis follow a similar pattern. The quickstart provides the basic architecture that enables you to take the chat content and pass it to Azure OpenAI for analysis. , and summarization. The Core Function: getChatCompletions The getChatCompletions function is a pivotal part of the AI chat sample project. It processes user messages from a chat application, sends them to the OpenAI service for analysis, and returns the AI-generated responses. Here’s a detailed breakdown of how it works: Parameters The getChatCompletions function takes in two required parameters: systemPrompt: A string that provides instructions or context to the AI model. This helps guide OpenAI to generate appropriate and relevant responses. userPrompt: A string that contains the actual message from the user. This is what the AI model analyzes and responds to. Deployment Name: The getChatCompletions function starts by retrieving the deployment name for the OpenAI model from the environment variables. Message Preparation: The function formats and prepares messages to send to OpenAI. This includes the system prompt with instructions for the AI model and user prompts that contain the actual chat messages. Sending to OpenAI: The function sends these prepared messages to the OpenAI service using the openAiClient’s getChatCompletions method. This method interacts with the OpenAI model to generate a response based on the provided prompts. Processing the Response: The function receives the response from OpenAI, extracts the AI-generated content, logs it, and returns it for further use. Explore and Customize the Quickstart The goal of the quickstart is to demonstrate how to connect a chat application and Azure OpenAI, then expand on the capabilities. To run this project locally, make sure you meet the prerequisites and follow the instructions in the GitHub repository. The system prompts and user messages are provided as samples for you experiment with. The sample chat interaction is quite pleasant. Feel free to play around with the system prompts and change the sample messages between fictional Bob and Alice in client.ts to something more hostile and see how the analysis changes. Below is an example of changing the sample messages and running the project again. Real-time messages For your chat application, you should analyze messages in real-time. This demo is designed to simulate that workflow for ease of setup, with messages sent through your local demo server. However, the GitHub repository for this quickstart project provides instructions for implementing this in your actual application. To analyze real-time messages, you can use Azure Event Grid to capture any messages sent to your Azure Communication Resource along with the necessary chat data. From there, you trigger the function that calls Azure OpenAI with the appropriate context and system prompts for the desired analysis. More information about setting up this workflow is available with "optional" tags in the quickstart's README on GitHub. Conclusion Integrating Azure Communication Services with Azure OpenAI enables you to enhance your chat applications with AI analysis and insights. This guide helps you set up a demo that shows sentiment analysis, translation, and summarization, improving user interactions and engagement. To dive deeper into the code, check out the Natural Language Processing of Chat Messages repository, and build your own AI-powered chat application today!Make your own private ChatGPT
Introduction Creating your own private ChatGPT allows you to leverage AI capabilities while ensuring data privacy and security. This guide walks you through building a secure, customized chatbot using tools like Azure OpenAI, Cosmos DB and Azure App service. Why Build a Private ChatGPT? With the rise of AI-driven applications, organizations, people often face challenges related to data privacy, customization, and integration. Building a private ChatGPT addresses these concerns by: Maintaining Data Privacy: Keep sensitive information within your infrastructure. Customizing Responses: Tailor the chatbot’s behavior and language to suit your requirements. Ensuring Security: Leverage enterprise-grade security protocols. Avoiding Data Sharing: Prevent your data from being used to train external models. If organizations do not take these measures their data may go into future model training and can leak your sensitive data to public. Eg: Chatgpt collects personal data mentioned in their privacy policy Prerequisites Before you begin, ensure you have: Access to Azure OpenAI Service. A development environment set up with Python. Basic knowledge of FastAPI and MongoDB. An Azure account with necessary permissions. If you do not have Azure subscription, try Azure for students for FREE. Step 1: Set Up Azure OpenAI Log in to the Azure Portal and create an Azure OpenAI resource. Deploy a model, such as GPT-4o (multimodal), and note down the endpoint and API key. Note there is also an option of keyless authentication. Configure permissions to control access. Step 2: Use Chatgpt like app sample You can select any repository to be as base template for your app, in this I will be using the third option AOAIchat. It is developed by me. GitHub - mckaywrigley/chatbot-ui: AI chat for any model. Azure-Samples/azure-search-openai-demo: A sample app for the Retrieval-Augmented Generation pattern running in Azure, using Azure AI Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&A experiences. sourabhkv/AOAIchat: Azure OpenAI chat This architecture diagram represents a typical flow for a private ChatGPT application with the following components: App UX (User Interface): This is the front-end application (mobile, web, or desktop) where users interact with the chatbot. It sends the user's input (prompt) and displays the AI's responses. App Service: Acts as the backend application, handling user requests and coordinating with other services. Functions: Receives user inputs and prepares them for processing by the Azure OpenAI service. Streams AI responses back to the App UX. Reads from and writes to Cosmos DB to manage chat history. Azure OpenAI Service: This is the core AI service, processing the user input and generating responses using models like GPT-4o. The App Service sends the user input (along with context) to this service and receives the AI-generated responses. Cosmos DB: A NoSQL database used to store and manage chat history. Operations: Writes user messages and AI-generated responses for future reference or analysis. Reads chat history to provide context for AI responses, enabling more intelligent and contextual conversations. Data Flow: User inputs are sent from the App UX to the App Service. The App Service forwards the input (with additional context, if needed) to Azure OpenAI. Azure OpenAI generates a response, which is streamed back to the App UX via the App Service. The App Service writes user inputs and AI responses to Cosmos DB for persistence. This architecture ensures scalability, secure data handling, and the ability to provide contextual responses by integrating database and AI services. What can you do with my template? AOAIchat supports personal, enterprise chat enabled by RAG People can enable RAG mode if they want to search within their database, else it behaves like normal ChatGPT. It supports multimodality, (supports image, text input) also depends on model deployed in Azure AI foundry. Step 3: Deploy to Azure Deploy a Cosmos DB account in nearest region Deploy Azure OpenAI model (gpt-4o, gpt-4o-mini recommended) Deploy Azure App service, try using container I would recommend B1plan to your nearest region, select docker registry sourabhkv/aoaichatdb:0.1 startup command uvicorn app:app --host 0.0.0.0 --port 80 After app service starts, put all environment variables The application requires the following environment variables to be set for proper configuration: Environment Variable Description AZURE_OPENAI_ENDPOINT The endpoint for Azure OpenAI API. AZURE_OPENAI_API_KEY API key for accessing Azure OpenAI. DEPLOYMENT_NAME Azure OpenAI deployment name. API_VERSION API version for Azure OpenAI. MAX_TOKENS Maximum tokens for API responses. MONGO_DETAILS MongoDB connection string. AZURE_OPENAI_ENDPOINT=<your_azure_openai_endpoint> AZURE_OPENAI_API_KEY=<your_azure_openai_api_key> DEPLOYMENT_NAME=<your_deployment_name> API_VERSION=<your_api_version> MAX_TOKENS=<max_tokens> MONGO_DETAILS=<your_mongo_connection_string> Optional feature: implement authentication to secure access. Within app service select Authentication and select service providers. I went with Entra based authentication with single tenant. There is option of multi-tenant, personal accounts as well. Restart App service and within 2 minutes your private ChatGPT is ready. Pricing Pricing may depend on the plan you have deployed resources and region. Check Azure calculator for price estimation. My estimate for pricing I deployed all my resources in Sweden central Cosmos DB config - Cosmos DB for MongoDB (RU) serverless config with single write master, 2 GB transactional storage, 2 backup plan (FREE) ~ 0.75$ Azure OpenAI service - plan S0, model gpt-4o-mini global deployment, Input 20000 tokens, Output 10000 tokens ~ 9.00$ App service plan - OS Linux, Tier B1, instance count 1 ~13.14$ Total monthly cost = 22.89$ This price may vary in future, in region I calculated my configuration in Azure calculator Governance Azure OpenAI provides content filters to block any kind of input that violates responsible AI practices. Categories include Hate and Fairness Sexual Violence Self-harm User Prompt Attacks (direct and indirect) The content filtering system detects and takes action on specific categories of potentially harmful content in both input prompts and output completions. Azure OpenAI Service includes default safety settings applied to all models set as medium. Content filters can be modified to different level depending on use case. It supports RAG, I have provided detailed solution for it in my GitHub. Practical implementation GE Aerospace, in partnership with Microsoft and Accenture, has launched a company-wide generative AI platform, leveraging Microsoft Azure and Azure OpenAI Service. This solution aims to transform asset tracking and compliance in aviation, enabling quick access to maintenance records and reducing manual processing time from days to minutes. It supports informed decision-making by providing insights into aircraft leasing, compliance gaps, and asset health. For enterprises implementing private ChatGPT solutions, this illustrates the potential of generative AI for streamlining document-intensive processes while ensuring data security and compliance through cloud-based infrastructure like Azure. GE Aerospace Launches Company-wide Generative AI Platform for Employees | GE Aerospace News Build your own private ChatGPT style app with enterprise-ready architecture - By Microsoft Mechanics How to make private ChatGPT for FREE? It can be FREE if all of the setup is running locally on your hardware. Cosmos DB <-> MongoDB. Azure OpenAI <->Ollama/LM studioRefer this NOTE : I have usedgpt-4o,gpt-4o-minithese values are hardcoded in webpage, if you are using other models, you might have to change them inindex.html. App Service <-> Local machine Register for Github models to access API for FREE. Note: GitHub models have rate limit for different models. Useful links sourabhkv/AOAIchat: Azure OpenAI chat What is RAG? Get started with Azure OpenAI API Chat with Azure OpenAI models using your own data776Views0likes0CommentsMondays at Microsoft | Episode 36
Another "Mondays at Microsoft" coming your way! Karuana Gatimu & Heather Cook will focus on AI in aviation, measuring AI performance, overview LLMs, discuss Microsoft 365 Copilots agents in SharePoint, plus highlight upcoming community events, and more. Join LIVE Monday, October 21, 2024, 8:00am PDT. See you soon! #CommunityLuv 💝 Oct. 21 agenda, plus links to learn more: Insights from the fifth annual Microsoft Digital Defense Report. AI in Aviation: Introduction of a new industry reference architecture for airlines and airports. Measurement in AI: Discussion on the importance of measuring AI performance. AI for Water Conservation: Highlighting an AI tool developed by FIDO Tech to detect and prioritize water leaks. Overview of large language models (LLMs) and their applications. Introduction of Copilot agents in SharePoint. New updates for Pipelines in Power Platform, including custom pipelines host. To learn more about the Microsoft Global Community Initiative - a meeting place for all who are part of the Microsoft Community ecosystem, all are welcome: https://aka.ms/MGCI We launched a new page listing our newly minted regional directors. https://aka.ms/MGCIAdvisors So many great events coming up and all available to check out on www.communitydays.org + its community calls page. FYI |Microsoft Ignite | November 19-22, 2024, in Chicago, USA – https://ignite.microsoft.com FYI | Our next episode of Mondays at Microsoft isNovember 4th; check our show page for more.733Views0likes0CommentsMondays at Microsoft | Episode 34
Power boost your day with Karuana Gatimu and Heather Cook. They've got the latest community news and events prepped for the next "Mondays at Microsoft." This event aired live at 8am PST on Monday, September 23rd, 2024!😎 [Episode 34] Show notes and links to all that was shared and discussed during this episode: TurboVote | Register to vote, check your registration, or sign up for election reminders — https://aka.ms/TurboVote. The next phase of Microsoft 365 Copilot innovation — https://aka.ms/Copilot/NextPhase. #Wave2: Copilot Pages, Copilot in Excel with Python, Copilot in PowerPoint, Copilot agents in SharePoint, and more. Plus, Microsoft 365 Copilot Wave 2 announcements blog — https://aka.ms/Copilot/NextPhase/blogby Jared Spataro. Microsoft’s new Qualcomm-powered Surface devices now available — https://aka.ms/QualcommDevices. Learn to build a security-first culture with AI at Microsoft Ignite 2024 — https://aka.ms/SecurityIgnite2024. New Copilot enhancements help small and medium-sized businesses (SMB) innovate — https://aka.ms/CopilotEnhancementsSMB. Bidirectional translation support now available for language interpretation in Teams — https://aka.ms/BidirectionalSupport. Unveiling Copilot agents built with Microsoft Copilot Studio - https://aka.ms/UnveilingCopilotAgents. "How the Microsoft Power Platform community is using low-code and AI to transform work and lives" by Charles Lamanna. Register for the European Microsoft Fabric Community Conference in Stockholm (September 24-27) - https://aka.ms/Fabric/CommunityConference. To learn more about the Microsoft Global Community Initiative - a meeting place for all who are part of the Microsoft Community ecosystem, all are welcome: https://aka.ms/MGCI.We launched a new page listing our newly minted regional directors. https://aka.ms/MGCIAdvisors So many great events coming up and all available to check out on https://www.communitydays.org. We also just launched a community calls page here so you can find community related called host by Microsoft and the community. Join us for the next episode of Mondays at Microsoft on October 8th at 8am PDT for our next show and check out our page for our show and Community Studio efforts at https://adoption.microsoft.com/mondays-at-microsoft.3.6KViews0likes15CommentsOneDrive community call | August 2024
Join the OneDrive product team each month tohear what's top of mind, get insights on roadmap updates, and dig into a special topic.Each call includes live Q&A where you'll have a chance to ask the OneDrive product team any question about OneDrive - The home of your files. This month's special topic wasCopilot in OneDrive witharjuntomar(Senior Product Manager on the OneDrive team at Microsoft). This broadcast above is now available on demand. Share this page with anyone far and wide to anyone you know that would be interested.Note: The original "Join" aka.ms URL link gets updated each month to take you to the individual event space each month: https://aka.ms/OneDriveCommunityCall. Our goal is to simplify the way you create and access the files you need, get the information you are looking for, and manage your tasks efficiently.We can't wait to share, listen and engage - monthly! Anyonecan join this one-hour webinar to ask us questions, share feedback, and learn more about the features we’re releasing soon and our roadmap. Note: Our monthly public calls are not an official support tool. To open support tickets, go to seeGet support for Microsoft 365; support for educators and education customers is available at https://aka.ms/edusupport. Stay up to date onMicrosoft OneDrive adoptionon adoption.microsoft.com. Join our community to catch all news and insights from theOneDrive community blog. And follow us on Twitter:@OneDrive.Thank you for your interest in making your voice heard and taking your knowledge and depth of OneDrive to the next level. Thanks for joining.4.6KViews6likes157CommentsMondays at Microsoft | Episode 32
Stay informed | Join Karuana Gatimu and Heather Cook for the latest in AI, broad Microsoft 365 product awareness, community activities and events, and more. Join live on Monday, August 19th, 8:00am PDT.#CommunityLuv Note: We multicast all episodes. You can watch the show live right here on itsCommunity News Desk event page, via LinkedIn Live from the"Microsoft Community" page, or direct within YouTube on theMicrosoft Community Learning channel. Show notes & links to all that was shared and discussed on July 19th, 2024: Now available: Copilot for Microsoft 365 Risk Assessment QuickStart Guide. Find your city on the Microsoft AI Tour, starting September 24. Shout out to the TechCon365 Microsoft 365 Power Platform Conference team for a great DC show, and FYI about their next event in Dallas (Nov.11-15.2024). Announcing mandatory multi-factor authentication (MFA) for Microsoft Azure sign-in. Lots of great Microsoft 365 product training content:https://aka.ms/M365ConSessions. Introducing new OneDrive, Quick Links and Playlist cards for Viva Connections. Viva Connections webpart changes. Join the Microsoft OneDrive community call on Wednesday, August 21st at 8am PDT —https://aka.ms/OneDriveCommunityCall. Join the Microsoft Viva community call on Wednesday, August 21st at 9am PDT —https://aka.ms/VivaCommunityCall. Microsoft Entra ID Admin Center & Admin portal will no longer support license management. Microsoft Loop | 5-Part Learning Series. "Building Power Apps Canvas App with Multimedia Integration in SharePoint - Audio Player" by Shadrack Inusah. Latest episode of the "Abstracts" podcast discusses Personhood credentials: Artificial intelligence and the value of privacy-preserving tools to distinguish who is real online. To learn more about the Microsoft Global Community Initiative - a meeting place for all who are part of the Microsoft Community ecosystem, all are welcome:https://aka.ms/MGCI We launched a new page listing our newly minted regional directors.https://aka.ms/MGCIAdvisors Find upcoming events and community calls onhttps://CommunityDays.org Join us in 3 weeks on Monday, September 9 th , 8am PDT for our next show!953Views0likes15CommentsMondays at Microsoft | Episode 31
NOW ON DEMAND| Busy last few weeks, with more to come. Stay in the know with Mondays at Microsoft. Karuana_Gatimu_MSFTandHeather Newmankeep you grounded with the latest in AI, broader product awareness, community activities and events, and more. Note: AiredMonday, August 5th, 8:00am PDT.#CommunityLuv Show notes and links to all that was shared and discussed during this episode: Lumen’s inaugural “Copilot Olympics” Chi Mei Medical Center developed AI copilots built with Microsoft Azure OpenAI Service to help overworked staff lighten their workloads. Xbox at gamescom 2024. Microsoft collaborates with Mass General Brigham and University of Wisconsin-Madison to advance AI foundation models for medical imaging. Community Skills challenge: Power skills for user enablement. Trustworthy and Responsible AI Network (TRAIN) expands to help European healthcare organizations enhance the quality, safety and trustworthiness of AI in health. U.S., Pacific Gas & Electric Company (PG&E) implements Power Platform and Copilot Studio to automate low-value tasks and rededicate employees to focus on high-value work. Microsoft Loop | 5-Part Learning Series:https://aka.ms/LoopLearningSeries Part 1 | “Meet the makers: The story behind Loop” |Wednesday, August 14th, 2024, from 10:00am – 11:00am PDT Part 2 | “Almost everything you need to know to start with Microsoft Loop” |Wednesday, August 21st, 2024, from 10:00am – 10:30am PDT. Part 3 | “Level up your project management with Loop” |Wednesday, August 28th, 2024, from 10:00am – 10:30am PDT. Part 4 | "Meet Copilot in Loop" |Wednesday, September 4th, 2024, from 10:00am – 10:30am PDT. Part 5 | "AMA: Live Video Q&A with the Loop team" |Wednesday, September 11th, 2024, from 10:00am – 11:30am PDT. What’s new in Power Apps: July 2024 Feature Update. Fabric Influencers Spotlight July 2024 - Introducing the Fabric Influencers Spotlight, shining a light on Microsoft MVPs & Fabric Super Users. Microsoft Community ecosystem (MGCI), all are welcome - learn more:https://aka.ms/MGCI. And check out our new Linktree for *Mondays at Microsoft*:https://linktr.ee/MondaysAtMicrosoft- all the resources connected to the show.1.6KViews0likes12Comments