Citizen Developer
Zeitspanne
explore our new search
Azure AI Foundry: Power Apps Connector
Power Platform
17. Nov 2025 21:59

Azure AI Foundry: Power Apps Connector

von HubSite 365 über Andrew Hess - MySPQuestions

Currently I am sharing my knowledge with the Power Platform, with PowerApps and Power Automate. With over 8 years of experience, I have been learning SharePoint and SharePoint Online

Microsoft expert guide to Power Platform custom connector to Azure AI Foundry LLM for cost control and Power Automate

Key insights

  • Azure AI Foundry Custom Connector: Lets you call your own Azure-hosted LLMs directly from Power Platform tools.
    Available since September 2025, it connects custom models to flows, apps, and Copilot agents.
  • Power Platform integration: Works natively with Power Automate, Power Apps, Logic Apps, and Copilot Studio for low-code AI workflows.
    Use it to add chat completions, RAG responses, or model-driven actions inside flows and apps.
  • Deployment and authentication: Deploy a model in Azure AI Foundry, then configure the connector with the endpoint URI, model deployment name, and API key.
    Set the host and authorization in the connector so Power Platform can call the model securely.
  • Domain specialization: Use fine-tuned or industry-specific models to get more accurate, context-aware answers for legal, finance, support, or other domains.
    Combine with retrieval-augmented generation (RAG) to surface your own documents and facts.
  • Cost control and security: Route calls through your Azure resources to avoid external credits and to control spending.
    Azure Foundry supports network isolation, identity controls, and governance for enterprise compliance.
  • Limits and best practices: Be aware of throttling (for example, 5,000 calls per connection per minute) and test flows for error handling and retries.
    Define clear request parameters, update the connector after model changes, and use visibility and authorization settings to protect access.

Overview of the video

Andrew Hess of MySPQuestions presents a clear walkthrough of the Azure AI Foundry Custom Connector to Power Platform in a YouTube tutorial aimed at beginners. He demonstrates how this connector lets organizations call their deployed large language models directly from Power Platform tools like Power Automate and Power Apps. As a result, teams can use domain-tuned models without relying on external message credits or generic hosted services, which can reduce costs and surface more relevant results for specific use cases.

In the video, Hess breaks the process into discrete steps and timestamps each segment to help viewers follow along. He emphasizes the practical steps: Deploy the Model, Create Custom Connector, Define Custom Parameters, and Use in Power Automate. Consequently, the tutorial functions both as a demo and a how-to guide for makers who want to integrate their own AI endpoints into automation flows.

How the connector works

The video explains that the connector wraps a model endpoint hosted in Azure AI Foundry and exposes it to Power Platform via a standardized API. Viewers see how to set the endpoint, add the API key for authentication, and configure the host and action definitions within the custom connector interface. This abstraction means Power Automate flows and Power Apps can call the model the same way they call other connectors, which simplifies integration for citizen developers and IT teams alike.

Hess also covers the importance of parameters and request bodies when defining a new action. He shows how to map inputs so that prompts, context, and options pass correctly to the model, and how to design outputs that Power Platform components can parse. Therefore, the connector acts as a bridge that carries model inputs and responses in a predictable, reusable way across different apps and flows.

Deployment and demo walkthrough

Early in the video, Hess walks through deploying a sample model—he uses o3-mini as an example—and then links that deployment to the connector settings. He steps through creating a new action, setting visibility, and updating the connector after modifications, which highlights the iterative nature of connector development. Importantly, he demos the connector in a Power Automate flow to show end-to-end behavior and validate parameters and authorization methods.

Hess also timestamps each part of the demo so viewers can jump to the segment they need, from endpoint and key setup to parameter tuning and authorization checks. As a result, the video serves both as a live demonstration and as a reference that teams can return to while building. This makes it easier to test changes and refine the connector before rolling it into production scenarios.

Benefits and tradeoffs

The tutorial outlines clear benefits: domain specialization, cost control, and improved governance when organizations host models in their own Azure tenant. By using a private endpoint, teams can fine-tune models on sensitive data and enforce network or identity controls. Moreover, the connector enables closer integration with business processes in Power Platform, which can improve automation accuracy and user experience.

However, Hess also hints at tradeoffs. Managing your own model requires operational overhead, including deployment, scaling, and monitoring. While this approach can reduce per-message charges associated with hosted services, it requires teams to balance cost savings against the effort and expertise needed to maintain model instances and security controls. Consequently, organizations must weigh immediate cost benefits against longer-term maintenance and governance responsibilities.

Challenges and practical recommendations

The video does not shy away from challenges that makers and admins may face. Hess points out considerations like API throttling limits, the need to design reliable parameter schemas, and handling different response formats from models. He recommends validating connector behavior with simple test flows before using models in mission-critical automations to avoid unexpected failures.

For teams adopting this pattern, Hess advises clear governance and testing practices, such as restricting API keys, isolating networks when needed, and logging calls for observability. Finally, he suggests combining simple prompt-based integrations with more advanced orchestrations—like Azure Functions—when flows require retry logic, error handling, or complex data transformations. In this way, teams can balance simplicity for business users and robustness for production scenarios.

Power Platform - Azure AI Foundry: Power Apps Connector

Keywords

Azure AI Foundry custom connector, Power Platform custom connector, Azure Foundry Power Platform integration, Azure AI connector for Power Automate, Power Apps Azure AI integration, Azure AI Foundry tutorial, Build custom connector Azure AI, Azure AI Foundry deployment guide