
Currently I am sharing my knowledge with the Power Platform, with PowerApps and Power Automate. With over 8 years of experience, I have been learning SharePoint and SharePoint Online
Andrew Hess of MySPQuestions presents a clear walkthrough of the Azure AI Foundry Custom Connector to Power Platform in a YouTube tutorial aimed at beginners. He demonstrates how this connector lets organizations call their deployed large language models directly from Power Platform tools like Power Automate and Power Apps. As a result, teams can use domain-tuned models without relying on external message credits or generic hosted services, which can reduce costs and surface more relevant results for specific use cases.
In the video, Hess breaks the process into discrete steps and timestamps each segment to help viewers follow along. He emphasizes the practical steps: Deploy the Model, Create Custom Connector, Define Custom Parameters, and Use in Power Automate. Consequently, the tutorial functions both as a demo and a how-to guide for makers who want to integrate their own AI endpoints into automation flows.
The video explains that the connector wraps a model endpoint hosted in Azure AI Foundry and exposes it to Power Platform via a standardized API. Viewers see how to set the endpoint, add the API key for authentication, and configure the host and action definitions within the custom connector interface. This abstraction means Power Automate flows and Power Apps can call the model the same way they call other connectors, which simplifies integration for citizen developers and IT teams alike.
Hess also covers the importance of parameters and request bodies when defining a new action. He shows how to map inputs so that prompts, context, and options pass correctly to the model, and how to design outputs that Power Platform components can parse. Therefore, the connector acts as a bridge that carries model inputs and responses in a predictable, reusable way across different apps and flows.
Early in the video, Hess walks through deploying a sample model—he uses o3-mini as an example—and then links that deployment to the connector settings. He steps through creating a new action, setting visibility, and updating the connector after modifications, which highlights the iterative nature of connector development. Importantly, he demos the connector in a Power Automate flow to show end-to-end behavior and validate parameters and authorization methods.
Hess also timestamps each part of the demo so viewers can jump to the segment they need, from endpoint and key setup to parameter tuning and authorization checks. As a result, the video serves both as a live demonstration and as a reference that teams can return to while building. This makes it easier to test changes and refine the connector before rolling it into production scenarios.
The tutorial outlines clear benefits: domain specialization, cost control, and improved governance when organizations host models in their own Azure tenant. By using a private endpoint, teams can fine-tune models on sensitive data and enforce network or identity controls. Moreover, the connector enables closer integration with business processes in Power Platform, which can improve automation accuracy and user experience.
However, Hess also hints at tradeoffs. Managing your own model requires operational overhead, including deployment, scaling, and monitoring. While this approach can reduce per-message charges associated with hosted services, it requires teams to balance cost savings against the effort and expertise needed to maintain model instances and security controls. Consequently, organizations must weigh immediate cost benefits against longer-term maintenance and governance responsibilities.
The video does not shy away from challenges that makers and admins may face. Hess points out considerations like API throttling limits, the need to design reliable parameter schemas, and handling different response formats from models. He recommends validating connector behavior with simple test flows before using models in mission-critical automations to avoid unexpected failures.
For teams adopting this pattern, Hess advises clear governance and testing practices, such as restricting API keys, isolating networks when needed, and logging calls for observability. Finally, he suggests combining simple prompt-based integrations with more advanced orchestrations—like Azure Functions—when flows require retry logic, error handling, or complex data transformations. In this way, teams can balance simplicity for business users and robustness for production scenarios.
Azure AI Foundry custom connector, Power Platform custom connector, Azure Foundry Power Platform integration, Azure AI connector for Power Automate, Power Apps Azure AI integration, Azure AI Foundry tutorial, Build custom connector Azure AI, Azure AI Foundry deployment guide