ChatGPT: Use Uploaded Files in Prompts
Microsoft Copilot Studio
5. März 2026 12:38

ChatGPT: Use Uploaded Files in Prompts

Microsoft expert: Copilot Studio chat uses Excel and Code Interpreter to call uploaded files into prompts for LLM output

Key insights

  • Copilot Studio and Copilot Chat let you pass uploaded files directly into prompts to ground AI responses.
    They show how to call files so the model uses your actual documents instead of guessing.
  • In Microsoft 365 Copilot Chat, type / in the prompt box to browse and insert files or emails you can access.
    Copilot then includes the selected item when answering your prompt.
  • In Security Copilot, open the Files area, choose Upload file, toggle the file on, and reference "uploaded files" in your prompt.
    Uploaded items stay private to the uploader and are available only for that session.
  • Use the Code Interpreter to convert complex files like Excel into CSV or JSON that the LLM can read directly.
    This avoids manual reformatting and makes spreadsheet data usable in prompts.
  • Permissions and Privacy are enforced automatically: Copilot can only access files you are allowed to view.
    Files stored for security tasks remain isolated to meet compliance needs.
  • Productivity tips: surface recent or shared docs, toggle uploads active before prompting, and prefer common formats (CSV, TXT) for fastest results.
    Use OneDrive/SharePoint file requests when you need external uploads without granting folder access.

Video Overview and Context

Dewain Robinson published a YouTube video that demonstrates how to call an uploaded file from a prompt in Copilot Studio. In the clip, he walks viewers through passing files directly into a chat prompt, with a special focus on handling Excel spreadsheets. Moreover, Robinson highlights a useful trick that leverages the new Code Interpreter function to convert file content into formats the large language model can read. Consequently, the video aims to show both the basic flow and several practical tips that help users get more accurate results from AI-assisted prompts.

At the same time, the video sits within a broader wave of Microsoft 365 Copilot Chat and Microsoft Security Copilot. Robinson frames his tutorial as a bridge between simple file uploads and more advanced prompt engineering that reduces AI errors. Therefore, readers should expect hands-on steps as well as notes about permissions and file handling. In short, the video is both a how-to and a demonstration of why file-aware prompts matter for real work tasks.

Demonstration: Step-by-Step File Calling

Robinson begins by showing how to upload a file to the session and then reference it directly inside the chat prompt, illustrating the process with a sample spreadsheet. He shows that you can instruct the assistant to "summarize this file" or to perform targeted analysis, and then the assistant uses the referenced content to produce grounded answers. Additionally, he points out interface cues that help you select the correct file, which avoids accidental use of the wrong document. Thus, following his steps helps reduce manual copying and keeps the workflow inside the chat pane for speed and clarity.

Next, the video explores the mechanics of toggling uploads so the assistant can access them during a session, and Robinson demonstrates the results with concrete examples. He emphasizes that files must be made active or referenced explicitly to be considered by the model, which prevents unintended disclosure of unrelated documents. This practical emphasis helps viewers replicate the process in their own tenant or test environment. Consequently, the approach proves useful for summarization, data extraction, and simple transformations without leaving the chat experience.

Code Interpreter and Handling Complex Formats

A key highlight in the video is the use of the Code Interpreter to convert complex files, like spreadsheets, into a model-friendly format. Robinson shows how converting an Excel file to CSV or a structured JSON can make numerical and tabular content much easier for the model to parse. As a result, the assistant can perform calculations, generate summaries, or extract structured insights with better fidelity. Therefore, the conversion step serves as a practical workaround for formats that LLMs struggle to interpret directly.

However, Robinson also notes that conversion introduces tradeoffs: data types or formulas may not translate perfectly, and large workbooks can require manual trimming before conversion. Consequently, users must balance fidelity against speed, deciding when to keep original structure and when to simplify. In addition, automated conversions may change date formats or lose cell-level context, so validation remains important. Thus, the Code Interpreter boosts capability but does not eliminate the need for user review.

Security, Privacy, and Permission Considerations

The video addresses how permissions and storage impact the safety of calling files in prompts, drawing on Microsoft practices where files remain private to the uploading user unless explicitly shared. Robinson underscores that tools like Microsoft Security Copilot store uploaded files in an isolated space tied to the user session, which reduces cross-tenant exposure. Meanwhile, features in Copilot Chat show accessible files based on recent activity and permissions, limiting what the assistant can reference. Therefore, the design aims to balance convenience with access controls to protect sensitive content.

Nevertheless, Robinson cautions that organizations must still manage policies, especially when forms or external file requests are involved in workflows using OneDrive or SharePoint. External uploads can be useful for surveys or intake forms, but they also require governance to avoid accidental data leakage. As a result, admins should consider retention rules, user training, and the limits of anonymous uploads before enabling broad file-based prompts. Thus, practical security steps remain a critical complement to the built-in protections shown in the video.

Tradeoffs, Challenges, and Best Practices

Robinson closes with several tips and small tricks that help users get reliable results while managing tradeoffs between convenience and accuracy. For example, while direct file referencing speeds up tasks, it can produce errors if the file format is complex or if the model misreads context, so reviewers should validate outputs. Moreover, large files can add latency, and conversion steps can strip useful metadata, so selecting relevant subsets often yields better outcomes. Therefore, the best practice is to prepare lightweight, well-structured inputs and then use the assistant iteratively to refine results.

Finally, the video is useful for both curious individuals and teams exploring prompt-driven workflows with file support, though it also highlights limitations that demand caution. Robinson offers a balanced mix of demonstration and advice, showing that the technology can boost productivity but not replace user oversight. Overall, the tutorial provides clear, actionable steps while reminding viewers to weigh usability against accuracy and governance. Consequently, the clip serves as a practical starting point for organizations testing file-aware prompts in their AI toolchains.

Microsoft Copilot Studio - ChatGPT: Use Uploaded Files in Prompts

Keywords

call uploaded file from prompt, access uploaded file in prompt, use uploaded file in AI prompt, reference uploaded file in ChatGPT prompt, attach file to prompt, retrieve uploaded file in prompt, pass file to prompt API, file handling in prompts