dataflows - Stringfest Analytics https://stringfestanalytics.com Analytics & AI for Modern Excel Tue, 18 Nov 2025 22:35:02 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://i0.wp.com/stringfestanalytics.com/wp-content/uploads/2020/05/cropped-RGB-SEAL-LOGO-STRINGFEST-01.png?fit=32%2C32&ssl=1 dataflows - Stringfest Analytics https://stringfestanalytics.com 32 32 98759290 How to build AI-powered Excel reports with Power Automate https://stringfestanalytics.com/how-to-build-ai-powered-excel-reports-with-power-automate/ Wed, 29 Oct 2025 23:44:14 +0000 https://stringfestanalytics.com/?p=16131 As Excel users, we spend a lot of time reporting on data across different teams, formats, and audiences. We’re expected not just to crunch numbers but to generate insights and communicate them clearly. What if AI could help with that? In this post, we’ll look at how to build an AI-powered Excel report using the […]

The post How to build AI-powered Excel reports with Power Automate first appeared on Stringfest Analytics.

]]>
As Excel users, we spend a lot of time reporting on data across different teams, formats, and audiences. We’re expected not just to crunch numbers but to generate insights and communicate them clearly. What if AI could help with that?

In this post, we’ll look at how to build an AI-powered Excel report using the AI Builder in Power Automate. To keep things simple, we’ll start small with a workflow that’s triggered manually and outputs a formatted summary of a well-known vehicle dataset. From there, you can expand the flow to refresh data automatically, send approvals, or even generate visuals, but it’s best to start with the basics.

Download the exercise file below and follow along to build your first AI-driven Excel report.

 

The very first thing we need to do is get this data into the Dataverse. That’s where AI Builder looks for data to analyze. I’d suggest using Dataflows for this step since it gives you a dynamic pipeline that automatically stays up to date as your data changes.

If you haven’t worked with Dataflows before, check out my post on how to set them up. Once your data is loaded, you can name the table autompg in Dataverse. That will make it easier to reference later when we build our AI-powered report.

Next you’ll fire up Power Automate at make.powerautomate.com and for the ease of simplicity start an Instant Cloud Flow. If you’re not familiar with these steps, I’d suggest checking out my LinkedIn Learning course on Power Automate:

Add a step after manually triggering a flow. From the AI Builder group at the top, choose Run a prompt (or search for it). This lets you use generative AI right inside Power Automate to summarize data, rewrite text, or generate insights using values from earlier steps. Learn more at learn.microsoft.com/ai-builder/overview or start a trial at learn.microsoft.com/ai-builder/ai-builder-trials.

Run a prompt AI builder

Now, just like any other step in a Power Automate flow, you’ll need to define a few parameters. In this case, it’s the prompt to run. Choose Custom prompt. There are some built-in templates, but it’s best to provide as much context as possible so the AI understands your audience and intent.

AI Builder custom prompt

Here’s an example prompt you can use in AI Builder to describe your dataset and the tone you want for the report. Paste it into the Prompt instructions box on the left.

As you build your prompt, start by giving it a clear, searchable name at the top. Then, make sure to connect it to live data from Dataverse so your flow stays dynamic.

To do this, click Add content at the bottom of the prompt box (or press the / key). Switch to Dataverse, locate the autompg table, and select the autompg attribute. This will pull in all records from the dataset and keep your AI Builder prompt connected to the latest data.

Rename and point data dataverse

Before testing the prompt, make a few quick adjustments. Click the three dots (…) and select Settings. Increase the record retrieval limit to around 400 or more to ensure all rows in the dataset are included. Then turn on code interpreter, which allows the system to run calculations, generate charts, and analyze your data more effectively within the flow.

Autompg prompt settings

Go ahead and test the results. you should see something like the below.

Autompg text output

It looks great, but freeform text output can be unpredictable. One way to control that is by lowering the temperature, but since it’s already at the minimum, we’ll need another approach. The solution is to add structure by mapping the output into JSON. This gives the model a clear format to follow, which helps produce consistent, reliable responses every time you run the flow.

To do this, change the Output format on the right side of the prompt builder from Text to JSON, then paste in the setup below. Be careful not to break the live connection to the autompg table in Dataverse.

Go ahead and re-test the prompt. You’ll notice the output looks quite different this time. The JSON format might seem messy at first, but that’s actually a good thing. It breaks your results into clear, structured building blocks, making them much easier to parse and reuse in the next step of your flow.

Autompg JSON

For this next step, we’ll take the JSON output and display it in an adaptive card. An adaptive card is a dynamic, interactive message format that can be shown in Teams, Outlook, or other Microsoft 365 apps. It’s a great way to present structured data clearly.

Instead of showing a block of raw JSON, users see a formatted, readable summary with key insights and buttons for actions. This makes your flow outputs easier to share and understand. If you want to explore adaptive cards in more depth, check out this post:

That means the third step in our flow will be to post an adaptive card to Teams. Specify the channel and any other details about where it should go, then for the Adaptive card parameter, paste in the information below:

Make sure to map your dynamic content to the correct values in the adaptive card code. Keep the quotation marks! They define where your live data goes and removing them can break the card’s structure.

Adaptive card parameter flow

Finally, save and test your adaptive card. You should see something like this in Teams:

Autompg summary adaptive card

Each time you run the flow, the results might vary slightly since it’s using generative AI, but the overall structure will stay consistent. That consistency is a good sign: it means our JSON formatting is working, and we can now fine-tune the look and layout through the adaptive card code.

Now that you’ve built your first AI-powered Excel report, you’ve got a solid foundation to build on. From here, you could add an approval process that routes the report to a manager in Teams before sharing, or schedule the flow to refresh and update the data automatically in Dataverse.

And this flow is just the beginning. Imagine generating monthly summaries automatically, sending alerts when key metrics drop, or creating follow-up tasks based on AI-generated insights. The possibilities go far beyond reporting. This opens the door to a smarter, more connected analytics process across your organization.

What other use cases could your team benefit from? Book a discovery call and let’s explore how to bring AI-powered reporting and automation into your workflows:

The post How to build AI-powered Excel reports with Power Automate first appeared on Stringfest Analytics.

]]>
16131
How to understand the modern Excel AI stack https://stringfestanalytics.com/how-to-understand-the-modern-excel-ai-stack/ Wed, 22 Oct 2025 23:08:12 +0000 https://stringfestanalytics.com/?p=16136 A client recently asked me to give a kind of “state of the union” talk on Excel and its growing AI stack. And honestly, it’s not wrong to call it messy. There are so many new pieces floating around. Even Microsoft admits that Copilots and Agents are not the same thing, yet you use Copilot […]

The post How to understand the modern Excel AI stack first appeared on Stringfest Analytics.

]]>
A client recently asked me to give a kind of “state of the union” talk on Excel and its growing AI stack. And honestly, it’s not wrong to call it messy. There are so many new pieces floating around. Even Microsoft admits that Copilots and Agents are not the same thing, yet you use Copilot Studio to build Agents! 🤔

Still, there’s a structure forming. Over the past year, the Excel ecosystem has been reshaped around a much broader vision of AI and automation. Excel is no longer a canned set of gridlines… it’s a gateway to an entire AI-powered data platform. If you can start seeing how these parts connect, you’ll be way ahead of most analysts and organizations.

Let’s walk through what I see as the emerging stack for AI and Excel, as well as some of my resources to get you started.

Power BI and Dataverse

At the foundation of the stack sits Power BI and Dataverse. Think of this as the data governance and storage layer, the place where your organization’s data actually lives and is managed. Power BI remains the visualization front end, but Dataverse is the real star for those moving beyond spreadsheets. It provides structured, secure tables of data that can be shared across Excel, Power Apps, and Power Automate without the chaos of file versions and email attachments.

In practical terms, Dataverse acts as your organization’s “truth layer.” When you connect Excel to Dataverse, you’re no longer pulling from CSVs or manually refreshing reports. Instead, you’re working directly with live, centralized data. This means your Copilot queries and Python models in Excel are referencing the same trusted data that your BI dashboards and apps do.

For a deeper dive on this, I wrote about how Excel fits into the Power Platform and Dataverse ecosystem here:

You can also check out my LinkedIn Learning course on the basics of Power BI for Excel users who are new to the Power Platform. It’s a great starting point, and the skills you’ll learn include many of the other tools mentioned in this post, like Power Automate and Copilot Studio. Definitely a keeper!

Power Query/Dataflows

Next comes Power Query and Dataflows. Power Query has long been Excel’s built-in ETL (extract, transform, load) tool, allowing analysts to clean, reshape, and combine data before analysis. But with Dataflows, this logic can be pushed into the cloud. Instead of each workbook running its own refresh process, you can define transformations once and share them across the organization.

If Dataverse is your source of truth, Power Query and Dataflows are how you make that truth usable. They standardize messy spreadsheets, merge data from multiple systems, and prepare clean tables for analysis. And since Dataflows can feed directly into both Power BI and Excel, your analysts can stay in Excel while working with enterprise-grade pipelines.

I covered how to connect Excel to Dataverse via Dataflows in detail here:

For a deeper dive into the fundamentals of Power Query in Excel, take a look at my book Modern Data Analytics in Excel:

Excel

Now we arrive at the familiar territory: Excel itself. Except this isn’t the Excel of even five years ago. Today’s Excel contains multiple AI-powered features that together form the intelligence layer of the stack: Copilot, Python, and now Agent Mode.

Copilot

This is the most visible AI layer, allowing users to generate formulas, create charts, and summarize data through natural language. It’s the first step toward conversational analytics inside the spreadsheet. You can ask it to “summarize sales by region” or “highlight outliers in this column,” and it will produce working Excel formulas or visualizations for you.

But Copilot doesn’t replace your analytical thinking. It depends on your ability to ask the right questions and recognize when its answers don’t quite make sense.

To get started with Copilot in Excel, check out my course on LinkedIn Learning:

Python in Excel

Next comes Python in Excel, which bridges the gap between Excel users and the data science ecosystem. Python unlocks advanced analytics, machine learning, and visualization capabilities directly in the workbook. You can import packages like pandas, numpy, or matplotlib and perform operations that were once out of reach for Excel alone. This means you can run predictive models, clean data programmatically, or create custom visuals, all while maintaining Excel’s familiar interface.

For a quick, practical overview of 15 ready-to-use Python in Excel examples, check out my short course on Gumroad.

What’s especially exciting is that Copilot and Python now work together through the Advanced Analysis experience. Instead of writing Python code manually, you can ask Copilot to generate it for you. For instance, you might type “show me a histogram of revenue distribution by region” or “forecast next quarter’s sales with a linear model,” and Copilot will return executable Python code that runs right inside your workbook.

To see Advanced Analysis in action, check out this post:

It’s a major leap toward making Excel a full analytics development environment: one where formula-based logic, natural language prompts, and code-based analysis coexist seamlessly.

Agent Mode

Agent Mode represents a major shift from single prompts to full reasoning workflows. Copilot is built around a one-shot model: you ask a question, it answers. Agent Mode, by contrast, uses an iterative reasoning loop that plans, executes, validates, and retries until the output meets the user’s intent. Rather than just speeding up a task,

Agent Mode can manage an entire workflow under your supervision, much like delegating to a junior analyst. This means a tool that doesn’t just write formulas for you. It can build reports, validate totals, format outputs, and so much more.

Learn more in my guide on getting started with Agent Mode here:

Office Scripts and Power Automate

Once you’ve built intelligence into your Excel processes, you’ll want to execute them reliably. That’s where Office Scripts and Power Automate come in. Office Scripts lets you record and reuse repeatable actions in Excel on the web: cleaning data, formatting tables, or updating charts. When paired with Power Automate, those scripts become part of larger workflows that run automatically, even when you’re not in Excel.

This combination is how Excel begins to extend its reach across the wider Microsoft 365 ecosystem. A workbook can now refresh data, apply formatting, run calculations, and send updates entirely on its own. A Power Automate flow might open an Excel file stored in OneDrive, trigger a script to recalculate KPIs, and post the results as a formatted summary in Teams. Another flow might collect survey responses from Microsoft Forms, append them to a central Excel table, and update a dashboard every morning. The line between spreadsheets, communication tools, and business systems becomes almost invisible once these pieces are connected.

Power Automate with Office Scripts essentially turns Excel from a static reporting tool into an active participant in your organization’s workflows. It’s where business logic meets execution.

To learn more about these two tools, check out my LinkedIn Learning courses covering each:

Copilot Studio

At the top of the stack sits Copilot Studio, the tool that connects everything else. Copilot Studio lets you build and manage custom copilots and agents that can interact with Excel, Power Automate, and external systems through connectors and APIs. If Copilot is your assistant and Agent Mode is your analyst, Copilot Studio is your command center.

With Copilot Studio, you can design domain-specific copilots that draw from your organization’s own data sources and workflows. A finance Copilot can answer questions about budget performance by querying live Excel data from Dataverse. A project management Copilot can notify stakeholders when milestones are delayed by triggering a Power Automate flow. An HR Copilot might summarize headcount changes from an Excel table or pull analytics from Power BI. In each case, the Copilot is not a static chatbot: it’s an orchestrator that understands context, retrieves information, and can take action.

The real potential of Copilot Studio lies in this orchestration. You’re doing more than just monitoring your data. You’re building systems that can reason across multiple layers of the Microsoft stack and perform tasks end to end.

For an example of how this works, see my tutorial:

What’s fascinating is that Copilot Studio uses many of the same components we’ve already discussed. Your Excel files can act as data sources, your Office Scripts can become agent actions, and your Power Automate flows can serve as orchestration layers. Excel remains the front door, but now the system behind it can reason, decide, and act.

Where it’s all heading

Right now, it’s fair to call this ecosystem messy. The boundaries between products aren’t fully clear, features are evolving fast, and documentation can lag behind the technology. But when you zoom out, the direction is unmistakable.

Excel is becoming the user interface to a much larger AI and automation ecosystem. Analysts will soon spend more time designing workflows, defining logic, and validating insights than manually crunching numbers. The winners will be those who can think across tools—connecting Power Query to Python, linking Office Scripts to Power Automate, and embedding their logic into custom Copilot experiences.

The tools are powerful, but the key is systems thinking. Your team needs analysts who understand how data flows from one layer to another, how automation can scale their work, and how to evaluate AI outputs critically. Without that mindset, you risk building disconnected tools that never deliver true value.

To see how this future might play out when it comes to Excel-based training and skills development, check out this post:

Conclusion: build your strategy now

The best thing you can do right now is get your analysts’ skills ducks in a row. Learn how these tools relate to one another. Start experimenting with small automations. Map out your data pipelines and workflows before the technology overwhelms you.

If your organization is trying to make sense of how to connect Excel, Power BI, and the Power Platform into a cohesive AI strategy, I’d love to help. You can book time with me here to talk through where you are, what you want to achieve, and how to structure a roadmap that turns this messy new world into a clear competitive advantage:

The post How to understand the modern Excel AI stack first appeared on Stringfest Analytics.

]]>
16136
How to connect your Excel workbook to the Dataverse via Dataflows https://stringfestanalytics.com/how-to-connect-your-excel-workbook-to-the-dataverse-via-dataflows/ Mon, 20 Oct 2025 11:50:40 +0000 https://stringfestanalytics.com/?p=16089 If you want to make your Excel data more useful across the Power Platform, connecting it to Dataflows within Power Apps is the right starting point. By moving data from your workbook into a Dataflow, you make it accessible to Dataverse and ready for automation through tools like Power Automate and Copilot Studio. Once this […]

The post How to connect your Excel workbook to the Dataverse via Dataflows first appeared on Stringfest Analytics.

]]>
If you want to make your Excel data more useful across the Power Platform, connecting it to Dataflows within Power Apps is the right starting point. By moving data from your workbook into a Dataflow, you make it accessible to Dataverse and ready for automation through tools like Power Automate and Copilot Studio.

Once this connection is in place, you can start running prompts or reports on a schedule; for example, automatically generating a weekly variance report and posting it to Teams or emailing it. You could also build guardrails into your flows, such as approval steps or editable review stages, so your team can validate AI-generated insights before they’re published.

In this post we’ll connect a sales data table so that the data flows into Dataverse through Dataflows. I know that in the real world your sales data probably doesn’t live in Excel, or at least I hope it doesn’t. Excel often isn’t the best place for a single source of truth. But I also live in the real world, and I know that Excel has its advantages for powering those sources of truth. So here’s how to bridge that gap into a more developer and enterprise friendly way of doing things.

 

Save this workbook somewhere easy to find on your OneDrive or SharePoint. You’ll need it there for the next steps.

What is a Dataflow?

A Dataflow is a cloud-based way to extract, clean, and load data into the Power Platform. Power Apps will be our way in to set up these Dataflows, since that is where you can create and manage them to feed data into Dataverse. Dataflows use the same Power Query technology that Excel users already know, but instead of saving transformations inside a single workbook, they are stored in the cloud so multiple apps and services can reuse the same prepared dataset.

Once a Dataflow loads data into Dataverse, it becomes accessible across the Power Platform, including Power Apps for building interactive applications, Power Automate for scheduling and running flows, and Copilot Studio for creating generative AI experiences that draw on your Dataverse data.

A comparison table of using Power Query in Excel versus Dataflows in Power Apps follows.

Feature Power Query in Excel Dataflows in Power Apps
Where it runs On your computer In the Power Platform cloud
Purpose Transform data for a single workbook Prepare and share data for multiple apps
Storage Inside the Excel file In Dataverse or Azure Data Lake
Refresh Manual or workbook-triggered Scheduled automatic refresh
Collaboration Local to one user Centralized and reusable across apps
Integration Excel only Connects to Power Apps, Power Automate, and Copilot Studio

Syncing the Excel data to Dataflows

To get started, go to make.powerapps.com and head to Tables. These data sources are actually Dataverse tables. You’ll notice that many of them are already there by default. You didn’t create them, and you can safely ignore them for now. Dataverse includes a set of standard tables out of the box like Account, Contact, and Appointment because they support core business functions used by many Power Apps solutions. For this walkthrough, we’ll focus on adding our own table by going to the top menu, selecting Import, and then choosing Import data with Dataflows.

At the top, select Import, then choose Import data with Dataflows. This is where we’ll set up our live connection between Excel and Dataverse. While there are several simple ways to import static data from Excel (for example, by uploading a CSV), Dataflows are what make the connection dynamic.

Import data from Dataflows

Great work! Now select Choose Data Source, then pick “Excel workbook.”

At this point, your workbook needs to be stored on OneDrive or SharePoint. Go ahead and sign in with your account, then browse to and select the workbook you want to connect.

Browse OneDrive

Now things should start to look more familiar if you’ve used Power Query before. You’ll first choose which parts of the workbook to connect to. In this case, I’m connecting only to the table I need, currently named “Table1” (I really should have named it better!). Fortunately, we’ll get a chance to give it a clearer name once it’s loaded into Dataverse.

Get data Dataflows

Now we are fully in Power Query, and you can see the label in the upper left corner confirming it. If you wanted to build data transformations, this is where you could do it. For now, I will skip that step and select Next to finalize the Dataflow.

Power Query dataflows

Now we have a few options for how to finalize this connection into Dataverse. I am going to have it create a completely new table.

Here we can name the table, set data types, define a primary key, rename columns, and make other adjustments under the Column mapping dropdown. Each column is assumed to have a single type, just like in Power Query.

For now, I will assume Dataflows and Dataverse handle this setup correctly. If something needs to be changed later, that is fine. This process is reproducible, so we can always go back and update it.

Load to Dataflows

Last, we’ll be asked how we want the Dataflow to refresh. We can set it to run on a schedule or trigger it manually. I am going to set it to refresh manually for now. Since there is a way to trigger a refresh directly within Power Automate, I’ll handle automation later as part of a broader Power Automate flow.

Refresh settings Dataflows

Great work. Now let’s test it with a manual refresh. Go back to your Excel workbook and add a test row. Then return to Power Apps and open Dataflows. You should see your active Dataflow listed, and it may take a moment to finish publishing. From there, select the three dots next to your flow. You can give it a name and choose Refresh to trigger the update.

Edit and rename dataflow

Amazing. The new record is there. We’ve successfully set up a Dataflow that brings data from an Excel workbook into Dataverse.

New record dataflow

Conclusion

In this post, we connected an Excel workbook to a Dataflow in Power Apps and saw how this simple step brings your spreadsheet data into the Power Platform ecosystem. Once the data is stored in Dataverse, it’s no longer limited to one workbook. It can power automations, apps, and even AI-driven summaries across your organization.

In a future post, we’ll build on this foundation by creating a Power Automate flow that pulls your Dataverse data and uses a prompt to generate a dynamic weekly summary. We’ll even have it post directly to Teams for an automated, AI-powered status update, set up an approval process for review and more. Stay tuned!

The post How to connect your Excel workbook to the Dataverse via Dataflows first appeared on Stringfest Analytics.

]]>
16089