Reality-Craft https://reality-craft.com Solution Crafter Mon, 23 Feb 2026 08:06:26 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://reality-craft.com/wp-content/uploads/2025/03/cropped-Dark-128-32x32.png Reality-Craft https://reality-craft.com 32 32 How to Create a Word Document from a Template in Power Automate (Using Premium Connector) https://reality-craft.com/how-to-create-a-word-document-from-a-template-in-power-automate-using-premium-connector/ Mon, 09 Jun 2025 13:02:27 +0000 https://reality-craft.com/?p=2963 In Microsoft’s Power Automate, crafting dynamic and personalized Word documents through templates has become a streamlined process. Let’s start the detailed steps that illuminate the path to seamlessly generating these documents using Power Automate.

Crafting Your Word Template

Step 1: Activating the developer option

To begin, open up your Office Word application. Navigate to the “File” tab, then click on “Options.”

This leads to the “Customize Ribbon” tab, where you activate the “Developer” option by checking the designated box. This action grants access to the Developer tools on the application’s homepage tab.

Step 2: Incorporating Dynamic Content

Enabling the Developer option facilitates the use of the “Insert Plain Text” control, allowing the inclusion of dynamic content. Then after Create a template according to your needs.

(Highlighted is the “Insert Plain Text” option)

Utilize this tool to add fields, such as “Name,” for dynamic content within your Word template. Navigate to Developer, select the text you wish to make dynamic, and click on the Highlighted Icon for the desired effect.

Step 3: Uploading to SharePoint

Once your template is created, ensure to upload it to the SharePoint library for accessibility and seamless integration.

Developing a Flow in Power Automate

Step 1: Setting Up the Flow

After uploading the template, configure a manual trigger flow. Incorporate necessary fields, like the person’s name, ensuring you’re effectively setting the stage for dynamic input.

Step 2: Populating the Word Template

In the ‘Manually trigger a flow’ section, include the action called “Word.” Then, select ‘Word online (Business),’ where you’ll find the “Populate a Microsoft Word template” action.

(Search for “word”)

(Select action called “Populate a Microsoft Word template”)

Select “Populate a Microsoft Word template” action and Put in the important info about where your Word template is in SharePoint. Then, pick the Document library and select the template file. Use the manual trigger field to keep things changing.

(Consult this step for the action called “Populate a Microsoft Word template” )

Next, choose ‘Create file’ action and where you want to keep the new Word files created. You can even give them a special name like “Offboard_Employeename.”

(Consult this step for create file)

Step 3: Initiating the Flow Manually

Once the flow is set, manually initiate it. This prompts the input of the person’s name, seamlessly integrating the dynamic element into the document.

Step 4: Generating the Word File


Upon execution, witness the magic as the file, named as per the flow, is automatically generated in the SharePoint document library. Open the Word document to observe the dynamic change, showcasing the effectiveness of Power Automate.




(Generated File the SharePoint document library)



(Updated Content of Word File)

Conclusion

Creating Word documents from templates within Power Automate empowers users to streamline and personalize their document creation process. By following these steps, one can harness the full potential of Power Automate’s capabilities in generating dynamic and customized documents effortlessly.

]]>
How to use dynamic email template in Power Automate? https://reality-craft.com/how-to-use-dynamic-email-template-in-power-automate/ Mon, 09 Jun 2025 13:01:25 +0000 https://reality-craft.com/?p=2953 SharePoint is a strong tool for teamwork, with many features to help teams work efficiently. One useful feature is creating templates that automatically get content from lists, which helps users avoid doing the same tasks over and over.

In this blog post, we’ll show you how to make a template in SharePoint that pulls content from a list and sends it out by email using Power Automate. This integration between SharePoint and Power Automate can save you time and resources while making sure the right people get the important information quickly.

In the SharePoint list, we create two lists. The first list is named “Email Template.” In this list, we create a column called “Template,” which is a multiple lines of text column. We add an email template to the “Template” column by creating a new entry in this list.

Additionally, we create a template where we retrieve four values from another list: “Purchaser Name,” “Delivery Date,” “Product Name,” and “Purchaser Name.”

The second list is named “Online Order Product Details,” where we create a “Delivery Date” column as Date and Time. Additionally, we have a “Product Name” column, which is a choice column, and we include the default “Created By” column provided by SharePoint.

In the template, we replace “Purchaser Name” with the name of the creator, retrieved from the “Created By” column in the list. The “Delivery Date” is substituted with the dynamic value from the “Delivery Date” column in the “Online Order Product Details” list. Similarly, the “Product Name” column is replaced with the dynamic value from the “Product Name” field in the “Online Order Product Details” list.

Step:1 Add the “When an item is created” trigger action and specify the site URL and list name from which dynamic values are retrieved. The flow will be triggered when an item is created or modified in the SharePoint list.

Step:2 Add the “Get Items” action, specifying the site address and list name where the user creates the template in the list.

Step:3 Add the compose action and create the expression to first replace the Purchaser name with the Created By display name.

Add the following expression in the compose action:

First, add the “Replace” value. Then, include the “Get Items” action to retrieve the value. Here, obtain the data at index “0” to retrieve the number 1 ID. Next, specify the column name where we create the template in multiple lines of text. Add a comma after this, and then include the dynamic name you wish to use to replace the original data. Add a comma after the dynamic name. Finally, include the “Created By” display name, which retrieves data from another list.

replace(outputs(‘Get_items_-_Email_Template’)?[‘body/value’][0][‘Template’],'[Purchaser Name]’,triggerOutputs()?[‘body/Author/DisplayName’])

Step:4 Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the delivery date. Add the delivery date value.

replace(outputs(‘Compose_-_Replace_the_Purchaser_name’),'[Deliver Date]’,triggerOutputs()?[‘body/DeliveryDate’])

Step:5 Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the Product Name. Add the Product Name value.

replace(outputs(‘Compose_-_Replace_with_the_Delivery_date’),'[Product name]’,triggerOutputs()?[‘body/ProductName/Value’])

Step:6  Add a new “Compose” action. Use the “Replace()” expression. Insert the compose output from dynamic content inside the brackets. Specify the name to replace the Purchaser Name. Add the Created by display name value.

replace(outputs(‘Compose_-_Replace_with_the_Product_Name’),'[Purchaser Name]’,triggerOutputs()?[‘body/Author/DisplayName’])

Step:7 Add the “Send an email” action. Fill in the required fields. Add the last compose output to the email body. Save and test the flow. Show the output. Users can change the template in the list without manually updating Power Automate each time.

Step:8 Users can add other dynamic values in SharePoint and compose actions step by step. Save and run the flow to see the email output.

Conclusion: By establishing a SharePoint template to extract dynamic content from a list and automating email distribution through Power Automate, businesses boost efficiency and productivity. This integration saves time, ensuring data accuracy and consistency. As organizations strive for workflow optimization, SharePoint and Power Automate provide a robust framework. This solution caters to reporting, notifications, and communication, offering flexibility and scalability for managing information flow within teams and across departments.

]]>
Modern SharePoint List Forms: JSON Formatting & Validations Tutorial https://reality-craft.com/modern-sharepoint-list-forms-json-formatting-validations-tutorial/ Mon, 09 Jun 2025 12:58:12 +0000 https://reality-craft.com/?p=2942 Customizing SharePoint online forms is typically done using SPFx or Power Apps. However, this blog will demonstrate an alternative approach using JSON to customize out-of-the-box SharePoint list forms. This method eliminates the need for SPFx or Power Apps, as simple JSON can be employed to tailor the header, footer, and body of the form..

Through the use of JSON, users can create distinct sections within the form body and implement validations to dynamically show or hide form fields based on the values of other fields. Notably, the advantage of employing JSON for customization is the inherent responsiveness of the forms. Unlike other methods, there is no additional effort required to make these forms responsive, allowing seamless utilization across both desktop and mobile platforms.

In this blog on SharePoint Online, User will explore the customization and formatting of SharePoint Online list forms through the use of JSON formatting. And column validation in SharePoint online list.
[JP1] 



First, familiarize yourself with customizing JSON formatting. Customize the form header, form footer, and form body. Divide the body into different sections.

  1. Customize the SharePoint Online List Form Header using JSON formatting.

In the Format window, you’ll encounter options such as Header, Body, and Footer. Ensure that you choose Header.

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section.

{

    “elmType”: “div”,

    “attributes”: {

        “class”: “ms-borderColor-neutralTertiary”

    },

    “style”: {

        “width”: “100%”,

        “margin-bottom”: “15px”,

        “background-color”: “purple”

    },

    “children”: [

        {

            “elmType”: “div”,

            “style”: {

                “display”: “flex”,

                “align-items”: “center”

            },

            “children”: [

                {

                    “elmType”: “div”,

                    “style”: {

                        “flex”: “none”,

                        “padding”: “0px”,

                        “padding-left”: “10px”,

                        “height”: “40px”,

                        “color”: “white”

                    }

                }

            ]

        },

        {

            “elmType”: “div”,

            “attributes”: {

                “class”: “ms-fontColor-themePrimary ms-borderColor-themePrimary ms-fontWeight-bold ms-fontSize-xl ms-fontColor-neutralSecondary–hover ms-bgColor-themeLight–hover”

            },

            “style”: {

                “box-sizing”: “border-box”,

                “width”: “100%”,

                “text-align”: “center”,

                “padding”: “21px 12px”,

                “overflow”: “hidden”,

                “color”: “white”

            },

            “children”: [

                {

                    “elmType”: “div”,

                    “txtContent”: “Task Details”

                }

            ]

        }

    ]

}

You can customize the header by adjusting parameters such as background color, text color, and adding icons. Additionally, you can retrieve dynamic values from the body using this JSON formatting.

  • Customize the SharePoint Online List Form Body using JSON formatting.

Once the “Configure layout” option is chosen, select the “Body” option from the dropdown in the “Apply formatting to” section.

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section.

{

    “sections”: [

        {

            //give a display name for the section

            “displayname”: “✈”, //Instead of icon, user have the option to use names of sections  .

            “fields”: [[JP2] 

                //reference your fields here using their display name

                “Airline”,

                 “Estimated airfare”

            ]

        },

        {

            //give a display name for the section

            “displayname”: “🏡”, //Instead of icon, user have the option to use names of sections.

            “fields”: [

                “Hotel”,
                “Estimated hotel cost”//reference your fields here using their display name

            ]

        }

    ]

}

Change the given code to establish distinct sections within the form. Each section allows for the inclusion of an icon alongside its name, mirroring the example depicted below. Additionally, you have the flexibility to add any column to any field within a section.

When attempting to edit the columns in the form, it is evident that the fields are presented in a section-wise manner, as demonstrated below:

  • Customize the SharePoint Online List Form footer using JSON formatting.

After choosing the “Configure layout” option, select the “Footer” option from the dropdown in the “Apply formatting to” section.

In the “Formatting code” option, apply or paste the JSON formatting code provided below. Alternatively, you can retrieve this code by clicking on “Learn more” within the JSON code section.

{

    “elmType”: “div”,

    “style”: {

        “height”: “24px”,

        “width”: “100%”,

        “color”: “#fff”,

        “font-size”: “15px”,

        “border-top”: “5px solid #eee”,

        “background-color”: “Purple”,

        “padding”: “10px”

    },

    “children”: [

        {

            “elmType”: “div”,

            “style”: {

                “width”: “100%”,

                “padding-top”: “10px”,

                “height”: “24px”

            },

            “children”: [

                {

                    “elmType”: “a”,

                    “txtContent”: “=’Task Review for ‘ + [$Title]”,

                    “attributes”: {

                        “target”: “_blank”,

                       “href”: “=’https://aka.ms/task?review=’ + [$Tasks]”

                    },

                    “style”: {

                        “color”: “white”

                    }

                }

            ]

        }

    ]

}

In the screenshot below, you can observe that the JSON code for the Footer has been included. I have implemented a functionality where clicking on the specified text will navigate the user to the email administrator.

Show/Hide form field based on conditions:

To meet the requirement of displaying the Manager field in the form only when the Priority is set to Critical, follow these steps:

  • Click the “New list” button to open the list form.
  • Choose “Edit columns” from the options available in the top-right corner of the New form.
  • Select column which you want to hide/show  and click on three dots and select “Edit Conditional Formula”.

Following that, you will encounter the field where you can input your formula. As an example, here is the code for the “Estimated Airface” field: when a specific value is selected in the “Airline” column, the estimated airfare will be displayed; otherwise, it will be hidden.

Please note that you can find and obtain a solution by clicking on the line highlighted in green, which reads, “Learn to use conditional formulas in a list form.

How to do column validation in SharePoint:

Column validation serves to ensure that the data entered in a specific column is accurate and meets the defined criteria.

Use Cases/Formulas-

  • In my SharePoint list, there is a column named “Travel Start Date,” and I aim to enforce a rule that this date must be greater than or equal to today’s date.: =TravelStartDate>TODAY().

-First, enter the Edit mode for that column and then click on “More options.” And click on “column validation”

Subsequently, insert the formula in this section, and ensure to include a User message. If the conditions specified in the formula are not met, this message will be displayed to the user.

Inline column validation in SharePoint will not be effective for this situation since it doesn’t allow retrieving or comparing values from other columns. To address this limitation, it is necessary to utilize list-level validation settings, as illustrated in the image below. This approach enables the use of multiple formulas to compare values from different columns.

In the image above, the user can specify a formula ensuring that the travel end date is greater than the travel start date. Additionally, the sum of Estimated Airfare and Estimated Hotel Cost must be greater than zero. It’s crucial to understand that Estimated Airfare and Estimated Hotel Cost are individual columns, and their respective formulas won’t work in column validation settings. To overcome this, these formulas must be written in the list setting validation. Moreover, when employing multiple formulas, it’s important to write user messages separately, as demonstrated in the image, as there isn’t a distinct method to display messages within the formula.


 

 

]]>
Implementing DLP for Secure Data Sharing in SharePoint and OneDrive https://reality-craft.com/hello-world-5/ Thu, 03 Apr 2025 17:02:47 +0000 https://reality-craft.com/?p=758 With cyber threats evolving constantly, IT teams face immense pressure to safeguard data from malware, phishing, and insider threats. Microsoft’s Purview platform, equipped with a powerful DLP module, provides a proactive defence. Data Loss Prevention (DLP) is a security solution that helps prevent unsafe sharing and inappropriate transfer of sensitive data. DLP is utilized to protect and monitor sensitive data across both on-premises systems and cloud-based locations within our organizations.

This blog post tackles the critical issue of data security in Microsoft 365. With the ever-growing threat of cyberattacks, protecting sensitive information like social security numbers, credit card data, and “Confidential” documents is paramount.

Microsoft Purview’s DLP module offers a powerful solution to address this challenge. We’ll delve into its importance and provide a step-by-step guide to configure a DLP policy that meets your specific needs.

Scenario:

Block all external sharing of SharePoint and OneDrive items containing sensitive information like Social Security Numbers Credit Card Data etc. and Sensitivity labelled item.

Alerting the Email notification to the security team for every blocked file sharing attempt. User notifications within the interface (if available).
By following our guide, you can gain control over sensitive data sharing and confidently navigate today’s complex cybersecurity landscape.

Step-by-Step Guide to Creating a DLP Policy in Microsoft Purview

This guide walks you through creating a Data Loss Prevention (DLP) policy in Microsoft Purview to prevent accidental or unauthorized sharing of sensitive data in SharePoint and OneDrive.

  1. Access the Compliance Portal:

Go to the Microsoft 365 Admin Center and navigate to “Compliance.”In the left-hand pane, find “Data loss prevention” and then “Policies.”

2. Create a New Policy:
Click the “Create policy” button.

3. Define Policy Details:

Under “Categories,” choose “Custom.” Under “Regulations,” select “Custom policy.”

Click “Next” to proceed. Give a proper policy name and description.

  • Assign admin units for users or groups:
    Similar to the default template approach, assign admin units from Microsoft Entra ID to limit the policy to specific users or groups.
  • Enable DLP for SharePoint and OneDrive:

Turn on DLP for “SharePoint sites” and “OneDrive accounts.” Leave other locations disabled. Click “Next.”

  • Create and Define DLP Rule:

In “Define policy settings,” ensure “Create or customize advanced DLP rules” is selected. Click “Next” and then “Create rule.”

Give your rule a clear name (e.g., “Block External Sharing of Sensitive Data”). Briefly describe the rule’s purpose.

  • Set Rule Trigger Conditions:

Click “+ Add condition”. Choose “Content is shared from Microsoft 365.” Select “with people outside my organization.” Choose “Content contains” and choose appropriate data types based on your needs. In this case, we’ve selected for sensitive Australian information such as credit card numbers, passport numbers, etc.

  • Identify Sensitive Data:

Click “+ Add condition” and set the logic to identify content matching both conditions (AND). Create a new condition group with OR logic (matches at least one condition. Create a new condition group with AND logic (matches at least one condition). Choose “Sensitivity labels” and select the labels based on your needs.

  • Block Unauthorized Sharing:
    Under “Actions,” choose “Restrict access or encrypt the content.” Select “Block only people outside your organization” to prevent external sharing.
  • Notify Users:
    Enable “User notifications” to inform users attempting to share sensitive data through a policy tip in Office 365 services. This will notify the user who shared the data.
  • Prevent Sharing Overrides:
    Ensure “Allow override from M365 services” is not selected to prevent users from bypassing the policy.
  • Set Up Reporting:

Set “Use this severity level in admin alerts and reports” for informational purposes.

Enable “Send an alert to admins when a rule match occurs” to notify admins of potential incidents. Optionally, add email addresses for receiving alerts (e.g., security team). Choose to send alerts for every rule match. You can also enable and add the user to receive incident reports via email when a policy match occurs.

Choose the necessary details for incident reports, including the matched information, the location of the match, and the activated rules and policies.


Review the DLP Rule After Creation:

  1. Set Policy mode:
    The “State” setting determines when your DLP policy becomes active. It’s recommended to keep the policy off while you’re configuring it and getting approvals. Click Next and then Submit. Click Done.

Here are the available state options:
Run the policy in test mode: You’ll be able to review alerts to assess the policy’s impact. Any restrictions you configured won’t be enforced. Users also receive informative messages to raise awareness about the policy.
Turn the policy on immediately: This enables full enforcement of the policy. After the policy is created, it’ll take up to an hour before any changes are enforced.
Leave the policy turned off: Decide to test or activate the policy later.
Remember: You can change the policy state at any time to adjust your rollout strategy.

  • Publish the Policy:

Click “Submit” to finalize and publish the DLP policy.

Conclusion

By following the outlined steps, organizations can create and implement Data Loss Prevention (DLP) policies in Microsoft SharePoint and OneDrive. These proactive measures significantly enhance data security by preventing accidental or unauthorized sharing of sensitive information. Remember, minimizing the risk of data breaches and ensuring compliance with data privacy regulations require continuous efforts. Regularly monitor policy effectiveness, gather user feedback, and optimize your DLP strategy for a robust security posture in today’s dynamic digital landscape.

]]>
Understanding Power BI Import Query and Direct Query https://reality-craft.com/understanding-power-bi-import-query-and-direct-query/ Thu, 03 Apr 2025 17:02:41 +0000 https://reality-craft.com/?p=757 Introduction: Power BI is a powerful tool that helps businesses analyze data and make informed decisions. To better understand how it works, let’s take SQL Server as an example. When working with SQL Server as a data source in Power BI, you have two main options for connecting to your data: Import Mode and Direct Query. Choosing the right option depends on your specific needs. In this blog, we’ll break down these two modes, explain their features, and help you decide when to use each one.

What Is Import Query?

Import Mode in Power BI allows you to bring data from a source, like SQL Server, directly into Power BI’s memory. [JP1] This imported data is saved in your Power BI file (.pbix) and doesn’t require constant connections to the source.

This mode is great for:

  • Performance: It provides fast query response times because everything is stored in memory.
  • Flexibility: You can create advanced calculations, relationships, and data transformations without relying on the source.
  • Small to Medium Data: Works best with smaller datasets that don’t need frequent updates.

When using Import Mode, Power BI takes a snapshot of your data at a specific time. You can then interact with and filter this compressed data without affecting the source. It’s perfect for reports needing complex models or quick responses to user actions.

Key Features of Import Query

  • High Performance: Data is preloaded into memory, enabling faster report rendering and interactions.
  • Offline Capabilities: Once data is imported, reports can be accessed even without a connection to the data source.
  • Enhanced DAX Support: Import Query supports advanced calculations and transformations using Data Analysis Expressions (DAX).
  • Multiple Data Sources: Using Import, you can combine data from various data sources (data flows, databases, CSV)
  • Complexity of Measures: Suitable for complex DAX Measures as all M and Dax Functions are accessible.

Disadvantages of Import Query

  • Limited to Data Size: Import mode struggles with very large datasets, as in-memory storage has limitations depending on your hardware and Power BI service tier.
  • Data Updates Are Not Real-Time: Data in Import mode is only updated when the dataset is refreshed, which means it may not reflect the latest changes in the source.
  • Refresh Limitations: Dataset refreshes are limited to 8 times per day on the Power BI Pro license (48 times with Premium), which can be a constraint for frequently changing data.
  • File Size Constraints: When you use Import Mode in Power BI, the data you bring in must fit within Power BI’s file size limits. The maximum file size for users with a Power BI Pro license is 1 GB. If you have a Power BI Premium license, you can work with larger files, but there are still size limits depending on your Premium capacity.

What Is a Direct Query?

Direct Query in Power BI allows you to connect directly to a data source (such as a SQL database) and retrieve data in real-time as users interact with the report. Unlike Import Query, where data is stored in Power BI, Direct Query leaves the data in the source, meaning the data is never stored or cached in Power BI. Power BI supports a wide range of data sources for Direct Query, including Azure SQL Database, Dataverse, Snowflake, SQL Server, Teradata Database, and many more. Here is how it works:

  • Real-Time Data Access: Always fetches the most up-to-date data.
  • No Data Size Constraints: Suitable for very large datasets that may exceed Power BI’s in-memory storage limits.
  • Centralized Security:  With Direct Query, the data stays in the source system, meaning it follows the security rules and access controls set by that system. This ensures that sensitive information is protected according to the security measures already in place in the data source, rather than relying on Power BI for security.[JP2] 

In simple terms, with Direct Query, Power BI queries the data source every time an interaction or filter is applied in the report, ensuring you’re always working with live data without manually refreshing anything.

Key Features of Direct Query

  • Real-Time Data: With Direct Query, the data in your report is always up to date because every time a user interacts with the report, Power BI sends a query directly to the data source to get the latest data.
  • No Data Storage in Power BI: Since no data is imported into Power BI, there’s no need for manual refreshes, and the data is always current.
  • Large or Frequently Updated Datasets: Direct Query is useful for very large datasets or when the data changes frequently, as it queries the source every time a report is run.
  • Cross-Source Data Models: You can create data models that pull data from multiple tables or even multiple data sources, enabling more flexible and real-time analysis.

Disadvantages of Direct Query

  • Slow Performance: Complex queries can slow down reports, especially with large datasets.
  • Limited Modelling: Fewer options for transforming or calculating data compared to Import mode.
  • Depends on Source: Relies on the performance and availability of the data source.
  • Limited Query Types: Some data sources or types of queries may not be supported by Direct Query. For example, complex SQL queries or specific advanced data operations may not work well or be supported in Direct Query mode, depending on the source system and its capabilities.
  • Connectivity Issues: Direct Query requires a constant and stable connection to the data source. If there are any network issues, or if the connection is slow or unavailable, it will directly affect the performance and usability of your reports. This can cause delays, errors, or incomplete data being displayed in your Power BI reports.
Feature Import Query Direct Query
Data Storage Stored in Power BI’s memory Stored in the source system
Performance Fast report loading and interactions Slower due to real-time queries
Data Freshness Requires manual or scheduled refreshes Always up-to-date
Data Size Limitations Limited by Power BI capacity (1 GB compressed per dataset for Pro license) No size constraints but depends on the performance of the source
Advanced Calculations Fully supported with DAX and transformations in Power Query Limited support: some transformations may not work
Connection Dependency Not required after import Continuous connection required
Query Complexity Supports complex queries and aggregations in Power BI Relies on the source system’s query capability
Source System Load Minimal load after data is imported High load due to real-time queries
Offline Access Fully accessible offline Not accessible without an active connection
Scalability Limited to Power BI’s in-memory capacity Scales with source system capacity
Real-Time Data Requires frequent imports to simulate real-time Ideal for real-time scenarios
Setup Complexity Easier to set up and manage Requires careful design for performance optimization
Data Security Requires data to be imported into Power BI’s service or file Data remains in the source system, providing additional security in some scenarios
Supported Data Sources Most data sources supported for import Limited to those with Direct Query capabilities
Cost Implications May require more storage and higher-tier licensing for large datasets Depends on the performance and licensing of the data source
Data Transformation Power Query allows extensive data shaping and transformation Limited transformation; mostly in the source system
Caching Fully cached in-memory data No caching: queries executed each time

Conclusion

Choosing between Import Query and Direct Query in Power BI depends on your specific needs. Use Import Query for smaller datasets, faster performance, and advanced modelling, particularly when offline access is required. Choose Direct Query for large datasets or when real-time data is essential. Both modes offer unique advantages, so evaluate your performance, data size, and refresh requirements to determine the best option for your scenario.


 

]]>
How to Pass Data from Power Apps to Power Automate Using JSON https://reality-craft.com/how-to-pass-data-from-power-apps-to-power-automate-using-json/ Thu, 03 Apr 2025 17:02:34 +0000 https://reality-craft.com/?p=756 Introduction: Power Automate is a versatile tool for automating workflows and integrating various systems. However, one common challenge users face is managing large datasets that exceed the platform’s input size limit. These limitations can lead to flow failures, causing frustration and inefficiency.

This blog will guide you through an innovative and practical solution to bypass Power Automate’s input limit by leveraging JSON input and Power Apps. With this approach, you’ll not only streamline your data processing but also build more robust workflows capable of handling large-scale submissions.

Scenario: Imagine a company onboarding employees through a Power Apps form. The form captures vital data such as: (Name, Department, Email, Address, etc.) and sometimes attachments. This data is sent to Power Automate for processing and storage. However, large submissions often exceed the allowable input size, resulting in failed flows. To resolve this issue, we need to package the data efficiently in JSON format, enabling smooth transmission and seamless automation.

Step 1: Creating the Power App

Set Up the Form

  • Start by creating a Power App.
  • Add the fields you want to populate, such as Name, Department, and Email.

Collect the Data

  • ClearCollect(    colRequestDataInsert,    {         FirstName_DataCard2: If(IsBlank(DataCardValue19.Text), “”, DataCardValue19.Text)    });  Use a collection to temporarily store the data entered in the form. For instance, you can add the following code to the Submit button’s On Select property:

Convert to JSON

  • Set(    varInsertFlowData,    JSON(colRequestDataInsert)); After creating the collection, convert it into a JSON format using the JSON function:

Trigger the Flow

  • Integrate Power Automate into your app and use the Run function to pass the JSON data to the flow:
Set(    varFlowAdd,    Morethen20FieldsAdd.Run(varInsertFlowData));

Step 2: Building the Power Automate Flow

Power Apps Trigger

  • Start your flow with the Power Apps trigger. This allows the flow to receive the JSON variable from the Power App.

Compose Action

  • Add a Compose action to capture the data from Power Apps. Insert the variable passed from Power Apps into this action.

Parse JSON

  • Use the Parse JSON action to decode the JSON into a usable format.
  • Add output from compose action inside content.
  • To generate the schema, start by adding only the Compose action in your Power Automate flow. Run the flow instance, and once it completes, click on the “Generate from sample” button. This will open a pop-up window where you can copy the output from the Compose action and paste it into the text box provided in the pop-up.

Process Data

  • You can now access all the fields (e.g., Name, Department, Email) extracted from the JSON data.
  • Add actions like Create Item (for SharePoint) or Insert Row (for Excel) to store or use the data.
  • For example, we have extracted all the columns from JSON as below using select action.

Test and Validate

  • Save the flow, submit data from the Power App, and verify that the flow processes all the data accurately.

Key Benefits

  • Overcomes Input Limits: Efficiently handles large datasets without hitting Power Automate’s size restrictions.
  • Flexible Integration: Works seamlessly with Power Apps to pass complex data structures.
  • Scalable Solution: Suitable for workflows of varying complexity, from simple forms to multi-field submissions.

Conclusion

By combining Power Apps and Power Automate with JSON, you can overcome the input size limitations, streamline data transmission, and build scalable workflows. This solution is particularly useful for handling complex forms or large datasets in scenarios like employee onboarding, customer feedback collection, or data surveys. Start implementing this approach today to enhance the efficiency and reliability of your automated workflows. With a little creativity and the power of JSON, the possibilities are endless!

]]>
Streamlining Data Management and Reporting with Power BI and PowerApps https://reality-craft.com/streamlining-data-management-and-reporting-with-power-bi-and-powerapps/ Thu, 03 Apr 2025 17:02:22 +0000 https://reality-craft.com/?p=755 Summary

A company integrated Power BI and PowerApps to streamline data management and reporting, enabling seamless data operations directly within reports. By leveraging SQL with Direct Query for real-time updates and incorporating validation and error-handling features, users accessed the latest data more efficiently, improving satisfaction and productivity.


Project Details

  • Customer: Conversant Capital
  • Location: Summit, NJ
  • Industry: Investment Management
  • Department: Real Estate Capital Solutions

Tools and Technology

  • Tools:
    • Power BI Desktop
    • SQL
  • Technology:
    • PowerApps
    • Power BI
    • Power Automate Flow
    • SharePoint Online

Overview

The project focused on integrating Power BI and PowerApps to create a unified platform for managing and reporting data efficiently. Key functionalities included:

  • Data Insertion:
    Users could add data directly through the PowerApps visual embedded within the Power BI report, simplifying the data entry process.
  • Editing Records:
    Users could edit data directly using the PowerApps visual inside the Power BI report, eliminating the need to switch screens and improving efficiency.
  • Real-Time Updates:
    Implemented Direct Query to ensure data synchronization, providing real-time insights.
  • Data Load Operations:
    Developed a user-friendly interface for importing Excel data into SQL tables, featuring validation and error handling.

Challenges and Solutions

Challenge:

Users faced inefficiencies due to separate interfaces for Power BI and PowerApps, leading to fragmented workflows and slower decision-making.

Solution:

The solution integrated Power BI into a single PowerApp, streamlining workflows by consolidating data operations and reports within one interface. Role-based permissions ensured secure access, and a specialized data load screen facilitated Excel-to-SQL uploads with real-time validation.


Features Developed

1. Enhanced User Interface:

  • Embedded PowerApps visuals within Power BI for direct data management.
  • Consolidated multiple screens into a single application, simplifying navigation.

2. Real-Time Updates:

  • Direct Query mode maintained live data synchronization, ensuring reports reflected the most current information.

3. Version Notifications:

  • Implemented alerts to notify users when the PowerApps interface was outdated, prompting them to refresh and access the latest version.

4. Error Handling and Validations:

  • Utilized SQL stored procedures for data operations.
  • Added field-level validation checks in PowerApps and Power BI to maintain data integrity.
  • Designed a data load screen for importing Excel data into SQL tables, featuring validation and error handling.

Client Impact

  • Improved Efficiency:
    Unified data management reduced tab-switching, saving time and boosting productivity.
  • Real-Time Insights:
    Direct Query ensured up-to-date data, enhancing decision-making capabilities.
  • Streamlined Workflow:
    Consolidated interfaces reduced confusion, simplifying processes and improving efficiency.
  • Enhanced Satisfaction:
    The robust integration addressed client needs, delivering a user-friendly solution that optimized data management and reporting.

Conclusion

This project exemplifies the successful integration of Power BI, PowerApps, and SQL, streamlining data management and enhancing reporting. By simplifying workflows, incorporating thorough validation, and improving user experience, the solution significantly boosted operational efficiency and data accuracy, aligning with the client’s business objectives and driving higher satisfaction.

]]>
Migrating Lotus Notes to Dataverse and SharePoint Online https://reality-craft.com/migration-lotus-notes-to-dataverse-and-sharepoint-online/ Thu, 13 Mar 2025 08:02:58 +0000 https://reality-craft.com/?p=1 Migrating Lotus Notes to Dataverse and SharePoint Online

Project Overview

This case study explores the successful migration of approximately 160,000 records from a legacy Lotus Notes database to a modern, cloud-based solution leveraging Dataverse and Power Apps. The primary goal of this migration was to enhance data management, accessibility, and overall efficiency while transitioning to a more scalable and flexible platform, like the benefits of migrating from manual processes to automated systems.

Project Details

Customer Lambda Legal
Country New York, NY
Domain Legal and Advocacy for LGBTQ+ and HIV Rights
Division/Department Legal, Policy, and Advocacy
Tools and Technology SharePoint OnlineDataversePower AutomatePower Apps
Migration Tool Tzunami Migration Tool

Tools and Technology:

  • SharePoint Online
  • Dataverse
  • Power Automate
  • Power Apps
  • Tzunami Migration Tool

Migration Strategies

  • Smaller Batch Migration: Due to repeated failures during large batch exports, the team opted for a smaller batch migration strategy. This approach allowed for the migration of data in manageable segments, reducing the risk of failures and facilitating easier troubleshooting.
  • Exporting as Data Items: Instead of exporting records as separate documents, the new strategy involved migrating them as “Data Items” into SharePoint Lists. This method simplified the process of handling attachments and comments, ensuring that they were directly associated with the corresponding records.
  • Power Automate Implementation: To address issues with metadata mapping and the export of comments, the team is exploring the use of Power Automate. This tool will read the exported XML metadata files from Tzunami and migrate them to SharePoint lists, ensuring accurate mapping of approximately 167 fields.

Challenges and Solutions

Data Migration Challenges

  • XML Data Parsing: The migration process involved extracting data from the Legacy Database as XML files. Parsing these files presented challenges, particularly when dealing with complex data structures like people fields, choice fields, and handling null values.
  • Large Dataset Migration: The sheer volume of data (160,000 records) posed significant challenges in terms of processing time and potential failures during the migration process.

Solutions Employed

  • Power Automate for Data Extraction: Power Automate flows were utilized to read the XML files, extract relevant data, and insert it into Dataverse tables.
  • Custom Logic for Data Handling: Specific logic was developed to handle people fields by referencing user data from Dataverse, choice fields by using global choice IDs, and null values appropriately.
  • Batch Processing: To mitigate the impact of large datasets, the migration was divided into smaller batches, reducing the risk of failures and improving overall efficiency.

Form Development and Dataverse Integration

  • Model-Driven App Development: A Model-Driven App was created in Power Apps to replicate the structure and functionality of the Legacy Database.
  • Dataverse Schema Design: Tables and columns were designed in Dataverse to accurately represent the migrated data, including fields for cascading, static headers, and timeline tracking.
  • Business Rules and Logic: Business rules were implemented to control field visibility and behaviour, ensuring the application adheres to specific business requirements.

Key Achievements and Benefits

  • Successful Data Migration: The migration of 160,000 records from the Legacy Database to Dataverse was successfully completed.
  • Modernized Application: The Power Apps application provides a more modern and user-friendly interface, enhancing productivity and accessibility.
  • Improved Data Management: Dataverse offers robust data management capabilities, including features for data governance, security, and integration with other Microsoft 365 services.
  • Enhanced Scalability: The cloud-based nature of Dataverse and Power Apps ensures scalability and flexibility to accommodate future growth.

Lessons Learned and Future Considerations

  • Batch Processing: The importance of dividing large datasets into smaller batches for migration was emphasized to reduce the risk of failures and improve efficiency.
  • Custom Logic Development: The need for custom logic to handle specific data types and scenarios was highlighted, particularly when dealing with legacy systems.
  • Data Validation and Quality: Implementing robust data validation mechanisms is crucial to ensure data integrity and accuracy during the migration process.
  • Continuous Improvement: The migration project should be viewed as an ongoing process, with opportunities for future enhancements and optimizations as technology evolves.

Conclusion

The migration of Legacy Database data to Dataverse and Power Apps showcases how modern cloud platforms can revolutionize legacy applications. By addressing challenges in data migration, form development, and integration, this case study illustrates the advantages of transitioning to flexible and scalable solutions.

]]>