Are you familiar with Dataflow in Power Apps Dataverse and how to establish a dataflow in Dataverse? Do not be concerned. This Dataverse tutorial will show you how to use different connections while working with Dataverse Dataflows, among other things. Such as:
- What are Dataverse Dataflows?
- Operation of the Dataverse Dataflow function
- What are the benefits of dataverse dataflows
- Dataverse dataflow limitations
- Dataflow data types in dataverse
- How to create dataflow in Power Apps
- Create Power Apps Dataflow from SharePoint List
- Dataverse dataflow excel
- Dataverse dataflow incremental refresh
- Dataverse dataflow export
What are Dataverse Dataflows?
- A cloud-based tool for self-service data preparation is dataflows. Customers may use dataflows to ingest, convert, and load data into Power BI workspaces, Microsoft Dataverse environments, or your company’s Azure Data Lake Storage account.
- Dataflows may be produced and executed without the need for a license that is particular to dataflows because they are included in many Microsoft products.
- Dataflows are accessible through Power Apps, Power BI, and Dynamics 365 Customer Insights. A dataflow’s definition is preserved in the cloud when it is created and saved. In the cloud, a dataflow is also present.
- Power Query, a unified data connection and preparation experience currently included in several Microsoft products, including Excel and Power BI, is used to create dataflow.
- Data is constantly updated, and customers may set up dataflows to execute automatically or on demand.
Also, have a look: Dataverse Solution [Complete Tutorial]
Operation of the Dataverse Dataflow function
An overview of a dataflow’s architecture is shown in the picture below.
- Data is obtained through a dataflow from several data sources (already supported are over 80 different data sources).
- The dataflow then uses the dataflow engine to convert the data in accordance with the transformations set up with the Power Query experience.
- The data is then imported into the output location, which may be an Azure Data Lake Storage account for the company, a Microsoft Power Platform environment, or a Power BI workspace.
This is how the Dataflow is functioning or working in Dataverse.
What are the benefits of dataverse dataflow?
The benefits of using Dataverse dataflows are highlighted in the following points:
- Dataflows are totally cloud-based. There is no need for extra infrastructure.
- Using licenses for Power Apps, Power BI, and Customer Insights, you have various choices for getting started with dataflows.
- Instead of being distributed throughout several objects, the data transformation code might reside in a single, central area called a dataflow.
- Power Query, a powerful, graphical, self-service data transformation experience, is utilized by dataflows.
- A dataflow is independent of products. You may access its data via other tools and services; Power BI is not the only place where it is available.
- In a Power BI system, a dataflow separates the modelling and visualization layer from the data transformation layer.
- With the ability to do complicated transformations, dataflows are intended for self-service applications and don’t require an IT or programming background.
- Only Power Query knowledge is required for a dataflow builder. The dataflow creator may be a team member that jointly develops the fully operational application or BI solution in a setting with numerous creators.
Check out: How to Create Dataverse File Field
Dataverse dataflow limitations
There are certain limitations in Dataverse Dataflows. Such as:
- Data sources that are deleted aren’t taken off the dataflow datasource page. This standard procedure has no adverse effects on updating or altering dataflows.
- The gateway drop-down menu on the Setting page will still display deleted data sources.
- Dataflows connected to other dataflows represent depth. 32 is the current maximum depth.
- Refreshing more than 10 dataflows across workspaces requires a Power BI Premium subscription.
- Global variables cannot be used as a URL parameter in Power BI dataflows.
- You can utilize parameters in Power BI dataflows, but you can’t update them without also editing the entire data flow. Parameters in dataflows act similarly to specified constants in this sense.
- The PowerQuery Online use limits page contains information about PowerQuery restrictions.
- Multi-Geo is currently not supported unless storage is configured to utilize your own Azure Data Lake Gen2 storage account. A gateway is used to support vnets.
Dataflow Data Types in Dataverse
Dataverse dataflows support the following data types:
All Data Types | Dataflow Data Types |
Time | Time |
Date | Date |
DateTime | DateTime |
DateTimeZone | DateTimeOffset |
Logical | Boolean |
Text | String |
Any | String |
Currency | Decimal |
Int8 | Int64 |
Int16 | Int64 |
Int32 | Int64 |
Int64 | Int64 |
Double | Double |
Percentage | Double |
Single | Double |
Decimal | Double |
Number | Double |
Duration | Not Supported |
Binary | Not Supported |
Function | Not Supported |
Table | Not Supported |
List | Not Supported |
Record | Not Supported |
Type | Not Supported |
Action | Not Supported |
None | Not Supported |
Null | Not Supported |
Read: How to Create Dataverse View
How to create dataflow in Power Apps
To create the dataflow in Power Apps or Dataverse, you need to follow the below 5 steps:
- Set the location or destination where the output data will be loaded.
- The place where data should come from.
- Implement the Power Query.
- Schedule dataflow executions.
- Use the information you transferred to the final storage location.
Create Power Apps Dataflow from SharePoint List
This part will describe how to establish a Dataflow in Power Apps or Dataverse from scratch. Here, we’ll use a SharePoint List to develop a Dataverse Dataflow.
Step – 1:
- First, On the Power Apps screen, expand Dataverse -> Select Dataflows -> Click on the + New dataflow from the top of the page.
- When we expand the + New dataflow button, we have two choices:
- Start from blank =Select this option if you want to start the Power Apps dataflow from scratch.
- Import template = Choose this if you want to establish a dataflow using any imported template from your system.
We have selected the Start from blank option for this scenario.
- In the New dataflow page, provide a unique name (Dataflow For Book Details) to the new dataflow and then click on the Create button.
Step – 2:
- We will then be prompted to select the data source from which you want to get the data. You may access the data through several connectors, as seen in the image below. You can drag a file from your PC or choose a connection.
- Many tabs, like All categories, File, Database, etc., make it easier to find individual connectors from a specific tab. Furthermore, we may search for any connection name using the search bar.
- We will choose the SharePoint Online list as shown below since we will retrieve the data from the SharePoint list.
Step – 3:
Once we select the SharePoint list connector, it will direct us to the next page where we need to connect the SharePoint data source.
The two different connection configurations are as follows. like as:
- Connection settings: With this configuration, there is just one field, i.e.
- Site URL = Specify the URL of the SharePoint site that contains the SharePoint list. You can type a URL like as “https://<tenant>.sharepoint.com/sites/PowerApps/“.
2. Connection credentials: There are four fields in this setting: (The choices below are available if this is your first time signing in).
- Connection = Choose between a new connection and an existing connection.
- Connection name = This will automatically fill in using the Site URL box.
- Data gateway = This is selected with none by default.
- Authentication kind = Choose the Organizational account option for the authentication type.
You won’t see the above alternatives if you have already logged in before. You only have access to the Edit connection button in place of that.
NOTES:
-> If you aren’t already logged in, click the Sign-in option and enter your Microsoft login information to access the SharePoint Online list.
-> Make sure the Organizational account is selected in the Anonymous section.
Click the Next button to continue when the connection is completed.
IMPORTANT:
If you type the wrong URL, username, or password, you will get the warning “Invalid credentials.” If the wrong source is picked, the error could also happen.
Refer to the screenshot below.
Step – 4:
You must select the data or SharePoint list that appears in the left menu on the next page. A Power Query editor will display after we choose the particular SharePoint list (Book Purchase Info), loading all the columns and data. Click the Transform data button.
Step – 5:
- On this page, you have access to all feasible modifications. The designer is similar to Power BI, as you’ll see. Let’s imagine you want to update this page such that you may delete a row or a column.
- Additionally, as you can see, the bottom portion includes details about the loading time, the number of columns (39), and the quantity of data rows (7, excluding headers and column names), after which select Next.
Step – 6:
- We will now map the tables, the columns, and the load configurations. You may either load it into an existing table or make a brand-new one. I’ve made the decision to add it to a new table.
- The Table name and Table display name will automatically populate this field. Also optional is the field for the Table description.
- The table’s primary key (or unique column in SQL) must then be determined. If you already have a column, you may utilize it or leave it as Auto-generated. In my case, I’ll only use the Auto-generated default option.
- On the current page, each type of Destination column may be seen as a Text type. The entire mapping screen will have the following appearance. Tap the Next button to continue.
Step – 7:
- Once you click the Next button, nothing will happen to take you to the next page. Under the Column Mapping section, a warning error notice with the text “Some column names only different by letter case. Please update the column names to be unique case-insensitive names.“
- This warning indicates that the column name in the source is not unique irrespective of the case used. In order to move on with the procedure, we must adjust the column names to be unique.
- In order to resolve this problem, we will use the bottom left Back button to return to the Query editor page.
Step – 8:
On the Power Query editor page, you may see a column attribute with the same name i.e. ID. As seen below, one ID is lowercase, and the other is uppercase. To turn it into a unique characteristic, we must remove one ID field.
NOTE:
You may eliminate any unnecessary fields or other properties with the same name in the query editor, not just the ID column.
Step – 9:
- Under the Remove columns part of the command bar, choose the Id (lower case) column and click Remove columns.
- A specific Id column has now been removed from the query editor. Click Next after it is finished.
Step – 10:
On the following screen, we will once more see the Map table page. Simply make a modification to the Table name before clicking the Next button.
Step – 11:
- Finally, Refresh manually and Refresh automatically are the two choices available on the Refresh settings page. Either manually or automatically refreshing must be chosen. If the data migration is a one-time occurrence, don’t do anything in this section.
- You must choose a time interval if you choose the Refresh automatically option.
- Here, choose Refresh automatically, and you’ll see two parts:
- Frequent-based refresh = If you want the dataflow to be refreshed once a day, once a week, once a month, once a year, or more frequently, select this option. We have chosen the option where the dataflow will update once weekly for a set amount of time.
- Refresh on specific days and times = Select this if you want the dataflow to be updated at a certain time and day.
- Additionally, choose the “Send refresh failure notification to the dataflow owner” option if you wish to notify the dataflow owner of a refresh failure.
- Select Publish -> Publish now after that and wait a bit.
Step – 12:
- The data flow will refresh and the progress status will show up after some time. The last confirmation notice, “Your dataflow “Dataflow For Book Details” was published successfully and is presently refreshing,” will then appear at the top of the page.
- Now to view the created Dataverse dataflow in Power Apps, Go to Tables -> Custom tab -> You can view the SharePoint list (Book Purchase Info) name as shown below.
These are the steps to create a dataflow in Power Apps.
Have a look: How to Upload images to Dataverse from Power Apps
Dataverse Dataflow Excel
Next, we will see how to create dataverse dataflow from an Excel spreadsheet. Every step, in this case, is comparable to the one above. To get the data, we only need to connect the Excel Workbook connection. Follow the steps below:
Step – 1:
Expand Dataverse, choose Dataflows, and then click the Add + New dataflow link on the Power Apps screen at the top of the page. Or you can choose the Start from blank option.
Step – 2:
Give the new dataflow a distinctive name (such as Create Dataflow From Excel) and then click the Create button on the New dataflow screen.
Step – 3:
- You may select your data source from a Power Query box that will appear. Choose the data source from which you will get your data. This source may be either on-premises or cloud-based, and examples include Excel, Azure SQL Database, SharePoint, Azure Data Explorer, Salesforce, Oracle database, and more.
- Excel workbook will be the data source in this example. Select the Excel workbook under the All categories section below.
Step – 4:
- The following step is to create a connection to your data source; in this example, we will do so for the One Drive Excel file.
- Choose the Link to file option -> click on the Browse OneDrive button -> Select the specific excel file (Sales) from the OneDrive account -> Click on the Select button as shown below.
Step – 5:
When the upload of the excel file is completed, the connection settings screen will show up as seen in the following image. If you want to alter the connection, use the Edit connection link button and make the necessary changes. Click on Next.
Step – 6:
The tables from your Excel file will be displayed. Click Transform Data after selecting the table from which you need to transform the data.
Step – 7:
You will then be sent to the data flow editor where you may customize your data. To see and alter the code generated by the query editor, use the Advanced editor in the example below. Click on the Next button.
Step – 8:
- You will now be prompted to map your data; you may load it into an existing table, map it to a new table, or choose Do not load.
- We’ll decide to import the data into a new table in our scenario. You are prepared to go on to the next phase after finishing your choices.
- Here, we just changed the Table name (SALES Table) and Table display name (SALES Table) and then click on the Next button.
Step – 9:
- The next step is to choose the dataflow’s refresh frequency. To keep data current, dataflows use a data refresh method.
- You may choose a timetable for the data refresh and decide whether to refresh manually or automatically.
- Here we have chosen Refresh automatically and provided the frequency-based refresh timings as shown below. Once everything is done, click on the Publish button.
- The Publish button displays two options i.e. Publish now and Publish later. We may publish the dataverse dataflow in Power Apps in accordance with it.
Step – 10:
- Our dataflow is now refreshing and is viewable in progress in the Dataflows section. After a while, you could see that the dataflow has been published and that the next refresh date and time have been displayed.
- You may view the table (SALES Table) to which we just mapped the data by navigating to the Dataverse tables (Custom tab).
- The dataverse table will show all the information that was taken from the Excel spreadsheet as soon as we click on it.
That’s how we can create the Power Apps Dataflow from Excel.
Read: Filter Dataverse Choice Column [With Various Examples]
Dataverse dataflow incremental refresh
A dataflow can only load data from a certain time frame due to incremental refresh. This can decrease the quantity of data processed during each dataflow refresh, resulting in quicker, more dependable, and resource-efficient subsequent refreshes.
Dataverse dataflow incremental refresh includes two types. Such as:
- Full refresh = The Dataverse table is completely refreshed with this refresh option.
- Incremental refresh = To filter and update it for the entire period, select a certain date or time column.
To workaround with the dataflow incremental refresh, Expand Dataverse -> Dataflows -> My Dataflows -> click on ellipses of a specific dataflow -> select Edit incremental refresh as shown in the screenshot below.
Next, select Incremental refresh -> Choose a Date or Time column to filter by -> Provide the number of days, months, quarters, or years that you want to refresh the rows from the past -> Click on Save.
This is what about Dataverse dataflow incremental refresh.
Dataverse dataflow export
To work with the Dataverse export things, refer to this Dataverse tutorial: Dataverse dataflow export
Also, you may like some more Dataverse and Power Apps tutorials:
- How To Get Dataverse List Rows Count Using Power Automate
- How to Export Dataverse Table to Excel
- Power Apps Add Data to Dataverse Table
- Dataverse Primary Name Column Autonumber
- Delete All Records From Dataverse Table [With Examples]
- Dataverse create table from SharePoint list
- Power Apps Sort Function [With 19 useful examples]
- PowerApps Search Function + How to use with example
This Dataverse tutorial covered a variety of topics, including how to leverage various connections when using Dataverse Dataflows. We also discussed the following topics:
- What are Dataverse Dataflows?
- Operation of the Dataverse Dataflow function
- What are the benefits of dataverse dataflows
- Dataverse dataflow limitations
- Dataflow data types in dataverse
- How to create dataflow in Power Apps
- Create Power Apps Dataflow from SharePoint List
- Create Dataverse dataflow from excel
- Dataverse dataflow incremental refresh
- Dataverse dataflow export
I am Bijay a Microsoft MVP (10 times – My MVP Profile) in SharePoint and have more than 17 years of expertise in SharePoint Online Office 365, SharePoint subscription edition, and SharePoint 2019/2016/2013. Currently working in my own venture TSInfo Technologies a SharePoint development, consulting, and training company. I also run the popular SharePoint website EnjoySharePoint.com