microsoft / coe-starter-kit

Other
748 stars 221 forks source link

[CoE BYODL Starter Kit - BUG] CoE BYODL Flows Dataflow issue #6736

Closed jayakumarkgds closed 9 months ago

jayakumarkgds commented 1 year ago

Does this bug already exist in our backlog?

Describe the issue

Getting dataflow failures for BYODL dataverse sync. image

Expected Behavior

Should refresh the data to dataverse

What solution are you experiencing the issue with?

Core

What solution version are you using?

4.8

What app or flow are you having the issue with?

COE BYODL FLows

What method are you using to get inventory and telemetry?

Data Export

Steps To Reproduce

Refresh the dataflow

Anything else?

No response

AB#220

manuelap-msft commented 1 year ago

Hello,

we are having some known issues with the CoE BYODL Apps and Makers dataflows, see https://github.com/microsoft/coe-starter-kit/issues/6688 and https://github.com/microsoft/coe-starter-kit/issues/6577. The issue is caused by new properties being made available in Data Export, some of which are made available through nested JSON in the files we consume from the storage account and the Dataflow transformations weren't robust enough to handle nested JSON.

We are still testing the changes required to the dataflow, but you can try making the following changes on your side and let us know how you get on:

  1. Changes to the Power BI dataflow: https://github.com/microsoft/coe-starter-kit/issues/6688#issuecomment-1731598690
  2. Changes to the Power Platform dataflows: https://github.com/microsoft/coe-starter-kit/issues/6688#issuecomment-1736852295

However, it looks like your issue is with the CoE BYODL Flows Dataflow, is that correct? If so, can you please edit the Dataflow and see if you have any errors in the table transformations there? Thank you!

jayakumarkgds commented 1 year ago

CoE BYODL Flows Dataflow By editing dataflow i don't see any errors.it is properly loading the tables.

jayakumarkgds commented 1 year ago

I have updated the BYODL Maker dataflow but still getting same error image

jayakumarkgds commented 1 year ago

Hello,

we are having some known issues with the CoE BYODL Apps and Makers dataflows, see #6688 and #6577. The issue is caused by new properties being made available in Data Export, some of which are made available through nested JSON in the files we consume from the storage account and the Dataflow transformations weren't robust enough to handle nested JSON.

We are still testing the changes required to the dataflow, but you can try making the following changes on your side and let us know how you get on:

  1. Changes to the Power BI dataflow: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)
  2. Changes to the Power Platform dataflows: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)

However, it looks like your issue is with the CoE BYODL Flows Dataflow, is that correct? If so, can you please edit the Dataflow and see if you have any errors in the table transformations there? Thank you!

It is still failing after updating the query in maker dataflow image

We have issues for Flows and Maker dat flows with same error

manuelap-msft commented 1 year ago

Hello, we are having some known issues with the CoE BYODL Apps and Makers dataflows, see #6688 and #6577. The issue is caused by new properties being made available in Data Export, some of which are made available through nested JSON in the files we consume from the storage account and the Dataflow transformations weren't robust enough to handle nested JSON. We are still testing the changes required to the dataflow, but you can try making the following changes on your side and let us know how you get on:

  1. Changes to the Power BI dataflow: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)
  2. Changes to the Power Platform dataflows: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)

However, it looks like your issue is with the CoE BYODL Flows Dataflow, is that correct? If so, can you please edit the Dataflow and see if you have any errors in the table transformations there? Thank you!

It is still failing after updating the query in maker dataflow image

We have issues for Flows and Maker dat flows with same error

We're releasing a new version tomorrow that has the updated dataflows, that will hopefully address your issue.

jayakumarkgds commented 1 year ago

Hello, we are having some known issues with the CoE BYODL Apps and Makers dataflows, see #6688 and #6577. The issue is caused by new properties being made available in Data Export, some of which are made available through nested JSON in the files we consume from the storage account and the Dataflow transformations weren't robust enough to handle nested JSON. We are still testing the changes required to the dataflow, but you can try making the following changes on your side and let us know how you get on:

  1. Changes to the Power BI dataflow: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)
  2. Changes to the Power Platform dataflows: [CoE Starter Kit - BUG] BYODL - Error in PowerPlatformAdminAnalytics-DF Dataflow #6688 (comment)

However, it looks like your issue is with the CoE BYODL Flows Dataflow, is that correct? If so, can you please edit the Dataflow and see if you have any errors in the table transformations there? Thank you!

It is still failing after updating the query in maker dataflow image We have issues for Flows and Maker dat flows with same error

We're releasing a new version tomorrow that has the updated dataflows, that will hopefully address your issue.

After updating to new version we have 2 different erros..1 for maker and another for flow

image

image

Error Code: Mashup Exception Data Source Error, Error Details: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: OData: Request failed: The remote server returned an error: (400) Bad Request. (System.NullReferenceException: Object reference not set to an instance of an object. at Microsoft.Dynamics.CDS.AADPlugins.GraphApiClient.ExecuteGraphApiRequest(IPluginExecutionContext pluginContext, String columnsToRetrieve, String filterCondition, String orderBy, String searchQuery, Int32 topCount, String& nextLink, String pagingCookie) at Microsoft.Dynamics.CDS.AADPlugins.PersonTypeRetrieveMultiplePlugin.RetrieveMultiple(IPluginExecutionContext pluginContext, EntityMetadata entityMetadata, IGraphApiClient graphApiClient) at Microsoft.Dynamics.CDS.AADPlugins.PersonTypeRetrieveMultiplePlugin.<>cDisplayClass2_0.b0() at Microsoft.PowerApps.CoreFramework.ActivityLoggerExtensions.Execute(ILogger logger, EventId eventId, ActivityType activityType, Action action, IEnumerable1 additionalCustomProperties) at Microsoft.CDSRuntime.SandboxWorker.SandboxFirstPartyPluginLogger.Execute(String activityName, Action action, IEnumerable1 additionalCustomProperties) at Microsoft.Dynamics.CDS.AADPlugins.PersonTypeRetrieveMultiplePlugin.ExecuteInternal(IPluginExecutionContext pluginContext, IOrganizationService organizationService, EntityMetadata entityMetadata, IGraphApiClient graphApiClient) at Microsoft.Dynamics.CDS.AADPlugins.PluginBase.<>cDisplayClass9_0.b0() at Microsoft.PowerApps.CoreFramework.ActivityLoggerExtensions.Execute(ILogger logger, EventId eventId, ActivityType activityType, Action action, IEnumerable1 additionalCustomProperties) at Microsoft.CDSRuntime.SandboxWorker.SandboxFirstPartyPluginLogger.Execute(String activityName, Action action, IEnumerable1 additionalCustomProperties) at Microsoft.Dynamics.CDS.AADPlugins.PluginBase.Execute(IServiceProvider serviceProvider)) Details: Reason = DataSource.Error;DataSourceKind = OData;DataSourcePath = https://<>.api.crm4.dynamics.com/api/data/v9.2/aadusers;Url = https://<>.api.crm4.dynamics.com/api/data/v9.2/aadusers?$skiptoken= (Request ID: <>).

jayakumarkgds commented 1 year ago

For Flows still same issue

image

Jenefer-Monroe commented 1 year ago

to investigate with DF-rewrite

manuelap-msft commented 1 year ago

Hello,

There's been a recent change to the data provided by the Data Export feature - more properties are now provided and some attributes are nested etc. Data format changes are entirely expected during a public preview. Unfortunately, our Power Platform dataflows were not robust enough to handle those changes so we have some catching up to do to modify them, re-test them and ensure they work with the new files.

We don't repo the above error so it makes it a little tricker for us to investigate what's wrong! If you could test the below suggested changes and let us know if you still see failures that would be great:

  1. Open the CoE BYODl Flows Dataflow > Edit
  2. Open the Transform Files folder > Transform file (9) function > Advanced Editor
  3. Replace the code you see there with

    let Source = (Parameter as binary) => let Source = Table.FromColumns({Lines.FromBinary(Parameter)}),

    "Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

    "Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"resourceId", "type", "subType", "name", "tenantId", "environmentId", "resourceVersion", "lifecycleState", "events_created_timestamp", "events_created_principalId", "events_modified_timestamp", "sharedUsers", "sharedGroups", "workflowEntityId", "flowSuspensionReason", "templateName", "flowFailureAlertSubscribed", "sharingType"}),

    "Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"resourceId", type text}, {"environmentId", type text}, {"type", type text}, {"subType", type text}, {"name", type text}, {"tenantId", type text}, {"lifecycleState", type text}, {"resourceVersion", type text}, {"events_created_timestamp", type datetime}, {"events_created_principalId", type text}, {"events_modified_timestamp", type datetime}, {"sharedUsers", Int64.Type}, {"sharedGroups", Int64.Type}, {"workflowEntityId", type text}, {"flowSuspensionReason", type text}, {"templateName", type text}, {"flowFailureAlertSubscribed", type text}, {"sharingType", type text}})

in

"Changed Type"

in Source

  1. Select the Flows table > Advanced Editor
  2. Replace the code you see there with

let Source = AzureStorage.DataLake(DatalakeURL & "/powerautomate/Flows"), excludehiddenfiles = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true), excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0), // Extract Environment GUID from File Name

"Extract Environment ID from File Name" = Table.TransformColumnTypes(Table.AddColumn(excludezerolengthfiles, "EnvironmentName", each if Text.Length([Name]) = 41 then Text.Range([Name], 0,36) else Text.Range(Text.AfterDelimiter([Name], "-", 0), 0, 36)), {{"EnvironmentName", type text}}),

// Remove unnecessary columns from the folder system, like extension and folder path.

"Removed unneeded columns" = Table.RemoveColumns(#"Extract Environment ID from File Name", {"Folder Path", "Attributes", "Date created", "Date modified", "Date accessed", "Extension", "Name"}),

// Inner join of flow table with environments table, only keep flows where the environment still exists. This filters out flows from deleted environments.

"Merge with environments" = Table.NestedJoin(#"Removed unneeded columns", {"EnvironmentName"}, admin_environments, {"admin_environmentid"}, "admin_environments", JoinKind.Inner),

"Expanded Environments" = Table.ExpandTableColumn(#"Merge with environments", "admin_environments", {"admin_environmentid", "admin_displayname"}, {"admin_environmentid", "admin_environmentdisplayname"}),

// Transform the content from the json files in the storage account into tables

"Get content from json files" = Table.AddColumn(#"Expanded Environments", "Transform file", each #"Transform file (9)"([Content])),

// Expand all the columns the json file in the storage account provides into individual columns, e.g. flow name, flow id, created by, created on...

"Expanded columns" = Table.ExpandTableColumn(#"Get content from json files", "Transform file", {"resourceId", "name", "environmentId", "lifecycleState", "events_created_timestamp", "events_created_principalId", "events_modified_timestamp"}, {"flowId", "name", "environmentId", "lifecycleState", "createdOn", "createdBy", "lastModifiedOn"}),

"Removed errors" = Table.RemoveRowsWithErrors(#"Expanded columns", {"flowId"}),

"Removed Content binary column" = Table.RemoveColumns(#"Removed errors", {"Content"}),

// Merge flows with maker to get more maker details, such as maker display name, department, is orphaned etc for furhter logic (is the flow orphaned)

"Merged with makers" = Table.NestedJoin(#"Removed Content binary column", {"createdBy"}, admin_makers, {"admin_makerid"}, "admin_makers", JoinKind.LeftOuter),

"Expanded maker table" = Table.ExpandTableColumn(#"Merged with makers", "admin_makers", {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}, {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}),

// Add utcNow timestamp which we use to identify when the dataflow last updated/created a row

"Added QueriedOn" = Table.AddColumn(#"Expanded maker table", "QueriedOn", each DateTimeZone.UtcNow()),

// Add yes/no text field to identify if the flow is owned by a maker that no longer exists in the org

"Add IsOrphan" = Table.AddColumn(#"Added QueriedOn", "IsOrphanText", each if [admin_makerisorphaned] = true then "Yes" else "No"),

"Merged queries" = Table.NestedJoin(#"Add IsOrphan", {"flowId"}, admin_flows, {"admin_flowid"}, "admin_flows", JoinKind.LeftOuter),

"Expanded admin_flows" = Table.ExpandTableColumn(#"Merged queries", "admin_flows", {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}, {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}),

"Inserted conditional column" = Table.AddColumn(#"Expanded admin_flows", "DerivedOwnerFinal", each if [admin_derivedownermakerid] = null then [admin_makerid] else [admin_derivedownermakerid]),

"Add flowIsPublished" = Table.TransformColumnTypes(Table.AddColumn(#"Inserted conditional column", "flowIsPublished", each true), {{"flowIsPublished", type logical}}) //to replace with logic once available

,

"Add isNotDeleted" = Table.TransformColumnTypes(Table.AddColumn(#"Add flowIsPublished", "isNotDeleted", each false), {{"isNotDeleted", type logical}}),

"Changed column type" = Table.TransformColumnTypes(#"Add isNotDeleted", {{"EnvironmentName", type text}, {"admin_environmentid", type text}, {"admin_environmentdisplayname", type text}, {"flowId", type text}, {"name", type text}, {"environmentId", type text}, {"lifecycleState", type text}, {"createdOn", type datetime}, {"createdBy", type text}, {"lastModifiedOn", type datetime}, {"admin_makerid", type text}, {"admin_userprincipalname", type text}, {"admin_makerisorphaned", type logical}, {"admin_department", type text}, {"admin_displayname", type text}, {"admin_country", type text}, {"admin_company", type text}, {"admin_city", type text}, {"QueriedOn", type datetimezone}, {"IsOrphanText", type text}, {"admin_derivedownerdisplayname", type text}, {"admin_derivedownermakerid", type text}, {"DerivedOwnerFinal", type text}, {"flowIsPublished", type logical}, {"isNotDeleted", type logical}})

in

"Changed column type"

  1. Next > Publish (don't make any changes on the Publish step)

Please let us know if that works!

jayakumarkgds commented 1 year ago

let Source = AzureStorage.DataLake(DatalakeURL & "/powerautomate/Flows"), excludehiddenfiles = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true), excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0), // Extract Environment GUID from File Name

"Extract Environment ID from File Name" = Table.TransformColumnTypes(Table.AddColumn(excludezerolengthfiles, "EnvironmentName", each if Text.Length([Name]) = 41 then Text.Range([Name], 0,36) else Text.Range(Text.AfterDelimiter([Name], "-", 0), 0, 36)), {{"EnvironmentName", type text}}),

// Remove unnecessary columns from the folder system, like extension and folder path.

"Removed unneeded columns" = Table.RemoveColumns(#"Extract Environment ID from File Name", {"Folder Path", "Attributes", "Date created", "Date modified", "Date accessed", "Extension", "Name"}),

// Inner join of flow table with environments table, only keep flows where the environment still exists. This filters out flows from deleted environments.

"Merge with environments" = Table.NestedJoin(#"Removed unneeded columns", {"EnvironmentName"}, admin_environments, {"admin_environmentid"}, "admin_environments", JoinKind.Inner),

"Expanded Environments" = Table.ExpandTableColumn(#"Merge with environments", "admin_environments", {"admin_environmentid", "admin_displayname"}, {"admin_environmentid", "admin_environmentdisplayname"}),

// Transform the content from the json files in the storage account into tables

"Get content from json files" = Table.AddColumn(#"Expanded Environments", "Transform file", each #"Transform file (9)"([Content])),

// Expand all the columns the json file in the storage account provides into individual columns, e.g. flow name, flow id, created by, created on...

"Expanded columns" = Table.ExpandTableColumn(#"Get content from json files", "Transform file", {"resourceId", "name", "environmentId", "lifecycleState", "events_created_timestamp", "events_created_principalId", "events_modified_timestamp"}, {"flowId", "name", "environmentId", "lifecycleState", "createdOn", "createdBy", "lastModifiedOn"}),

"Removed errors" = Table.RemoveRowsWithErrors(#"Expanded columns", {"flowId"}),

"Removed Content binary column" = Table.RemoveColumns(#"Removed errors", {"Content"}),

// Merge flows with maker to get more maker details, such as maker display name, department, is orphaned etc for furhter logic (is the flow orphaned)

"Merged with makers" = Table.NestedJoin(#"Removed Content binary column", {"createdBy"}, admin_makers, {"admin_makerid"}, "admin_makers", JoinKind.LeftOuter),

"Expanded maker table" = Table.ExpandTableColumn(#"Merged with makers", "admin_makers", {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}, {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}),

// Add utcNow timestamp which we use to identify when the dataflow last updated/created a row

"Added QueriedOn" = Table.AddColumn(#"Expanded maker table", "QueriedOn", each DateTimeZone.UtcNow()),

// Add yes/no text field to identify if the flow is owned by a maker that no longer exists in the org

"Add IsOrphan" = Table.AddColumn(#"Added QueriedOn", "IsOrphanText", each if [admin_makerisorphaned] = true then "Yes" else "No"),

"Merged queries" = Table.NestedJoin(#"Add IsOrphan", {"flowId"}, admin_flows, {"admin_flowid"}, "admin_flows", JoinKind.LeftOuter),

"Expanded admin_flows" = Table.ExpandTableColumn(#"Merged queries", "admin_flows", {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}, {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}),

"Inserted conditional column" = Table.AddColumn(#"Expanded admin_flows", "DerivedOwnerFinal", each if [admin_derivedownermakerid] = null then [admin_makerid] else [admin_derivedownermakerid]),

"Add flowIsPublished" = Table.TransformColumnTypes(Table.AddColumn(#"Inserted conditional column", "flowIsPublished", each true), {{"flowIsPublished", type logical}}) //to replace with logic once available

,

"Add isNotDeleted" = Table.TransformColumnTypes(Table.AddColumn(#"Add flowIsPublished", "isNotDeleted", each false), {{"isNotDeleted", type logical}}),

"Changed column type" = Table.TransformColumnTypes(#"Add isNotDeleted", {{"EnvironmentName", type text}, {"admin_environmentid", type text}, {"admin_environmentdisplayname", type text}, {"flowId", type text}, {"name", type text}, {"environmentId", type text}, {"lifecycleState", type text}, {"createdOn", type datetime}, {"createdBy", type text}, {"lastModifiedOn", type datetime}, {"admin_makerid", type text}, {"admin_userprincipalname", type text}, {"admin_makerisorphaned", type logical}, {"admin_department", type text}, {"admin_displayname", type text}, {"admin_country", type text}, {"admin_company", type text}, {"admin_city", type text}, {"QueriedOn", type datetimezone}, {"IsOrphanText", type text}, {"admin_derivedownerdisplayname", type text}, {"admin_derivedownermakerid", type text}, {"DerivedOwnerFinal", type text}, {"flowIsPublished", type logical}, {"isNotDeleted", type logical}})

in

"Changed column type"

Thanks for this. I am trying this. Updated and waiting for sync to complete. Also facing issue on powerbi dataflow,no errors but not completing. image

jayakumarkgds commented 1 year ago

let Source = AzureStorage.DataLake(DatalakeURL & "/powerautomate/Flows"), excludehiddenfiles = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true), excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0), // Extract Environment GUID from File Name

"Extract Environment ID from File Name" = Table.TransformColumnTypes(Table.AddColumn(excludezerolengthfiles, "EnvironmentName", each if Text.Length([Name]) = 41 then Text.Range([Name], 0,36) else Text.Range(Text.AfterDelimiter([Name], "-", 0), 0, 36)), {{"EnvironmentName", type text}}),

// Remove unnecessary columns from the folder system, like extension and folder path.

"Removed unneeded columns" = Table.RemoveColumns(#"Extract Environment ID from File Name", {"Folder Path", "Attributes", "Date created", "Date modified", "Date accessed", "Extension", "Name"}),

// Inner join of flow table with environments table, only keep flows where the environment still exists. This filters out flows from deleted environments.

"Merge with environments" = Table.NestedJoin(#"Removed unneeded columns", {"EnvironmentName"}, admin_environments, {"admin_environmentid"}, "admin_environments", JoinKind.Inner),

"Expanded Environments" = Table.ExpandTableColumn(#"Merge with environments", "admin_environments", {"admin_environmentid", "admin_displayname"}, {"admin_environmentid", "admin_environmentdisplayname"}),

// Transform the content from the json files in the storage account into tables

"Get content from json files" = Table.AddColumn(#"Expanded Environments", "Transform file", each #"Transform file (9)"([Content])),

// Expand all the columns the json file in the storage account provides into individual columns, e.g. flow name, flow id, created by, created on...

"Expanded columns" = Table.ExpandTableColumn(#"Get content from json files", "Transform file", {"resourceId", "name", "environmentId", "lifecycleState", "events_created_timestamp", "events_created_principalId", "events_modified_timestamp"}, {"flowId", "name", "environmentId", "lifecycleState", "createdOn", "createdBy", "lastModifiedOn"}),

"Removed errors" = Table.RemoveRowsWithErrors(#"Expanded columns", {"flowId"}),

"Removed Content binary column" = Table.RemoveColumns(#"Removed errors", {"Content"}),

// Merge flows with maker to get more maker details, such as maker display name, department, is orphaned etc for furhter logic (is the flow orphaned)

"Merged with makers" = Table.NestedJoin(#"Removed Content binary column", {"createdBy"}, admin_makers, {"admin_makerid"}, "admin_makers", JoinKind.LeftOuter),

"Expanded maker table" = Table.ExpandTableColumn(#"Merged with makers", "admin_makers", {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}, {"admin_makerid", "admin_userprincipalname", "admin_makerisorphaned", "admin_department", "admin_displayname", "admin_country", "admin_company", "admin_city"}),

// Add utcNow timestamp which we use to identify when the dataflow last updated/created a row

"Added QueriedOn" = Table.AddColumn(#"Expanded maker table", "QueriedOn", each DateTimeZone.UtcNow()),

// Add yes/no text field to identify if the flow is owned by a maker that no longer exists in the org

"Add IsOrphan" = Table.AddColumn(#"Added QueriedOn", "IsOrphanText", each if [admin_makerisorphaned] = true then "Yes" else "No"),

"Merged queries" = Table.NestedJoin(#"Add IsOrphan", {"flowId"}, admin_flows, {"admin_flowid"}, "admin_flows", JoinKind.LeftOuter),

"Expanded admin_flows" = Table.ExpandTableColumn(#"Merged queries", "admin_flows", {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}, {"admin_derivedownerdisplayname", "admin_derivedownermakerid"}),

"Inserted conditional column" = Table.AddColumn(#"Expanded admin_flows", "DerivedOwnerFinal", each if [admin_derivedownermakerid] = null then [admin_makerid] else [admin_derivedownermakerid]),

"Add flowIsPublished" = Table.TransformColumnTypes(Table.AddColumn(#"Inserted conditional column", "flowIsPublished", each true), {{"flowIsPublished", type logical}}) //to replace with logic once available

,

"Add isNotDeleted" = Table.TransformColumnTypes(Table.AddColumn(#"Add flowIsPublished", "isNotDeleted", each false), {{"isNotDeleted", type logical}}),

"Changed column type" = Table.TransformColumnTypes(#"Add isNotDeleted", {{"EnvironmentName", type text}, {"admin_environmentid", type text}, {"admin_environmentdisplayname", type text}, {"flowId", type text}, {"name", type text}, {"environmentId", type text}, {"lifecycleState", type text}, {"createdOn", type datetime}, {"createdBy", type text}, {"lastModifiedOn", type datetime}, {"admin_makerid", type text}, {"admin_userprincipalname", type text}, {"admin_makerisorphaned", type logical}, {"admin_department", type text}, {"admin_displayname", type text}, {"admin_country", type text}, {"admin_company", type text}, {"admin_city", type text}, {"QueriedOn", type datetimezone}, {"IsOrphanText", type text}, {"admin_derivedownerdisplayname", type text}, {"admin_derivedownermakerid", type text}, {"DerivedOwnerFinal", type text}, {"flowIsPublished", type logical}, {"isNotDeleted", type logical}})

in

"Changed column type"

Thanks for this. I am trying this. Updated and waiting for sync to complete. Also facing issue on powerbi dataflow,no errors but not completing. image

Different error after updating the dataflows as per your input. image

SPMush commented 11 months ago

Hi Manuela, Tried your suggested fix above Dataflows, all refresh completed. But I am still seeing JSON format errors and most environments being excluded from the PowerPlatformAdminAnalytics-DF query as a result. image

Rajatchopra3 commented 10 months ago

Hi Team ,I am also getting the same error

Image

CoEStarterKitBot commented 9 months ago

@jayakumarkgds This has been fixed in the latest release. Please install the latest version of the toolkit following the instructions for installing updates. Note that if you do not remove the unmanaged layers as described there you will not receive updates from us.

jayakumarkgds commented 9 months ago

Thanks for the update. How can we upgrade the dataflow? The steps don't include dataflow upgrade

Upcoming Out of Office: @.***

Jayakumar K | M365 Engineering Team| Workforce Services| EYTS | Enterprise Workplace EYGBS (India) Private Limited 4th Floor, Smart City, Kochi, Kerala, India Office: +91 484 664 5932 | @.**@.> Website: http://www.ey.comhttp://www.ey.com/

@.***

From: CoE Bot @.> Sent: Thursday, January 4, 2024 7:11 PM To: microsoft/coe-starter-kit @.> Cc: Jayakumar K @.>; Mention @.> Subject: Re: [microsoft/coe-starter-kit] [CoE BYODL Starter Kit - BUG] CoE BYODL Flows Dataflow issue (Issue #6736)

@jayakumarkgdshttps://github.com/jayakumarkgds This has been fixed in the latest release. Please install the latest version of the toolkithttps://aka.ms/CoEStarterKitDownload following the instructions for installing updateshttps://docs.microsoft.com/power-platform/guidance/coe/setup#installing-updates. Note that if you do not remove the unmanaged layers as described there you will not receive updates from us.

- Reply to this email directly, view it on GitHubhttps://github.com/microsoft/coe-starter-kit/issues/6736#issuecomment-1877112759, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AU3PXCLMY5DI5JNJ7ZS2PZ3YM2WOXAVCNFSM6AAAAAA5JW4KVSVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNZXGEYTENZVHE. You are receiving this because you were mentioned.Message ID: @.**@.>>


The information contained in this communication is intended solely for the use of the individual or entity to whom it is addressed and others authorized to receive it. It may contain confidential or legally privileged information. If you are not the intended recipient you are hereby notified that any disclosure, copying, distribution or taking any action in reliance on the contents of this information is strictly prohibited and may be unlawful. If you have received this communication in error, please notify us immediately by responding to this email and then delete it from your system. EY is neither liable for the proper and complete transmission of the information contained in this communication nor for any delay in its receipt.

jayakumarkgds commented 9 months ago

@jayakumarkgds This has been fixed in the latest release. Please install the latest version of the toolkit following the instructions for installing updates. Note that if you do not remove the unmanaged layers as described there you will not receive updates from us.

How can i update the powerBi dataflow with latest?is it only replacement? If its only replacement it is clearing old sync data

jayakumarkgds commented 9 months ago

Still same issue image

@manuelap-msft @Jenefer-Monroe