microsoft / coe-starter-kit

Other
751 stars 221 forks source link

Unable to turn on or fix connection references #7595

Closed RickKush closed 8 months ago

RickKush commented 9 months ago

Does this question already exist in our backlog?

What is your question?

How can I turn on or fix connection references. They are all off. image

What solution are you experiencing the issue with?

Core

What solution version are you using?

4.22

What app or flow are you having the issue with?

Connection References

What method are you using to get inventory and telemetry?

Cloud flows

AB#2212

Jenefer-Monroe commented 9 months ago

Hello. Not all objects have an on/off state and unfortunately that column is still listed for some of these. So for connection references there is no on/off notion, and they just show as off.

For fixing them, I presume you mean update the identity they use? For that you have to go to the default solution instead of going into the Core solution. image

Hope that helps!

RickKush commented 9 months ago

[like] Kushner, Rick [USA] reacted to your message:


From: Jenefer Monroe @.> Sent: Monday, February 5, 2024 6:18:29 PM To: microsoft/coe-starter-kit @.> Cc: Kushner, Rick [USA] @.>; Author @.> Subject: [External] Re: [microsoft/coe-starter-kit] Unable to turn on or fix connection references (Issue #7595)

Hello. Not all objects have an on/off state and unfortunately that column is still listed for some of these. So for connection references there is no on/off notion, and they just show as off.

For fixing them, I presume you mean update the identity they use? For that you have to go to the default solution instead of going into the Core solution. image.png (view on web)https://urldefense.com/v3/__https://github.com/microsoft/coe-starter-kit/assets/89584709/cbcc9c2a-089f-4653-90cf-d60857bdf201__;!!May37g!P4FuboTTKmUQjPtoNCuKW3jT38YdWRPLEuVbuFtT4S9XtmTXVFV-JyppWMfNKGra48CY-CD-1I0_ddH5T520U4wt6A$

Hope that helps!

— Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https://github.com/microsoft/coe-starter-kit/issues/7595*issuecomment-1927697634__;Iw!!May37g!P4FuboTTKmUQjPtoNCuKW3jT38YdWRPLEuVbuFtT4S9XtmTXVFV-JyppWMfNKGra48CY-CD-1I0_ddH5T535nqiYhg$, or unsubscribehttps://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/A32AAGS3AZBBTMDWFJBDLFLYSEO7LAVCNFSM6AAAAABC2OD266VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRXGY4TONRTGQ__;!!May37g!P4FuboTTKmUQjPtoNCuKW3jT38YdWRPLEuVbuFtT4S9XtmTXVFV-JyppWMfNKGra48CY-CD-1I0_ddH5T50xJ4EvNA$. You are receiving this because you authored the thread.Message ID: @.***>

RickKush commented 9 months ago

I think that this isn't the issue which I need to pursue

RickKush commented 9 months ago

Why is the status Off for all of my Connection references, even in the default environment.

Admin | Sync Template v3 (Environment Properties) isn't getting triggered, apparently because Admin | Sync Template v3 (Driver) isn't running, due to the connection reference problem:

Flow client error returned with status code "Forbidden" and details "{"error":{"code":"ConnectionAuthorizationFailed","message":"The caller object id is '20ff9928-4866-467e-b32e-df53a128ae9c'. Connection 'admin_CoECoreTeams' to 'shared_teams' cannot be used to activate this flow, either because this is not a valid connection or because it is not a connection you have access permission for. Either replace the connection with a valid connection you can access or have the connection owner activate the flow, so the connection is shared with you in the context of this flow."}}".

I am unable to find or create any connection for shared teams.

Jenefer-Monroe commented 9 months ago

Sorry it sounds like you are having issues with the product itself and not the kit. I can try a little more but we unfortunately are not staffed to be product support.

Note that connection references will always say "off", its just a junk column and means nothing.

Go to the default envt and look at each of the connection references there. Are they connected to the expected user? image

RickKush commented 9 months ago

Booz Allen Hamilton Internal

All of the connections say Connected

From: Jenefer Monroe @.> Sent: Monday, February 5, 2024 3:49 PM To: microsoft/coe-starter-kit @.> Cc: Kushner, Rick [USA] @.>; State change @.> Subject: [External] Re: [microsoft/coe-starter-kit] Unable to turn on or fix connection references (Issue #7595)

Sorry it sounds like you are having issues with the product itself and not the kit. I can try a little more but we unfortunately are not staffed to be product support.

Note that connection references will always say "off", its just a junk column and means nothing.

Go to the default envt and look at each of the connection references there. Are they connected to the expected user? image.png (view on web)https://urldefense.com/v3/__https:/github.com/microsoft/coe-starter-kit/assets/89584709/cbcc9c2a-089f-4653-90cf-d60857bdf201__;!!May37g!K1t8fJFvQjY76kwU3HxiWC2P50v3NBXEHrwLSGlsJ_ke28FMSgCrWBqL94gxBmCUp2WlaPvPI2Y-mkhqbhfIgmUeyg$

- Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https:/github.com/microsoft/coe-starter-kit/issues/7595*issuecomment-1928069664__;Iw!!May37g!K1t8fJFvQjY76kwU3HxiWC2P50v3NBXEHrwLSGlsJ_ke28FMSgCrWBqL94gxBmCUp2WlaPvPI2Y-mkhqbheG3Ztplg$, or unsubscribehttps://urldefense.com/v3/__https:/github.com/notifications/unsubscribe-auth/A32AAGTSACSUA6ZH7LUMYC3YSFAVRAVCNFSM6AAAAABC2OD266VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRYGA3DSNRWGQ__;!!May37g!K1t8fJFvQjY76kwU3HxiWC2P50v3NBXEHrwLSGlsJ_ke28FMSgCrWBqL94gxBmCUp2WlaPvPI2Y-mkhqbhdN8OgmeQ$. You are receiving this because you modified the open/close state.Message ID: @.**@.>>

RickKush commented 9 months ago

Admin | Sync Template v3 (Driver) is in the Core solution, but it isn't included in the Flows in the Environment.

RickKush commented 9 months ago

I was able to recreate the connections for the correct user and run the flow successfully, but it terminates immediately after the trigger: image

RickKush commented 9 months ago

Apparently it doesn't update Environments because I am doing BYODL. Admin | Sync Template v3 (Driver) terminates on BYODL condition and so Admin | Sync Template v3 (Environment Properties) still isn't getting triggered

Jenefer-Monroe commented 9 months ago

Thats correct. If you use the Setup Wizard it will turn the correct flows on for you. If you are using Data Export (v Cloud Flow) then this flow does not need to be turned on. Have you used the setup wizard to setup Data Export? If so you should see the following when you look at the Dataflows in the envt. Where they are all successfully published and refreshed, and the Maker (and only the Maker) has a Next Refresh scheduled. Its the completion of these that should trigger the Envt Properties flow. image

RickKush commented 9 months ago

I made a mistake when I created this ticket. I am set up for BYODL. Should I go back to the Setup Wizard? Are the environments still supposed to update when I use the BYODL?

Jenefer-Monroe commented 9 months ago

If you are unsure if the correct flows are on, the easiest way to address in my experience is to use the Setup Wizard. Yes when complete and correct configured as shown above, the CoE BYODL Environments Dataflow will write to the Environments table and trigger all the other inventory flows not covered by the Data Export files, including this Environment Properties flow.

RickKush commented 9 months ago

[like] Kushner, Rick [USA] reacted to your message:


From: Jenefer Monroe @.> Sent: Tuesday, February 6, 2024 1:22:21 PM To: microsoft/coe-starter-kit @.> Cc: Kushner, Rick [USA] @.>; State change @.> Subject: [External] Re: [microsoft/coe-starter-kit] Unable to turn on or fix connection references (Issue #7595)

If you are unsure if the correct flows are on, the easiest way to address in my experience is to use the Setup Wizard. Yes when complete and correct configured as shown above, the CoE BYODL Environments Dataflow will write to the Environments table and trigger all the other inventory flows not covered by the Data Export files, including this Environment Properties flow.

— Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https://github.com/microsoft/coe-starter-kit/issues/7595*issuecomment-1929579660__;Iw!!May37g!Pxn-6F-OmrhDjVWR-x7OTzsOLgZbCJUZ2E3jEiaPE4yAu93oM4k89dFdtftpviqoa3if9pbm4VNlh_nn-WYZzvNmfg$, or unsubscribehttps://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/A32AAGXNIW4MN3HVKMWYC53YSIVA3AVCNFSM6AAAAABC2OD266VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMRZGU3TSNRWGA__;!!May37g!Pxn-6F-OmrhDjVWR-x7OTzsOLgZbCJUZ2E3jEiaPE4yAu93oM4k89dFdtftpviqoa3if9pbm4VNlh_nn-Wa7wE0l4g$. You are receiving this because you modified the open/close state.Message ID: @.***>

RickKush commented 9 months ago

At the Refresh Dataflow step: PowerQueryDataflows.RefreshDataflow failed: { "Message": "The refresh request is invalid 'https://unitedstates-002.azure-apim.net/apim/dataflows/1b557538431d4884b2a9c9e769926d32/api/groups/a626fa30-f1a6-ed81-8202-ad1dbe404d16/dataflows/ba2611a1-26d3-4456-af9a-d41589317851/refreshdataflow?workspaceType=Environment'", "Reason": "'InvalidRefreshRequest' - GroupType 'Environment'", "WorkspaceType": "Environment", "OperationId": "RefreshDataflow", "ClientRequestId": "d8fab929-eac6-4574... image

RickKush commented 9 months ago

The CoE BYODL - When Environment dataflow refresh is complete Flow has this error:

The trigger is invalid 'https://flow-apim-unitedstates-002-eastus-01.azure-apim.net/apim/dataflows/2d9ab9e8b4f04e57b90ac8c200949095/api/groups/a626fa30-f1a6-ed81-8202-ad1dbe404d16/dataflows/87769e96-9a65-41ee-b69f-47e1f10cc6a4/onrefreshcomplete?workspaceType=Environment'

The dataflow ID, 87769e96-9a65-41ee-b69f-47e1f10cc6a4, comes from an environment var which is populated automatically, but I am unable to verify it.

The error is similar for all five of the CoE BYODL dataflows.

Jenefer-Monroe commented 9 months ago

First note that once you get this setup you need to set the Maker one (and only the maker one) to auto refresh. This is one of the steps in the setup wizard. image

Jenefer-Monroe commented 9 months ago

It looks like you may be hitting that same issue again where you did not create all the connections when you edited the dataflows. I believe we walked through this together before. Remember that you need to touch every query and wait till it completes and that you dont need to authenticate again. image

RickKush commented 9 months ago

I create a new connection, but it doesn't change the data source. Do I need to change it? How?

image

image

image

RickKush commented 9 months ago

DataFormat.Error: The Container name must be 3 to 63 lower case letters and numbers. It must start with a letter or number and can contain only letters, numbers, and the dash (-) character. No consecutive dashes are allowed. Every dash (-) character must be immediately preceded and followed by a letter or number.

---------- Message ---------- DataFormat.Error: The Container name must be 3 to 63 lower case letters and numbers. It must start with a letter or number and can contain only letters, numbers, and the dash (-) character. No consecutive dashes are allowed. Every dash (-) character must be immediately preceded and followed by a letter or number.

---------- Session ID ---------- 9dd4c444-e177-493a-a894-fedd7eadcb04

---------- Request ID ---------- 8cd020cd-e88e-48ce-b5f3-6076e5bcabb4

---------- Mashup script ---------- section Section1; shared #"Transform environment file" = let Source = Table.FromColumns({Lines.FromBinary(#"Parameter (4)")}),

"Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

#"Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}, {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"environmentId", type text}, {"environmentName", type text}, {"isDefault", type logical}, {"environmentState", type text}, {"environmentUrl", type text}, {"environmentType", type text}, {"securityGroup", type text}, {"purpose", type text}, {"cdsInstanceId", type text}, {"cdsInstanceUrl", type text}, {"createdTime", type datetime}, {"createdPrincipalId", type text}, {"lastModifiedTime", type text}, {"lastModifiedPrincipalId", type text}, {"deletedTime", type text}, {"environmentRegion", type text}, {"tenantGuid", type text}})

in

"Changed Type";

shared #"Sample app file" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/Applications"),

"Filtered hidden files" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),

Navigation = #"Filtered hidden files"{0}[Content]

in Navigation; shared #"Transform app file" = let Source = (Parameter as binary) => let Source = Table.FromColumns({Lines.FromBinary(Parameter)}),

"Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

#"Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"resourceId", "environmentId", "type", "subType", "DocumentVersion", "name", "description", "uri", "tenantId", "lifecycleState", "owner", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "lastEnabledTime", "lastEnabledPrincipalId", "sharedUsers", "sharedGroups", "appPlanClassification", "bypassConsent", "canConsumeAppPass", "sharepointFormURL", "status", "almMode", "usesDataverseTables", "dlpEvaluationStatus"}, {"resourceId", "environmentId", "type", "subType", "DocumentVersion", "name", "description", "uri", "tenantId", "lifecycleState", "owner", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "lastEnabledTime", "lastEnabledPrincipalId", "sharedUsers", "sharedGroups", "appPlanClassification", "bypassConsent", "canConsumeAppPass", "sharepointFormURL", "status", "almMode", "usesDataverseTables", "dlpEvaluationStatus"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"resourceId", type text}, {"environmentId", type text}, {"type", type text}, {"subType", type text}, {"DocumentVersion", type any}, {"name", type text}, {"description", type text}, {"uri", type text}, {"tenantId", type text}, {"lifecycleState", type text}, {"owner", type text}, {"createdTime", type datetime}, {"createdPrincipalId", type text}, {"lastModifiedTime", type datetime}, {"lastModifiedPrincipalId", type text}, {"lastEnabledTime", type datetime}, {"lastEnabledPrincipalId", type text}, {"sharedUsers", Int64.Type}, {"sharedGroups", Int64.Type}, {"appPlanClassification", type text}, {"bypassConsent", type text}, {"canConsumeAppPass", type any}, {"sharepointFormURL", type any}, {"status", type any}, {"almMode", type any}, {"usesDataverseTables", type any}, {"dlpEvaluationStatus", type any}})

in

"Changed Type"

/*let Source = (Parameter as binary) => let Source = Text.FromBinary(Parameter),

"Imported JSON" = Replacer.ReplaceText(Source, "}", "},"),

#"Trim last character" = Text.Trim(#"Imported JSON", ","),
#"Insert bracket" = Text.Insert(#"Trim last character", 0, "["),
#"Insert second bracket" = Text.Insert(#"Insert bracket", Text.Length(#"Insert bracket"), "]"),
#"Parsed JSON" = Json.Document(#"Insert second bracket")

/* #"Navigation 1" = #"Parsed JSON"{0}

"Converted to table" = Record.ToTable(#"Navigation 1"),

#"Transposed table" = Table.Transpose(#"Converted to table"),
#"Promoted headers" = Table.PromoteHeaders(#"Transposed table", [PromoteAllScalars = true]),
#"Changed column type" = Table.TransformColumnTypes(#"Promoted headers", {{"resourceId", type text}, {"environmentId", type text}, {"type", type text}, {"subType", type text}, {"DocumentVersion", type text}, {"name", type text}, {"description", type text}, {"uri", type text}, {"tenantId", type text}, {"lifecycleState", type text}, {"owner", type text}, {"createdTime", type datetime}, {"createdPrincipalId", type text}, {"lastModifiedTime", type datetime}, {"lastModifiedPrincipalId", type text}, {"lastEnabledTime", type datetime}, {"lastEnabledPrincipalId", type text}, {"sharedUsers", Int64.Type}, {"sharedGroups", Int64.Type}, {"solution", type any}}) 

in

"Parsed JSON" */

in Source; shared #"Sample flow file" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerautomate/Flows"),

"Filtered hidden files" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),

Navigation = #"Filtered hidden files"{0}[Content]

in Navigation; shared #"Transform flow file" = let Source = (Parameter as binary) => let Source = Table.FromColumns({Lines.FromBinary(Parameter)}),

"Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

#"Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"resourceId", "type", "subType", "name", "tenantId", "environmentId", "resourceVersion", "lifecycleState", "events_created_timestamp", "events_created_principalId", "events_modified_timestamp", "sharedUsers", "sharedGroups", "workflowEntityId", "flowSuspensionReason", "templateName", "flowFailureAlertSubscribed", "sharingType"}),
 #"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"resourceId", type text}, {"environmentId", type text}, {"type", type text}, {"subType", type text}, {"name", type text},  {"tenantId", type text}, {"lifecycleState", type text}, {"resourceVersion", type text}, {"events_created_timestamp", type datetime}, {"events_created_principalId", type text}, {"events_modified_timestamp", type datetime}, {"sharedUsers", Int64.Type}, {"sharedGroups", Int64.Type}, {"workflowEntityId", type text}, {"flowSuspensionReason", type text}, {"templateName", type text}, {"flowFailureAlertSubscribed", type text}, {"sharingType", type text}})

in

"Changed Type"

in Source; shared #"Sample environment file" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/environments"),

"Filtered hidden files" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),

Navigation = #"Filtered hidden files"{0}[Content]

in Navigation; shared #"Transform enviornment file" = let Source = (Parameter as binary) => let Source = Table.FromColumns({Lines.FromBinary(Parameter)}),

"Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

#"Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}, {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"environmentId", type text}, {"environmentName", type text}, {"isDefault", type logical}, {"environmentState", type text}, {"environmentUrl", type text}, {"environmentType", type text}, {"securityGroup", type text}, {"purpose", type text}, {"cdsInstanceId", type text}, {"cdsInstanceUrl", type text}, {"createdTime", type datetime}, {"createdPrincipalId", type text}, {"lastModifiedTime", type text}, {"lastModifiedPrincipalId", type text}, {"deletedTime", type text}, {"environmentRegion", type text}, {"tenantGuid", type text}})

in

"Changed Type"

in Source; shared #"Sample file (4)" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/environments"),

"Filtered hidden files" = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true),

Navigation = #"Filtered hidden files"{0}[Content]

in Navigation; shared #"Parameter (4)" = let Parameter = #"Sample file (4)" meta [IsParameterQuery = true, IsParameterQueryRequired = true, Type = type binary] in Parameter; shared #"Transform file (4)" = let Source = (Parameter as binary) => let Source = Table.FromColumns({Lines.FromBinary(Parameter)}),

"Transformed Column" = Table.TransformColumns(Source, {"Column1", Json.Document}),

#"Expanded Column1" = Table.ExpandRecordColumn(#"Transformed Column", "Column1", {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}, {"environmentId", "environmentName", "isDefault", "environmentState", "environmentUrl", "environmentType", "securityGroup", "purpose", "cdsInstanceId", "cdsInstanceUrl", "createdTime", "createdPrincipalId", "lastModifiedTime", "lastModifiedPrincipalId", "deletedTime", "environmentRegion", "tenantGuid"}),
#"Changed Type" = Table.TransformColumnTypes(#"Expanded Column1", {{"environmentId", type text}, {"environmentName", type text}, {"isDefault", type logical}, {"environmentState", type text}, {"environmentUrl", type text}, {"environmentType", type text}, {"securityGroup", type text}, {"purpose", type text}, {"cdsInstanceId", type text}, {"cdsInstanceUrl", type text}, {"createdTime", type datetime}, {"createdPrincipalId", type text}, {"lastModifiedTime", type text}, {"lastModifiedPrincipalId", type text}, {"deletedTime", type text}, {"environmentRegion", type text}, {"tenantGuid", type text}})

in

"Changed Type"

in Source; shared Makers = let Source = OData.Feed(EnvironmentAPI & "/aadusers", null, [Implementation = "2.0"]),

"Merged queries" = Table.NestedJoin(Source, {"aaduserid"}, #"Combined Makers", {"createdBy"}, "Combined Makers", JoinKind.RightOuter),

#"Expanded Combined App and Flow Makers" = Table.ExpandTableColumn(#"Merged queries", "Combined Makers", {"createdBy"}, {"createdBy"}),

"Inserted IsOrphan" = Table.AddColumn(#"Expanded Combined App and Flow Makers", "Is Orphan", each if [aaduserid] = null then true else if [aaduserid] <> null then false else ""),

"Removed unneeded columns" = Table.RemoveColumns(#"Inserted IsOrphan", {"postalcode", "businessphones", "streetaddress", "mobilephone", "surname", "givenname", "createddatetime", "id", "imaddresses"}),

"Get distinct createdBy" = Table.Distinct(#"Removed unneeded columns", {"createdBy"}),

"Replaced mail value" = Table.ReplaceValue(#"Get distinct createdBy", null, "", Replacer.ReplaceValue, {"mail"}),

// SYSTEM maker will be added with it's own unique GUID in the setup wizard

"Filtered out SYSTEM maker" = Table.SelectRows(#"Replaced mail value", each ([createdBy] <> "SYSTEM")),

"Replaced UPN" = Table.ReplaceValue(#"Filtered out SYSTEM maker", null, "", Replacer.ReplaceValue, {"userprincipalname"}),

"Replaced preferred language" = Table.ReplaceValue(#"Replaced UPN", null, "", Replacer.ReplaceValue, {"preferredlanguage"}),

"Filtered blank makers" = Table.SelectRows(#"Replaced preferred language", each ([createdBy] <> null)),

"Replaced null display name with Unknown" = Table.ReplaceValue(#"Filtered blank makers", null, "Unknown", Replacer.ReplaceValue, {"displayname"}),

"Changed column type" = Table.TransformColumnTypes(#"Replaced null display name with Unknown", {{"createdBy", type text}})

in

"Changed column type";

shared #"App Makers" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/Applications"),

"Changed column type" = Table.TransformColumnTypes(Source, {{"Date modified", type date}}),

// Filter to files that have been updated since the last Dataflow refresh OR all files if initial inventory

"Filter to recent" = Table.SelectRows(#"Changed column type", each [Date modified] >= (if List.IsEmpty(DataflowRefresh) then Date.From("1900-01-01") else Date.From(DataflowRefresh{0}))),

  excludehiddenfiles = Table.SelectRows(#"Filter to recent", each [Attributes]?[Hidden]? <> true),
excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0),

"Replaced canvas.json" = Table.ReplaceValue(excludezerolengthfiles, "canvas.json", "", Replacer.ReplaceText, {"Name"}),

"Replaced model.json" = Table.ReplaceValue(#"Replaced canvas.json", "model.json", "", Replacer.ReplaceText, {"Name"}),

"Lowercased environment name" = Table.TransformColumns(#"Replaced model.json", {{"Name", each Text.Lower(_), type nullable text}}),

"Merged queries" = Table.NestedJoin(#"Lowercased environment name", {"Name"}, Environments, {"environmentId"}, "Environments", JoinKind.Inner),

  Getcontent = Table.AddColumn(#"Merged queries", "Transform file", each #"Transform app file"([Content])),

"Expanded file" = Table.ExpandListColumn(Getcontent, "Transform file"),

"Expanded createdBy" = Table.ExpandRecordColumn(#"Expanded file", "Transform file", {"createdPrincipalId"}, {"createdBy"}),

"Removed errors" = Table.RemoveRowsWithErrors(#"Expanded createdBy", {"createdBy"}),

#"Removed columns" = Table.RemoveColumns(#"Removed errors", {"Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path", "Name", "Content", "Environments"}),

"Removed duplicates" = Table.Distinct(#"Removed columns", {"createdBy"}),

"Filtered rows" = Table.SelectRows(#"Removed duplicates", each [createdBy] <> null)

in

"Filtered rows";

shared #"Environment Makers" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/environments"), excludehiddenfiles = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true), excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0),

"Get file content" = Table.AddColumn(excludezerolengthfiles, "Transform file", each #"Transform enviornment file"([Content])),

"Expanded Transform file" = Table.ExpandTableColumn(#"Get file content", "Transform file", {"createdPrincipalId"}, {"createdBy"}),

"Removed columns" = Table.RemoveColumns(#"Expanded Transform file", {"Content", "Name", "Extension", "Date accessed", "Date modified", "Date created", "Attributes", "Folder Path"}),

"Removed duplicates" = Table.Distinct(#"Removed columns", {"createdBy"}),

"Filter out duplicates" = Table.SelectRows(#"Removed duplicates", each [createdBy] <> null),

"Filtered out Microsoft Dynamics Deployment Service maker" = Table.SelectRows(#"Filter out duplicates", each [createdBy] <> "Microsoft Dynamics Deployment Service")

in

"Filtered out Microsoft Dynamics Deployment Service maker";

shared #"Flow Makers" = let Source = AzureStorage.DataLake(DatalakeURL & "/powerautomate/Flows"),

"Changed column type" = Table.TransformColumnTypes(Source, {{"Date modified", type date}}),

// Filter to files that have been updated since the last Dataflow refresh OR all files if initial inventory

"Filter to recent" = Table.SelectRows(#"Changed column type", each [Date modified] >= (if List.IsEmpty(DataflowRefresh) then Date.From("1900-01-01") else Date.From(DataflowRefresh{0}))),

  excludehiddenfiles = Table.SelectRows(#"Filter to recent", each [Attributes]?[Hidden]? <> true),
excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0),

"Replaced .json" = Table.ReplaceValue(excludezerolengthfiles, ".json", "", Replacer.ReplaceText, {"Name"}),

"Lowercased environment name" = Table.TransformColumns(#"Replaced .json", {{"Name", each Text.Lower(_), type nullable text}}),

"Merged with environments" = Table.NestedJoin(#"Lowercased environment name", {"Name"}, Environments, {"environmentId"}, "Environments", JoinKind.Inner),

Getcontent = Table.AddColumn(#"Merged with environments", "Transform file", each #"Transform flow file"([Content])),

"Removed errors" = Table.RemoveRowsWithErrors(Getcontent, {"Transform file"}),

"Removed columns" = Table.RemoveColumns(#"Removed errors", {"Folder Path", "Attributes", "Date created", "Date modified", "Date accessed", "Extension", "Environments"}),

"Expanded Transform file" = Table.ExpandTableColumn(#"Removed columns", "Transform file", {"events_created_principalId"}, {"createdBy"}),

"Removed errors 1" = Table.RemoveRowsWithErrors(#"Expanded Transform file", {"createdBy"}),

"Removed duplicates" = Table.Distinct(#"Removed errors 1", {"createdBy"}),

"Removed columns 1" = Table.RemoveColumns(#"Removed duplicates", {"Content", "Name"}),

"Filtered rows" = Table.SelectRows(#"Removed columns 1", each [createdBy] <> null)

in

"Filtered rows";

[Description = "Environment Web API endpoint. Get via make.powerapps.com > Settings Cog > Developer resources > Web API Endpoint value (example https://mycoe.api.crm.dynamics.com/api/data/v9.2)"] shared EnvironmentAPI = let EnvironmentAPI = "https://ppgovcoe.api.crm.dynamics.com/api/data/v9.2" meta [IsParameterQuery = true, IsParameterQueryRequired = true, Type = type any] in EnvironmentAPI; shared DatalakeURL = let DatalakeURL_Apps = "https://powerplatformreporting.dfs.core.windows.net/" meta [IsParameterQuery = true, IsParameterQueryRequired = true, Type = type any] in DatalakeURL_Apps; shared #"Combined Makers" = let Source = Table.Combine({#"Environment Makers", #"App Makers", #"Flow Makers"}),

"Removed duplicates" = Table.Distinct(Source, {"createdBy"})

in

"Removed duplicates";

shared Environments = let Source = AzureStorage.DataLake(DatalakeURL & "/powerapps/environments"), excludehiddenfiles = Table.SelectRows(Source, each [Attributes]?[Hidden]? <> true), excludezerolengthfiles = Table.SelectRows(excludehiddenfiles, each [Attributes]?[Size]? > 0),

"Get file content" = Table.AddColumn(excludezerolengthfiles, "Transform file", each #"Transform enviornment file"([Content])),

"Expanded Transform file" = Table.ExpandTableColumn(#"Get file content", "Transform file", {"environmentId"}, {"environmentId"}),

"Choose columns" = Table.SelectColumns(#"Expanded Transform file", {"environmentId"})

in

"Choose columns";

shared DataflowRefresh = let Source = OData.Feed(EnvironmentAPI & "/msdyn_dataflowrefreshhistories", null, [Implementation = "2.0"]),

"Filtered rows" = Table.SelectRows(Source, each ([msdyn_dataflowname] = "CoE BYODL Makers")),

"Sorted rows" = Table.Sort(#"Filtered rows", {{"msdyn_endtime", Order.Descending}}),

"Removed other columns" = Table.SelectColumns(#"Sorted rows", {"msdyn_endtime"}),

"Converted to list" = #"Removed other columns"[msdyn_endtime]

in

"Converted to list";

RickKush commented 9 months ago

The Makers Source has good data: image

But the next query step yields this problem: image

RickKush commented 9 months ago

I tried changing the DatalakeURL parameter, https://powerplatformreporting.blob.core.windows.net/, https://powerplatformreporting.dfs.core.windows.net/, https://powerplatformreporting.z13.web.core.windows.net/, but the error didn't change.

RickKush commented 9 months ago

At Refresh dataflow for initial inventory.

PowerQueryDataflows.RefreshDataflow failed: { "Message": "The refresh request is invalid 'https://unitedstates-002.azure-apim.net/apim/dataflows/1b557538431d4884b2a9c9e769926d32/api/groups/a626fa30-f1a6-ed81-8202-ad1dbe404d16/dataflows/ba2611a1-26d3-4456-af9a-d41589317851/refreshdataflow?workspaceType=Environment'", "Reason": "'InvalidRefreshRequest' - GroupType 'Environment'", "WorkspaceType": "Environment", "OperationId": "RefreshDataflow", "ClientRequestId": "726a7ad4-f060-4e23...

The previous steps look good: image

RickKush commented 9 months ago

image

All eight of the failures have a very similar error:
The dataflow could not be refreshed because there was a problem with the data sources credentials or configuration. Please update the connection credentials and configuration and try again. Data sources: For query "Apps", Schema evaluation did not return TableResult. Actual result type: Challenge, Bad credentials, Challenge type: Resource, DataSources: {"Kind":"OData","NormalizedPath":"https://pptoolsnewcore.api.crm.dynamics.com/api/data/v9.2","Path":"https://pptoolsnewcore.api.crm.dynamics.com/api/data/v9.2"}, Message: Credentials are required to connect to the OData source. (Source at https://pptoolsnewcore.api.crm.dynamics.com/api/data/v9.2.). (Request ID: 9f7a61af-479a-4720-9975-14a8981feb39).

Jenefer-Monroe commented 9 months ago

I'm afraid this is not an issue with the kit but with your configuration. However it is you got the Maker to work you'll need to also do to the others. You may be better served sticking with the sync flow architecture.

RickKush commented 9 months ago

These all look good now since I understood that I have to go into all of them. I thought I was only supposed to do Makers... Apparently they don't share the common 2 environment vars and the connection...

image

But I still get this error on Refresh dataflow for initial inventory: PowerQueryDataflows.RefreshDataflow failed: { "Message": "The refresh request is invalid 'https://unitedstates-002.azure-apim.net/apim/dataflows/1b557538431d4884b2a9c9e769926d32/api/groups/a626fa30-f1a6-ed81-8202-ad1dbe404d16/dataflows/ba2611a1-26d3-4456-af9a-d41589317851/refreshdataflow?workspaceType=Environment'", "Reason": "'InvalidRefreshRequest' - GroupType 'Environment'", "WorkspaceType": "Environment", "OperationId": "RefreshDataflow", "ClientRequestId": "555dc27a-d93d-4df0... image

RickKush commented 9 months ago

image

image

All five of the power automate cloud flows have identical errors which go away as soon as you go to edit the flow.

RickKush commented 9 months ago

They are all running: image The "failure" updated 8,798 rows and failed on 3! Also note that the 8 dataflows did not run automatically after Makers ran on schedule at 1 AM.

But Refresh dataflow for initial inventory still fails:
PowerQueryDataflows.RefreshDataflow failed: The function 'RefreshDataflow' has an invalid value for parameter 'groupIdForRefreshDataflow' - a blank value was passed to it where it was not expected. Please make sure that a valid argument is passed to the function.

RickKush commented 9 months ago

image

RickKush commented 9 months ago

image

RickKush commented 9 months ago

image

Jenefer-Monroe commented 9 months ago

Did you use the setup wizard to setup? And if so did you chose Data Export (rather than sync flow) because that sets up what should go here image

RickKush commented 9 months ago

Yes and Yes

Should I do it again?

From: Jenefer Monroe @.> Sent: Friday, February 9, 2024 11:50 AM To: microsoft/coe-starter-kit @.> Cc: Kushner, Rick [USA] @.>; State change @.> Subject: [External] Re: [microsoft/coe-starter-kit] Unable to turn on or fix connection references (Issue #7595)

Did you use the setup wizard to setup? And if so did you chose Data Export (rather than sync flow) because that sets up what should go here image.png (view on web)https://urldefense.com/v3/__https:/github.com/microsoft/coe-starter-kit/assets/89584709/5138a133-3836-49b4-8815-63452bb85f2c__;!!May37g!PSPB36q16BYsIdUDsBSKC7Nrksm3CACfobX4al0bSKwcImX2JPcVi-HY4yAsup4yr8hkt5n4fppsXQPSsE84L-wjaw$

- Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https:/github.com/microsoft/coe-starter-kit/issues/7595*issuecomment-1936259298__;Iw!!May37g!PSPB36q16BYsIdUDsBSKC7Nrksm3CACfobX4al0bSKwcImX2JPcVi-HY4yAsup4yr8hkt5n4fppsXQPSsE9Iv7_deg$, or unsubscribehttps://urldefense.com/v3/__https:/github.com/notifications/unsubscribe-auth/A32AAGR7ZV3B2UWTIOMXZ4LYSZHTPAVCNFSM6AAAAABC2OD266VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMZWGI2TSMRZHA__;!!May37g!PSPB36q16BYsIdUDsBSKC7Nrksm3CACfobX4al0bSKwcImX2JPcVi-HY4yAsup4yr8hkt5n4fppsXQPSsE93bhaf6w$. You are receiving this because you modified the open/close state.Message ID: @.**@.>>

Jenefer-Monroe commented 9 months ago

That is not what the group looks like, its a strange string. We do have a flow in Core intended to help with this. Go find this flow in Core, edit it and chose your CoE envt here, then chose the Maker Dataflow (it will lose that selection when you pick your envt) Save and then turn the flow on

Image

Manually kick off the Maker dataflow and when its done this flow should run Image

Once it finishes, open the run and you'll see what that string is supposed to look like. You'll see its something unusual like this Image

This should update the env var for you. Double check in the Command Center, if it doesnt plesae fix up there yourself. Restart al lthe BYODL flows after this to be sure they get the correct new env var and then they should work

RickKush commented 8 months ago

image

a626fa30-f1a6-ed81-8202-ad1dbe404d16 is the correct environment for the CoE image This is the same error as the cloud dataflows: The trigger is invalid 'https://flow-apim-unitedstates-002-eastus-01.azure-apim.net/apim/dataflows/2d9ab9e8b4f04e57b90ac8c200949095/api/groups/_**a626fa30-f1a6-ed81-8202-ad1dbe404d16**_/dataflows/ba2611a1-26d3-4456-af9a-d41589317851/onrefreshcomplete?workspaceType=Environment'

image

image

RickKush commented 8 months ago

This should update the env var for you. Double check in the Command Center, if it doesnt plesae fix up there yourself. Restart al lthe BYODL flows after this to be sure they get the correct new env var and then they should work

Which env var? And what should I set it to?

I set the Tenant ID because I noticed that it was blank, but that didn't help or change anything.

Jenefer-Monroe commented 8 months ago

Sorry I'm afraid you are hitting way more configuration issues that normal. I'm not sure we are going to be able to resolve this over GitHub. You may be better served by reverting back to Sync Flow architecture.

RickKush commented 8 months ago

Could you just please try to answer this question about what you told me to do?

This should update the env var for you. Double check in the Command Center, if it doesnt plesae fix up there yourself. Restart al lthe BYODL flows after this to be sure they get the correct new env var and then they should work

Which env var? And what should I set it to?

I set the Tenant ID because I noticed that it was blank, but that didn't help or change anything.

RickKush commented 8 months ago

Would you have any warnings for me if I choose a new install in a new environment? Would I need to migrate the existing data? Or would it get everything on the first run?