MicrosoftLearning / mslearn-fabric

This repository hosts content related to Microsoft Fabric content on Microsoft Learn.
https://microsoftlearning.github.io/mslearn-fabric/
MIT License
184 stars 145 forks source link

Create database for fabric_lakehouse is not permitted using Apache Spark in Microsoft Fabric. #21

Closed linusyoung closed 1 year ago

linusyoung commented 1 year ago

Module: 01

Lab/Demo: 03

Task: Create a managed table

Step: 1

Description of issue

I got below error message AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Create database for fabric_lakehouse is not permitted using Apache Spark in Microsoft Fabric. )

Repro steps:

1.Follow all steps upto Create a managed table 2.Run the code

Mihai-Ac commented 1 year ago

Hello @linusyoung

I have run the lab several times, but couldn´t reproduce the issue. The managed table is created every time without issues. image

Can you confirm you are using a Trial license mode for the workspace that you have created? image

Also, can you reproduce this error in another workspace? if so, can you please include some screenshots?

Regards, Mihai

linusyoung commented 1 year ago

@Mihai-Ac

I experienced the same issue from the other tutorial (Lakehouse tutorial introduction). Not sure if I missed any permission settings.

Here is Workspace setting Workspace setting

Items in the Workspace. I created a new workspace for this tutorial. Workspace Items

Notebook 1

Notebook 2

Below is the last message inte logs 2023-05-30 03:11:21,742 INFO CreateAdviseEventHandler [Thread-43]: Sending DriverError to Advise Hub: Map(_source -> system, _jobGroupId -> 1, _detail -> None, _executionId -> -1, _level -> error, _helpLink -> , _description -> Error reading or writing to the Metastore. Among other things the Metastore keeps track of information about Spark tables. While the Metastore doesn't store the information contained in the tables, it contains information for Spark to read and write to those tables. 1. Ensure that the table doesn't already exist if you're attempting to create a table. 2. Ensure that your user account has the correct permissions on the has "Storage Blob Data Contributor" role on the ADLS Gen2 storage account that is storing the Metadata information. 3. Check the logs for this Spark application by clicking the Monitor tab in left side of the Synapse Studio UI, select "Apache Spark Applications" from the "Activities" section, and find your Spark job from this list. Inspect the logs available in the "Logs" tab in the bottom part of this page for a clearer indication of which table relation is causing this issue., _name -> Spark_System_MetaStore_HiveException) 2023-05-30 03:11:21,742 INFO KustoHandler [Thread-43]: Logging DriverError with appId: null to Kusto: Map(_source -> system, _jobGroupId -> 1, _detail -> None, _executionId -> -1, _level -> error, _helpLink -> , _description -> Error reading or writing to the Metastore. Among other things the Metastore keeps track of information about Spark tables. While the Metastore doesn't store the information contained in the tables, it contains information for Spark to read and write to those tables. 1. Ensure that the table doesn't already exist if you're attempting to create a table. 2. Ensure that your user account has the correct permissions on the has "Storage Blob Data Contributor" role on the ADLS Gen2 storage account that is storing the Metadata information. 3. Check the logs for this Spark application by clicking the Monitor tab in left side of the Synapse Studio UI, select "Apache Spark Applications" from the "Activities" section, and find your Spark job from this list. Inspect the logs available in the "Logs" tab in the bottom part of this page for a clearer indication of which table relation is causing this issue., _name -> Spark_System_MetaStore_HiveException)

GraemeMalcolm commented 1 year ago

This looks like a permissions / configuration issue in the tenant (specifically the underlying ADLS store). We can't repro it with a clean setup, and unfortunately with an early preview service like this it can be difficult to track down the specific problem. The issue doesn't appear to be an error in the content, so closing this. If you figure out the cause of the problem in your tenant, please let us know!

linusyoung commented 1 year ago

@GraemeMalcolm, Can you point me some direction regarding ADLS storage? In the Trial capacity, I can't find any related settings. I assume the underlying ADLS storage is configured automatically when I can create a new workspace with this trial capacity. Also, I checked the tenant setting in the Admin Portal, it does enable the Fabric Items.

image

image

GraemeMalcolm commented 1 year ago

The storage is indeed set up automatically along with your workspace. This seems to be some sort of issue in your tenant, which could be related to things like the datacenter region where your tenant is defined or some other setting in your subscription. The lab steps themselves work when tried against a newly created tenant, so it doesn't seem like the issue is in the steps. Unfortunately, we can only support the lab content here - not the product itself. You can check regional status for the service and find FAQs for common known issues at https://support.fabric.microsoft.com/en-US/support/, or engage with the community there to try to resolve the issue you're having.

linusyoung commented 1 year ago

@GraemeMalcolm Thanks for your response. I will check with our IT department to see if they can provide more information.

jcbendernh commented 1 year ago

@linusyoung , I was running into the same problem and was able to resolve it by going into the Lakehouse Explorer of my Notebook and selected "Remove all Lakehouses" and then I added back the desired lakehouse and I was able to successfully create the delta table in my lakehouse. See screenshot for reference. Screenshot 2023-06-04 142101

michael-mertens commented 5 months ago

Thank you, @jcbendernh , your answer is about 1 year old, but the problem still occurs and your workaround still solves the problem!