Open cugni opened 1 month ago
In the second example, the QbeastCatalog
is missing as a secondary catalog, and I think that's the problem when trying to recognize the format.
In the first one, I will look into the QbeastCatalog.loadTable
method. Seems that is not parsing the TableIdentifier
properly when a path is specified.
override def loadTable(ident: Identifier): Table = {
try {
getSessionCatalog().loadTable(ident) match {
case table
if QbeastCatalogUtils.isQbeastProvider(table.properties().asScala.get("provider")) =>
QbeastCatalogUtils.loadQbeastTable(table, tableFactory)
case o => o
}
} catch {
case _: NoSuchDatabaseException | _: NoSuchNamespaceException | _: NoSuchTableException
if QbeastCatalogUtils.isPathTable(ident) =>
QbeastTableImpl(
TableIdentifier(ident.name(), ident.namespace().headOption),
new Path(ident.name()),
Map.empty,
tableFactory = tableFactory)
}
}
What went wrong?
Clear, concise explanation and the expected behavior.
How to reproduce?
Different steps about how to reproduce the problem.
1. Code that triggered the bug, or steps to reproduce:
On the other hand, if we use the delta catalog
2. Branch and commit id:
0.7.0
3. Spark version:
3.5.0
4. Hadoop version:
3.3.4
5. How are you running Spark?
local computer
6. Stack trace:
Trace of the log/error messages when using QbeastCatalog