apache / gravitino

World's most powerful open data catalog for building a high-performance, geo-distributed and federated metadata lake.
https://gravitino.apache.org
Apache License 2.0
919 stars 297 forks source link

[Bug report]create schema error #4728

Open heziyi399 opened 2 weeks ago

heziyi399 commented 2 weeks ago

Version

0.5.1

Describe what's wrong

now I want to create a schema for fileset types,The code is as follows: @PostMapping("/addCatalogAndSchema") public void addCatalogAndSchema(@RequestBody FilesetRequest request) { NameIdentifier nameIdentifier = NameIdentifier.of("tbds", request.getName()); Map<String, String> properties = ImmutableMap.<String, String>builder() .put("location", hdfsSiteMap.getOrDefault("fs.defaultFS", "")) .build(); Catalog catalog = GravitinoClient.builder(getGravitinoUrl()).withMetalake(metaLakeName).build() .createCatalog( nameIdentifier, Catalog.Type.FILESET, "hadoop", request.getName(), properties); SupportsSchemas supportsSchemas = catalog.asSchemas(); Map<String, String> schemaProperties = ImmutableMap.<String, String>builder() .put("location", hdfsSiteMap.getOrDefault("fs.defaultFS", "")) .build(); Schema schema1 = supportsSchemas.createSchema( NameIdentifier.of("tbds", request.getName(), request.getSchemaName()), "This is a schema", schemaProperties); }

Error message and/or stacktrace

[Exception] ErrorCode -> FailedOperation, RequestURI -> /datamanager/apiv3/addCatalogAndSchema,errorMsg->Failed to operate schema(s) [schema] operation [CREATE] under catalog [filesetCatalog], reason [Relative path in absolute URI: hdfs://HDFS78000026schema]

How to reproduce

0.5.1

Additional context

No response

justinmclean commented 2 weeks ago

@mchades it's best to block and report the user first, then delete those comments.

mchades commented 2 weeks ago

@mchades it's best to block and report the user first, then delete those comments.

@justinmclean OK. Can you explain in detail how to block and report?

justinmclean commented 2 weeks ago

I've already blocked and reported them to GitHub, and they reacted fairly quickly to the previous user I reported. You can do this by clicking on their name and following the "Block or report" link under their picture. There's a lot of this going around, and it seems a lot of people's accounts have been hacked.

mchades commented 2 weeks ago

@justinmclean Take a look at the comments above, it seems like the block didn't work?

justinmclean commented 2 weeks ago

I assume GitHub needs to look at each one, and the block is not automatic. I've also informed ASF Infra about the issue.

yuqi1129 commented 2 weeks ago

@heziyi399 Please provide more detail about the problem, for example, the result of hdfsSiteMap.getOrDefault("fs.defaultFS", "")?

heziyi399 commented 2 weeks ago

@yuqi1129 the result of hdfsSiteMap.getOrDefault("fs.defaultFS", "") is “hdfs://HDFS78000026"

yuqi1129 commented 2 weeks ago

It seems like the improper way to handle the last '/' that causes the problem.

@jerryshao Do we need to fix it in release 0.6.0?

jerryshao commented 2 weeks ago

We can fix it in 0.6.1, it's not urgent.