Closed karli2 closed 8 years ago
Thanks, Günter. I will take a look. Do you have the exception details?
I will try to reproduce tomorrow.
Btw: do you have tipps to speed up schema-information (tables, colums, foreign keys) for an oracle db. My account hat full-access, and it takes very long (20-30 secs) to produce the catalog information; and this is fast because I already restricted to one schema....
I’m currently building an web-app where the people can create special triggers ... and therefore I didn’t want to re-invent the wheel and just tried to use APIs already in the wild, so your API was choosen.... but somehow I still have this performance problem....
From: Sualeh Fatehi [mailto:notifications@github.com] Sent: Mittwoch, 18. Mai 2016 18:40 To: sualeh/SchemaCrawler SchemaCrawler@noreply.github.com Cc: Guenther k9455398@students.jku.at; Author author@noreply.github.com Subject: Re: [sualeh/SchemaCrawler] Release 14.08.02 - not loading Columns from Oracle (#68)
Thanks, Günter. I will take a look. Do you have the exception details?
— You are receiving this because you authored the thread. Reply to this email directly or view it on GitHubhttps://github.com/sualeh/SchemaCrawler/issues/68#issuecomment-220085786
Günter do you have the exception details?
sorry no - I hope next week. btw: I used SC on oracle 12 where the retrieval of foreign keys took 2 minutes (in a databae with 6 tables) ... but when i deployed my code to use an oracle 11 express this was done in a few seconds ...
Am 10.06.2016 um 01:37 schrieb Sualeh Fatehi notifications@github.com<mailto:notifications@github.com>:
G?nter do you have the exception details?
You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/sualeh/SchemaCrawler/issues/68#issuecomment-225057496, or mute the threadhttps://github.com/notifications/unsubscribe/AKRoLp78lXPLUTQROKn5EJDgiJrnVR3Gks5qKKN2gaJpZM4IhPOw.
The code is targeted to Oracle 11. If you send me the logs from Oracle 12, we can see what is taking long, and optimize that for Oracle 12.
therfore I do not need to start the computer :) the foreign keys are the biggest problem. they take approx. 2 minutes ... which is awfull, because as I already told you, there are just 5-6 tables in the database (my tables, there are thousands of oracles own tables). I searche the internet for a sql statement, and found one, and if I run that statement in sql developer, the result is in approx. 2 secs available
but I think today I will find time to have a look again on my code and will first try to get the exception again, and secondly will send you the log output
Von meinem iPad gesendet
Am 13.06.2016 um 01:28 schrieb Sualeh Fatehi notifications@github.com<mailto:notifications@github.com>:
The code is targeted to Oracle 11. If you send me the logs from Oracle 12, we can see what is taking long, and optimize that for Oracle 12.
You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/sualeh/SchemaCrawler/issues/68#issuecomment-225466801, or mute the threadhttps://github.com/notifications/unsubscribe/AKRoLpuR61WSALfwbMIfkGbz5KU0GR-hks5qLJXNgaJpZM4IhPOw.
Hi,
here is now some information regarding speed:
Attached you find my schema – that’s really all
Oracle: winx64_12102_SE2
My code:
SchemaInfoLevel schemaInfoLevel = new SchemaInfoLevel();
schemaInfoLevel.setTag("Custom Info Level");
schemaInfoLevel.setRetrieveTables(true);
schemaInfoLevel.setRetrieveTableColumns(true);
schemaInfoLevel.setRetrieveTriggerInformation(true);
schemaInfoLevel.setRetrieveForeignKeys(true); //setting this would increase the process in oracle, however, we need this to identifiy relationships
schemaInfoLevel.setRetrieveColumnDataTypes(false);
SchemaCrawlerOptions options = new SchemaCrawlerOptions();
options.setSchemaInfoLevel(schemaInfoLevel);
options.setSchemaInclusionRule(new RegularExpressionInclusionRule("\"C##FULL\""));
this.catalog = SchemaCrawlerUtility.getCatalog(conn, options);
As you see, I already restricted to just one schema. :)
Here to log-output from eclipse:
Jun 13, 2016 4:59:21 PM schemacrawler.utility.SchemaCrawlerUtility matchDatabaseSpecificOverrideOptions INFO: Using database plugin for oracle - Oracle Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlSchemas INFO: Crawling schemas Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaRetriever retrieveAllSchemas INFO: Retrieving all schemas Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaRetriever retrieveAllCatalogs INFO: Retrieving all catalogs Jun 13, 2016 4:59:21 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveAllSchemas" results had 37 rows Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlSchemas INFO: Total time taken for "crawlSchemas" - 00:00:00.067 hours
Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlDatabaseInfo INFO: Crawling SchemaCrawler information Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlDatabaseInfo INFO: Retrieving database information Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler lambda$crawlDatabaseInfo$12 INFO: Not retrieving additional database information, since this was not requested Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlDatabaseInfo INFO: Retrieving JDBC driver information Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler lambda$crawlDatabaseInfo$14 INFO: Not retrieving additional JDBC driver information, since this was not requested Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlDatabaseInfo INFO: Retrieving SchemaCrawler crawl information Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlDatabaseInfo INFO: Total time taken for "crawlDatabaseInfo" - 00:00:00.010 hours
Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlColumnDataTypes INFO: Crawling column data types Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler lambda$crawlColumnDataTypes$9 INFO: Not retrieving system column data types, since this was not requested Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler lambda$crawlColumnDataTypes$10 INFO: Not retrieving user column data types, since this was not requested Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlColumnDataTypes INFO: Total time taken for "crawlColumnDataTypes" - 00:00:00.000 hours
Jun 13, 2016 4:59:21 PM schemacrawler.crawl.SchemaCrawler crawlTables INFO: Crawling tables Jun 13, 2016 4:59:21 PM schemacrawler.crawl.TableRetriever retrieveTables INFO: Retrieving tables Jun 13, 2016 4:59:22 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveTables" results had 12 rows Jun 13, 2016 4:59:22 PM schemacrawler.crawl.TableColumnRetriever retrieveColumns INFO: Retrieving table columns, using fast data dictionary retrieval Jun 13, 2016 4:59:39 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveColumnsFromDataDictionary" results had 1749 rows Jun 13, 2016 4:59:39 PM schemacrawler.crawl.ForeignKeyRetriever retrieveForeignKeys INFO: Retrieving foreign keys, using SQL Jun 13, 2016 5:02:12 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveForeignKeysUsingSql" results had 102 rows Jun 13, 2016 5:02:12 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveForeignKeysUsingSql" results had 102 rows Jun 13, 2016 5:02:13 PM schemacrawler.crawl.SchemaCrawler lambda$crawlTables$30 INFO: Retrieving primary keys and indexes Jun 13, 2016 5:02:15 PM schemacrawler.crawl.TableExtRetriever retrieveTriggerInformation INFO: Retrieving trigger definitions 17:02:18 INFO repository.SchemaRepositorySC finished retrieving schema 17:02:18 INFO dbinfo.Column there was the datatype TIMESTAMP(6) on table EVENTSWOSOCKET for column TIMERAISED which could not be mapped to one of our dataTypes 17:02:18 INFO repository.SchemaRepositorySC GetTable BRAND Jun 13, 2016 5:02:18 PM schemacrawler.crawl.SchemaCrawler crawlTables INFO: Total time taken for "crawlTables" - 00:02:56.266 hours
Jun 13, 2016 5:02:18 PM schemacrawler.crawl.SchemaCrawler crawlRoutines INFO: Not retrieving routines, since this was not requested Jun 13, 2016 5:02:18 PM schemacrawler.crawl.SchemaCrawler crawlSynonyms INFO: Not retrieving synonyms, since this was not requested Jun 13, 2016 5:02:18 PM schemacrawler.crawl.SchemaCrawler crawlSequences INFO: Not retrieving sequences, since this was not requested
As you see, this takes really very very long. I implemented a singleton, so currently my app reads the Schema on the first request and then works with the singleton; however, my app changes the Schema, which will lead to problems when I cannot reload the Schema in a fast way.
From: Sualeh Fatehi [mailto:notifications@github.com] Sent: Montag, 13. Juni 2016 01:27 To: sualeh/SchemaCrawler SchemaCrawler@noreply.github.com Cc: Guenther k9455398@students.jku.at; Author author@noreply.github.com Subject: Re: [sualeh/SchemaCrawler] Release 14.08.02 - not loading Columns from Oracle (#68)
The code is targeted to Oracle 11. If you send me the logs from Oracle 12, we can see what is taking long, and optimize that for Oracle 12.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/sualeh/SchemaCrawler/issues/68#issuecomment-225466801, or mute the threadhttps://github.com/notifications/unsubscribe/AKRoLpuR61WSALfwbMIfkGbz5KU0GR-hks5qLJXNgaJpZM4IhPOw.
Here now the exception:
Jun 13, 2016 5:13:08 PM schemacrawler.crawl.TableExtRetriever retrieveTableDefinitions
INFO: Retrieving table definitions
Jun 13, 2016 5:13:43 PM sf.util.DatabaseUtility executeSql
WARNING: Error executing SQL, SELECT /+ PARALLEL(AUTO) /
NULL AS TABLE_CATALOG,
TABLES.OWNER AS TABLE_SCHEMA,
TABLES.TABLE_NAME,
DBMS_METADATA.GET_DDL('TABLE', TABLES.TABLE_NAME, TABLES.OWNER)
AS TABLE_DEFINITION
FROM
ALL_TABLES TABLES
INNER JOIN ALL_USERS USERS
ON TABLES.OWNER = USERS.USERNAME
WHERE
USERS.USERNAME NOT IN
('ANONYMOUS', 'APEX_PUBLIC_USER', 'BI', 'CTXSYS', 'DBSNMP', 'DIP',
'EXFSYS', 'FLOWS_30000', 'FLOWS_FILES', 'HR', 'IX', 'LBACSYS',
'MDDATA', 'MDSYS', 'MGMT_VIEW', 'OE', 'OLAPSYS', 'ORACLE_OCM',
'ORDPLUGINS', 'ORDSYS', 'OUTLN', 'OWBSYS', 'PM', 'SCOTT', 'SH',
'SI_INFORMTN_SCHEMA', 'SPATIAL_CSW_ADMIN_USR', 'SPATIAL_WFS_ADMIN_USR',
'SYS', 'SYSMAN', 'SYSTEM', 'TSMSYS', 'WKPROXY', 'WKSYS', 'WK_TEST',
'WMSYS', 'XDB', 'XS$NULL')
AND NOT REGEXPLIKE(USERS.USERNAME, '^APEX[0-9]{6}$')
AND NOT REGEXPLIKE(USERS.USERNAME, '^FLOWS[0-9]{5}$')
AND TABLES.TABLE_NAME NOT LIKE 'BIN$%'
ORDER BY
TABLE_SCHEMA,
TABLE_NAME
java.sql.SQLException: ORA-31603: Objekt "ORDDCM_PREFS_DEF_VALUES_TAB" vom Typ TABLE in Schema "ORDDATA" nicht gefunden ORA-06512: in "SYS.DBMS_METADATA", Zeile 6069 ORA-06512: in "SYS.DBMS_METADATA", Zeile 8666 ORA-06512: in Zeile 1
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:450)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:399)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:1059)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:522)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:257)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:587)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:210)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:30)
at oracle.jdbc.driver.T4CStatement.executeForRows(T4CStatement.java:931)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:957)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1111)
at oracle.jdbc.driver.OracleStatement.executeInternal(OracleStatement.java:1792)
at oracle.jdbc.driver.OracleStatement.execute(OracleStatement.java:1745)
at oracle.jdbc.driver.OracleStatementWrapper.execute(OracleStatementWrapper.java:334)
at sf.util.DatabaseUtility.executeSql(DatabaseUtility.java:145)
at sf.util.DatabaseUtility.executeSql(DatabaseUtility.java:120)
at schemacrawler.crawl.TableExtRetriever.retrieveTableDefinitions(TableExtRetriever.java:389)
at schemacrawler.crawl.SchemaCrawler.lambda$crawlTables$34(SchemaCrawler.java:652)
at schemacrawler.crawl.SchemaCrawler$$Lambda$40/1757156391.call(Unknown Source)
at sf.util.StopWatch.time(StopWatch.java:156)
at schemacrawler.crawl.SchemaCrawler.crawlTables(SchemaCrawler.java:649)
at schemacrawler.crawl.SchemaCrawler.crawl(SchemaCrawler.java:766)
at schemacrawler.utility.SchemaCrawlerUtility.getCatalog(SchemaCrawlerUtility.java:68)
at at.jku.cs.triggermanager.repository.SchemaRepositorySC.InitializeSchema(SchemaRepositorySC.java:98)
at at.jku.cs.triggermanager.repository.SchemaRepositorySC.<clinit>(SchemaRepositorySC.java:45)
at at.jku.cs.triggermanager.repository.SchemaRepositoryProvider.getSchemaRepository(SchemaRepositoryProvider.java:29)
at at.jku.cs.triggermanager.rest.TableResource.<init>(TableResource.java:42)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.glassfish.hk2.utilities.reflection.ReflectionHelper.makeMe(ReflectionHelper.java:1350)
at org.jvnet.hk2.internal.ClazzCreator.createMe(ClazzCreator.java:271)
at org.jvnet.hk2.internal.ClazzCreator.create(ClazzCreator.java:365)
at org.jvnet.hk2.internal.SystemDescriptor.create(SystemDescriptor.java:471)
at org.glassfish.jersey.process.internal.RequestScope.findOrCreate(RequestScope.java:162)
at org.jvnet.hk2.internal.Utilities.createService(Utilities.java:2022)
at org.jvnet.hk2.internal.ServiceLocatorImpl.internalGetService(ServiceLocatorImpl.java:765)
at org.jvnet.hk2.internal.ServiceLocatorImpl.getService(ServiceLocatorImpl.java:704)
at org.glassfish.jersey.internal.inject.Injections.getOrCreate(Injections.java:172)
at org.glassfish.jersey.server.model.MethodHandler$ClassBasedMethodHandler.getInstance(MethodHandler.java:284)
at org.glassfish.jersey.server.internal.routing.PushMethodHandlerRouter.apply(PushMethodHandlerRouter.java:74)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:109)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:112)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:112)
at org.glassfish.jersey.server.internal.routing.RoutingStage._apply(RoutingStage.java:112)
at org.glassfish.jersey.server.internal.routing.RoutingStage.apply(RoutingStage.java:92)
at org.glassfish.jersey.server.internal.routing.RoutingStage.apply(RoutingStage.java:61)
at org.glassfish.jersey.process.internal.Stages.process(Stages.java:197)
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:318)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.filters.CorsFilter.handleNonCORS(CorsFilter.java:436)
at org.apache.catalina.filters.CorsFilter.doFilter(CorsFilter.java:177)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1095)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1502)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1458)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Unknown Source)
Jun 13, 2016 5:13:43 PM schemacrawler.crawl.TableExtRetriever retrieveTableDefinitions
WARNING: Could not retrieve table definitions
java.lang.NullPointerException: Cannot use null results
at java.util.Objects.requireNonNull(Unknown Source)
at schemacrawler.crawl.MetadataResultSet.
Jun 13, 2016 5:13:43 PM schemacrawler.crawl.TableExtRetriever retrieveIndexInformation
Thanks for the detalis. I will open another issue for the slowness. For the exception, I believe that this has been fixed in the latest version of SchemaCrawler. Please could you check? Thanks.
As I wrote on the initial message: there was the problem with retrieveColumnsFromDataDictionary which resulted in just 1 column...
Ok, then I am missing something. Why use release 14.07.08 at all?
I tried to use schemacrawler on a brand new oracle-database, and get the tables and columns of one schema:
Mai 18, 2016 2:32:10 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveColumnsFromDataDictionary" results had 1 rows
the same code, but with the v14.07.08:
INFO: Retrieving table columns, using fast data dictionary retrieval Mai 18, 2016 2:34:18 PM schemacrawler.crawl.MetadataResultSet close INFO: "retrieveColumnsFromDataDictionary" results had 1749 rows
However, this release throws exception ....
Here is the code: