vmware-archive / pxf-field

Prototype PXF extensions for HAWQ
Apache License 2.0
36 stars 15 forks source link

JDBC connector #17

Open Nasaa169 opened 9 years ago

Nasaa169 commented 9 years ago

The JDBC connector is not built successfully.. Is there anyway I can create HAWQ external tables from JDBC??

tzolov commented 9 years ago

@Nasaa169 could you please explain what do you mean by "is not build successfully". Also which versions do you use? Are you using the Maven artefacts or you are trying to build it yourself?

Nasaa169 commented 9 years ago

When I'm trying to build the jdbc pxf connector, it fails and gives this error:


[INFO] Building jdbc-pxf-ext 3.0.0.0-18-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ jdbc-pxf-ext --- [INFO] [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ jdbc-pxf-ext --- [debug] execute contextualize [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 1 resource [INFO] [INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ jdbc-pxf-ext --- [INFO] Changes detected - recompiling the module! [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [INFO] Compiling 2 source files to /Pivotal-Field-Engineering-pxf-field-v2.0.1.0-0-35-gf0bc563/Pivotal-Field-Engineering-pxf-field-f0bc563/jdbc-pxf-ext/target/classes [INFO] [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ jdbc-pxf-ext --- [debug] execute contextualize [WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent! [INFO] Copying 1 resource [INFO] [INFO] --- maven-compiler-plugin:3.0:testCompile (default-testCompile) @ jdbc-pxf-ext --- [INFO] Changes detected - recompiling the module! [WARNING] File encoding has not been set, using platform encoding UTF-8, i.e. build is platform dependent! [INFO] Compiling 1 source file to /Pivotal-Field-Engineering-pxf-field-v2.0.1.0-0-35-gf0bc563/Pivotal-Field-Engineering-pxf-field-f0bc563/jdbc-pxf-ext/target/test-classes [INFO] [INFO] --- maven-surefire-plugin:2.10:test (default-test) @ jdbc-pxf-ext --- [INFO] Surefire report directory: /Pivotal-Field-Engineering-pxf-field-v2.0.1.0-0-35-gf0bc563/Pivotal-Field-Engineering-pxf-field-f0bc563/jdbc-pxf-ext/target/surefire-reports


T E S T S

Running com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest Tests run: 14, Failures: 0, Errors: 14, Skipped: 0, Time elapsed: 1.209 sec <<< FAILURE!

Results :

Tests in error: testNoPasswordNotRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoPasswordNotRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoUserButRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoUserButRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoPasswordButRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoPasswordButRequired(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testBigInt(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testBigInt(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoJdbcDriver(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoJdbcDriver(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoDBUrl(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoDBUrl(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoTable(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure testNoTable(com.pivotal.pxf.plugins.jdbc.JdbcMySqlExtensionTest): Communications link failure

Tests run: 14, Failures: 0, Errors: 14, Skipped: 0

[INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 8.346s [INFO] Finished at: Tue Aug 25 07:37:44 EDT 2015 [INFO] Final Memory: 19M/454M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.10:test (default-test) on project jdbc-pxf-ext: There are test failures. [ERROR] [ERROR] Please refer to /Pivotal-Field-Engineering-pxf-field-v2.0.1.0-0-35-gf0bc563/Pivotal-Field-Engineering-pxf-field-f0bc563/jdbc-pxf-ext/target/surefire-reports for the individual test results. [ERROR] -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException cp: cannot stat `target/*.jar': No such file or directory


Nasaa169 commented 9 years ago

I build it by running build-all.sh file and then the dependencies one

tzolov commented 9 years ago

At the moment the jdbc test expects a running MySQL DB to connect to and run the test. This is wrong and will be addressed in the future making the test using a dedicated in-memory db instead.

For the time being you can build the project by skipping the tests: mvn clean install -DskipTests or even better download th pre-build PHD-3.0.1 pxf-field jars like jdbc-pxf-ext-3.0.1.0-1.jar.

Does this help?

Nasaa169 commented 9 years ago

Thanks a lot.. I'll try it and let you know what happens.

I have a question regarding the json pxf connector, how many array levels does it support?? And how can I define a column to get "text" in this example please:

"entities":{"hashtags":[{"indices":[139,140],"text":"EMC"}]}

Nasaa169 commented 9 years ago

I downloaded the jdbc connecto.jar, how shall I use it in HAWQ when creating external tables please?

kojec commented 9 years ago

assuming you have copied jars ( jdbc driver & jdbc pxf ) to right place and edit file /etc/gphd/pxf/conf/pxf-public.classpath to point to that place , also you have jdbc profile added in /etc/gphd/pxf/conf/pxf-profiles.xml. In your hawq client you type as below:

DROP EXTERNAL TABLE mysql_test_ext; CREATE WRITABLE EXTERNAL TABLE mysql_test_ext( kol1 varchar(20), kol2 varchar(20) )location('pxf://localhost:50070/my_path?PROFILE=JDBC&JDBC_DRIVER=com.mysql.jdbc.Driver&DB_URL=jdbc:mysql://localhost:3306/monkey&USER=root&PASS=root&TABLE_NAME=monkey.monkey') format 'CUSTOM' (FORMATTER='pxfwritable_export');

insert into mysql_test_ext values ('a','b'); insert into mysql_test_ext values ('a','b'); insert into mysql_test_ext values ('d','o');

then in mysql database you can check that rows have been written mysql> select * from monkey.monkey; +------+------+ | kol1 | kol2 | +------+------+ | a | b | | a | b | | d | o | +------+------+ 3 rows in set (0.00 sec)