Open tzonghao opened 5 years ago
I don't think we've tested the Kafka adapter in a while - my guess is that we have changed-up measurement metadata functionality since this adapter was written and there's a small bug here now.
FYI - I looked at the code where this error occurred and it's not obvious to me why this is failing. Perhaps it's related to the type of database you are using. What is your openPDC configuration database type??
I'm using SQLite.
On Tue, May 14, 2019 at 9:40 PM J. Ritchie Carroll notifications@github.com wrote:
FYI - I looked at the code where this error occurred and it's not obvious to me why this is failing. Perhaps it's related to the type of database you are using. What is your openPDC configuration database type??
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/GridProtectionAlliance/openPDC/issues/121?email_source=notifications&email_token=AADPCBLJ2LNCSWN32AMLBCLPVNSYDA5CNFSM4HM42TAKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGODVNIJUA#issuecomment-492471504, or mute the thread https://github.com/notifications/unsubscribe-auth/AADPCBOBZ2LDD6NWEG4CZHDPVNSYDANCNFSM4HM42TAA .
I can think of two things (1) CHAR in SQLite is not properly converting to string, or (2) DateTime is having cultural issues.
Is it possible to try another database to see if the error persists, such as, SQL Server Express?
Thanks, Ritchie
I can try PostgreSQL and maybe some other databases, sometime tomorrow. Thanks for looking into this.
I tried to use both PostgreSQL and MySQL, but openPDC failed to connect/setup with either, so there's no way to tell if they'd have the same issue. However, SQL Server Express 2017 works out of the box and does not have the same issue as with SQLite.
(Version 2.6 or 2.7) When Kafka adapter output is enabled, the error is reported and no metadata is produced in Kafka topic:
[KAFKA] Failed to serialize current time-series metadata records: Specified cast is not valid.