Open dkarzon opened 4 years ago
have you fixed your problem ?
The same is happening to me. Anyone had any luck with this?
Are you using the catalog to choose the fields that you want via selected
metadata? That should allow you to limit the fields that get emitted even though the schema has all the properties in it.
Yes, I'm! I've only selected like 4-5 properties using "selected: true" and the others are explicitly set to false.
Well, I've done some testing and:
In case anyone else is hitting this, it's a real bummer. I'm not intimately familiar with the code in this repo but for now, I went into tap_hubspot/__init__.py
(locally) to line 149 and changed it from if extras:
to if False and extras:
.
Yes, it's a hack and yes, I don't quite understand the ramifications of not syncing the extra data associated with properties. At the same time, it prevents the explosion of columns needed to pipe the data in Postgres which might be all people need.
I am trying to setup a hubspot tap with a postgres target and I keep getting an error about postgres trying to create a table with more than 1600 columns in it. Even though at the time my schema only had 4 properties in it.
However going through the code it looks like if deals is selected as a stream the schema is automatically applied from the json output of this api call https://api.hubapi.com/properties/v1/deals/properties See code here: https://github.com/singer-io/tap-hubspot/blob/master/tap_hubspot/__init__.py#L191
Is there a way to modify he computed schema for the deals stream at all to remove the properties that I don't need? I haven't been able to find a way to do that at the moment.
My deals schema: