Closed rubypnchl closed 1 year ago
When you use them in either the update_topics
function or the fit
/fit_transform
functions, they should be saved within the model. As a result, they are used as a default when running hierarchical topic modeling.
When you use them in either the
update_topics
function or thefit
/fit_transform
functions, they should be saved within the model. As a result, they are used as a default when running hierarchical topic modeling.
Hi Maarten,
Yes, I completely understand and agree with your point. I am sorry for not clearly framing my query earlier. My query is: Can I apply topic representation models at only hierarchical level only without applying fit/fit_transform or using update_topics? I want to label topics to make them more readable to users and for the rest of the processing, I want to perform on original topic keywords only. I hope I am able to clear my query.
Can I apply topic representation models at only hierarchical level only without applying fit/fit_transform or using update_topics?
I am not sure if I understand correctly but you would always need to use at least fit
or fit_transform
to create your model. Whether that is with pre-labeled data for manual BERTopic or a semi-supervised approach.
I want to label topics to make them more readable to users and for the rest of the processing
Does this mean that you have manually labeled the topics? If so, then you would need to set the labels with .set_topic_labels
as that allows for custom labels to be set.
It is not clear to me, however, what representation models have to do with you manually labeling topics.
Closing this due to inactivity. Let me know if I need to re-open the issue!
Hi Maarten,
I am liking working with topic representation methods and I have applied them at two levels: topic generation (BERTopic) and using topic_model. update function. I am highly interested in applying the same representation model at hierarchical levels of the topics. Can you please guide me about the same, I have tried but am completely stuck.
Thanks in Advance!