issues
search
ai-se
/
Pits_lda
IST journal 2017: Tuning LDA
https://github.com/amritbhanu/LDADE-package
4
stars
4
forks
source link
Review - 04/27/2016
#17
Closed
amritbhanu
closed
7 years ago
amritbhanu
commented
8 years ago
Literature Survey
All the literature survey from SE domain.
Searched for
lda topics stable OR unstable OR coherence
on google scholar. Top cited papers.
7 out of 9 papers stated topics unstable. So, went for manual validation of topics to do further experiments.
1 paper gave a very strong statement that lda is stable. Data not available to verify their results.
Some other lda toolkit mentioned are
GibbsLDA++
.
Some have talked about playing with different configurations. Common Parameters are, k, a, b, i. (BASICALLY TALKING ABOUT TUNING)
Non deterministic so not many people have bothered about how their results might vary
Refer this for more details
DE Experiment
Implemented DE with CR = 0.3, mutation = 0.7
Termination criteria, no of iterations.
Tuning Parameters, k = (10,30), a=(0,1), b=(0,1). Based on literature review.
Results
Tuning helped in various datasets for higher no of terms overlap. Or atleast remained same as default parameters.
No change just by tuning a,b
Higher k value improved the stability.
Less b improved the stability.
amritbhanu
commented
8 years ago
To Dos:
[x] Read Papers other than SE domain.
[x] Verify that tuning really helps for other datasets like SE, SO, Python programming and many more.
[x] Extend x axis down to 1
[x] 4 different DEs pop size =10, 30 and reverse CR and Mutation
[x] Read Weis paper, and verify hypothesis.
Literature Survey
DE Experiment
Results