Open Abahski opened 2 years ago
Hello 😄 , num_feat
is the number of features chosen, max_feat
is total features dataset has, (num_feat / max_feat)
stands for the ratio of the chosen features to total features. Using beta * (num_feat / max_feat)
, i suppose, is to control the number of features we choose. If the num_feat
is too much, then cost would become bigger.
Hi good morning, I wonder what is cost code in here and what does it do?
cost = alpha * error + beta * (num_feat / max_feat)
And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?
Hi,
You can also use error rate, there is nothing wrong with it.
The cost function is to ensure a lower feature size while reducing the error rate. (normally we set alpha = 0.99 and beta = 0.01)
Best Regards, Jingwei Too
On Wed, Apr 20, 2022 at 5:51 PM liansyyy @.***> wrote:
Hello 😄 , num_feat is the number of features chosen, max_feat is total features dataset has, (num_feat / max_feat) stands for the ratio of the chosen features to total features. Using beta * (num_feat / max_feat), i suppose, is to control the number of features we choose. If the num_feat is too much, then cost would become bigger.
Hi good morning, I wonder what is cost code in here and what does it do?
cost = alpha error + beta (num_feat / max_feat)
And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?
— Reply to this email directly, view it on GitHub https://github.com/JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17#issuecomment-1103727930, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHKG2GZPLD5UFNUFO35UVFDVF7HTDANCNFSM5RAU7QWQ . You are receiving this because you are subscribed to this thread.Message ID: <JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17/1103727930@ github.com>
Thank you 😄
Can we say that cost function is the upgrade version from error rate? because you say that cost function reduce the error rate.
Hi, You can also use error rate, there is nothing wrong with it. The cost function is to ensure a lower feature size while reducing the error rate. (normally we set alpha = 0.99 and beta = 0.01) Best Regards, Jingwei Too … On Wed, Apr 20, 2022 at 5:51 PM liansyyy @.**> wrote: Hello 😄 , num_feat is the number of features chosen, max_feat is total features dataset has, (num_feat / max_feat) stands for the ratio of the chosen features to total features. Using beta (num_feat / max_feat), i suppose, is to control the number of features we choose. If the num_feat is too much, then cost would become bigger. Hi good morning, I wonder what is cost code in here and what does it do? cost = alpha error + beta (num_feat / max_feat) And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate? — Reply to this email directly, view it on GitHub <#17 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHKG2GZPLD5UFNUFO35UVFDVF7HTDANCNFSM5RAU7QWQ . You are receiving this because you are subscribed to this thread.Message ID: <JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17/1103727930@ github.com>
Hi good morning, I wonder what is cost code in here and what does it do?
cost = alpha * error + beta * (num_feat / max_feat)
And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?