JingweiToo / Wrapper-Feature-Selection-Toolbox-Python

This toolbox offers 13 wrapper feature selection methods (PSO, GA, GWO, HHO, BA, WOA, and etc.) with examples. It is simple and easy to implement.
BSD 3-Clause "New" or "Revised" License
244 stars 69 forks source link

About cost in FunctionHO.py #17

Open Abahski opened 2 years ago

Abahski commented 2 years ago

Hi good morning, I wonder what is cost code in here and what does it do?

cost = alpha * error + beta * (num_feat / max_feat)

And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?

liansyyy commented 2 years ago

Hello 😄 , num_feat is the number of features chosen, max_feat is total features dataset has, (num_feat / max_feat) stands for the ratio of the chosen features to total features. Using beta * (num_feat / max_feat), i suppose, is to control the number of features we choose. If the num_feat is too much, then cost would become bigger.

Hi good morning, I wonder what is cost code in here and what does it do?

cost = alpha * error + beta * (num_feat / max_feat)

And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?

JingweiToo commented 2 years ago

Hi,

You can also use error rate, there is nothing wrong with it.

The cost function is to ensure a lower feature size while reducing the error rate. (normally we set alpha = 0.99 and beta = 0.01)

Best Regards, Jingwei Too

On Wed, Apr 20, 2022 at 5:51 PM liansyyy @.***> wrote:

Hello 😄 , num_feat is the number of features chosen, max_feat is total features dataset has, (num_feat / max_feat) stands for the ratio of the chosen features to total features. Using beta * (num_feat / max_feat), i suppose, is to control the number of features we choose. If the num_feat is too much, then cost would become bigger.

Hi good morning, I wonder what is cost code in here and what does it do?

cost = alpha error + beta (num_feat / max_feat)

And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate?

— Reply to this email directly, view it on GitHub https://github.com/JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17#issuecomment-1103727930, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHKG2GZPLD5UFNUFO35UVFDVF7HTDANCNFSM5RAU7QWQ . You are receiving this because you are subscribed to this thread.Message ID: <JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17/1103727930@ github.com>

liansyyy commented 2 years ago

Thank you 😄

Abahski commented 2 years ago

Can we say that cost function is the upgrade version from error rate? because you say that cost function reduce the error rate.

Hi, You can also use error rate, there is nothing wrong with it. The cost function is to ensure a lower feature size while reducing the error rate. (normally we set alpha = 0.99 and beta = 0.01) Best Regards, Jingwei Too … On Wed, Apr 20, 2022 at 5:51 PM liansyyy @.**> wrote: Hello 😄 , num_feat is the number of features chosen, max_feat is total features dataset has, (num_feat / max_feat) stands for the ratio of the chosen features to total features. Using beta (num_feat / max_feat), i suppose, is to control the number of features we choose. If the num_feat is too much, then cost would become bigger. Hi good morning, I wonder what is cost code in here and what does it do? cost = alpha error + beta (num_feat / max_feat) And i know that you use error rate as fitness, but why use cost value as fitness instead of error rate? — Reply to this email directly, view it on GitHub <#17 (comment)>, or unsubscribe https://github.com/notifications/unsubscribe-auth/AHKG2GZPLD5UFNUFO35UVFDVF7HTDANCNFSM5RAU7QWQ . You are receiving this because you are subscribed to this thread.Message ID: <JingweiToo/Wrapper-Feature-Selection-Toolbox-Python/issues/17/1103727930@ github.com>