Open CarlitosDev opened 5 years ago
Thank u for the work.may i know what was the algorithms used in this project .
Thank u for the work.may i know what was the algorithms used in this project .
Hi Nathani,
The algorithm is an ensemble of self-weighted k-nearest neighbours. You can find the mathematical description in our paper in Section III. And the entry point in the code
How are you planning to use it? Let me know if I can be of any help
Carlos
sir but u used some kpi
On Thu, 3 Dec 2020 at 17:57, Carlos Aguilar notifications@github.com wrote:
Thank u for the work.may i know what was the algorithms used in this project .
Hi Nathani,
The algorithm is an ensemble of self-weighted k-nearest neighbours. You can find the mathematical description in our paper in Section III https://ieeexplore.ieee.org/document/8727882. And the entry point in the code https://github.com/CarlitosDev/nextDoor/blob/a1d116b1d5ea39c4cfd223adfd2600c3a992e704/nextDoorForecasterV2.py#L104
How are you planning to use it? Let me know if I can be of any help
Carlos
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/CarlitosDev/nextDoor/issues/1#issuecomment-737917013, or unsubscribe https://github.com/notifications/unsubscribe-auth/APH2ZXTEXLNL2IG3XEDI6WTSS572RANCNFSM4HZBPINA .
ohe algorithm. sir actually we are dng term paper by taking your ieee paper as reference i am new to machine learning sir
On Thu, 3 Dec 2020 at 17:59, Nathani Gowthami nathanigowthami6@gmail.com wrote:
sir but u used some kpi
On Thu, 3 Dec 2020 at 17:57, Carlos Aguilar notifications@github.com wrote:
Thank u for the work.may i know what was the algorithms used in this project .
Hi Nathani,
The algorithm is an ensemble of self-weighted k-nearest neighbours. You can find the mathematical description in our paper in Section III https://ieeexplore.ieee.org/document/8727882. And the entry point in the code https://github.com/CarlitosDev/nextDoor/blob/a1d116b1d5ea39c4cfd223adfd2600c3a992e704/nextDoorForecasterV2.py#L104
How are you planning to use it? Let me know if I can be of any help
Carlos
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/CarlitosDev/nextDoor/issues/1#issuecomment-737917013, or unsubscribe https://github.com/notifications/unsubscribe-auth/APH2ZXTEXLNL2IG3XEDI6WTSS572RANCNFSM4HZBPINA .
ohe algorithm. sir actually we are dng term paper by taking your ieee paper as reference i am new to machine learning sir On Thu, 3 Dec 2020 at 17:59, Nathani Gowthami nathanigowthami6@gmail.com wrote: … sir but u used some kpi On Thu, 3 Dec 2020 at 17:57, Carlos Aguilar @.***> wrote: > Thank u for the work.may i know what was the algorithms used in this > project . > > Hi Nathani, > > The algorithm is an ensemble of self-weighted k-nearest neighbours. You > can find the mathematical description in our paper in Section III > https://ieeexplore.ieee.org/document/8727882. And the entry point in > the code > https://github.com/CarlitosDev/nextDoor/blob/a1d116b1d5ea39c4cfd223adfd2600c3a992e704/nextDoorForecasterV2.py#L104 > > How are you planning to use it? Let me know if I can be of any help > > Carlos > > — > You are receiving this because you commented. > Reply to this email directly, view it on GitHub > <#1 (comment)>, > or unsubscribe > https://github.com/notifications/unsubscribe-auth/APH2ZXTEXLNL2IG3XEDI6WTSS572RANCNFSM4HZBPINA > . >
Welcome to ML then ;)
You can do OHE directly in Pandas as pd.get_dummies(df[varName], columns=varName, prefix=varName)
where varName
is the categorical variable that you want to encode.
Thanks for the tip!
On Thu, 3 Dec 2020 at 18:07, Carlos Aguilar notifications@github.com wrote:
ohe algorithm. sir actually we are dng term paper by taking your ieee paper as reference i am new to machine learning sir On Thu, 3 Dec 2020 at 17:59, Nathani Gowthami nathanigowthami6@gmail.com wrote: … <#m-1134551124792968154> sir but u used some kpi On Thu, 3 Dec 2020 at 17:57, Carlos Aguilar @.***> wrote: > Thank u for the work.may i know what was the algorithms used in this > project . > > Hi Nathani, > > The algorithm is an ensemble of self-weighted k-nearest neighbours. You > can find the mathematical description in our paper in Section III > https://ieeexplore.ieee.org/document/8727882. And the entry point in > the code > https://github.com/CarlitosDev/nextDoor/blob/a1d116b1d5ea39c4cfd223adfd2600c3a992e704/nextDoorForecasterV2.py#L104
How are you planning to use it? Let me know if I can be of any help > > Carlos > > — > You are receiving this because you commented. > Reply to this email directly, view it on GitHub > <#1 (comment) https://github.com/CarlitosDev/nextDoor/issues/1#issuecomment-737917013>, or unsubscribe > https://github.com/notifications/unsubscribe-auth/APH2ZXTEXLNL2IG3XEDI6WTSS572RANCNFSM4HZBPINA . >
Welcome to ML then ;) You can do OHE directly in Pandas as pd.get_dummies(df[varName], columns=varName, prefix=varName) where varName is the categorical variable that you want to encode.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/CarlitosDev/nextDoor/issues/1#issuecomment-737927717, or unsubscribe https://github.com/notifications/unsubscribe-auth/APH2ZXUWKG3AGXXOGNOGHMLSS6A77ANCNFSM4HZBPINA .
You're actually reading the issue now...