Tokenization is a simple process that takes raw data and converts it into a useful data string. The proposed model turns any Hindi sentence into tokens and also helps to detect usage of any language used other than Hindi.
@HemanthSai7 Please review this PR (Pull Request) and label this PR as "hacktoberfest-accepted" and "hacktoberfest-2022".
Tokenization is a simple process that takes raw data and converts it into a useful data string. The proposed model turns any Hindi sentence into tokens and also helps to detect usage of any language used other than Hindi. @HemanthSai7 Please review this PR (Pull Request) and label this PR as "hacktoberfest-accepted" and "hacktoberfest-2022".