diyorsattarov / 2

open-source 2
Other
1 stars 0 forks source link

LexerTests.TokenizeBasicTypes Test Failure #1

Closed diyorsattarov closed 1 year ago

diyorsattarov commented 1 year ago

The test LexerTests.TokenizeBasicTypes has failed. Here are the details of the failure:

First token to be of type INTEGER (TokenType::INTEGER), but it was of type <00-00 00-00>. Second token to be of type WHITESPACE (TokenType::WHITESPACE), but it was of type 6. Third token to be of type IDENTIFIER (TokenType::IDENTIFIER), but it was of type <00-00 00-00>. Fourth token to be of type WHITESPACE (TokenType::WHITESPACE), but it was of type 6. Fifth token to be of type WHITESPACE (TokenType::WHITESPACE), but it was of type 6. Sixth token to be of type INTEGER (TokenType::INTEGER), but it was of type <01-00 00-00>.

The test checks the tokenization of a simple code snippet "int x = 42;". The failure indicates a discrepancy between the expected and actual tokens.

diyorsattarov commented 1 year ago
admin: ~/workspace/cpp/compilator/build (fix-whitespace-issue)$ ./tests
Running main() from ./googletest/src/gtest_main.cc
[==========] Running 2 tests from 1 test suite.
[----------] Global test environment set-up.
[----------] 2 tests from LexerTests
[ RUN      ] LexerTests.TokenizeSimpleCode
Actual tokens:
Type: 0, Lexeme: int
Type: 0, Lexeme:  main
Type: 6, Lexeme: (
Type: 6, Lexeme: )
Type: 6, Lexeme: {
Type: 0, Lexeme:  return
Type: 1, Lexeme:  0
Type: 2, Lexeme: +
Type: 6, Lexeme: }
Type: 5, Lexeme: 
[       OK ] LexerTests.TokenizeSimpleCode (0 ms)
[ RUN      ] LexerTests.TokenizeWithWhitespace
Actual tokens:
Type: 0, Lexeme:   int
Type: 0, Lexeme:   x
Type: 6, Lexeme: =
Type: 1, Lexeme:   42
Type: 2, Lexeme: +
Type: 6, Lexeme: 
Type: 5, Lexeme: 
/home/admin/workspace/cpp/compilator/tests/lexer/lexer_tests.cpp:31: Failure
Expected equality of these values:
  tokens.size()
    Which is: 7
  4
[  FAILED  ] LexerTests.TokenizeWithWhitespace (0 ms)
[----------] 2 tests from LexerTests (0 ms total)

[----------] Global test environment tear-down
[==========] 2 tests from 1 test suite ran. (0 ms total)
[  PASSED  ] 1 test.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] LexerTests.TokenizeWithWhitespace

 1 FAILED TEST
TEST(LexerTests, TokenizeWithWhitespace) {
    Lexer lexer("  int  x   =  42  ;  ");
    std::vector<Token> tokens = lexer.tokenize();
    std::cout << "Actual tokens:\n";
    for (const Token& token : tokens) {
        std::cout << "Type: " << static_cast<int>(token.type) << ", Lexeme: " << token.lexeme << "\n";
    }
    ASSERT_EQ(tokens.size(), 4);
    // Ensure no whitespace tokens are generated
    for (const auto& token : tokens) {
        ASSERT_NE(token.type, TokenType::WHITESPACE);
    }
}