tairov / lamatune

LLama implementations benchmarking framework
https://engiware.com/benchmark/llama2-ports-extensive-benchmarks-mac-m1-max.html
Other
10 stars 1 forks source link

Unable to compile llama2mojo #2

Open Seyamalam opened 2 months ago

Seyamalam commented 2 months ago

When running the bash build_all.sh command most of them are building but mojo is giving an error, $ bash build_all.sh /workspaces/lamatune/bench /workspaces/lamatune/bench /opt/homebrew/opt/llvm/bin/clang -Ofast -fopenmp -march=native run.c -lm -o run make: /opt/homebrew/opt/llvm/bin/clang: Command not found make: *** [Makefile:36: runomp] Error 127 gcc -Ofast -o run run.c -lm gcc -Ofast -o runq runq.c -lm /workspaces/lamatune/llama2.mojo/llama2.mojo:2:47: error: package 'algorithm' does not contain 'unroll' from algorithm import vectorize, parallelize, unroll ^ /workspaces/lamatune/llama2.mojo/llama2.mojo:173:18: error: no matching function in call to 'memcpy' memcpy[UInt8](str, s1, l1)


/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'DTypePointer'
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:174:18: error: no matching function in call to 'memcpy'
    memcpy[UInt8](str.offset(l1), s2, l2)
    ~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'DTypePointer'
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:208:49: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    inout array: PointerStrings, inout indices: DynamicVector[Int], low: Int, high: Int
                                                ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:212:31: error: unexpected token in expression
    for jj in range(low, high):
                              ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:212:31: error: statements must start at the beginning of a line
    for jj in range(low, high):
                              ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:236:49: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    inout array: PointerStrings, inout indices: DynamicVector[Int], low: Int, high: Int
                                                ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:296:25: error: use of unknown declaration 'DynamicVector'
    var sorted_indices: DynamicVector[Int]
                        ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:517:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        a.simd_store[_nelts](j, a.simd_load[_nelts](j) + b.simd_load[_nelts](j))
        ~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:531:35: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        tmp.accumulate(x.offset(j).simd_load[_nelts](0) ** 2)
                       ~~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:542:25: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        var val = weight.simd_load[_nelts](j) * ss * x.simd_load[_nelts](j)
                  ~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:542:55: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        var val = weight.simd_load[_nelts](j) * ss * x.simd_load[_nelts](j)
                                                     ~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:543:20: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_store'
        o.offset(j).simd_store[_nelts](0, val)
        ~~~~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:569:20: error: 'Tensor[f32]' value has no attribute 'simd_load'
        var val = x.simd_load[_nelts](start + ii).reduce_max()
                  ~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:579:29: error: 'Tensor[f32]' value has no attribute 'simd_load'
        var val = math.exp(x.simd_load[_nelts](start + ii) - max_val)
                           ~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:580:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        x.simd_store[_nelts](start + ii, val)
        ~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:589:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        x.simd_store[_nelts](start + ii, x.simd_load[_nelts](start + ii) / ssum)
        ~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:598:20: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
    C: StaticTuple[n, BufferPtrFloat32],
                   ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:600:20: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
    B: StaticTuple[n, BufferPtrFloat32],
                   ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:606:31: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
        var tmp = StaticTuple[n, Accumulator[DType.float32, nelts]]()
                              ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:616:22: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
            var a = A.simd_load[_nelts](j)
                    ~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:730:26: error: no matching function in call to 'memcpy'
    memcpy[DType.float32](state.x.data(), content_row, dim)
    ~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'Pointer'
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:794:32: error: 'Tensor[f32]' value has no attribute 'simd_load'
                        state.q.simd_load[_nelts](q_offset + i)
                        ~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:795:42: error: 'Tensor[f32]' value has no attribute 'simd_load'
                        * state.key_cache.simd_load[_nelts](k_offset + i)
                          ~~~~~~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:818:39: error: 'Tensor[f32]' value has no attribute 'simd_load'
                    var xbi = state.xb.simd_load[_nelts](
                              ~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:820:46: error: 'Tensor[f32]' value has no attribute 'simd_load'
                    ) + a * state.value_cache.simd_load[_nelts](v_offset + i)
                            ~~~~~~~~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:821:29: error: 'Tensor[f32]' value has no attribute 'simd_store'
                    state.xb.simd_store[_nelts](xb_offset + i, xbi)
                    ~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:846:38: error: 'Tensor[f32]' value has no attribute 'simd_load'
            var initial_hb = state.hb.simd_load[_nelts](i)
                             ~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:850:21: error: 'Tensor[f32]' value has no attribute 'simd_store'
            state.hb.simd_store[_nelts](i, hbi * state.hb2.simd_load[_nelts](i))
            ~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:881:32: error: invalid call to 'rand': missing 1 required positional argument: 'size'
    var r = rand[DType.float32](1)
            ~~~~~~~~~~~~~~~~~~~^~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:1:1: note: function declared here
from algorithm import sum
^
/workspaces/lamatune/llama2.mojo/llama2.mojo:890:29: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
fn bpe_encode(inout tokens: DynamicVector[Int], text: String, inout tok: Tokenizer):
                            ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:891:32: error: unexpected token in expression
    for pos in range(len(text)):
                               ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:891:32: error: statements must start at the beginning of a line
    for pos in range(len(text)):
                               ^
/workspaces/lamatune/llama2.mojo/llama2.mojo:940:9: error: use of unknown declaration 'print_no_newline'
        print_no_newline(chr(str2num(d1) * 16 + str2num(d2)))
        ^~~~~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:945:9: error: use of unknown declaration 'print_no_newline'
        print_no_newline(chr(s[p].to_int()))
        ^~~~~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:1044:25: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    var prompt_tokens = DynamicVector[Int]()
                        ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:49:31: error: 'DTypePointer[T, 0]' value has no attribute 'simd_load'
        var newVal = self.data.simd_load[_width]() + val
                     ~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:50:18: error: 'DTypePointer[T, 0]' value has no attribute 'simd_store'
        self.data.simd_store[_width](newVal)
        ~~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:54:25: error: 'DTypePointer[T, 0]' value has no attribute 'simd_load'
        return self.data.simd_load[width]().reduce_add()
               ~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:111:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[nelts](idx)
               ~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:124:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[nelts](indices[0] * self._shape[1] + indices[1])
               ~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:127:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[1](idx)
               ~~~~~~~~~~^~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:130:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_store'
        return self._data.simd_store[nelts](idx, val)
               ~~~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:305:31: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
        self.sorted_indices = DynamicVector[Int]()
                              ^~~~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:382:40: error: 'List[SIMD[si8, 1]]' value has no attribute '_steal_ptr'
        var int32_ptr = config_data_raw._steal_ptr().bitcast[DType.int32]()
                        ~~~~~~~~~~~~~~~^~~~~~~~~~~
/workspaces/lamatune/llama2.mojo/llama2.mojo:469:27: error: 'List[SIMD[si8, 1]]' value has no attribute '_steal_ptr'
            var data = tmp._steal_ptr().bitcast[DType.float32]()
                       ~~~^~~~~~~~~~~
mojo: error: failed to parse the provided Mojo
tairov commented 2 months ago

Hi @Seyamalam , thanks for creating issue . Could you please share your OS details and modular --version output , also which version of Mojo are you on ?

Seyamalam commented 2 months ago

$ modular -v modular 0.7.1 (28ddab26)

$ mojo -v mojo 24.2.1 (2f0dcf11)

Distributor ID: Ubuntu Description: Ubuntu 20.04.6 LTS Release: 20.04 Codename: focal

tairov commented 2 months ago

Hi @Seyamalam , we have fixed compatibility issues in llama2.mojo Could you please try now ? Just pull latest changes from llama2 repo . Thanks