tairov / llama2.mojo

Inference Llama 2 in one file of pure 🔥
https://www.modular.com/blog/community-spotlight-how-i-built-llama2-by-aydyn-tairov
MIT License
2.09k stars 139 forks source link

mojo 24.2.1 errors #88

Closed oss-maintainer-12 closed 4 months ago

oss-maintainer-12 commented 4 months ago

With mojo 24.2.1 (58157dc0) in Google CoLab env

Running mojo llama2.mojo stories15M.bin -s 100 -n 256 -t 0.5 -i "Mojo is a language" yields:

/root/llama2.mojo/llama2.mojo:2:47: error: package 'algorithm' does not contain 'unroll'
from algorithm import vectorize, parallelize, unroll
                                              ^
/root/llama2.mojo/llama2.mojo:173:18: error: no matching function in call to 'memcpy'
    memcpy[UInt8](str, s1, l1)
    ~~~~~~~~~~~~~^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'DTypePointer'
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:174:18: error: no matching function in call to 'memcpy'
    memcpy[UInt8](str.offset(l1), s2, l2)
    ~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'DTypePointer'
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:208:49: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    inout array: PointerStrings, inout indices: DynamicVector[Int], low: Int, high: Int
                                                ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:212:31: error: unexpected token in expression
    for jj in range(low, high):
                              ^
/root/llama2.mojo/llama2.mojo:212:31: error: statements must start at the beginning of a line
    for jj in range(low, high):
                              ^
/root/llama2.mojo/llama2.mojo:236:49: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    inout array: PointerStrings, inout indices: DynamicVector[Int], low: Int, high: Int
                                                ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:296:25: error: use of unknown declaration 'DynamicVector'
    var sorted_indices: DynamicVector[Int]
                        ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:517:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        a.simd_store[_nelts](j, a.simd_load[_nelts](j) + b.simd_load[_nelts](j))
        ~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:531:35: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        tmp.accumulate(x.offset(j).simd_load[_nelts](0) ** 2)
                       ~~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:542:25: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        var val = weight.simd_load[_nelts](j) * ss * x.simd_load[_nelts](j)
                  ~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:542:55: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        var val = weight.simd_load[_nelts](j) * ss * x.simd_load[_nelts](j)
                                                     ~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:543:20: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_store'
        o.offset(j).simd_store[_nelts](0, val)
        ~~~~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:569:20: error: 'Tensor[f32]' value has no attribute 'simd_load'
        var val = x.simd_load[_nelts](start + ii).reduce_max()
                  ~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:579:29: error: 'Tensor[f32]' value has no attribute 'simd_load'
        var val = math.exp(x.simd_load[_nelts](start + ii) - max_val)
                           ~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:580:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        x.simd_store[_nelts](start + ii, val)
        ~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:589:10: error: 'Tensor[f32]' value has no attribute 'simd_store'
        x.simd_store[_nelts](start + ii, x.simd_load[_nelts](start + ii) / ssum)
        ~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:598:20: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
    C: StaticTuple[n, BufferPtrFloat32],
                   ^
/root/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:600:20: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
    B: StaticTuple[n, BufferPtrFloat32],
                   ^
/root/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:606:31: error: 'StaticTuple' parameter #0 has 'AnyRegType' type, but value has type 'Int'
        var tmp = StaticTuple[n, Accumulator[DType.float32, nelts]]()
                              ^
/root/llama2.mojo/llama2.mojo:1:1: note: 'StaticTuple' declared here
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:616:22: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
            var a = A.simd_load[_nelts](j)
                    ~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:730:26: error: no matching function in call to 'memcpy'
    memcpy[DType.float32](state.x.data(), content_row, dim)
    ~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: expected at most 2 positional arguments, got 3
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: failed to infer implicit parameter 'type' of argument 'dest' type 'Pointer'
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:1:1: note: candidate not viable: callee expects 0 parameters, but 1 was specified
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:794:32: error: 'Tensor[f32]' value has no attribute 'simd_load'
                        state.q.simd_load[_nelts](q_offset + i)
                        ~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:795:42: error: 'Tensor[f32]' value has no attribute 'simd_load'
                        * state.key_cache.simd_load[_nelts](k_offset + i)
                          ~~~~~~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:818:39: error: 'Tensor[f32]' value has no attribute 'simd_load'
                    var xbi = state.xb.simd_load[_nelts](
                              ~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:820:46: error: 'Tensor[f32]' value has no attribute 'simd_load'
                    ) + a * state.value_cache.simd_load[_nelts](v_offset + i)
                            ~~~~~~~~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:821:29: error: 'Tensor[f32]' value has no attribute 'simd_store'
                    state.xb.simd_store[_nelts](xb_offset + i, xbi)
                    ~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:846:38: error: 'Tensor[f32]' value has no attribute 'simd_load'
            var initial_hb = state.hb.simd_load[_nelts](i)
                             ~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:850:21: error: 'Tensor[f32]' value has no attribute 'simd_store'
            state.hb.simd_store[_nelts](i, hbi * state.hb2.simd_load[_nelts](i))
            ~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:881:32: error: invalid call to 'rand': missing 1 required positional argument: 'size'
    var r = rand[DType.float32](1)
            ~~~~~~~~~~~~~~~~~~~^~~
/root/llama2.mojo/llama2.mojo:1:1: note: function declared here
from algorithm import sum
^
/root/llama2.mojo/llama2.mojo:890:29: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
fn bpe_encode(inout tokens: DynamicVector[Int], text: String, inout tok: Tokenizer):
                            ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:891:32: error: unexpected token in expression
    for pos in range(len(text)):
                               ^
/root/llama2.mojo/llama2.mojo:891:32: error: statements must start at the beginning of a line
    for pos in range(len(text)):
                               ^
/root/llama2.mojo/llama2.mojo:940:9: error: use of unknown declaration 'print_no_newline'
        print_no_newline(chr(str2num(d1) * 16 + str2num(d2)))
        ^~~~~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:945:9: error: use of unknown declaration 'print_no_newline'
        print_no_newline(chr(s[p].to_int()))
        ^~~~~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:1044:25: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
    var prompt_tokens = DynamicVector[Int]()
                        ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:49:31: error: 'DTypePointer[T, 0]' value has no attribute 'simd_load'
        var newVal = self.data.simd_load[_width]() + val
                     ~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:50:18: error: 'DTypePointer[T, 0]' value has no attribute 'simd_store'
        self.data.simd_store[_width](newVal)
        ~~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:54:25: error: 'DTypePointer[T, 0]' value has no attribute 'simd_load'
        return self.data.simd_load[width]().reduce_add()
               ~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:111:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[nelts](idx)
               ~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:124:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[nelts](indices[0] * self._shape[1] + indices[1])
               ~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:127:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_load'
        return self._data.simd_load[1](idx)
               ~~~~~~~~~~^~~~~~~~~~
/root/llama2.mojo/llama2.mojo:130:26: error: 'DTypePointer[f32, 0]' value has no attribute 'simd_store'
        return self._data.simd_store[nelts](idx, val)
               ~~~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:305:31: error: use of unknown declaration 'DynamicVector', 'fn' declarations require explicit variable declarations
        self.sorted_indices = DynamicVector[Int]()
                              ^~~~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:382:40: error: 'List[SIMD[si8, 1]]' value has no attribute '_steal_ptr'
        var int32_ptr = config_data_raw._steal_ptr().bitcast[DType.int32]()
                        ~~~~~~~~~~~~~~~^~~~~~~~~~~
/root/llama2.mojo/llama2.mojo:469:27: error: 'List[SIMD[si8, 1]]' value has no attribute '_steal_ptr'
            var data = tmp._steal_ptr().bitcast[DType.float32]()
                       ~~~^~~~~~~~~~~
/root/.modular/pkg/packages.modular.com_max/bin/mojo: error: failed to parse the provided Mojo
tairov commented 4 months ago

Hi @oss-maintainer-12 , please pull latest changes , all errors must be fixed

oss-maintainer-12 commented 4 months ago

Thanks!

On Thu, Apr 18, 2024, 10:53 AM Aydyn Tairov @.***> wrote:

Closed #88 https://github.com/tairov/llama2.mojo/issues/88 as completed.

— Reply to this email directly, view it on GitHub https://github.com/tairov/llama2.mojo/issues/88#event-12527061455, or unsubscribe https://github.com/notifications/unsubscribe-auth/BGOHR7RI6MYIBE5SCX7XVUDY6ACAFAVCNFSM6AAAAABGIH3TJOVHI2DSMVQWIX3LMV45UABCJFZXG5LFIV3GK3TUJZXXI2LGNFRWC5DJN5XDWMJSGUZDOMBWGE2DKNI . You are receiving this because you were mentioned.Message ID: @.***>