Overtrained / contextual-qa-chat-app

0 stars 0 forks source link

Confirm that model runs on an M1 chip #18

Open 907Resident opened 1 year ago

907Resident commented 1 year ago

Context:

The ability to use the Mistral-7B model on an M1 chip would reduce the need for remote compute resources. Based on work completed in #15, Mistral could be used effectively with Google Colab (Pro).

Objective:

Complete the same tasks listed #15 on a machine with an M1 (or M2) chip.

907Resident commented 1 year ago

Initial attempts were unsuccessful. This may require a dedicated session to fully explore.