Closed WiktorD75 closed 8 months ago
Hello again, my friend of interest, I was waiting for a package to answer you. Today I received the package, and I’ve been testing all evening on Windows 11 Pro with a white anime DICASVER 3070M 16GB, everything works as it should, LLMs work very quickly, 7B models are very fast compared to original turbine 1070 8GB, perhaps faster than ChatGPT (the quality of the answers does not count). I tested both mistral instructions v0.1 Q4_K_M and v0.2 Q8_0. I also tested AUTOMATIC1111 with SDXL, it works. I tried 1.5, 2.1 and a bunch of similar models, everything was very fast. In Python, training keras tensorflow is also very fast. This is about neural networks. In general, the drivers are great.
upd: I forgot to write, 16 gigs work everywhere I need them (although 3070 by 16 does not exist). Well, in your case, there will definitely be no problems with 12 gigabytes.
I have a 3070m 16gb and can confirm SD A1111 / ComfyUI work well (on latest drivers), its the reason i got my card specifically. Not had any issues with OOM. Still learning but not ran into any issues with the GPU at all. 70deg on 100% full load 120watts
thnx guys, this is great news, but where did you buy your 3070m 16gb?
No worries.
So bit of an strange one. I happen to come across an eBay listing. Could not find that specific vram arrangement anywhere else. Others all had 8gb. Too a gamble and it works a treat.
I will link you additional ones. Fyi these won't be the exact same seller I purchased from though.
The ram works with no issues, I have carried out tasks to utilise it fully.
On Mon, 29 Jan 2024, 12:58 WiktorD75, @.***> wrote:
thnx guys, this is great news, but where did you buy your 3070m 16gb?
— Reply to this email directly, view it on GitHub https://github.com/arutar/FrankenDriver/issues/38#issuecomment-1914646271, or unsubscribe https://github.com/notifications/unsubscribe-auth/BFQNG22PLOS6WS4BE2IJGBTYQ6MHHAVCNFSM6AAAAABBO4TQQ6VHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSMJUGY2DMMRXGE . You are receiving this because you commented.Message ID: @.***>
thnx guys, this is great news, but where did you buy your 3070m 16gb?
I’m from Russia, and we have our own marketplaces like OZON (amazon like lol), and such goods travel not locally in 2-3 days, but from China through Ozon Global in 2-4 weeks. I’m not sure that you can order from Western countries on Ozon. I was able to grab the "DICASVER 3070M 16GB White" at a discount for $246. I wanted to find the same brand or an analogue on AliExpress, but they are no longer there or too expensive. Apparently OZON had remnants of these video cards, which are no longer available on AliExpress.
If you really want to find something at a similar price, I recommend visiting Chinese marketplaces like Alibaba and Taobao. But there are some subtleties, for example, it is advisable to ask the seller for the price tag for the product, just like at a city market in real life.
As I understand, there are fewer such video cards, because the chips are most likely after mining, and it is unclear where Samsung or Hynix memory comes from, perhaps also from broken video cards. Since the mining boom was relatively long ago, all the Frankensteins were naturally sold, and now you can find leftovers from warehouses. So similar FrankenGPUs can only be found in the Chinese.
This is NOT product card which i bought, and it's $112 more expensive here. But I have exactly the same one article [FXKCP00049]. Just for example: https://www.ozon.ru/product/dicasver-videokarta-geforce-rtx-3070-16-gb-fxkcp00049-1393041082/
I was lucky with my GPU, I came across Samsung chips, and I was able to overclock the memory by +1000 MHz
Could someone check if AI models can be run locally on cards like the 3060M 12GB with FrankenDriver? The easiest way is to try using the stand-alone LMstudio.ai application (closed source), but if someone has experience in running LLM locally on open source applications, they can try do it using e.g. oobabooga. Do you get any errors or do they start normally?