Yxxxb / VoCo-LLaMA

VoCo-LLaMA: This repo is the official implementation of "VoCo-LLaMA: Towards Vision Compression with Large Language Models".
https://yxxxb.github.io/VoCo-LLaMA-page/
Apache License 2.0
84 stars 4 forks source link

Weird loss curve #11

Closed Andrew-Zhang closed 4 months ago

Andrew-Zhang commented 4 months ago

Hello! I followed the instructions to train VoCO-LLaMA, and I get the following loss curve.

image

Is it normal to have spikes where the loss drops to around 0.3 roughly every 20 steps? Thanks!

Yxxxb commented 4 months ago

Hi!

I encountered a similar situation when training LLaVA, which seems to be a common phenomenon in large (visual) language models. This is not caused by VoCo-LLaMA. This is a possible reasonable explanation I have learned. Reducing the learning rate or using LoRA may alleviate this phenomenon.

Andrew-Zhang commented 4 months ago

Ok thanks!