AI-Commandos / LLaMa2lang

Convenience scripts to finetune (chat-)LLaMa3 and other models for any language
Apache License 2.0
283 stars 32 forks source link

Question or bug #31

Closed IzzyHibbert closed 10 months ago

IzzyHibbert commented 10 months ago

Branch Main branch

Environment I am using Colab

RAM/vRAM 16Gb ram and V100

Script with parameters Using the file translate_oasst.py with two arguments (in addiction to target_lang and checkpoint_location) : --use_madlad --madlad_quant in order to test the new madlad. I made no changes to the file translate_oasst.py.

Data layout or HF dataset Dataset is OpenAssistant/oasst1

Problem description/Question I am trying to create the translation by using the new Madlad. After I start the script I get the following error message and it stops.

0% 0/88838 [00:00<?, ?it/s]Got 39283 records for source language en, skipping 0 0% 0/88838 [00:24<?, ?it/s]2024-01-08 11:57:10.576859: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered 2024-01-08 11:57:10.576969: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered 2024-01-08 11:57:10.708224: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered 2024-01-08 11:57:12.917922: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT 0% 20/88838 [02:32<158:08:22, 6.41s/it]["Can you clarify the analogy? I'm not following the notation or vocabulary used in the example.", 'Can you write a formal letter to introduce Jeff Bezos to a customer?', 'I asked about contrastive learning in machine learning which has nothing to do with Jeff Bezos. Contrastive learning is used to increase the performance of vision-based tasks using contrast. I want you to explain this technique to me in a way that anyone without machine learning knowledge would understand.', 'Can you explain why it is important to manage stakeholders and engagement actively for any projects or initiatives that you are involved in your workplace?', 'In simple terms, contrastive learning focuses on teaching an AI the points of similarity between different images (or other media) to indirectly aid it in spotting points of divergence when present. To anthropomorphize the process, a human engaged in contrastive learning and eating hundreds of apples in a week would be better equipped to recognize an orange when presented with one.', 'I want to start doing astrophotography as a hobby, any suggestions what could i do?', "Getting started in astrophotography can seem daunting, but with some patience and practice, you can become a master of the craft. To begin, you'll need a good camera and lens, a tripod, and a dark sky location free of light pollution. You will also need to learn about the basics of astrophotography, such as what camera settings to use, how to capture star trails, and the best techniques for tracking celestial objects. You can also purchase or rent different types of telescopes, depending on what types of objects you want to capture. Additionally, it's important to keep up with the latest astrophotography news and trends. Once you have the necessary equipment and knowledge, you can start shooting and experimenting with different techniques to get the best results.", 'Can you tell me more? What would you recommend as a basic set of equipment to get started with? How much will it cost?', "Astrophotography can be a fun and rewarding hobby, and here are some more in depth suggestions for you to get started:\n\n Equipment: As a beginner, you will need a camera that is capable of taking long exposures and manual control over the settings. A good starting point would be a digital SLR (DSLR) camera or a mirrorless camera. You will also need a sturdy tripod, a fast wide-angle lens (f/2.8 or faster), and a remote shutter release cable to minimize camera shake during long exposures.\n\n Location: Look for a location with minimal light pollution and a clear view of the night sky. Check online maps to find the best spots near you.\n\n Settings: Start with a wide-angle lens and set your camera to manual mode. Set your aperture to f/2.8 or wider, ISO to 800 or 1600, and experiment with different exposure times (30 seconds to 2 minutes).\n\n Focus: Focus the lens on the stars using live view and manual focus. You may also need to use a Bahtinov mask to achieve accurate focus.\n\n Editing: After you've taken your photos, use editing software such as Adobe Lightroom or Photoshop to adjust the exposure, contrast, and color balance to get the best results.\n\nThe cost of equipment will depend on the type of camera and lens you choose, but a basic setup can cost anywhere from $500 to $3000. However, you don't need to start with the most expensive equipment to get good results. As you improve, you can upgrade your equipment and add additional accessories such as filters and tracking mounts.\n\nRemember, astrophotography is a hobby that requires patience and practice, so don't be discouraged if your first few attempts don't turn out as well as you had hoped. Keep experimenting and learning, and you'll soon be able to capture stunning images of the night sky!", 'What is a good course of instruction to avoid years of wasted effort?', 'Here is a basic deep-sky astrophotography equipment setup typically includes an equatorial mount, telescope, camera, autoguiding system, and a slew of additional accessories required to run the camera throughout the night. \n\nMount: Sky-Watcher EQ6-R Pro\nTelescope: Sky-Watcher Esprit 100\nCamera: ZWO ASI2600MM Pro\nColor Camera Filters: Optolong L-eXtreme, Radian Triad Ultra\nGuide Scope: William Optics 50mm Guide Scope\nGuide Camera: ZWO ASI290mm Mini\n\nCosts unfortunately may vary drastically, what location on planet earth do you occupy?\nI can help you find average prices and retailers in your area.\n\nWould you like new or used Equipment?\nThis will hap narrow down your selection.', 'You want a camera, tripod, and tracking mount. Those run about 700, 150, and 300 to 2k, respectively. You may also add items like an intervalometer or light pollution filter.\n\n[1] [https://en.wikipedia.org/wiki/Astrophotography\nWhat](https://en.wikipedia.org/wiki/Astrophotography/nWhat) else do you need?', 'buy a telescope\ngo to the planetarium\nbuy a specificaly designed star catching camera', 'Do you have any suggestions for how I might improve my skill in capturing good photos?', '• Purchase the most ridiculously large telescope you can afford and install it in your backyard. This lets your neighbors know you have sky-high aspirations and a lens to match.\n\n• Get a camera that lets you crank up the ISO to unconscionably high levels. This\'ll produce lots of noise in your images but makes for a handy excuse for any less-than-stellar results: "It\'s not sensor noise, it\'s star noise!"\n\n• Spend long hours in the dark gazing upward and complaining about light pollution. Even if you live out in the boondocks, there\'s always some distant illuminated civilization marring your views and precision-focused complaints.\n\n• Develop an obsession with tracking down elusive deep-sky objects. "M104, M109, show yourselves! I know you\'re out there, hiding among the stars!" The hunt can take all night but that\'s ideal for this hobby.\n\n• Join online astrophotography forums and brag about your imaging rig and targets acquired. The only thing as vast as the cosmos is the astronomical egos of fellow \'astro-photogs.\' Outdo them all!\n\n• Most importantly, remember that astrophotography is less about taking pictures and more about embracing the night. So have fun and may the stars shine brightly on your new obsession!', 'I think that your reply is humorous/sarcastic, while I wanted a serious one. Please answer once again in a more serious fashion, with actual advice', "I apologize for my previous response. Here are some tips on how to get good pictures, starting out: \n- Get away from high population areas. Light pollution is a major factor in astrophotography. \n- Use what you can afford. You might want to get an expensive telescope and camera, but with good conditions, some phones can even create decent pictures of the night sky (For example, Google Pixel phones have an astrophotography mode) \n- Keep your camera stable. Astrophotography needs long exposures to gather enough light, so any movement will show up as a blur. \n- For very long exposures, consider a tripod of setup to track the object of interest through the sky\n- Keep an eye on the weather. Even if there is no precipitation or clouds, the atmosphere is turbulent, which will affect your pictures. \n- Start simple and keep reasonable expectations. Your pictures starting out will look amazing to you, but probably won't look as good as what you see online. Remember that other people may have better equipment, locations, and experience. You should seek advice from them, and improve yourself.", "Astrophotography is an interesting hobby and with increasingly better cameras and optics, entry to it becomes easier. If you don't have a telescope, you could see if you can find an observatory you can visit, they often have open doors for interested people. You could also try to find a community to help you get started. \nIt is very favorable to make photos of the stars in areas with low light emissions, so it is best to be in areas without city lights to take good pictures. Modern smartphones are very competent in enhancing low-light photography, and you can find cheap starter kits or even DIY projects for your first telescope. \nAs long as you are curious, you will find ways to look for the stars", 'Can you give me an example of a python script that opens an api point and serves a string?', 'Sure! Here\'s an example Python script that uses the Flask web framework to create a simple API endpoint that serves a string:\n\n``` \nfrom flask import Flask\n\napp = Flask(__name__)\n\n@app.route(\'/\')\ndef hello_world():\n return \'Hello, world!\'\n\nif __name__ == \'__main__\':\n app.run()\n\n``` \n\nIn this script, we first import the Flask class from the flask module. Then we create a new instance of the Flask class, using the __name__ variable to specify the name of the application.\n\\\nNext, we define a new route using the @app.route() decorator. This decorator tells Flask to map requests to the root URL ("/") to the hello_world() function.\n\\\nFinally, we use the if __name__ == \'__main__\': block to start the Flask application when the script is executed. By default, the application will run on port 5000.\n\\\nYou can run this script and test the API by opening a web browser and navigating to http://localhost:5000/. You should see a simple "Hello, world!" message displayed in your browser.'] 0% 20/88838 [04:33<337:34:50, 13.68s/it] Traceback (most recent call last): File "/content/drive/MyDrive/LLM_NewLanguage/translate_oasst.py", line 232, in <module> main() File "/content/drive/MyDrive/LLM_NewLanguage/translate_oasst.py", line 203, in main translated_batch = batch_translate_madlad(texts_to_translate, target_lang) File "/content/drive/MyDrive/LLM_NewLanguage/translate_oasst.py", line 101, in batch_translate_madlad raise Exception("Failed to translate properly") Exception: Failed to translate properly

ErikTromp commented 10 months ago

Yeah madlad was broken on special characters. I fixed this now, try again.

IzzyHibbert commented 10 months ago

Yeah madlad was broken on special characters. I fixed this now, try again.

Yes I confirm that it works. For this translation steps, based on your experience, do you think that a 12Gb Vram could be enough ? (In colab I see it under 6Gb Vram at the moment - just started - so was wondering if I could rather translate on prem). Thanks

ErikTromp commented 10 months ago

You can always get it to work by lowering batch_size and/or using quantization but both slow down translation speeds unfortunately. We will be looking into adding a lot more translation models to pick from and making it easy to switch between those so hopefully in the (near) future this will become easier.

For now I would say that if your (target) language is well supported in OPUS (meaning from at least English and Spanish, there is a direct model, check HF) you should go with those. In other cases you have to opt for madlad but tweak the sizing and quantization (--madlad_quant for 8bit and --madlad_quant4 for 4bit).

IzzyHibbert commented 10 months ago

hat if your (target) language is well supported in OPUS (meaning from at least English and Spanish, there is a direct model, check HF) you should go with those.

Thanks for the details. As for "translating the Dataset", Yes last week I made all the steps and use what you guys already made (big thanks!) all the way through without this initial training, but rather by reusing your QLora. The thing is that this morning I had a chance to do some tests with the resulting model and it was not really good. I guess that this might depend on the Dataset, rather than the translation (actually I saw that there is a bigger version of the same OpenAssistant : oasst2 but probably this requires much more translation time).

Last question: If I use another base model, let's say one that is based on Mistral 7b, I guess it's as simple as changing the base model arguments and no other attention points ?

(apologies for been off topic, ticket is fine and about to close it)

ErikTromp commented 10 months ago

The dataset is actually quite okay but we have had some issues with translation of a few languages so far, so we will have to redo a few too in the coming weeks.

As for oasst2: es it takes longer but you can already do that using the current scripts if you want to. Same for Mistral.

Mixtral we are not sure yet but will support in the future too.