Closed Katehuuh closed 6 months ago
Error seems to show on any (dpo) dataset, not just dpo_mix_en.
dpo_mix_en
No response
No dataset issue Running tokenizer on dataset is successfully. But the resume LoRA --adapter_name_or_path saves\LLaMA2-13B-Chat\lora\SDprompt_ext is the issue, as adapter_config.json is empty.
Running tokenizer on dataset
--adapter_name_or_path saves\LLaMA2-13B-Chat\lora\SDprompt_ext
adapter_config.json
Reminder
Reproduction
Error seems to show on any (dpo) dataset, not just
dpo_mix_en
.Full Log. Clean install d9cdddd
```cmd (venv) C:\LLaMA-Factory>set CUDA_VISIBLE_DEVICES=0 && llamafactory-cli train --stage orpo --do_train True --model_name_or_path C:\LLaMA-Factory\checkpoints\Llama-2-13b-chat-hf --adapter_name_or_path saves\LLaMA2-13B-Chat\lora\SDprompt_ext --finetuning_type lora --quantization_bit 4 --template alpaca --rope_scaling linear --flash_attn fa2 --dataset_dir data --dataset dpo_mix_en --cutoff_len 4096 --learning_rate 5e-05 --num_train_epochs 1.0 --max_samples 100000 --per_device_train_batch_size 1 --gradient_accumulation_steps 1 --lr_scheduler_type cosine --max_grad_norm 1.0 --logging_steps 5 --save_steps 1000 --warmup_steps 1000 --optim adamw_torch --output_dir saves\LLaMA2-13B-Chat\lora\SDprompt_ext_orpo --bf16 True --lora_rank 32 --lora_dropout 0.15 --lora_target all --plot_loss True bin C:\LLaMA-Factory\venv\lib\site-packages\bitsandbytes\libbitsandbytes_cuda121.dll [2024-05-08 13:38:57,911] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect) W0508 13:38:58.160000 11952 torch\distributed\elastic\multiprocessing\redirects.py:27] NOTE: Redirects are currently not supported in Windows or MacOs. 05/08/2024 13:38:58 - WARNING - llmtuner.hparams.parser - We recommend enable `upcast_layernorm` in quantized training. 05/08/2024 13:38:58 - INFO - llmtuner.hparams.parser - Process rank: 0, device: cuda:0, n_gpu: 1, distributed training: False, compute dtype: torch.bfloat16 [INFO|tokenization_utils_base.py:2085] 2024-05-08 13:38:58,917 >> loading file tokenizer.model [INFO|tokenization_utils_base.py:2085] 2024-05-08 13:38:58,917 >> loading file tokenizer.json [INFO|tokenization_utils_base.py:2085] 2024-05-08 13:38:58,917 >> loading file added_tokens.json [INFO|tokenization_utils_base.py:2085] 2024-05-08 13:38:58,918 >> loading file special_tokens_map.json [INFO|tokenization_utils_base.py:2085] 2024-05-08 13:38:58,918 >> loading file tokenizer_config.json 05/08/2024 13:38:58 - INFO - llmtuner.data.template - Add pad token: 05/08/2024 13:38:58 - INFO - llmtuner.data.loader - Loading dataset hiyouga/DPO-En-Zh-20k... Downloading readme: 100%|█████████████████████████████████████████████████████████████████| 1.63k/1.63k [00:00, ?B/s] Downloading data: 100%|███████████████████████████████████████████████████████████| 49.2M/49.2M [00:00<00:00, 53.0MB/s] Generating train split: 10000 examples [00:00, 14406.95 examples/s] Converting format of dataset: 100%|████████████████████████████████████| 10000/10000 [00:00<00:00, 21727.60 examples/s] Running tokenizer on dataset: 100%|██████████████████████████████████████| 10000/10000 [00:27<00:00, 362.83 examples/s] prompt_ids: [13866, 338, 385, 15278, 393, 16612, 263, 3414, 29889, 14350, 263, 2933, 393, 7128, 2486, 1614, 2167, 278, 2009, 29889, 13, 13, 835, 2799, 4080, 29901, 13, 10376, 664, 373, 263, 5381, 3814, 363, 263, 5381, 393, 8128, 5786, 304, 15724, 470, 25700, 18987, 5866, 29915, 29879, 22162, 1446, 13, 13, 2277, 29937, 13291, 29901, 13, 306, 29889, 28841, 6991, 5219, 13, 13, 1123, 23367, 4649, 276, 1446, 338, 263, 2989, 29899, 5509, 22162, 271, 18987, 5001, 29892, 4266, 5281, 297, 4969, 443, 1454, 657, 2371, 322, 3710, 1680, 292, 5866, 29915, 29879, 22162, 1446, 363, 15724, 322, 25700, 29889, 8680, 10655, 338, 304, 3867, 409, 314, 2222, 18987, 29892, 13013, 29892, 322, 8225, 310, 4327, 1230, 22162, 1446, 393, 9926, 261, 7333, 14321, 29892, 1532, 29899, 915, 292, 29892, 322, 3957, 4249, 5866, 29889, 13, 13, 2687, 29889, 6938, 6811, 1493, 13, 13, 29909, 29889, 6938, 4408, 29901, 830, 23367, 4649, 276, 1446, 13, 13, 29933, 29889, 6938, 3767, 12425, 29901, 28873, 2718, 3097, 6938, 313, 2208, 29907, 29897, 13, 13, 29907, 29889, 7460, 414, 29901, 518, 10858, 4408, 29898, 29879, 29897, 669, 830, 6591, 28224, 5597, 29962, 13, 13, 29928, 29889, 17015, 29901, 518, 16885, 29892, 4306, 29962, 13, 13, 5287, 29889, 15538, 4587, 571, 287, 13, 13, 29909, 29889, 8701, 4649, 276, 271, 1858, 9450, 29901, 1334, 674, 664, 16467, 411, 13154, 304, 2874, 322, 3814, 22162, 1446, 12464, 4395, 304, 1009, 2702, 4225, 322, 1203, 3145, 29889, 13, 13, 29933, 29889, 9548, 434, 669, 7255, 510, 1545, 362, 6726, 292, 29901, 1334, 674, 11592, 278, 4922, 6003, 434, 322, 24803, 800, 363, 1269, 22162, 271, 29892, 5662, 3864, 263, 25561, 322, 8681, 8491, 5177, 29889, 13, 13, 29907, 29889, 13414, 3189, 536, 3381, 29901, 1334, 674, 564, 3881, 263, 12875, 310, 664, 845, 3554, 29892, 3031, 262, 1503, 29892, 322, 28709, 1288, 14188, 29892, 1316, 408, 343, 14895, 29892, 1612, 7018, 29892, 3815, 29899, 25237, 24472, 3476, 267, 29892, 322, 907, 1230, 664, 845, 3554, 29889, 13, 13, 29928, 29889, 2088, 342, 5013, 21079, 669, 14184, 309, 277, 4097, 29901, 1334, 674, 2752, 322, 14821, 411, 17924, 17838, 7726, 414, 322, 16089, 277, 4097, 29892, 26118, 411, 278, 22162, 271, 29915, 29879, 10929, 322, 1203, 3145, 29889, 13, 13, 29923, 29889, 315, 1008, 292, 669, 2191, 284, 1858, 9450, 29901, 1334, 674, 564, 3881, 363, 9045, 29891, 29892, 628, 14803, 29892, 322, 2888, 13902, 592, 284, 3987, 29892, 24803, 1218, 652, 300, 653, 25091, 322, 5821, 2063, 29889, 13, 13, 29943, 29889, 4485, 15133, 669, 9705, 8194, 29901, 1334, 674, 6985, 13154, 297, 2504, 11427, 1009, 22162, 1446, 1549, 5164, 18196, 29892, 3704, 5264, 5745, 29892, 4876, 9999, 292, 29892, 322, 16650, 293, 22056, 14587, 29889, 13, 13, 29954, 29889, 7038, 657, 15057, 29901, 1334, 674, 10933, 278, 22162, 271, 29915, 29879, 23562, 322, 9801, 393, 599, 1518, 11259, 526, 16112, 20458, 322, 11819, 287, 29889, 13, 13, 5667, 29889, 28794, 24352, 13, 13, 29909, 29889, 17157, 28794, 29901, 8680, 3646, 9999, 7805, 29901, 13, 29896, 29889, 10152, 29899, 29888, 542, 3880, 25700, 322, 5381, 267, 13, 29906, 29889, 5674, 2264, 322, 7333, 5849, 6471, 13, 29941, 29889, 10152, 29915, 29879, 28127, 322, 10257, 27733, 13, 29946, 29889, 1894, 23352, 5866, 25738, 304, 2894, 675, 22162, 1446, 363, 7875, 470, 4066, 29899, 6707, 6471, 13, 13, 29933, 29889, 28794, 1605, 1975, 29901, 13, 29896, 29889, 402, 798, 292, 4066, 297, 1532, 2264, 322, 1583, 29899, 18020, 13, 29906, 29889, 512, 1037, 1463, 9667, 363, 5866, 29915, 29879, 3710, 1680, 358, 322, 28127, 28602, 1907, 13, 29941, 29889, 319, 13521, 363, 5412, 322, 4327, 1230, 27482, 13, 13, 29907, 29889, 24620, 2105, 24352, 29901, 518, 1293, 322, 27599, 596, 1667, 5100, 17259, 29962, 13, 13, 29963, 29889, 4485, 15133, 669, 28389, 3767, 8963, 13, 13, 29909, 29889, 18007, 292, 29901, 2661, 370, 1674, 263, 4549, 29892, 5412, 29892, 322, 5936, 13902, 14982, 10110, 393, 9432, 29879, 278, 1819, 322, 10655, 310, 830, 23367, 4649, 276, 1446, 29889, 13, 13, 29933, 29889, 13253, 29901, 6204, 263, 1998, 1474, 5929, 12818, 322, 1404, 29899, 18326, 368, 4700, 393, 1510, 11436, 1749, 5786, 29892, 3132, 1243, 20170, 616, 29879, 29892, 322, 4940, 22162, 1446, 29889, 13, 13, 29907, 29889, 10307, 8213, 29901, 2201, 482, 411, 1749, 3646, 20026, 1549, 5264, 5745, 21796, 29892, 1316, 408, 2799, 14442, 29892, 13327, 29892, 322, 28547, 797, 29889, 13, 13, 29928, 29889, 8527, 292, 669, 3455, 8397, 14587, 29901, 1152, 479, 16650, 293, 22056, 14587, 411, 5866, 29915, 29879, 25700, 29892, 1532, 2264, 1326, 11376, 29892, 322, 13787, 22543, 297, 1749, 3646, 9999, 29889, 13, 13, 29923, 29889, 22608, 4485, 15133, 29901, 8878, 385, 4876, 1051, 322, 1653, 4943, 9763, 1026, 2153, 304, 3013, 21696, 2596, 23388, 1048, 701, 11506, 22162, 1446, 322, 4959, 29889, 13, 13, 29943, 29889, 5236, 6376, 800, 29901, 951, 19698, 5745, 23746, 322, 17838, 12618, 3460, 28602, 1907, 304, 7910, 1749, 26401, 322, 6625, 4127, 29889, 13, 13, 18118, 29889, 6607, 800, 669, 15057, 13, 13, 29909, 29889, 8583, 3767, 12425, 29901, 22402, 16178, 322, 5544, 747, 9770, 363, 278, 10643, 3815, 29892, 3704, 16538, 322, 9999, 292, 29892, 22162, 271, 18987, 29892, 322, 6931, 29889, 13, 13, 29933, 29889, 5244, 1731, 669, 10554, 267, 29901, 2661, 370, 1674, 8543, 6757, 322, 10174, 304, 9801, 10597, 322, 5335, 873, 8225, 310, 22162, 271, 18987, 9595, 29889, 13, 13, 29907, 29889, 12477, 6376, 800, 4034, 15057, 29901, 1954, 2037, 263, 15600, 29924, 1788, 304, 10933, 3132, 2472, 322, 12084, 17583, 29889, 13, 13, 29963, 2687, 29889, 4231, 273, 1455, 1019, 24247, 13, 13, 29909, 29889, 830, 9947, 13763, 29879, 29901, 4451, 1220, 3806, 337, 9947, 20873, 29892, 1316, 408, 22162, 271, 18987, 1238, 267, 29892, 844, 6847, 29892, 322, 22056, 14587, 29889, 13, 13, 29933, 29889, 12027, 11259, 29901, 2661, 6490, 373, 17696, 1518, 11259, 29892, 3704, 4497, 4314, 29892, 8034, 2913, 29892, 9999, 292, 29892, 322, 9850, 21544, 29889, 13, 13, 29907, 29889, 28301, 29899, 29923, 854, 24352, 29901, 20535, 403, 278, 2867, 29899, 11884, 1298, 304, 8161, 746, 830, 23367, 4649, 276, 1446, 674, 4953, 2600, 8270, 29889, 13, 13, 29928, 29889, 315, 1161, 22787, 1019, 24247, 29901, 9133, 680, 263, 13173, 274, 1161, 4972, 29821, 579, 29892, 3704, 23483, 630, 337, 9947, 322, 1518, 11259, 29892, 304, 9801, 18161, 25806, 29889, 13, 13, 29963, 5287, 29889, 16367, 402, 798, 386, 669, 12027, 9454, 13, 13, 29909, 29889, 317, 1052, 292, 2, 29871, 13, 13, 835, 2799, 4080, 29901, 13, 19878, 13, 13, 2277, 29937, 13291, 29901, 13] prompt: Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: lets work on a business plan for a business that provides services to individuals or organizations planning women's retreats ### Response: I. Executive Summary Recharge Retreats is a full-service retreat planning company, specializing in creating unforgettable and empowering women's retreats for individuals and organizations. Our mission is to provide seamless planning, organization, and execution of transformative retreats that foster personal growth, well-being, and connection among women. II. Company Overview A. Company Name: Recharge Retreats B. Company Structure: Limited Liability Company (LLC) C. Founders: [Your Name(s) & Relevant Experience] D. Location: [City, State] III. Services Offered A. Custom Retreat Planning: We will work closely with clients to design and plan retreats tailored to their specific needs and objectives. B. Venue & Accommodation Booking: We will secure the perfect venue and accommodations for each retreat, ensuring a comfortable and inspiring environment. C. Activity Coordination: We will arrange a variety of workshops, seminars, and recreational activities, such as yoga, meditation, team-building exercises, and creative workshops. D. Guest Speakers & Facilitators: We will source and coordinate with expert guest speakers and facilitators, aligned with the retreat's theme and objectives. E. Catering & Meal Planning: We will arrange for healthy, delicious, and customizable meal options, accommodating dietary restrictions and preferences. F. Marketing & Promotion: We will assist clients in promoting their retreats through various channels, including social media, email marketing, and strategic partnerships. G. Budget Management: We will manage the retreat's budget and ensure that all expenses are carefully planned and monitored. IV. Market Analysis A. Target Market: Our target market includes: 1. Women-focused organizations and businesses 2. Wellness and personal development groups 3. Women's networking and professional associations 4. Individual women seeking to organize retreats for friends or interest-based groups B. Market Trends: 1. Growing interest in wellness and self-care 2. Increased demand for women's empowerment and networking opportunities 3. A desire for unique and transformative experiences C. Competitor Analysis: [List and analyze your main competitors] V. Marketing & Sales Strategy A. Branding: Establish a strong, unique, and recognizable brand identity that reflects the values and mission of Recharge Retreats. B. Website: Create a visually appealing and user-friendly website that showcases our services, client testimonials, and past retreats. C. Social Media: Engage with our target audience through social media platforms, such as Instagram, Facebook, and LinkedIn. D. Networking & Partnerships: Forge strategic partnerships with women's organizations, wellness providers, and influencers in our target market. E. Email Marketing: Build an email list and create regular newsletters to keep subscribers informed about upcoming retreats and events. F. Public Relations: Leverage media coverage and guest blogging opportunities to increase our visibility and credibility. VI. Operations & Management A. Team Structure: Define roles and responsibilities for the management team, including sales and marketing, retreat planning, and operations. B. Workflow & Processes: Establish efficient systems and processes to ensure smooth and timely execution of retreat planning tasks. C. Client Relationship Management: Implement a CRM system to manage client information and communication effectively. VII. Financial Projections A. Revenue Streams: Outline expected revenue streams, such as retreat planning fees, commissions, and partnerships. B. Expenses: Estimate ongoing expenses, including salaries, office space, marketing, and travel costs. C. Break-Even Analysis: Calculate the break-even point to determine when Recharge Retreats will become profitable. D. Cash Flow Projections: Provide a detailed cash flow forecast, including anticipated revenue and expenses, to ensure financial stability. VIII. Future Growth & Expansion A. Scaling ### Instruction: continue ### Response: chosen_ids: [1094, 830, 23367, 4649, 276, 1446, 25088, 29892, 591, 3814, 304, 7985, 1749, 3815, 411, 5684, 1741, 6615, 4097, 322, 9999, 292, 4266, 2879, 29889, 1763, 24803, 403, 445, 14321, 29892, 591, 674, 10127, 385, 8034, 2913, 393, 9926, 414, 24771, 322, 907, 28157, 4249, 3815, 5144, 29889, 1334, 674, 884, 7536, 277, 675, 7592, 664, 3987, 322, 4840, 21354, 12084, 8492, 304, 2304, 7592, 3815, 5144, 29889, 13, 13, 797, 6124, 29892, 591, 12242, 304, 6894, 1598, 1749, 22162, 271, 5957, 886, 304, 274, 1008, 304, 1422, 963, 267, 29892, 1316, 408, 6651, 9850, 22162, 1446, 29892, 1532, 2264, 22162, 1446, 363, 11825, 29892, 322, 20954, 8694, 484, 952, 29889, 1334, 674, 16508, 304, 18096, 411, 2999, 6003, 1041, 4822, 5164, 14354, 3186, 8157, 304, 3867, 263, 16984, 3464, 310, 22162, 271, 27482, 29889, 13, 13, 4806, 674, 884, 7985, 1749, 9999, 292, 14231, 304, 6159, 263, 25734, 20026, 29892, 5256, 5281, 14982, 14060, 545, 1549, 13787, 2265, 22056, 14587, 29892, 2902, 993, 616, 9999, 292, 29892, 322, 6901, 4959, 29889, 1334, 674, 3133, 5794, 380, 4401, 304, 2143, 457, 1749, 5786, 2729, 373, 3132, 16705, 322, 12439, 13661, 534, 1975, 304, 7744, 1749, 5957, 886, 304, 278, 3926, 29899, 305, 9776, 4225, 322, 5821, 2063, 310, 1749, 3646, 9999, 29889, 13, 13, 2887, 591, 6287, 29892, 591, 674, 3933, 19355, 304, 318, 561, 1025, 292, 1749, 10655, 304, 3710, 1680, 322, 4511, 5866, 1549, 4327, 1230, 22162, 271, 27482, 393, 7536, 277, 675, 7333, 14321, 29892, 1532, 2264, 29892, 322, 3957, 29889, 2] chosen: As Recharge Retreats grows, we plan to expand our team with additional event coordinators and marketing specialists. To accommodate this growth, we will establish an office space that fosters collaboration and creativity among team members. We will also prioritize remote work options and streamlined communication tools to support remote team members. In addition, we aim to diversify our retreat offerings to cater to different themes, such as solo travel retreats, wellness retreats for parents, and spiritual journeys. We will seek to partner with multiple venues across various locations worldwide to provide a diverse range of retreat experiences. We will also expand our marketing efforts to reach a wider audience, maximizing brand exposure through influencer partnerships, experiential marketing, and virtual events. We will continuously strive to refine our services based on client feedback and identify industry trends to adapt our offerings to the ever-changing needs and preferences of our target market. As we scale, we will remain committed to upholding our mission to empower and connect women through transformative retreat experiences that prioritize personal growth, wellness, and connection. rejected_ids: [319, 29889, 317, 1052, 292, 29901, 13, 13, 29896, 29889, 4587, 571, 263, 23272, 895, 1904, 29901, 29408, 4045, 304, 1722, 322, 21994, 830, 23367, 4649, 276, 1446, 23272, 4637, 297, 1422, 12786, 29892, 7985, 292, 1749, 6159, 322, 337, 9947, 7037, 29889, 13, 29906, 29889, 10682, 7395, 21888, 322, 1302, 9733, 11104, 29901, 6204, 1462, 944, 653, 17869, 20873, 491, 27032, 6694, 322, 2304, 304, 15724, 322, 25700, 8852, 297, 2894, 5281, 1009, 1914, 22162, 1446, 29889, 13, 13, 29933, 29889, 360, 1536, 2450, 29901, 13, 13, 29896, 29889, 12027, 392, 2669, 5957, 886, 29901, 10056, 4417, 5684, 22162, 271, 29899, 12817, 5786, 29892, 1316, 408, 1302, 9733, 29892, 373, 17696, 2304, 29892, 22162, 271, 17983, 29892, 470, 22162, 271, 11782, 3277, 3987, 29889, 13, 29906, 29889, 17157, 716, 2791, 1691, 29901, 4358, 4858, 292, 1749, 3646, 9999, 304, 3160, 916, 1261, 1946, 1199, 29892, 1316, 408, 1757, 29892, 13175, 29892, 470, 2702, 4066, 6471, 29889, 13, 13, 29907, 29889, 4623, 12027, 9454, 29901, 13, 13, 29896, 29889, 3455, 1089, 411, 1887, 25700, 29901, 13435, 3717, 403, 411, 5866, 29915, 29879, 25700, 28177, 304, 3814, 6121, 22162, 1446, 322, 7985, 1749, 5534, 10122, 29889, 13, 29906, 29889, 922, 1416, 5220, 292, 28602, 1907, 29901, 349, 1295, 434, 867, 1934, 29892, 13258, 1860, 29892, 322, 658, 550, 304, 1436, 749, 6121, 22162, 1446, 322, 2304, 13184, 297, 716, 2791, 1691, 29889, 13, 13, 2059, 1494, 445, 5381, 3814, 29892, 830, 23367, 4649, 276, 1446, 756, 278, 7037, 304, 4953, 263, 8236, 13113, 310, 5866, 29915, 29879, 22162, 271, 5786, 29892, 3710, 1680, 292, 15724, 322, 25700, 304, 9926, 261, 14321, 29892, 3957, 29892, 322, 7333, 13852, 1549, 26959, 519, 322, 2834, 29899, 264, 5403, 3277, 27482, 29889, 2] rejected: A. Scaling: 1. Offer a franchise model: Allow others to open and operate Recharge Retreats franchises in different regions, expanding our reach and revenue potential. 2. Develop online courses and coaching programs: Create supplementary income streams by offering training and support to individuals and organizations interested in organizing their own retreats. B. Diversification: 1. Expand service offerings: Consider adding additional retreat-related services, such as coaching, ongoing support, retreat evaluation, or retreat financing options. 2. Target new markets: Broadening our target market to include other demographics, such as men, families, or specific interest groups. C. International Expansion: 1. Partner with local organizations: Collaborate with women's organizations abroad to plan international retreats and expand our global presence. 2. Seek funding opportunities: Pursue grants, investments, and loans to finance international retreats and support expansion in new markets. By following this business plan, Recharge Retreats has the potential to become a leading provider of women's retreat services, empowering individuals and organizations to foster growth, connection, and personal transformation through memorable and life-enhancing experiences. [INFO|configuration_utils.py:724] 2024-05-08 13:39:31,702 >> loading configuration file C:\LLaMA-Factory\checkpoints\Llama-2-13b-chat-hf\config.json [INFO|configuration_utils.py:789] 2024-05-08 13:39:31,703 >> Model config LlamaConfig { "_name_or_path": "C:\\LLaMA-Factory\\checkpoints\\Llama-2-13b-chat-hf", "architectures": [ "LlamaForCausalLM" ], "attention_bias": false, "attention_dropout": 0.0, "bos_token_id": 1, "eos_token_id": 2, "hidden_act": "silu", "hidden_size": 5120, "initializer_range": 0.02, "intermediate_size": 13824, "max_position_embeddings": 4096, "model_type": "llama", "num_attention_heads": 40, "num_hidden_layers": 40, "num_key_value_heads": 40, "pretraining_tp": 1, "rms_norm_eps": 1e-05, "rope_scaling": null, "rope_theta": 10000.0, "tie_word_embeddings": false, "torch_dtype": "float16", "transformers_version": "4.40.2", "use_cache": true, "vocab_size": 32000 } 05/08/2024 13:39:31 - WARNING - llmtuner.model.utils.rope - Input length is smaller than max length. Consider increase input length. 05/08/2024 13:39:31 - INFO - llmtuner.model.utils.rope - Using linear scaling strategy and setting scaling factor to 1.0 05/08/2024 13:39:31 - INFO - llmtuner.model.utils.quantization - Quantizing model to 4 bit. [INFO|modeling_utils.py:3426] 2024-05-08 13:39:31,735 >> loading weights file C:\LLaMA-Factory\checkpoints\Llama-2-13b-chat-hf\model.safetensors.index.json [INFO|modeling_utils.py:1494] 2024-05-08 13:39:31,747 >> Instantiating LlamaForCausalLM model under default dtype torch.bfloat16. [INFO|configuration_utils.py:928] 2024-05-08 13:39:31,750 >> Generate config GenerationConfig { "bos_token_id": 1, "eos_token_id": 2 } Loading checkpoint shards: 100%|████████████████████████████████████████████████████████| 3/3 [13:03<00:00, 261.10s/it] [INFO|modeling_utils.py:4170] 2024-05-08 13:52:35,427 >> All model checkpoint weights were used when initializing LlamaForCausalLM. [INFO|modeling_utils.py:4178] 2024-05-08 13:52:35,427 >> All the weights of LlamaForCausalLM were initialized from the model checkpoint at C:\LLaMA-Factory\checkpoints\Llama-2-13b-chat-hf. If your task is similar to the task the model of the checkpoint was trained on, you can already use LlamaForCausalLM for predictions without further training. [INFO|modeling_utils.py:3719] 2024-05-08 13:52:35,430 >> Generation config file not found, using a generation config created from the model config. [WARNING|quantizer_bnb_4bit.py:307] 2024-05-08 13:52:35,823 >> You are calling `save_pretrained` to a 4-bit converted model, but your `bitsandbytes` version doesn't support it. If you want to save 4-bit models, make sure to have `bitsandbytes>=0.41.3` installed. 05/08/2024 13:52:35 - INFO - llmtuner.model.utils.checkpointing - Gradient checkpointing enabled. 05/08/2024 13:52:35 - INFO - llmtuner.model.utils.attention - Using FlashAttention-2 for faster training and inference. 05/08/2024 13:52:35 - INFO - llmtuner.model.adapter - Fine-tuning method: LoRA Traceback (most recent call last): File "C:\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Python\Python310\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\LLaMA-Factory\venv\Scripts\llamafactory-cli.exe\__main__.py", line 7, in
Expected behavior
No response
System Info
No response
Others
No response