Invalid LoRA weight model.layers.0.self_attn.q_proj.lora_A.weight shape: torch.Size([16, 4096]), expected (16, 8192)
fireworks.json
accounts/fireworks/models/llama-v3p1-8b-instruct
Was this page helpful?