adapter_config.json
- The Hugging Face adapter configuration file.adapter_model.bin
or adapter_model.safetensors
- The saved addon file.adapter_config.json
must contain the following fields:
r
- The number of LoRA ranks. Must be between an integer between 4 and 64, inclusive.target_modules
- A list of target modules. Currently the following target modules are supported:
q_proj
k_proj
v_proj
o_proj
up_proj
or w1
down_proj
or w2
gate_proj
or w3
block_sparse_moe.gate
fireworks.json
file directory containing: