Open m0saan opened 4 months ago
@awni your thoughts please!
Thanks @m0saan. I'm not sure we need parameter groups yet. Let's keep this issue open but I would mark it as low priority until we have reason to observe otherwise.
In MLX it's a lot easier to have multiple optimizers each working on a subset of the model since things are a bit more decoupled than in PyTorch. So this is an instance where the added functionality doesn't make as much sense for us.
Issue Description:
Feature Request
Summary: I propose adding support for parameter groups in MLX to enhance the flexibility and customization of model optimization.
Details: The addition of parameter groups would enable users to group and apply different optimization configurations to specific subsets of model parameters. This is a common feature in many deep learning frameworks and can significantly improve the efficiency of training and fine-tuning models.
Expected Behavior:
Motivation:
Example Usage: