Back to Projects
Modular Training Framework
A clean, extensible training framework for fine-tuning language models on any task (NER, QA, Classification) using LoRA and PEFT.
PythonPyTorchTransformersPEFTLoRAWandB

A modular framework designed to adapt the same training pipeline to any task by implementing just DataProcessor and Evaluator components. Features automated GGUF conversion, LoRA fine-tuning, and support for tasks like NER, Question Answering, and Text Classification.