Technical question: Lineage of huihui-ai/Huihui-Qwen3-VL-8B-Thinking-abliterated
Dear [Developer/Team],
I really appreciate your work on huihui-ai/Huihui-Qwen3-VL-8B-Thinking-abliterated. I've been using it recently and it's been very helpful.
I'm planning to use it as a base, so I need to confirm its relation with Qwen/Qwen3-VL-8B-Thinking:
Direct Fine-tuning: Is it derived directly from Qwen/Qwen3-VL-8B-Thinking, or through other checkpoints?
Inheritance: Does it strictly inherit the architecture and weights of Qwen/Qwen3-VL-8B-Thinking without merging or distilling from other models?
Understanding this will help me ensure compatibility with the Qwen/Qwen3-VL-8B-Thinking ecosystem.
Thank you for your time and support!
It was directly obtained through ablation from the original model Qwen/Qwen3-VL-8B-Thinking and has not undergone any fine-tuning.