llm.create_chat_completion(
messages = "No input example has been defined for this model task."
)Fikr-Code-v1
Fikr-Code is a specialized 4B parameter reasoning model, finetuned by Aayan Bin Asim, a 15-year-old developer from Pakistan.
Goal
The goal of this project is to develop high-reasoning AI tools locally in Pakistan. Fikr (Urdu for "Thought") is trained to solve complex coding tasks by first thinking through the logic.
How to use
This is a GGUF model. You can use it in Jan.ai, LM Studio, or Ollama. System Prompt: "You are a reasoning-focused coding assistant finetuned by Aayan bin Asim. Use a block for all logic before outputting code."
Training Details
- Base Model: Jan-code-4b
- Developer: Aayan Bin Asim
- Location: Pakistan ๐ต๐ฐ
- Downloads last month
- 1
Hardware compatibility
Log In to add your hardware
4-bit
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support
Model tree for TheDevil123456789/Fikr-Code-v1-GGUF
Base model
Qwen/Qwen3-4B-Instruct-2507 Finetuned
janhq/Jan-v3-4B-base-instruct Finetuned
janhq/Jan-code-4b
# !pip install llama-cpp-python from llama_cpp import Llama llm = Llama.from_pretrained( repo_id="TheDevil123456789/Fikr-Code-v1-GGUF", filename="Fikr-Code-v1-Q4_K_M.gguf", )