FunctionGemma Coding Assistant

Production-ready coding assistant with:

  • Chain of Recursive Thoughts reasoning
  • Web search integration (optional)
  • Local knowledge base (SQLite + FAISS)
  • Fine-tuned on 20k diverse code samples

Usage

from transformers import AutoModelForCausalLM, AutoTokenizermodel = AutoModelForCausalLM.from_pretrained("shaurya79/functiongemma-coding-assistant") tokenizer = AutoTokenizer.from_pretrained("shaurya79/functiongemma-coding-assistant")prompt = "Write a Python function to reverse a string" inputs = tokenizer(prompt, return_tensors="pt") outputs = model.generate(**inputs, max_new_tokens=512) print(tokenizer.decode(outputs))

Features

  • Code generation (Python, JavaScript, etc.)
  • Detailed explanations
  • Bug fixing suggestions
  • Multi-language support

Size

~250-300 MB (quantized)

Downloads last month
-
Safetensors
Model size
0.3B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Shaurya79/functiongemma-coding-assistant

Finetuned
(107)
this model