FunctionGemma Coding Assistant
Production-ready coding assistant with:
- Chain of Recursive Thoughts reasoning
- Web search integration (optional)
- Local knowledge base (SQLite + FAISS)
- Fine-tuned on 20k diverse code samples
Usage
from transformers import AutoModelForCausalLM, AutoTokenizermodel = AutoModelForCausalLM.from_pretrained("shaurya79/functiongemma-coding-assistant") tokenizer = AutoTokenizer.from_pretrained("shaurya79/functiongemma-coding-assistant")prompt = "Write a Python function to reverse a string" inputs = tokenizer(prompt, return_tensors="pt") outputs = model.generate(**inputs, max_new_tokens=512) print(tokenizer.decode(outputs))
Features
- Code generation (Python, JavaScript, etc.)
- Detailed explanations
- Bug fixing suggestions
- Multi-language support
Size
~250-300 MB (quantized)
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Shaurya79/functiongemma-coding-assistant
Base model
google/functiongemma-270m-it