Grok 3 with Requesty Router: Quick Integration Guide
Feb 24, 2025

Try the Requesty Router and get $6 free credits 🔀
Join the discord
We’re excited to announce that Grok 3—xAI’s next-generation AI model with powerful reasoning capabilities—is now available for seamless integration via the Requesty Router. Whether you’re using Cline, Roo Code, OpenWebUI, or other open-source tools, connecting Grok 3 to your development workflow has never been easier. In this post, we’ll show you how to get started, share some best practices, and invite you to join our community on Discord to stay ahead of the curve.
What is Grok 3?
Grok 3 is xAI’s latest and most capable large language model. Building on massive pretraining and advanced reinforcement learning for reasoning, Grok 3 excels at:
Mathematics & Problem Solving
Competition-level math performance and robust logical reasoning.Coding & Debugging
Generates well-structured code across popular languages, plus it can systematically debug multi-file projects.World Knowledge & QA
Handles graduate-level knowledge tasks, general domain questions, and complicated research queries.Instruction-Following & Creativity
Delivers concise, user-aligned responses and can generate creative content ranging from stories to design ideas.
Grok 3 ships with two variants:
Grok 3 Beta (fast, broad domain coverage, top-tier performance)
Grok 3 mini (cost-efficient, optimized for shorter queries and smaller contexts)
Additionally, both models have a (Think) variant for deeper, multi-step reasoning—perfect for complex tasks.
Why Integrate via Requesty Router?
The Requesty Router provides a single, OpenAI-compatible API endpoint that connects you to over 50 different large language models, including Grok 3. By placing Grok 3 behind a unified interface, you can:
Reduce Complexity
No separate signups or key management for each model; one key unlocks them all.Effortlessly Switch Models
Dynamically route requests to different LLMs based on cost, speed, or capability, without rewriting your code.Monitor Usage & Costs
A centralized dashboard shows consumption across all models, helping you optimize budgets.Automatic Fallbacks & Retries
If one model or provider is down, your requests can automatically fail over to an alternative, keeping your workflows humming.Community & Support
Receive hands-on help, examples, and best practices from the Requesty community and support team.
Setup Overview
At a high level, here’s how you’ll get Grok 3 working in various open-source tools:
Sign up for a Requesty Router account (or use your existing one).
Obtain your unified Router API Key (from app.requesty.ai/router).
Configure your chosen tool (Cline, Roo Code, OpenWebUI, etc.) to point at:
Base URL: https://router.requesty.ai/v1
Model: xai/grok-3:beta or xai/grok-3-mini:beta
(and for deep reasoning, xai/grok-3:beta-think or xai/grok-3-mini:beta-think)Auth Header: Authorization: Bearer <YOUR_ROUTER_API_KEY> (OpenAI-compatible header)
Send requests as if you’re calling an OpenAI-style completion or chat endpoint.
Enjoy the benefits of advanced reasoning and world-class language generation.
Using Grok 3 with Cline
Requesty Routing for Cline
Many of our users pair the Requesty Router with the Cline coding agent to quickly switch between model providers. It’s straightforward:
Select “Requesty” from the API Provider dropdown inside Cline’s settings.
Add your API Key – You can create or retrieve it on the Router Page in our platform.
Paste your Model ID – You’ll find Grok 3 or other models in the Model List.
We’ve created dedicated models for Cline. If you want to use those, the format is slightly different than the standard model names. You can find more details in the Dedicated Models documentation on the Requesty platform.
Example: If you set an alias “coding” for Grok 3 in the Requesty dashboard, you can reference it in Cline as alias/coding.
Quick-Start Example
Open Cline (either in VS Code after installing the extension, or via CLI).
Select Requesty as the provider and input your Router API Key.
Choose a model (e.g., xai/grok-3:beta or use a dedicated alias).
Ask a coding question or request code generation. Cline routes the call to Grok 3 automatically.
Using Grok 3 with Roo Code
Roo Code also supports direct integration with the Requesty Router. The process is similar:
Open Roo Code and look for the provider settings.
Pick “Requesty” as your API provider.
Enter your Requesty Router API Key.
Paste the Grok 3 model ID (xai/grok-3:beta), or use an alias like alias/coding.
Roo Code will now direct your coding prompts and completions through Grok 3. Switching to another model in Roo Code is just as easy—change the model ID or alias in the settings.
Using Grok 3 with OpenWebUI
OpenWebUI (OWUI) is a popular browser-based UI for local or remote LLM endpoints. It supports OpenAI-compatible calls out of the box:
Launch OpenWebUI and go to the Providers/Settings section.
Select “OpenAI-compatible” as your provider type.
Input the base endpoint: https://router.requesty.ai/v1.
Paste your API key in the “Bearer Token” or “API Key” field.
Set the model name**:** xai/grok-3:beta (or xai/grok-3-mini:beta, or a Think variant).
Save & refresh. You can now interact with Grok 3 in your browser and watch the real-time responses.
VS Code Extension & Instant Model Switching
Another handy option is the Requesty VS Code extension, which lets you switch LLMs on the fly right inside your editor:
Get Your API Key
Go to app.requesty.ai/router to generate or copy your key.Install the Extension
In VS Code, search “Requesty” in the Extensions panel, then click Install.Add Your Key
Click the Requesty icon on the sidebar and paste your key when prompted.Create an Alias (e.g., “coding”)
For example, set Grok 3 as your “coding” model.Use the Alias
In Cline, Roo Code, or other tools, set the model_id to alias/coding.
Now you can star (⭐️) your favorite models in the Requesty dashboard for quick reference, switch them anytime, and keep all your usage tracking centralized.
Power Tips: Making the Most of Grok 3
Use “Think” When You Need Depth
For routine tasks, standard Grok 3 is quick and efficient. For complex problem-solving or multi-step reasoning, use xai/grok-3:beta-think for a rich chain-of-thought.Leverage Context Windows
Grok 3 can handle up to 1M tokens in the Beta. Provide all relevant details or conversation history for the best results.Combine with Other Models
The Requesty Router allows fallback or “split” strategies—send simple queries to a cheaper model, heavy tasks to Grok 3.Monitor Usage
The Requesty dashboard shows token usage, costs, and performance data so you can avoid surprises.Experiment with System Prompts
If your tool supports “system” or “role” messages, use them to define style or domain constraints. Grok 3 respects these for improved alignment.
Join Our Discord
Ready to explore Grok 3’s capabilities, share your results, or get help with advanced integrations? Join our Requesty Discord community! Our team—and a vibrant group of developers—are there to:
Answer questions about setting up or tuning your Grok 3 environment.
Offer support for advanced multi-model strategies and usage.
Showcase projects using Grok 3 with open-source tooling.
Discuss the future of advanced reasoning LLMs.
We’d love to see what you build!
Conclusion
Grok 3 is ushering in a new era of advanced AI reasoning—coupling robust knowledge with deeply effective chain-of-thought. By tapping into it via the Requesty Router, you gain a single, streamlined interface for all your open-source coding tools, whether it’s Cline, Roo Code, OpenWebUI, or something else entirely.
Take a few minutes to set up your integration, and you’ll be on your way to powering your development workflows or research projects with next-level intelligence. If you have any questions—or just want to share your successes—be sure to join our Discord and connect with like-minded developers.
Get started today and let Grok 3 supercharge your open-source AI toolkit!