https://www.youtube.com/watch?v=plZAJUN6YTw&t=1s
If youâve been juggling different coding assistants like Cline, Goose, OpenWebUI, OpenRouter, or others, you know how important it is to have a clean, efficient workflow. One of the most powerful (and often underused) features in these AI-driven tools is the system prompt. That prompt can shape how your AI respondsâwhether youâre coding, writing blog posts, or prototyping new ideas.
In a recent update on the Requesty platform, we introduced a new feature that allows you to modify and fully customize the system prompt directly in the Requesty UIâno more diving into code or conf files just to tweak a single line. If youâre building or experimenting with AI tools, this new feature can save you time, offer new creative possibilities, and give you more control over how your AI responds.
Table of Contents
Why System Prompts Matter
New Feature: Customize Your Prompt in the UI
How It Works: Step-by-Step
When to Use a Custom System Prompt
Integrations with Cline, Goose, OpenWebUI, and OpenRouter
Tips for Effective Prompting
Wrap-Up
1. Why System Prompts Matter
In AI-assisted workflows, the system prompt sets the stage for how the model will respond. Whether youâre refactoring code in Cline, brainstorming with Goose, or using advanced features in OpenWebUI, the system prompt ensures the AI knows its role and context. Small tweaks here can significantly change the AIâs outputâmaking it more aligned with your project goals.
2. New Feature: Customize Your Prompt in the UI
Traditionally, adjusting the system prompt meant editing lines of code or environment variablesâtime-consuming if you want to experiment or if youâre not a developer. Now, Requesty allows you to:
View and Edit the system prompt directly in your Manage API â Features settings.
Prepend additional instructions (e.g., âYou are a coding tutor focusing on Python. Only give short code examples.â) or fully replace the existing system prompt.
This feature is especially handy if you notice your AI returning repetitive or confusing responses. With a custom prompt, you can narrow its scope or guide it more precisely.
3. How It Works: Step-by-Step
Below is a quick walkthrough inspired by the video transcript:
Go to Manage API In the Requesty dashboard, navigate to Manage API. Here, youâll see a tab called Featuresâclick it.
Locate the System Prompt Field Look for an option that mentions âSystem Promptâ or âCustomize Prompt.â This is where you can insert your own text.
Replace or Prepend
Replace: Wipe out the default system prompt and write an entirely new one. For example, âIgnore all instructions except for counting to 10.â
Prepend: Keep the default system prompt and add your extra text. For instance, âYou are a Requesty Coder. Please focus on AI routing best practices.â
Save Your Changes Once saved, your custom system prompt is immediately active. Any tool calling Requestyâlike Roo Code, Cline, or Gooseâwill receive these updated instructions behind the scenes.
Test in Your AI Tool Fire up your coding assistant or text interface. Enter a quick prompt (e.g., âTestâ) and see how the AI responds. You can check the Logs in Requesty to confirm the new system prompt is being used.
4. When to Use a Custom System Prompt
Project-Specific Guidance: If your team is building an AI feature for, say, medical records or financial data, you can instruct the AI to maintain compliance standards or a certain tone.
Debugging & Testing: When the AI acts erratically or youâre seeing âglitches,â adding extra clarification or constraints in the system prompt can fix the behavior.
Prototyping New Features: If youâre iterating quickly, you might want to cycle through different system prompts (e.g., âFocus on speed,â âFocus on minimal text,â âUse advanced reasoning,â etc.) to see which yields the best results.
5. Integrations with Cline, Goose, OpenWebUI, and OpenRouter
Cline:
Typically uses a default system prompt that sets the code assistantâs role. With Requesty, you can override or append instructionsâlike telling Cline to only output minimal comments.
Goose:
Great for writing & brainstorming. A custom system prompt can help ensure Goose stays on-topic (e.g., marketing content, technical documentation).
OpenWebUI & OpenRouter:
Both are popular for orchestrating different LLM models. If youâre testing multiple models via OpenRouter (or hosting your own with OpenWebUI), having the system prompt set in Requesty ensures consistent behavior across all models.
6. Tips for Effective Prompting
Keep It Clear & Concise: Long, rambling prompts can confuse the AI. Summarize your objectives in plain language.
Test Different Variations: Donât be afraid to iterateâtry multiple short system prompts, or see if one big specialized prompt works best.
Use Role-Based Language: E.g., âYou are a senior Java developerâŠâ or âYou are a friendly writing assistant for social media.â
Log & Review: Always check the Logs in Requesty to confirm your custom prompt is being appliedâand see how the AI is interpreting your instructions.
7. Wrap-Up
Requestyâs new Custom System Prompt feature is a game-changer for anyone who wants granular control over AI-driven tools like Cline, Goose, OpenWebUI, or OpenRouter. No more fiddling with code to make slight adjustments. With a few clicks, you can refine your assistantâs behavior, set fallback instructions, or even drastically change how it interprets your requests.
Ready to give it a try?
Head to Manage API â Features in your Requesty dashboard.
Modify the system prompt to align with your workflow, then save.
Start using your favorite tool (Cline, Goose, etc.) and watch how the AI responses adapt.
Whether youâre debugging code, writing documentation, or building a new AI feature from scratch, a well-tuned system prompt can make all the difference. Happy customizing!
Have questions or feedback? Join our Discord or check out our Docs. Weâd love to hear about your experiences and help you optimize your AI workflow.