How to Customize Your System Prompt in the Requesty UI

Mar 10, 2025

Try the Requesty Router and get $6 free credits 🔀

Join the discord


If you’ve been juggling different coding assistants like Cline, Goose, OpenWebUI, OpenRouter, or others, you know how important it is to have a clean, efficient workflow. One of the most powerful (and often underused) features in these AI-driven tools is the system prompt. That prompt can shape how your AI responds—whether you’re coding, writing blog posts, or prototyping new ideas.

In a recent update on the Requesty platform, we introduced a new feature that allows you to modify and fully customize the system prompt directly in the Requesty UI—no more diving into code or conf files just to tweak a single line. If you’re building or experimenting with AI tools, this new feature can save you time, offer new creative possibilities, and give you more control over how your AI responds.

Table of Contents

  1. Why System Prompts Matter

  2. New Feature: Customize Your Prompt in the UI

  3. How It Works: Step-by-Step

  4. When to Use a Custom System Prompt

  5. Integrations with Cline, Goose, OpenWebUI, and OpenRouter

  6. Tips for Effective Prompting

  7. Wrap-Up

1. Why System Prompts Matter

In AI-assisted workflows, the system prompt sets the stage for how the model will respond. Whether you’re refactoring code in Cline, brainstorming with Goose, or using advanced features in OpenWebUI, the system prompt ensures the AI knows its role and context. Small tweaks here can significantly change the AI’s output—making it more aligned with your project goals.

2. New Feature: Customize Your Prompt in the UI

Traditionally, adjusting the system prompt meant editing lines of code or environment variables—time-consuming if you want to experiment or if you’re not a developer. Now, Requesty allows you to:

  • View and Edit the system prompt directly in your Manage APIFeatures settings.

  • Prepend additional instructions (e.g., “You are a coding tutor focusing on Python. Only give short code examples.”) or fully replace the existing system prompt.

This feature is especially handy if you notice your AI returning repetitive or confusing responses. With a custom prompt, you can narrow its scope or guide it more precisely.

3. How It Works: Step-by-Step

Below is a quick walkthrough inspired by the video transcript:

  1. Go to Manage API
    In the Requesty dashboard, navigate to Manage API. Here, you’ll see a tab called Features—click it.

  2. Locate the System Prompt Field
    Look for an option that mentions “System Prompt” or “Customize Prompt.” This is where you can insert your own text.

  3. Replace or Prepend

    • Replace: Wipe out the default system prompt and write an entirely new one. For example, “Ignore all instructions except for counting to 10.”

    • Prepend: Keep the default system prompt and add your extra text. For instance, “You are a Requesty Coder. Please focus on AI routing best practices.”

  4. Save Your Changes
    Once saved, your custom system prompt is immediately active. Any tool calling Requesty—like Roo Code, Cline, or Goose—will receive these updated instructions behind the scenes.

  5. Test in Your AI Tool
    Fire up your coding assistant or text interface. Enter a quick prompt (e.g., “Test”) and see how the AI responds. You can check the Logs in Requesty to confirm the new system prompt is being used.

4. When to Use a Custom System Prompt

  • Project-Specific Guidance: If your team is building an AI feature for, say, medical records or financial data, you can instruct the AI to maintain compliance standards or a certain tone.

  • Debugging & Testing: When the AI acts erratically or you’re seeing “glitches,” adding extra clarification or constraints in the system prompt can fix the behavior.

  • Prototyping New Features: If you’re iterating quickly, you might want to cycle through different system prompts (e.g., “Focus on speed,” “Focus on minimal text,” “Use advanced reasoning,” etc.) to see which yields the best results.

5. Integrations with Cline, Goose, OpenWebUI, and OpenRouter

Cline:

  • Typically uses a default system prompt that sets the code assistant’s role. With Requesty, you can override or append instructions—like telling Cline to only output minimal comments.

Goose:

  • Great for writing & brainstorming. A custom system prompt can help ensure Goose stays on-topic (e.g., marketing content, technical documentation).

OpenWebUI & OpenRouter:

  • Both are popular for orchestrating different LLM models. If you’re testing multiple models via OpenRouter (or hosting your own with OpenWebUI), having the system prompt set in Requesty ensures consistent behavior across all models.

6. Tips for Effective Prompting

  • Keep It Clear & Concise: Long, rambling prompts can confuse the AI. Summarize your objectives in plain language.

  • Test Different Variations: Don’t be afraid to iterate—try multiple short system prompts, or see if one big specialized prompt works best.

  • Use Role-Based Language: E.g., “You are a senior Java developer…” or “You are a friendly writing assistant for social media.”

  • Log & Review: Always check the Logs in Requesty to confirm your custom prompt is being applied—and see how the AI is interpreting your instructions.

7. Wrap-Up

Requesty’s new Custom System Prompt feature is a game-changer for anyone who wants granular control over AI-driven tools like Cline, Goose, OpenWebUI, or OpenRouter. No more fiddling with code to make slight adjustments. With a few clicks, you can refine your assistant’s behavior, set fallback instructions, or even drastically change how it interprets your requests.

Ready to give it a try?

  • Head to Manage APIFeatures in your Requesty dashboard.

  • Modify the system prompt to align with your workflow, then save.

  • Start using your favorite tool (Cline, Goose, etc.) and watch how the AI responses adapt.

Whether you’re debugging code, writing documentation, or building a new AI feature from scratch, a well-tuned system prompt can make all the difference. Happy customizing!

Have questions or feedback?
Join our Discord or check out our Docs. We’d love to hear about your experiences and help you optimize your AI workflow.

Follow us on

© Requesty Ltd 2025