dino setting guide

Enhancing Your DocsGPT Experience

A Guide to Prompt Settings and K Number

The ability to fine-tune your interactions with language models is not just beneficial—it's essential. Whether you're seeking precise answers, generating content, or exploring complex datasets, how you engage with DocsGPT can dramatically affect the quality and utility of its outputs. This guide is designed to unveil the significance of prompt settings and K values in enhancing your DocsGPT experience.

Section 1: Prompt Settings

Prompts in language models act as the initial input that guides the model's response. These settings determine the context, tone, specificity, and direction of the AI-generated content. By adjusting prompt settings, users have a powerful tool at their disposal to influence the outcome of their queries, making this an essential aspect of user-model interaction.

Tailored prompts lead to more relevant and context-specific outputs, greatly enhancing user satisfaction and efficiency. The right prompt can turn a generic response into a tailored piece of information, perfectly suited to the user's needs.

Configuring Prompt Settings in DocsGPT

Changing prompts in DocsGPT is simple. Just go to the settings and find the section for prompt configuration. There, you can pick from preset options in the "Active Prompt" area. For more personalization, use the advanced settings to name your prompt and tweak it as you wish, then click "Add New Prompt" to save it.

Changing prompts in DocsGPT

Let's look at how changing prompts can make a big difference. Here are some examples:

When you use a basic prompt to ask, "Which endpoint in DocsGPT is used to request answers to user-provided questions?" you get:

According to the API documentation, the /api/answer endpoint is used to request answers to user-provided questions in DocsGPT.

The documentation specifies:

1. /api/answer
Description:
This endpoint is used to request answers to user-provided questions.

Request:
Method: POST 
Headers: Content-Type should be set to application/json; charset=utf-8
Request Body: JSON object with the following fields:

- question – The user's question.
- history  –  (Optional) Previous conversation history.
- api_key– Your API key. 
- embeddings_key  –  Your embeddings key.
- active_docs – The location of active documentation.

So to get answers from DocsGPT, you would send a POST request to the /api/answer endpoint with the required parameters in the JSON request body.

If you tweak that prompt by changing "detailed and informative responses" to "very short and concise responses," you'll see a totally different kind of answer.

The /api/answer endpoint is used to request answers to user-provided questions in DocsGPT.

Adding "{summaries}" to your prompts tells DocsGPT to pull out relevant points from your documents and use them to answer your question. This makes the answers you get relevant because the model knows the main ideas of your documents. Make sure to include "{summaries}" in your prompts when you're working with documents to get the most out of DocsGPT. If you're just having a chat or don't have any documents, you can leave it out to keep things simple.

Section 2: K Numbers

In RAG systems, K numbers hold the key to balancing efficiency and depth of information. They determine how many pieces of information (or "chunks") the model considers when crafting an answer. A smaller K number means the system looks at fewer chunks, potentially speeding up response times and focusing the answers but also risking missing out on valuable information. On the other hand, a larger K number allows the model to consider a broader array of information, which can enrich the answer but may also slow down the process.

Finding the Sweet Spot

The ideal K number varies depending on your specific needs and the context of the query. Experimentation is key to discovering the perfect balance where the model has just enough information to respond accurately without unnecessary delay. Remember, although a smaller K number might seem attractive for its directness and speed, it's vital to ensure it doesn't compromise the comprehensiveness and accuracy of the answers by omitting crucial chunks.

Changing prompts in DocsGPT

Advantages of the Pro Plan

The Pro plan offers advanced customization options, including the ability to adjust K number to greater values and access to premium models. These features allow users to process more data at a time with DocsGPT, achieving unparalleled precision and relevance in outputs.

Concluding Thoughts

Fine-tuning your interaction with DocsGPT through prompt settings and K numbers is not just a pathway to enhancing the AI's output—it's a journey towards achieving a highly customized and efficient AI experience. By carefully selecting prompts and optimizing K numbers, you unlock the potential for more relevant, accurate, and tailored responses, ensuring that each interaction with DocsGPT serves your specific needs and goals.

At Arc53, we're dedicated to helping you navigate the complexities of AI customization, from choosing the ight embeddings and training custom LLMs to optimising DocsGPT for your unique requirements. Our team is ready to provide specialized support and insights, ensuring you harness the full potential of your AI tools and projects.

Get in touch