AI System Prompting

System Prompting: System Prompting is Many way:

Zero-Shot Prompting:
 You ask the model to perform a task without giving it any examples.

import openai

# Set your OpenAI API key
openai.api_key = "your-api-key"

# Zero-shot prompt
prompt = "Translate the following sentence into Spanish: 'I am going to the market.'"

# API call
response = openai.Completion.create(
    engine="text-davinci-003",  # or "gpt-3.5-turbo-instruct"
    prompt=prompt,
    max_tokens=60,
    temperature=0.5
)

# Print the result
print(response.choices[0].text.strip())



import openai

openai.api_key = "your-api-key"

response = openai.ChatCompletion.create(
    model="gpt-3.5-turbo",
    messages=[
        {"role": "user", "content": "Translate the following sentence into Spanish: 'I am going to the market.'"}
    ],
    temperature=0.5
)

print(response['choices'][0]['message']['content'].strip())






Node.js

const { OpenAI } = require("openai");

// Initialize OpenAI
const openai = new OpenAI({
  apiKey: "your-api-key", // Replace with your actual API key
});

async function zeroShotExample() {
  const prompt = "Translate the following sentence into Spanish: 'I am going to the market.'";

  const response = await openai.completions.create({
    model: "text-davinci-003",  // You can also use "gpt-3.5-turbo-instruct"
    prompt: prompt,
    max_tokens: 60,
    temperature: 0.5,
  });

  console.log("Translation:", response.choices[0].text.trim());
}

zeroShotExample();




const { OpenAI } = require("openai");

const openai = new OpenAI({
  apiKey: "your-api-key",
});

async function zeroShotChatExample() {
  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      { role: "user", content: "Translate the following sentence into Spanish: 'I am going to the market.'" }
    ],
    temperature: 0.5,
  });

  console.log("Translation:", response.choices[0].message.content.trim());
}

zeroShotChatExample();



One-Shot Prompting:  You give one example of the task to guide the model.

 const prompt = `Translate English to French:
'Good morning' → 'Bonjour'
'How are you?' →`;

Few-Shot Prompting: 
You provide a few examples of the task to improve accuracy.
Translate English to French:
'Good morning' → 'Bonjour'
'How are you?' → 'Comment ça va ?'
'I am hungry' →


Chain-of-Thought (CoT) Prompting: You guide the model to reason step-by-step before producing an answer.

Question: If Alice has 5 apples and gives 2 to Bob, then buys 3 more, how many apples does she have?
Let's think step by step.

Alice starts with 5 apples.  
She gives 2 apples to Bob, so she has 5 - 2 = 3 apples.  
Then she buys 3 more apples, so she now has 3 + 3 = 6 apples.  
Answer: 6



Self-Consistency Prompting: A
 variation of CoT where multiple reasoning paths are generated and the most consistent answer is selected.


Instruction Prompting: You clearly describe what task you want the model to do using plain instructions


 const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      {
        role: "user",
        content: `Summarize this paragraph in one sentence:

Artificial intelligence is a field of computer science focused on building smart machines capable of performing tasks that typically require human intelligence. These systems can learn from data, adapt to new inputs, and perform tasks ranging from recognizing speech to playing chess.`
      }
    ],
    temperature: 0.5,
  });


Retrieval-Augmented Prompting: The prompt includes retrieved external documents or context to inform the model.


🔹 Retrieval-Augmented Generation (RAG) Prompting
You augment your prompt with relevant external data, such as:

Knowledge base articles
PDFs or documents
API responses
Database search results
Then, you pass that retrieved data into the model along with your instruction.



const { OpenAI } = require("openai");

const openai = new OpenAI({ apiKey: "your-api-key" });

// Example "retrieved" content
const retrievedContext = `
OpenAI's GPT-4 is a multimodal large language model that supports text and image inputs, with improved accuracy and reasoning.
It was released in March 2023 and powers ChatGPT Plus.
`;

const userQuery = "Summarize the information about GPT-4.";

async function ragPrompting() {
  const messages = [
    {
      role: "system",
      content: "You are an expert assistant that summarizes technical content accurately.",
    },
    {
      role: "user",
      content: `Context:\n${retrievedContext}\n\nInstruction: ${userQuery}`,
    }
  ];

  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages,
    temperature: 0.3,
  });

  console.log("Summary:", response.choices[0].message.content.trim());
}

ragPrompting();


Prompt Chaining: Multiple prompts are used in sequence, where the output of one becomes the input to another.


const { OpenAI } = require("openai");

const openai = new OpenAI({ apiKey: "your-api-key" });

async function promptChaining() {
  const originalText = `Artificial intelligence enables machines to learn from data, make decisions, and solve problems. It is used in speech recognition, medical diagnosis, and robotics.`;

  // Step 1: Summarize
  const summaryResponse = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      { role: "user", content: `Summarize this text in 2 sentences:\n${originalText}` }
    ]
  });

  const summary = summaryResponse.choices[0].message.content.trim();
  console.log("Step 1 - Summary:", summary);

  // Step 2: Simplify
  const simplifyResponse = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [
      { role: "user", content: `Rewrite this summary for a 10-year-old:\n${summary}` }
    ]
  });

  const simpleVersion = simplifyResponse.choices[0].message.content.trim();
  console.log("Step 2 - Simplified:", simpleVersion);
}

promptChaining();



Few-Shot with CoT Prompting: Combines both few-shot examples and step-by-step reasoning.

Q: If there are 3 red balls and 2 blue balls in a bag, and I pick one at random, what is the probability of picking a red ball?
A: Let's think step by step.
There are 3 + 2 = 5 balls in total.
The number of red balls is 3.
So the probability of picking a red ball is 3 out of 5, or 3/5.

Q: A box contains 4 green apples and 6 yellow apples. What is the probability of picking a green apple?
A: Let's think step by step.
There are 4 + 6 = 10 apples in total.
The number of green apples is 4.
So the probability of picking a green apple is 4 out of 10, or 4/10.

Q: A jar contains 5 marbles: 2 red, 1 blue, and 2 green. What is the probability of picking a blue marble?
A: Let's think step by step.
There are 2 + 1 + 2 = 5 marbles in total.
The number of blue marbles is 1.
So the probability of picking a blue marble is 1 out of 5, or 1/5.


const { OpenAI } = require("openai");
const openai = new OpenAI({ apiKey: "your-api-key" });

async function fewShotCoT() {
  const prompt = `
Q: If there are 3 red balls and 2 blue balls in a bag, and I pick one at random, what is the probability of picking a red ball?
A: Let's think step by step.
There are 3 + 2 = 5 balls in total.
The number of red balls is 3.
So the probability of picking a red ball is 3 out of 5, or 3/5.

Q: A box contains 4 green apples and 6 yellow apples. What is the probability of picking a green apple?
A: Let's think step by step.
There are 4 + 6 = 10 apples in total.
The number of green apples is 4.
So the probability of picking a green apple is 4 out of 10, or 4/10.

Q: A jar contains 5 marbles: 2 red, 1 blue, and 2 green. What is the probability of picking a blue marble?
A: Let's think step by step.`;

  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [{ role: "user", content: prompt }],
    temperature: 0.3,
  });

  console.log("Answer:", response.choices[0].message.content.trim());
}

fewShotCoT();




Contrastive Prompting: 
Uses both positive and negative examples to steer model responses.



const { OpenAI } = require("openai");
const openai = new OpenAI({ apiKey: "your-api-key" });

async function contrastivePrompting() {
  const prompt = `
Instruction: Rewrite the following message into a formal email.

❌ Bad Example:
Input: hey boss, can't come today. sick af.
Output: yo boss i’m out sick lol

✅ Good Example:
Input: hey boss, can't come today. sick af.
Output: Dear Sir/Madam, I am feeling unwell and will not be able to come to work today. Kind regards.

Input: gotta skip the meeting, my internet's dead.
Output:
`;

  const response = await openai.chat.completions.create({
    model: "gpt-3.5-turbo",
    messages: [{ role: "user", content: prompt }],
    temperature: 0.3,
  });

  console.log("Formal Email:", response.choices[0].message.content.trim());
}

contrastivePrompting();

Comments

Popular posts from this blog

Spread and Rest

Shallow Copy and Deep Copy in JavaScript