Prompt Engineering
Prompt Engineering
Prompt Engineering
INTERNSHIP REPORT
ON
Prompt Engineering and ChatGPT
INTERNSHIP REPORT
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
SHORT-TERM INTERNSHIP
(Onsite / Virtual)
An Internship Report on
BACHELOR OF TECHNOLOGY
Under the Faculty Guideship of
Miss.VijayaLakshmi
- Assistant Professor
Department of
CSE-ARTIFICIAL INTELLIGENCE
Submitted by:
DUDIPALLI VASANTHI
21JR1A4322
STUDENTS’s DECLARATION
OFFICIAL CERTIFICATE
Endorsements
Faculty Guide
ABSTRACT
INDEX
CONTENTS PAGE NO
Introduction 12
ChatGPT Introduction 20
OBJECTIVES
An objective for this position should emphasize the skills you already
possess in the area and your interest in learning more
Utilizing internships is a great way to build your resume and develop skills
that can be emphasized in your resume for future jobs. When you are
applying for a Training Internship, make sure to highlight any special skills
or talents that can make you stand apart from the rest of the applicants so
that you have an improved chance of landing the position.
Learn how to create, refine, and optimize prompts to generate accurate and
relevant outputs from ChatGPT. Gain insight into the functioning of
ChatGPT, exploring how prompt structure affects response quality and
coherence.
Introduction
Prompt engineering is the process of creating instructions for AI models to understand and interpret. It involves
crafting prompts to guide AI models to produce accurate, relevant, and valuable responses. Prompt engineering
is the practice of designing inputs for AI tools that will produce optimal outputs.
A prompt is a text in the natural language used to train the generative AI on the specific task at hand to be
executed. Generative AI, which utilizes large machine learning models, creates various content, including
stories, conversations, videos, images, and music.
The AI models are exceedingly multi-functional, and they serve functions such as summarizing documents,
completing sentences, answering questions, and translating from one language to another based on what they
have been exposed to during training. However, the models require context and better and more information
to make sound and meaningful output and minimize inaccuracies.
Correspondingly, prompt engineering describes creating and refining prompts to ensure the
AI generates usable and meaningful content. Upon reinterpretation, the cycle ensures that the AI models
respond appropriately to a wide range of user input. Ultimately, it may help strengthen the customer
experience.
Prompt engineering is enabled by in context learning, defined as a model's ability to temporarily learn from
prompts. The ability for in-context learning is an emergent ability of large language models. In-context
learning itself is an emergent property of model scale, meaning breaks in downstream scaling laws occur
such that its efficacy increases at a different rate in larger models than in smaller models.
Prompt engineering may involve phrasing a query, specifying a style, providing relevant context or assigning
a role to the AI such as "Act as a native French speaker". A prompt may include a few examples for a model
to learn from, such as asking the model to complete "mansion → house, chat → cat, an approach called few-
shot learning.
Prompt engineering helps make AI applications more efficient and effective. Application developers typically
include open-ended user input within a prompt before sending it to the AI model.
• Prompt types
1
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
Prompts can be queries, commands, or longer statements that provide context, instructions, and conversation
history.
• Prompt techniques
Some techniques include chain of thought prompting, which provides a series of prompts to help the AI
develop a deeper understanding of a task. Prompt chaining is another technique that breaks down a complex
task into smaller sub-tasks.
• Prompt iteration
Prompt engineering is an iterative process that involves experimenting with different ideas and testing the AI
prompts.
• Prompt injection
A subfield of prompt engineering that focuses on creating malicious input prompts to modify the behaviour of
an AI system.
Prompt engineering is an essential aspect of advanced language models like ChatGPT. It helps users engage
with the system and receive accurate responses. It helps researchers understand the capabilities and limitations
of large language models (LLMs).
1. Role
A role denotes the position where the prompt assumes an individual, which helps the AI create a response
relevant to that persona.
An example could be, “Technical support specialist: A customer has inquired about how to troubleshoot
software issues.”
Using the term “technical support specialist” allows the AI to create a response in a technical tone appropriate
for customer support.
2. Instruction/Task
This refers to a clear outline of what specific action or response the AI is expected to generate.
For example, “Compose a product description for a new smartphone model that captures both key features
and benefits”, is a prompt asking the AI to generate a product description that mainly emphasizes the features
and benefits, leading the response in a marketing-oriented direction.
3. Questions
A question is a way of asking the AI to offer more information or provide answers in a particular area, keeping
in mind its focus and restricting its feedback.
2
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
4. Context
Adding further contextual information helps adapt the AI-generated response to the relevant scenario,
enhancing the material’s relevance and accuracy.
By way of illustration, when given a prompt such as “With the patient’s medical history provided below,
outline potential treatment approaches for this condition,” the AI can respond with suggestions specific to the
actual patient’s health based on the medical history as context.
5. Example
An effective learning strategy can be adding examples to the prompts, which further attracts the AI’s attention
and sets clear expectations for the type of information required.
For instance, the prompt the author offers is, “Given the beginning and ending of a story, fill in the narrative
with plot details and character development.”
The AI is provided with a readymade story structure that needs to be filled out and is likely to offer the
sequence that aligns the most with the writing pattern offered.
Integrating these elements into prompts enables prompt engineers to accurately convey the intended task or
query to AI models. Eventually, this results in more accurate, relevant, and contextually fitting responses, thus
improving the usability and effectiveness of AI text generation systems in different applications and domains.
2. Master GPT architecture: Thoroughly study the architecture and principles of GPTbased models to
effectively utilize them for prompt engineering tasks.
3
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
1. Chain-of-thought-prompting
Chain-of-thought prompting is an AI technique that allows complex questions or problems to be broken down
into smaller parts. This technique is based on how humans approach a problem—they analyse it, with each
part investigated one at a time. When the question is broken into smaller segments, the artificial intelligence
model can analyse the problem more thoroughly and give a more accurate answer.
For example, given a question: “How does climate change affect biodiversity?” Instead of directly providing
an answer, an AI model that uses chain-of-thought prompting would break the question into three components
or subproblems. The subproblems might include:
• Destruction of habitat
Then, the model starts analysing and investigating how the changed climate affects temperature, how
temperature change affects habitat, and how the destruction of a habitat affects biodiversity.
This approach allows the model to address each part of the issue and give a more detailed answer to the initial
question of the influence of climate change on biodiversity.
Example:
Q: {question}
2. Tree-of-thought prompting
Tree-of-thought prompting builds upon chain-of-thought prompting. It expands on it by asking the model to
generate possible next steps and elaborate on each using a tree search method. For instance, if asked, “What
are the effects of climate change?” the model would generate possible next steps like listing environmental
and social effects and further elaborating on each.
Tree-of-thought prompting generalizes chain-of-thought by prompting the model to generate one or more
"possible next steps", and then running the model on each of the possible next steps by breadth-first, beam, or
some other method of tree search.
3. Maieutic prompting
Maieutic prompting is a technique used to make models explain how they came to give a particular response,
reason, or answer. In this case, one first prompts the model, asking why they gave a particular answer before
subsequently asking them to talk more about the first answer. The essence of repetitive questioning is to ensure
that the model provides better responses to complex reasoning questions through enhanced understanding.
4
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
For instance, consider the question, “Why is renewable energy important?” With maieutic prompting, the AI
model would simply say renewable energy is important because it reduces greenhouse gases. The subsequent
prompt would then promote the model to talk more about given aspects of the response. For instance, the
prompt might direct the model to talk more about how wind and solar power will replace fossil fuels and rid
the world of climate change. As a result, the AI model develops a better understanding and provides better
future findings and responses on the importance of renewable energy.
Example:
Q: {question}
A: True, because
Q: {question}
A: False, because
4. Complexity-based prompting
This method involves performing chain-of-thought rollouts and selecting the rollouts with the most extended
chains of thought. For instance, in solving a complex math problem, the model would consider rollouts with
the most calculation steps to reach a common conclusion.
6. Least-to-most prompting
Using the least-to-most prompting technique, the model will list the subproblems involved in solving a given
task. Then, the model solves the subproblems in a sequence to ensure that every subsequent step uses the
solutions to the previous ones. For example, a user may prompt the model using the following cooking-themed
least-to-most example: a user says to the model, “Bake a cake for me.” Hence, the model’s first output would
include the subproblems “preheat the oven” and “mix the ingredients.” The model would need to ensure that
the cake is baked.
7. Self-refine prompting
Self-refine or self-consistent prompting involves listing subproblems of a problem and solving them in
sequence related to the top-up. It consists of solving a problem, criticizing it, and solving the criticized solution
by considering the problem and the critique. When asked to write an essay, it writes before criticizing that it
has no prevalence of explicit examples and thus writes.
8. Directional-stimulus prompting
Directional-stimulus prompting includes directing what the models should write. For example, if I ask the
model to write a poem about love, I will suggest including “heart,” “passion,” and “eternal.” These provisions
help the model produce favourable outputs from various tasks and domains.
5
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
9. Zero-shot prompting
Zero-shot prompting represents a game-changer for natural language processing (NLP) as it allows AI models
to create answers without training based on the data or a set of examples. At that, zero-shot prompting does
stand out from the traditional ways to address this issue, as the system can draw from existing knowledge and
relationships based on data it already has, being encoded in its parameters.
A classic example includes a large language model trained on various uploaded texts to the internet but with
no specific preparation on medical topics. When the model is prompted with the phrase “What are the
symptoms of COVID-19?” through zero-shot prompting, it recognizes the structure and context of the issue.
It retrieves a question based on its understanding of related subjects it has seen while training.
While the system was never explicitly told about the disease, it accurately lists its symptoms, such as fever,
cough, and loss of taste and smell, due to the method showcasing the model’s generalizing and adaptable
nature. Zero-shot prompting is a cutting-edge breakthrough in NLP that can drive machine capabilities to
efficiently and effectively solve language tasks regardless of task or data.
Benefits
1. Enhanced control
Prompt engineering fosters user control over AI more than ever by allowing users to control the AI models
themselves with prompts. This, in turn, ensures that the most generated content closely matches the user’s
needs and expectations. As stated earlier, the same mechanism could be employed with different writing
services, including, but not limited to, content generation summarization and translation.
2. Improved relevance
It ensures that churned-out outputs have context and are intended accordingly. This increases the level of
practicality and excellence of implemented AI-based text products in different spheres.
6
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
3. Increased efficiency
Effective prompts help develop an AI targeted in its approach to text generation through proper direction on
specific tasks or topics. This automation is beneficial as it increases efficiency and reduces the need for manual
involvement. Hence, time and resources are saved by optimizing the process downstream.
4. Versatility
Prompt engineering approaches can be used across various text generation tasks and domains, making them
essential for content generation, language translation, summary, and other broad range of applications. 5.
Customization
Prompt engineering is all about creating a suitable basis for the design of AI-driven products, taking into
account a customer’s needs, tastes, and targeted group. That is the good side of flexibility, as it facilitates
modifying content to fit the person’s particular goals and targets.
Limitations
1. Prompt quality reliance
The output quality heavily depends on the prompts’ quality and precision. Poorly designed prompts may lead
to inaccurate or irrelevant AI-generated outputs, thus diminishing the overall quality of results.
2. Domain specificity
Optimal results in prompt engineering may require domain-specific understanding and expertise. Due to
insufficient domain know-how, a person may need help producing effective AI model guiding questions,
limiting applicability in some domains.
3. Potential bias
Biased prompts or training data can introduce bias into AI-generated outputs, leading to inaccurate or unfair
results. To ensure such outcomes are addressed, prompt engineering efforts should be made when designing
prompts and choosing data sets.
7
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
1. Content generation
Prompt engineering is extensively employed in content generation tasks, including writing articles, generating
product descriptions, and composing social media posts. By crafting tailored prompts, content creators can
guide AI models to produce engaging and informative content that resonates with the target audience.
2. Language translation
Prompt engineering is a valuable tool for accurate and contextually relevant language translation between
different languages. Translators can direct AI models to produce translations that capture the finer points and
intricacies of the original text, leading to excellent-quality translations by giving specific instructions.
3. Text summarization
Prompt engineering is instrumental in text summarization tasks, where lengthy documents or articles must be
condensed into concise and informative summaries. By crafting prompts that specify the desired summary
length and key points, prompt engineers can guide AI models to generate summaries that capture the essence
of the original text.
4. Dialogue systems
Dialogue systems like chatbots and virtual assistants rely on prompt engineering to facilitate natural and
engaging user interactions. By designing prompts that anticipate user queries and preferences, prompt
engineers can guide AI models to generate relevant, coherent, and contextually appropriate responses,
enhancing the overall user experience.
5. Information retrieval
In the information retrieval domain, prompt engineering enhances search engines’ capabilities to retrieve
relevant and accurate information from vast data repositories. By crafting prompts that specify the desired
information and criteria, prompt engineers can guide AI models to generate search results that effectively meet
the user’s information needs.
6. Code generation
Prompt engineering is increasingly applied in code generation tasks, where AI models are prompted to
generate code snippets, functions, or even entire programs. Prompt engineers can guide AI models to generate
code that fulfils the desired functionality by providing clear and specific prompts, thus streamlining software
development and automation processes.
7. Educational tools
Prompt engineering is employed in educational tools and platforms to provide personalized learning
experiences for students. By designing prompts that cater to individual learning objectives and proficiency
levels, prompt engineers can guide AI models to generate educational content, exercises, and assessments
tailored to the needs of each student.
8
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
The following tool in the list of top prompt engineering tools is ChatGPT, developed by OpenAI, has
revolutionized how we interact with AI, offering conversational capabilities spanning various applications
from tutoring and content creation to technical support. In 2024, its advanced language understanding and
generation make it one of the most versatile prompt engineering tools. It's a tool and a platform that embodies
the principles of effective, prompt engineering through its responsive and adaptive dialogue model.
Key Features
1. State-of-the-art natural language processing capabilities.
4. Extensive API support for integration with other services and platforms.
1. Define the objective: Establish a clear purpose for the prompt. For instance, when summarizing a
news article, the aim is to obtain a brief, informative overview.
2. Construct an initial prompt: Create a simple, concise prompt to begin the process. For example:
"Summarize the following news article:". This can be refined and expanded as needed.
3. Evaluate and iterate: Assess the output generated by ChatGPT in response to the initial prompt.
Modify the prompt as necessary to improve the outcome. For instance: "Provide a concise, 3-sentence
summary of the following news article:". Iterate until the desired result is achieved.
4. Employ control mechanisms: Experiment with various control techniques, such as tokens, prefixes,
or postfixes, to guide the AI's response. For example, prepend a sentence like "Using a professional
tone," to influence the tone of the generated text.
5. Leverage automation in prompt design: Explore tools and methodologies for automating prompt
design, such as employing machine learning algorithms to generate or optimize prompts based on
specific datasets.
By adhering to this structured approach and utilizing real-world examples, you will develop a strong
understanding of prompt engineering, enabling you to create effective prompts for ChatGPT.
Continued practice is essential for refining your prompt engineering skills. As you gain experience, you will
develop a more intuitive understanding of how to craft the ideal prompt for any situation. Additionally, do not
hesitate to consult online resources, such as GitHub repositories, for further inspiration and examples.
9
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
ChatGPT Introduction
ChatGPT, a new model trained by OpenAI, is designed to interact conversationally. It can follow instructions
in a prompt to provide appropriate responses within the context of a dialogue. ChatGPT can assist with
answering questions, suggesting recipes, writing lyrics in a certain style, generating code, and much more.
This model is trained using Reinforcement Learning from Human Feedback (RLHF). While it is more capable
than previous GPT iterations and trained to reduce harmful and untruthful outputs, it still has limitations. We
will cover some of the capabilities and limitations with concrete examples.
For example, a chat using the ChatGPT API would look like this:
import openai openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "system",
"content": "You are an AI research assistant. You use a tone that is technical and scientific."}, {"role":
"user", "content": "Hello, who are you?"}, {"role": "assistant", "content": "Greeting! I am an AI research
assistant. How can I help you today?"}, {"role": "user", "content": "Can you tell me about the creation of
black holes?"}])
Single-turn Tasks
ChatGPT also supports single-turn tasks, similar to those used with text-davinci-003. This means developers
can use ChatGPT to perform tasks previously demonstrated for the original GPT models, such as question-
answering tasks. Here's an example of a single-turn task using ChatGPT:
import openai and openai.ChatCompletion.create( model="gpt-3.5-turbo", messages=[ {"role": "system",
"content": "You are an AI research assistant. You use a tone that is technical and scientific."}, {"role": "user",
"content": "Hello, who are you?"}, {"role": "assistant", "content": "Greeting! I am an AI research assistant.
How can I help you today?"},{"role": "user", "content": "Can you tell me about the creation of black
holes?"}])temperature=0, )
10
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
1. PromptBase
The first tool in the list of top prompt engineering tools is PromptBase, which stands out as a marketplace
uniquely designed for buying and selling high-quality AI prompts. In 2024, it has become a go-to platform for
prompt engineering professionals and enthusiasts looking to explore the potential of language models without
delving into the complexities of model training or prompt crafting from scratch. Its user-friendly interface and
11
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
vast collection of prompts across various domains make it an invaluable resource for those seeking to leverage
AI for content creation, coding, design, and more.
Key Features
2. User ratings and reviews for each prompt to ensure quality and relevance.
2. OpenPrompt
OpenPrompt is a toolkit designed to simplify prompt engineering for language models. It offers an open-source
framework that supports developing, testing, and deploying prompts across various models and tasks. Its
versatility and comprehensive feature set make it particularly appealing to researchers and developers looking
to experiment with and optimize prompt-based interactions with AI systems.
Key Features
3. OpenAI
OpenAI, the organization behind groundbreaking models like ChatGPT, GPT-4, and DALL-
E, offers more than just AI models—it provides a comprehensive ecosystem for developing AI applications.
Its platform and APIs facilitate access to state-of-the-art AI capabilities, empowering developers, researchers,
and businesses to innovate and create with AI. OpenAI's tools are designed to be versatile, scalable, and
accessible, making advanced AI technologies available to a wide audience.
Key Features
12
Downloaded by Kondri Mounika ([email protected])
lOMoARcPSD|44275706
4. Emergent Mind
Emergent Mind is a cutting-edge prompt engineering tool designed to facilitate creating and managing AI-
generated content, focusing on enhancing creativity and productivity. It integrates seamlessly with various AI
models to offer content creators, marketers, and educators a streamlined workflow. Emergent Mind
emphasizes ease of use and flexibility, enabling users to harness the power of AI for creative endeavours
without needing deep technical knowledge.
Key Features
Conclusion
By mastering prompt engineering, developers can unlock the full potential of AI language models like
ChatGPT, creating engaging, informative, and dynamic conversational systems. As the field of AI continues
to evolve rapidly, staying up-to-date with the latest techniques, applications, and limitations is essential. This
comprehensive guide provides the foundation you need to excel in prompt engineering and create powerful
AI-driven applications that offer exceptional user experiences.
13
Downloaded by Kondri Mounika ([email protected])