Last Updated on June 3, 2024 by Editorial Team Author(s): Vishesh Kochher Originally published on Towards AI. The Verbal Revolution: Unlocking Prompt Engineering with Langchain Peter Thiel, the visionary entrepreneur and investor, mentioned in a recent interview that the post-AI society may favour strong verbal skills over math skills. This provocative statement from the thought leader should make it clear that the ability to communicate effectively with machines will become the new currency of success. But what does this mean for AI development? source: python.langchain.com In this article, we’ll explore the exciting world of prompt engineering using Langchain, the AI equivalent of scikit-learn for machine learning. We’ll delve into the various types of prompts, the roles they can play, and how to build smarter, dynamic prompts that unlock the full potential of AI. Buckle up, and let’s dive into the fascinating world of prompt engineering with Langchain! What actually is Prompt Engineering? Prompt Engineering can be approached from 2 separate, interdependent perspectives — The Linguist and The Coder: The Linguist The incredible power of LLMs can be best leveraged by giving instructions in a very specific format and linguistic style. Remember, this is a neural network of a few billion parameters (or neurons), and we are trying to activate certain pathways through literary input. While there is no set ‘format’ that is proven as best, there are multiple widely adopted methods. A linguist may draft a prompt based on a series of functional parts: Role: This sets the tone or ‘persona’ for the LLM to approach the task with Task: To spell out what the LLM should accomplish Question: The user input (Example: What hotel discounts are available?) Relevant Offers (Context): This would include a list of relevant events, most usually populated from the results of a RAG workflow, or manually added at the time of writing the prompt. In case someone is analyzing the annual report for Meta or Nvidia, this would include the most relevant excerpts of the report based on the user’s question. Chat history may be included in this step as well. Task description and specifics: This is to spell out for the LLM — the series of steps it should take, and the relevance the task holds. This sense of ‘utmost importance’ seems to make the LLMs work better Context: This is to further explain the task to the LLM, and to provide more of a context. Examples: For the LLM to infer the task and input properly, some examples may be provided to guide its reasoning and output. Examples may also guide the LLM to generate the output in the desired format and structure. Here the concepts of ‘Zero-Shot’ vs ‘One-Shot’ vs ‘Few-Shot’ learning are pivotal. Notes: This is essential to reiterate the most important points. Note that “Lost in the middle” is a real battle for anyone working with large prompts. Hence, ending your prompt with notes of the essentials makes the LLM more reliable. In the below example, we walk through a simple yet elaborate prompt layout using the above components: # RoleYou are a virtual concierge who is able to assist in finding suitable offers and benefits. You have a key attention to detail and a high level of geographic and temporal awareness. # TaskFor the provided list of relevant offers, you should answer the user's question accurately. Do not add any additional information beyond what is mentioned in the provided context## Question: {input}## Relevant Offers:{context}You may follow the following steps in order to accurately answer the question:1. Collate all the relevant offers and provided a crisp answer with bullet points and details from the relevant offer listings.2. Review your answer to ensure that there is no error in your final reply.# Specifics- This task is extremely important for our organization and all the stakeholders. - Our members' satisfaction depends on you being able to correctly answer the provided Question.- Do not hallucinate any answers.# Context- Users ask questions to find out details about discounts and offers at various establishments like hotels, restaurants, hospitals and airlines.- Your accurate results enable members to be well informed about relevant offers for them.# ExamplesQuestion: What benefits can I avail at hotels in India?Answer: Based on our current offers you can avail the following benefits: 1. 10% off at Taj Hotels. 2. 15% off on F&B at all Clarks properties in Delhi, Jaipur, Agra. This includes ... 3. ... Near Delhi, there are also benefits at hotels in Dehradun and Chandigarh that you may like to explore.# Notes- Provide accurate results about relevant offers.- Remember to follow the steps provided in order to execute the task effectively.- Provide a crisp answer with bullet points and details from the relevant offer listings. Note: Over the past months, various best-practice techniques have been released, each suited well for certain tasks and activities. Some of the most effective of these are: Chain-of-Thought (CoT), ReAct, ART, Self-Ask — although this topic is in high flux and we may see an even better technique anytime soon. The Coder While linguistics plays a crucial role in crafting effective prompts, the field of prompt engineering extends far beyond the realm of language and semantics. As AI developers, we know that prompt engineering is not just about designing clever phrases or sentences; it’s about engineering and integrating scalable prompt pipelines into the AI process chains. This means creating a seamless flow of prompts that can be easily adapted, modified, and fine-tuned to optimize AI performance. With Langchain as our framework of choice, we’ll delve into the various aspects of prompt engineering that go beyond linguistics: One-Shot and Few-Shot Prompts: Design prompts that can learn from a single example or a few examples, enabling AI models to adapt quickly to new tasks and domains. Dynamic Example Selectors: Develop prompts that can dynamically select relevant examples to finely prompt AI models, ensuring they learn from the most informative and diverse data. Partial Prompts: Create prompts that can be composed of multiple parts, allowing AI […]