add new langchain chatgpt notebook

pull/43/head
Elvis Saravia 2023-03-08 01:33:22 -06:00
parent 7121c2faa6
commit 0f2f44d923
5 changed files with 396 additions and 13 deletions

View File

@ -144,6 +144,7 @@ The following are the latest papers (sorted by release date) on prompt engineeri
- [Rethinking the Role of Demonstrations: What Makes In-Context Learning Work?](https://arxiv.org/abs/2202.12837) (Feb 2022)
- [Chain of Thought Prompting Elicits Reasoning in Large Language Models](https://arxiv.org/abs/2201.11903) (Jan 2022)
- [Show Your Work: Scratchpads for Intermediate Computation with Language Models](https://arxiv.org/abs/2112.00114) (Nov 2021)
- [AI Chains: Transparent and Controllable Human-AI Interaction by Chaining Large Language Model Prompts](https://arxiv.org/abs/2110.01691) (Oct 2021)
- [Generated Knowledge Prompting for Commonsense Reasoning](https://arxiv.org/abs/2110.08387) (Oct 2021)
- [Multitask Prompted Training Enables Zero-Shot Task Generalization](https://arxiv.org/abs/2110.08207) (Oct 2021)
- [Reframing Instructional Prompts to GPTk's Language](https://arxiv.org/abs/2109.07830) (Sep 2021)

View File

@ -7,7 +7,7 @@ In this section, we cover the latest prompt engineering techniques for ChatGPT,
Topics:
- [ChatGPT Introduction](#chatgpt-introduction)
- [Reviewing The Conversation Task](#reviewing-the-conversation-task)
- [Converstions with ChatGPT](#conversations-with-chatgpt)
- [Conversations with ChatGPT](#conversations-with-chatgpt)
---
## ChatGPT Introduction
@ -16,7 +16,7 @@ ChatGPT is a new model [trained by OpenAI](https://openai.com/blog/chatgpt) that
ChatGPT is trained using Reinforcement Learning from Human Feedback (RLHF). While this model is a lot more capable than previous GPT iterations (and also trained to reduce harmful and untruthful outputs), it still comes with limitations. Let's cover some of the capabilities and limitations with concrete examples.
You can use the research preview of ChatGPT [here](chat.openai.com) but for the examples we will use the Chat mode on the OpenAI Playground.
You can use the research preview of ChatGPT [here](chat.openai.com) but for the examples below we will use the `Chat` mode on the OpenAI Playground.
---
## Reviewing The Conversation Task
@ -31,7 +31,7 @@ The following is a conversation with an AI research assistant. The assistant ton
Human: Hello, who are you?
AI: Greeting! I am an AI research assistant. How can I help you today?
Human: Can you tell me about the creation of blackholes?
Human: Can you tell me about the creation of black holes?
AI:
```
@ -124,7 +124,9 @@ response = openai.ChatCompletion.create(
)
```
You can try out [this notebook](../notebooks/pe-chatgpt-intro.ipynb) to learn more about how to make calls to the ChatGPT APIs.
You can try out [this notebook](../notebooks/pe-chatgpt-intro.ipynb) to learn more about how to make calls to the ChatGPT APIs using the `openai` library.
You can also try out [this notebook](../notebooks/pe-chatgpt-langchain.ipynb) that provides an introduction of ChatGPT features available on the `LangChain` library.
---

View File

@ -23,8 +23,9 @@
"outputs": [],
"source": [
"%%capture\n",
"# update the OpenAI Python library \n",
"!pip install --upgrade openai"
"# update or install the necessary libraries\n",
"!pip install --upgrade openai\n",
"!pip install --upgrade python-dotenv"
]
},
{

View File

@ -0,0 +1,368 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"## ChatGPT with LangChain\n",
"\n",
"This notebook provides a quick introduction to ChatGPT and related features supported in LangChain."
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Install these libraries before getting started. Ideally, you want to create a dedicated environment for this."
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {},
"outputs": [],
"source": [
"%%capture\n",
"# update or install the necessary libraries\n",
"!pip install --upgrade openai\n",
"!pip install --upgrade langchain\n",
"!pip install --upgrade python-dotenv"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"True"
]
},
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import openai\n",
"import os\n",
"import IPython\n",
"from langchain.llms import OpenAI\n",
"from dotenv import load_dotenv\n",
"load_dotenv()"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Load environment variables. You can use anything you like but I used `python-dotenv`. Just create a `.env` file with your `OPENAI_API_KEY` then load it as follows:"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [],
"source": [
"os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY\")"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"We are adapting code from [here](https://langchain.readthedocs.io/en/latest/modules/chat/getting_started.html)."
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {},
"outputs": [],
"source": [
"from langchain.chat_models import ChatOpenAI\n",
"from langchain import PromptTemplate, LLMChain\n",
"from langchain.prompts.chat import (\n",
" ChatPromptTemplate,\n",
" SystemMessagePromptTemplate,\n",
" AIMessagePromptTemplate,\n",
" HumanMessagePromptTemplate,\n",
")\n",
"from langchain.schema import (\n",
" AIMessage,\n",
" HumanMessage,\n",
" SystemMessage\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {},
"outputs": [],
"source": [
"# chat mode instance\n",
"chat = ChatOpenAI(temperature=0)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"ChatGPT support different types of messages identifiable by the role. LangChain. Recall how we make a basic call to ChatGPT using `openai`? Here is an example:\n",
"\n",
"```python\n",
"MODEL = \"gpt-3.5-turbo\"\n",
"\n",
"response = openai.ChatCompletion.create(\n",
" model=MODEL,\n",
" messages=[\n",
" {\"role\": \"system\", \"content\": \"You are an AI research assistant. You use a tone that is technical and scientific.\"},\n",
" {\"role\": \"user\", \"content\": \"Hello, who are you?\"},\n",
" {\"role\": \"assistant\", \"content\": \"Greeting! I am an AI research assistant. How can I help you today?\"},\n",
" {\"role\": \"user\", \"content\": \"Can you tell me about the creation of black holes?\"}\n",
" ],\n",
" temperature=0,\n",
")\n",
"```\n",
"\n",
"LangChain supports these different types of messages, including a arbitrary role parameter (`ChatMessage`). Let's try: "
]
},
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Positive', additional_kwargs={})"
]
},
"execution_count": 7,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"USER_INPUT = \"I love programming.\"\n",
"FINAL_PROMPT = \"\"\"Classify the text into neutral, negative or positive. \n",
"\n",
"Text: {user_input}. \n",
"Sentiment:\"\"\"\n",
"\n",
"chat([HumanMessage(content=FINAL_PROMPT.format(user_input=USER_INPUT))])"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Let's try an example that involves a system instruction and a task provided by user."
]
},
{
"cell_type": "code",
"execution_count": 11,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Sentiment: Positive', additional_kwargs={})"
]
},
"execution_count": 11,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages = [\n",
" SystemMessage(content=\"You are a helpful assistant that can classify the sentiment of input texts. The labels you can use are positive, negative and neutral.\"),\n",
" HumanMessage(content=\"Classify the following sentence: I am doing brilliant today!\"),\n",
"]\n",
"\n",
"chat(messages)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Now let's try another example that involves an exchange between a human and AI research assistant:"
]
},
{
"cell_type": "code",
"execution_count": 8,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='Certainly! Black holes are formed when a massive star runs out of fuel and can no longer produce the energy needed to counteract the force of gravity. This causes the star to collapse in on itself, creating a singularity - a point of infinite density and zero volume. The gravitational pull of the singularity is so strong that nothing, not even light, can escape its grasp, hence the name \"black hole\". \\n\\nThere are also supermassive black holes, which are found at the centers of galaxies and are thought to have formed through the merging of smaller black holes and the accretion of matter. \\n\\nThe study of black holes is a fascinating and active area of research in astrophysics, and there is still much to be learned about these mysterious objects.', additional_kwargs={})"
]
},
"execution_count": 8,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"messages = [\n",
" SystemMessage(content=\"You are an AI research assistant. You use a tone that is technical and scientific.\"),\n",
" HumanMessage(content=\"Hello, who are you?\"),\n",
" AIMessage(content=\"Greeting! I am an AI research assistant. How can I help you today?\"),\n",
" HumanMessage(content=\"Can you tell me about the creation of black holes?\")\n",
"]\n",
"\n",
"chat(messages)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"There is even a feature to batch these requests and generate response (using `chat.response()`) like so:"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"LLMResult(generations=[[ChatGeneration(text='Certainly! Black holes are formed when a massive star runs out of fuel and can no longer produce the energy needed to counteract the force of gravity. This causes the star to collapse in on itself, creating a singularity - a point of infinite density and zero volume. The gravitational pull of the singularity is so strong that nothing, not even light, can escape its grasp, hence the name \"black hole\". \\n\\nThere are also supermassive black holes, which are found at the centers of galaxies and are thought to have formed through the merging of smaller black holes and the accretion of matter. \\n\\nThe study of black holes is a fascinating and active area of research in astrophysics, and there is still much to be learned about these mysterious objects.', generation_info=None, message=AIMessage(content='Certainly! Black holes are formed when a massive star runs out of fuel and can no longer produce the energy needed to counteract the force of gravity. This causes the star to collapse in on itself, creating a singularity - a point of infinite density and zero volume. The gravitational pull of the singularity is so strong that nothing, not even light, can escape its grasp, hence the name \"black hole\". \\n\\nThere are also supermassive black holes, which are found at the centers of galaxies and are thought to have formed through the merging of smaller black holes and the accretion of matter. \\n\\nThe study of black holes is a fascinating and active area of research in astrophysics, and there is still much to be learned about these mysterious objects.', additional_kwargs={}))], [ChatGeneration(text='Certainly! Dark matter is a hypothetical form of matter that is thought to make up approximately 85% of the matter in the universe. It is called \"dark\" because it does not interact with light or other forms of electromagnetic radiation, making it invisible to telescopes and other instruments that detect light. \\n\\nThe existence of dark matter was first proposed in the 1930s by Swiss astronomer Fritz Zwicky, who noticed that the observed mass of galaxy clusters was not sufficient to account for the gravitational forces that held them together. Since then, many other observations have supported the idea of dark matter, including the rotation curves of galaxies and the cosmic microwave background radiation.\\n\\nDespite its name, dark matter is not completely mysterious. Scientists have some ideas about what it might be made of, such as weakly interacting massive particles (WIMPs) or axions. However, these particles have not yet been directly detected, and the nature of dark matter remains one of the biggest mysteries in modern physics.', generation_info=None, message=AIMessage(content='Certainly! Dark matter is a hypothetical form of matter that is thought to make up approximately 85% of the matter in the universe. It is called \"dark\" because it does not interact with light or other forms of electromagnetic radiation, making it invisible to telescopes and other instruments that detect light. \\n\\nThe existence of dark matter was first proposed in the 1930s by Swiss astronomer Fritz Zwicky, who noticed that the observed mass of galaxy clusters was not sufficient to account for the gravitational forces that held them together. Since then, many other observations have supported the idea of dark matter, including the rotation curves of galaxies and the cosmic microwave background radiation.\\n\\nDespite its name, dark matter is not completely mysterious. Scientists have some ideas about what it might be made of, such as weakly interacting massive particles (WIMPs) or axions. However, these particles have not yet been directly detected, and the nature of dark matter remains one of the biggest mysteries in modern physics.', additional_kwargs={}))]], llm_output=None)"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"batch_messages = [\n",
" [\n",
" SystemMessage(content=\"You are an AI research assistant. You use a tone that is technical and scientific.\"),\n",
" HumanMessage(content=\"Hello, who are you?\"),\n",
" AIMessage(content=\"Greeting! I am an AI research assistant. How can I help you today?\"),\n",
" HumanMessage(content=\"Can you tell me about the creation of black holes?\")\n",
" ],\n",
" [\n",
" SystemMessage(content=\"You are an AI research assistant. You use a tone that is technical and scientific.\"),\n",
" HumanMessage(content=\"Hello, who are you?\"),\n",
" AIMessage(content=\"Greeting! I am an AI research assistant. How can I help you today?\"),\n",
" HumanMessage(content=\"Can you explain the dark matter?\")\n",
" ]\n",
"]\n",
"\n",
"chat.generate(batch_messages)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"If you look at the examples above it might be easier to just use a prompt template. LangChain also supports. Let's try that below:"
]
},
{
"cell_type": "code",
"execution_count": 18,
"metadata": {},
"outputs": [],
"source": [
"template = \"You are a helpful assistant that can classify the sentiment of input texts. The labels you can use are {sentiment_labels}. Classify the following sentence:\"\n",
"system_message_prompt = SystemMessagePromptTemplate.from_template(template)\n",
"human_template = \"{user_input}\"\n",
"human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='The sentiment of the sentence \"I am doing brilliant today!\" is positive.', additional_kwargs={})"
]
},
"execution_count": 21,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, human_message_prompt])\n",
"\n",
"\n",
"chat(chat_prompt.format_prompt(sentiment_labels=\"positive, negative, and neutral\", user_input=\"I am doing brilliant today!\").to_messages())"
]
},
{
"cell_type": "code",
"execution_count": 22,
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content='The sentiment of the sentence \"Not sure what the weather is like today\" is neutral.', additional_kwargs={})"
]
},
"execution_count": 22,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chat(chat_prompt.format_prompt(sentiment_labels=\"positive, negative, and neutral\", user_input=\"Not sure what the weather is like today.\").to_messages())"
]
}
],
"metadata": {
"kernelspec": {
"display_name": "promptlecture",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "f38e0373277d6f71ee44ee8fea5f1d408ad6999fda15d538a69a99a1665a839d"
}
}
},
"nbformat": 4,
"nbformat_minor": 2
}

View File

@ -41,9 +41,20 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Below we are loading the necessary libraries, utilities, and configurations. \n",
"\n",
"Please install `dotenv` as follows: `pip install python-dotenv`"
"Below we are loading the necessary libraries, utilities, and configurations."
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"%%capture\n",
"# update or install the necessary libraries\n",
"!pip install --upgrade openai\n",
"!pip install --upgrade langchain\n",
"!pip install --upgrade python-dotenv"
]
},
{
@ -64,7 +75,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Make sure to replace `OPENAI_API_KEY` with your own key."
"Load environment variables. You can use anything you like but I used `python-dotenv`. Just create a `.env` file with your `OPENAI_API_KEY` then load it."
]
},
{
@ -1129,7 +1140,7 @@
],
"metadata": {
"kernelspec": {
"display_name": "pwc",
"display_name": "promptlecture",
"language": "python",
"name": "python3"
},
@ -1143,12 +1154,12 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.15"
"version": "3.9.16"
},
"orig_nbformat": 4,
"vscode": {
"interpreter": {
"hash": "b052b85184348e6013a65b4e7f9532c753d0b79191b90d8994bc68bec0cc322a"
"hash": "f38e0373277d6f71ee44ee8fea5f1d408ad6999fda15d538a69a99a1665a839d"
}
}
},