-
Notifications
You must be signed in to change notification settings - Fork 30
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat(LAB-3135): add
kili.llm.create_conversation
(#1775)
Co-authored-by: paulruelle <[email protected]>
- Loading branch information
1 parent
5fa375d
commit 1e43de7
Showing
17 changed files
with
1,120 additions
and
125 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
# LLM module | ||
|
||
::: kili.llm.presentation.client.llm.LlmClientMethods |
Large diffs are not rendered by default.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,280 @@ | ||
{ | ||
"cells": [ | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"<a href=\"https://colab.research.google.com/github/kili-technology/kili-python-sdk/blob/main/recipes/llm_project_setup.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"# How to Set Up a Kili Project with a LLM Model and Create a Conversation" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"In this tutorial, you'll learn how to set up a project in Kili Technology that integrates a Large Language Model (LLM), associate the LLM with your project, and create a conversation using the Kili Python SDK. By the end of this guide, you'll have a functional project ready to collect and label LLM outputs for comparison and evaluation.\n", | ||
"\n", | ||
"\n", | ||
"Here are the steps we will follow:\n", | ||
"\n", | ||
"1. Creating a Kili project with a custom interface\n", | ||
"2. Creating an LLM model\n", | ||
"3. Associating the model with the project\n", | ||
"4. Creating a conversation" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Creating a Kili Project with a Custom Interface" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"We will create a Kili project with a custom interface that includes a comparison job and a classification job. This interface will be used for labeling and comparing LLM outputs.\n", | ||
"\n", | ||
"Here's the JSON interface we will use:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"interface = {\n", | ||
" \"jobs\": {\n", | ||
" \"COMPARISON_JOB\": {\n", | ||
" \"content\": {\n", | ||
" \"options\": {\n", | ||
" \"IS_MUCH_BETTER\": {\"children\": [], \"name\": \"Is much better\", \"id\": \"option1\"},\n", | ||
" \"IS_BETTER\": {\"children\": [], \"name\": \"Is better\", \"id\": \"option2\"},\n", | ||
" \"IS_SLIGHTLY_BETTER\": {\n", | ||
" \"children\": [],\n", | ||
" \"name\": \"Is slightly better\",\n", | ||
" \"id\": \"option3\",\n", | ||
" },\n", | ||
" \"TIE\": {\"children\": [], \"name\": \"Tie\", \"id\": \"option4\", \"mutual\": True},\n", | ||
" },\n", | ||
" \"input\": \"radio\",\n", | ||
" },\n", | ||
" \"instruction\": \"Pick the best answer\",\n", | ||
" \"mlTask\": \"COMPARISON\",\n", | ||
" \"required\": 1,\n", | ||
" \"isChild\": False,\n", | ||
" \"isNew\": False,\n", | ||
" },\n", | ||
" \"CLASSIFICATION_JOB\": {\n", | ||
" \"content\": {\n", | ||
" \"categories\": {\n", | ||
" \"BOTH_ARE_GOOD\": {\"children\": [], \"name\": \"Both are good\", \"id\": \"category1\"},\n", | ||
" \"BOTH_ARE_BAD\": {\"children\": [], \"name\": \"Both are bad\", \"id\": \"category2\"},\n", | ||
" },\n", | ||
" \"input\": \"radio\",\n", | ||
" },\n", | ||
" \"instruction\": \"Overall quality\",\n", | ||
" \"mlTask\": \"CLASSIFICATION\",\n", | ||
" \"required\": 0,\n", | ||
" \"isChild\": False,\n", | ||
" \"isNew\": False,\n", | ||
" },\n", | ||
" }\n", | ||
"}" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Now, we create the project using the `create_project` method, with type `LLM_INSTR_FOLLOWING`:" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"from kili.client import Kili\n", | ||
"\n", | ||
"kili = Kili(\n", | ||
" # api_endpoint=\"https://cloud.kili-technology.com/api/label/v2/graphql\",\n", | ||
")\n", | ||
"project = kili.create_project(\n", | ||
" title=\"[Kili SDK Notebook]: LLM Project\",\n", | ||
" description=\"Project Description\",\n", | ||
" input_type=\"LLM_INSTR_FOLLOWING\",\n", | ||
" json_interface=interface,\n", | ||
")\n", | ||
"project_id = project[\"id\"]" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Creating an LLM Model" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"We will now create an LLM model in Kili, by specifying the model's credentials and connector type. In this example, we will use the OpenAI SDK as the connector type.\n", | ||
"\n", | ||
"**Note**: Replace `api_key` and `endpoint` with your model's actual credentials." | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"model_response = kili.llm.create_model(\n", | ||
" organization_id=\"<YOUR_ORGANIZATION_ID>\",\n", | ||
" model={\n", | ||
" \"credentials\": {\n", | ||
" \"api_key\": \"<YOUR_OPEN_AI_API_KEY>\",\n", | ||
" \"endpoint\": \"https://api.openai.com/v1/\",\n", | ||
" },\n", | ||
" \"name\": \"My Model\",\n", | ||
" \"type\": \"OPEN_AI_SDK\",\n", | ||
" },\n", | ||
")\n", | ||
"\n", | ||
"model_id = model_response[\"id\"]" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"You can now see the model integration by clicking **Manage organization** :\n", | ||
"\n", | ||
"![Model Integration](./img/llm_models.png)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Associating the Model with the Project" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Next, we will associate the created model with our project by creating project models with different configurations. Each time you create a prompt, two models will be chosen from the project models in the project \n", | ||
"\n", | ||
"In this example, we compare **GPT 4o** and **GPT 4o Mini**, with different temperature settings :" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"# First project model with a fixed temperature\n", | ||
"first_project_model = kili.llm.create_project_model(\n", | ||
" project_id=project_id,\n", | ||
" model_id=model_id,\n", | ||
" configuration={\n", | ||
" \"model\": \"gpt-4o\",\n", | ||
" \"temperature\": 0.5,\n", | ||
" },\n", | ||
")\n", | ||
"\n", | ||
"# Second project model with a temperature range\n", | ||
"second_project_model = kili.llm.create_project_model(\n", | ||
" project_id=project_id,\n", | ||
" model_id=model_id,\n", | ||
" configuration={\n", | ||
" \"model\": \"gpt-4o-mini\",\n", | ||
" \"temperature\": {\"min\": 0.2, \"max\": 0.8},\n", | ||
" },\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"You can now see the project models in the project settings :\n", | ||
"\n", | ||
"![Project Models](./img/llm_project_models.png)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Creating a Conversation" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"Now, we'll generate a conversation by providing a prompt.\n" | ||
] | ||
}, | ||
{ | ||
"cell_type": "code", | ||
"execution_count": null, | ||
"metadata": {}, | ||
"outputs": [], | ||
"source": [ | ||
"conversation = kili.llm.create_conversation(\n", | ||
" project_id=project_id, prompt=\"Give me Schrödinger equation.\"\n", | ||
")" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"It will add an asset to your project, and you'll be ready to start labeling the conversation :\n", | ||
"\n", | ||
"![Conversation](./img/llm_conversation.png)" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"## Summary" | ||
] | ||
}, | ||
{ | ||
"cell_type": "markdown", | ||
"metadata": {}, | ||
"source": [ | ||
"In this tutorial, we've:\n", | ||
"\n", | ||
"- **Created a Kili project** with a custom interface for LLM output comparison.\n", | ||
"- **Registered an LLM model** in Kili with the necessary credentials.\n", | ||
"- **Associated the model** with the project by creating project models with different configurations.\n", | ||
"- **Generated a conversation** using a prompt, adding it to the project for labeling.\n" | ||
] | ||
} | ||
], | ||
"metadata": { | ||
"language_info": { | ||
"name": "python" | ||
} | ||
}, | ||
"nbformat": 4, | ||
"nbformat_minor": 4 | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.