LLM-FineTuning-Large-Language-Models

Форк
0
/
TogetherAI_API_with_LangChain.ipynb 
135 строк · 5.3 Кб
1
{
2
 "cells": [
3
  {
4
   "cell_type": "markdown",
5
   "metadata": {},
6
   "source": [
7
    "### Checkout my [Twitter(@rohanpaul_ai)](https://twitter.com/rohanpaul_ai) for daily LLM bits"
8
   ]
9
  },
10
  {
11
   "cell_type": "code",
12
   "execution_count": null,
13
   "metadata": {},
14
   "outputs": [],
15
   "source": [
16
    "import os\n",
17
    "import logging\n",
18
    "import together\n",
19
    "from langchain.llms.base import LLM\n",
20
    "from langchain import PromptTemplate, LLMChain\n",
21
    "from dotenv import load_dotenv\n",
22
    "load_dotenv()\n",
23
    "logging.basicConfig(level=logging.INFO)\n",
24
    "\n",
25
    "class TogetherLLM(LLM):\n",
26
    "    model: str = \"togethercomputer/llama-2-7b-chat\"\n",
27
    "    together_api_key: str = os.environ[\"TOGETHER_API_KEY\"]\n",
28
    "    temperature: float = 0.7\n",
29
    "    max_tokens: int = 512\n",
30
    "\n",
31
    "    def __init__(self, model=None, max_tokens=None, temperature=None):\n",
32
    "        if model:\n",
33
    "            self.model = model\n",
34
    "        if max_tokens:\n",
35
    "            self.max_tokens = max_tokens\n",
36
    "        if temperature:\n",
37
    "            self.temperature = temperature\n",
38
    "\n",
39
    "    @property\n",
40
    "    def llm_type(self) -> str:\n",
41
    "        return \"together\"\n",
42
    "\n",
43
    "    def __call__(self, prompt: str, **kwargs) -> str:\n",
44
    "        try:\n",
45
    "            logging.info(\"Calling Together endpoint.\")\n",
46
    "            return self.make_api_call(prompt)\n",
47
    "        except Exception as e:\n",
48
    "            logging.error(f\"Error in TogetherLLM call: {e}\", exc_info=True)\n",
49
    "            raise\n",
50
    "\n",
51
    "    def make_api_call(self, prompt: str) -> str:\n",
52
    "        together.api_key = self.together_api_key\n",
53
    "        output = together.Complete.create(\n",
54
    "            prompt,\n",
55
    "            model=self.model,\n",
56
    "            max_tokens=self.max_tokens,\n",
57
    "            temperature=self.temperature,\n",
58
    "        )\n",
59
    "        logging.info(\"API call successful.\")\n",
60
    "        return output['output']['choices'][0]['text']\n",
61
    "\n",
62
    "# Now let's use the class\n",
63
    "llm = TogetherLLM(\n",
64
    "    model=\"togethercomputer/llama-2-7b-chat\",\n",
65
    "    max_tokens=256,\n",
66
    "    temperature=0.8\n",
67
    ")\n",
68
    "\n",
69
    "prompt_template = \"You are a friendly AI. Answer the following question: {question}\"\n",
70
    "prompt = PromptTemplate(\n",
71
    "    input_variables=[\"question\"], template=prompt_template\n",
72
    ")\n",
73
    "chat = LLMChain(llm=llm, prompt=prompt)\n"
74
   ]
75
  },
76
  {
77
   "cell_type": "markdown",
78
   "metadata": {},
79
   "source": [
80
    "Integrating TogetherAI with LangChain 🦙\n",
81
    "\n",
82
    "📌 `langchain.llms.base.LLM` is an abstract base class. The purpose of this class is to expose a simpler interface for working with LLMs, rather than expect the user to implement the full _generate method."
83
   ]
84
  },
85
  {
86
   "cell_type": "markdown",
87
   "metadata": {},
88
   "source": [
89
    "------------\n",
90
    "\n",
91
    "### the significance of the this block of code\n",
92
    "\n",
93
    "```py\n",
94
    "  @property\n",
95
    "    def _llm_type(self) -> str:\n",
96
    "        \"\"\"Return type of LLM.\"\"\"\n",
97
    "        return \"together\"\n",
98
    "```\n",
99
    "\n",
100
    "\n",
101
    "📌 This method, `_llm_type`, serves as a getter for a property of the `TogetherLLM` object. \n",
102
    "\n",
103
    "📌 The `@property` decorator transforms the `_llm_type` method to behave like an attribute of the `TogetherLLM` class, rather than a method that needs to be explicitly called. This means you can access the type of the LLM by referring to `instance._llm_type` rather than `instance._llm_type()`.\n",
104
    "\n",
105
    "📌 The significance of defining this property in the class is primarily for internal use within the `TogetherLLM` or its parent classes. It could be used for type checking, logging, conditional processing based on the LLM type, or other purposes where identifying the type of the language model is necessary."
106
   ]
107
  },
108
  {
109
   "cell_type": "markdown",
110
   "metadata": {},
111
   "source": [
112
    "---------\n",
113
    "\n",
114
    "### 📌 The reason for using `__call__` method above\n",
115
    "\n",
116
    "The job of this method is to Check Cache and run the LLM on the given prompt and input.\n",
117
    "\n",
118
    "📌 When you define the `__call__` method in a class, it enables you to use instances of that class like this: `instance(parameters)`, where `instance` is an object of the class.\n",
119
    "\n",
120
    "📌 So with `__call__` the `TogetherLLM` class instances becomes callable objects. \n",
121
    "\n",
122
    "📌 That means, instead of having to explicitly call a method like `instance.call(prompt)`, you can simply use `instance(prompt)`. This makes the code more concise and can improve readability, especially for users who are familiar with functional programming paradigms.\n",
123
    "\n",
124
    "📌 Another advantage is that it allows the `TogetherLLM` class to integrate more seamlessly with Python features and libraries that expect callable objects. For instance, if you're using a higher-order function that takes a function as an argument, you could directly pass an instance of `TogetherLLM` instead of having to wrap it in another function or lambda."
125
   ]
126
  }
127
 ],
128
 "metadata": {
129
  "language_info": {
130
   "name": "python"
131
  }
132
 },
133
 "nbformat": 4,
134
 "nbformat_minor": 2
135
}
136

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.