examples

Форк
0
1181 строка · 176.9 Кб
1
{
2
  "cells": [
3
    {
4
      "cell_type": "code",
5
      "execution_count": 1,
6
      "metadata": {
7
        "id": "2lDxGsT5XQ2w"
8
      },
9
      "outputs": [],
10
      "source": [
11
        "!pip install -qU \\\n",
12
        "    datasets==2.14.4 \\\n",
13
        "    langchain==0.0.274 \\\n",
14
        "    pinecone-client==2.2.2 \\\n",
15
        "    openai==0.27.9"
16
      ]
17
    },
18
    {
19
      "cell_type": "code",
20
      "execution_count": 2,
21
      "metadata": {
22
        "colab": {
23
          "base_uri": "https://localhost:8080/"
24
        },
25
        "id": "5wjqYNbLXQ2x",
26
        "outputId": "90d81eda-5280-4138-8381-db2ec0eda4bb"
27
      },
28
      "outputs": [
29
        {
30
          "output_type": "execute_result",
31
          "data": {
32
            "text/plain": [
33
              "Dataset({\n",
34
              "    features: ['messages'],\n",
35
              "    num_rows: 270\n",
36
              "})"
37
            ]
38
          },
39
          "metadata": {},
40
          "execution_count": 2
41
        }
42
      ],
43
      "source": [
44
        "from datasets import load_dataset\n",
45
        "\n",
46
        "data = load_dataset(\n",
47
        "    \"jamescalam/agent-conversations-retrieval-tool\",\n",
48
        "    split=\"train\"\n",
49
        ")\n",
50
        "data"
51
      ]
52
    },
53
    {
54
      "cell_type": "code",
55
      "source": [
56
        "data[\"messages\"][0]"
57
      ],
58
      "metadata": {
59
        "colab": {
60
          "base_uri": "https://localhost:8080/"
61
        },
62
        "id": "aWUKWk_GdkjG",
63
        "outputId": "24bfb862-ade8-437a-c217-db86cc80c81d"
64
      },
65
      "execution_count": 3,
66
      "outputs": [
67
        {
68
          "output_type": "execute_result",
69
          "data": {
70
            "text/plain": [
71
              "[{'role': 'system',\n",
72
              "  'content': 'Assistant is a large language model trained by OpenAI.\\n\\nAssistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.\\n\\nAssistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.\\n\\nOverall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.'},\n",
73
              " {'role': 'user',\n",
74
              "  'content': 'TOOLS\\n------\\nAssistant can ask the user to use tools to look up information that may be helpful in answering the users original question. The tools the human can use are:\\n\\n> Vector Search Tool: This tool allows you to get research information about LLMs.\\n\\nRESPONSE FORMAT INSTRUCTIONS\\n----------------------------\\n\\nWhen responding to me, please output a response in one of two formats:\\n\\n**Option 1:**\\nUse this if you want the human to use a tool.\\nMarkdown code snippet formatted in the following schema:\\n\\n```json\\n{\\n    \"action\": string, \\\\ The action to take. Must be one of Vector Search Tool\\n    \"action_input\": string \\\\ The input to the action\\n}\\n```\\n\\n**Option #2:**\\nUse this if you want to respond directly to the human. Markdown code snippet formatted in the following schema:\\n\\n```json\\n{\\n    \"action\": \"Final Answer\",\\n    \"action_input\": string \\\\ You should put what you want to return to use here\\n}\\n```\\n\\nUSER\\'S INPUT\\n--------------------\\nHere is the user\\'s input (remember to respond with a markdown code snippet of a json blob with a single action, and NOTHING else):\\n\\nWhat is the main focus of the technical report IDSIA-01-11?'},\n",
75
              " {'role': 'assistant',\n",
76
              "  'content': '```json\\n{\\n    \"action\": \"Vector Search Tool\",\\n    \"action_input\": \"Main focus of technical report IDSIA-01-11\"\\n}\\n```'},\n",
77
              " {'role': 'user',\n",
78
              "  'content': '[High-Performance Neural Networks\\nfor Visual Object Classi\\x0ccation\\nDan C. Cire\\x18 san, Ueli Meier, Jonathan Masci,\\nLuca M. Gambardella and J\\x7f urgen Schmidhuber\\nTechnical Report No. IDSIA-01-11\\nJanuary 2011\\nIDSIA / USI-SUPSI\\nDalle Molle Institute for Arti\\x0ccial Intelligence\\nGalleria 2, 6928 Manno, Switzerland\\nIDSIA is a joint institute of both University of Lugano (USI) and University of Applied Sciences of Southern Switzerland (SUPSI),\\nand was founded in 1988 by the Dalle Molle Foundation which promoted quality of life.\\nThis work was partially supported by the Swiss Commission for Technology and Innovation (CTI), Project n. 9688.1 IFF:\\nIntelligent Fill in Form.arXiv:1102.0183v1  [cs.AI]  1 Feb 2011\\nTechnical Report No. IDSIA-01-11 1\\nHigh-Performance Neural Networks\\nfor Visual Object Classi\\x0ccation\\nDan C. Cire\\x18 san, Ueli Meier, Jonathan Masci,\\nLuca M. Gambardella and J\\x7f urgen Schmidhuber\\nJanuary 2011\\nAbstract\\nWe present a fast, fully parameterizable GPU implementation of Convolutional Neural\\nNetwork variants. Our feature extractors are neither carefully designed nor pre-wired, but]'},\n",
79
              " {'role': 'assistant',\n",
80
              "  'content': '```json\\n{\\n    \"action\": \"Final Answer\",\\n    \"action_input\": \"The main focus of the technical report IDSIA-01-11 is the presentation of a fast, fully parameterizable GPU implementation of Convolutional Neural Network variants for visual object classification.\"\\n}\\n```'}]"
81
            ]
82
          },
83
          "metadata": {},
84
          "execution_count": 3
85
        }
86
      ]
87
    },
88
    {
89
      "cell_type": "code",
90
      "source": [
91
        "data.to_json(\"conversations.jsonl\")"
92
      ],
93
      "metadata": {
94
        "colab": {
95
          "base_uri": "https://localhost:8080/",
96
          "height": 67,
97
          "referenced_widgets": [
98
            "39364e874e5c4e7baa01c08ac31165fb",
99
            "cf43f35611b444b498153f8d659ce153",
100
            "f2a10ce29d894e74a22842953fb8bc59",
101
            "58fac49a766a4233b513bc05a30da756",
102
            "7b5bcd804aa14aaca9d835c1a6262111",
103
            "fe26f0a8030b40528b5036bb8d994db5",
104
            "221b7605257a4235b77fdd828e7fd6e6",
105
            "de5ce44aeb78464a9be9c6b7392b6969",
106
            "280c7b6c0e4d42249a9adb5a0ca1d553",
107
            "8996a369a00a447093e6866183ef8648",
108
            "eece5f66123d4ded8181c0373781da5b"
109
          ]
110
        },
111
        "id": "0sIMkzT4eJXO",
112
        "outputId": "48786185-2818-4d3a-bb58-e2e696f3a662"
113
      },
114
      "execution_count": 4,
115
      "outputs": [
116
        {
117
          "output_type": "display_data",
118
          "data": {
119
            "text/plain": [
120
              "Creating json from Arrow format:   0%|          | 0/1 [00:00<?, ?ba/s]"
121
            ],
122
            "application/vnd.jupyter.widget-view+json": {
123
              "version_major": 2,
124
              "version_minor": 0,
125
              "model_id": "39364e874e5c4e7baa01c08ac31165fb"
126
            }
127
          },
128
          "metadata": {}
129
        },
130
        {
131
          "output_type": "execute_result",
132
          "data": {
133
            "text/plain": [
134
              "1103809"
135
            ]
136
          },
137
          "metadata": {},
138
          "execution_count": 4
139
        }
140
      ]
141
    },
142
    {
143
      "cell_type": "markdown",
144
      "metadata": {
145
        "id": "NQPO963iXQ2z"
146
      },
147
      "source": [
148
        "## Running Training"
149
      ]
150
    },
151
    {
152
      "cell_type": "markdown",
153
      "metadata": {
154
        "id": "u4mZk_vlXQ2z"
155
      },
156
      "source": [
157
        "First we upload the files:"
158
      ]
159
    },
160
    {
161
      "cell_type": "code",
162
      "execution_count": 5,
163
      "metadata": {
164
        "colab": {
165
          "base_uri": "https://localhost:8080/"
166
        },
167
        "id": "kLMSn9EJXQ2z",
168
        "outputId": "57afc952-fb55-421d-e338-91a8a633a234"
169
      },
170
      "outputs": [
171
        {
172
          "output_type": "execute_result",
173
          "data": {
174
            "text/plain": [
175
              "<File file id=file-lNA3Mipra1v7Q9ckvqTn84UA at 0x7ddc84ffae80> JSON: {\n",
176
              "  \"object\": \"file\",\n",
177
              "  \"id\": \"file-lNA3Mipra1v7Q9ckvqTn84UA\",\n",
178
              "  \"purpose\": \"fine-tune\",\n",
179
              "  \"filename\": \"file\",\n",
180
              "  \"bytes\": 1103809,\n",
181
              "  \"created_at\": 1693167014,\n",
182
              "  \"status\": \"uploaded\",\n",
183
              "  \"status_details\": null\n",
184
              "}"
185
            ]
186
          },
187
          "metadata": {},
188
          "execution_count": 5
189
        }
190
      ],
191
      "source": [
192
        "import os\n",
193
        "import openai\n",
194
        "\n",
195
        "os.environ[\"OPENAI_API_KEY\"] = os.getenv(\"OPENAI_API_KEY\") or \"YOUR_API_KEY\"\n",
196
        "openai.api_key = os.environ[\"OPENAI_API_KEY\"]\n",
197
        "\n",
198
        "res = openai.File.create(\n",
199
        "    file=open(\"conversations.jsonl\", \"r\"),\n",
200
        "    purpose='fine-tune'\n",
201
        ")\n",
202
        "res"
203
      ]
204
    },
205
    {
206
      "cell_type": "code",
207
      "execution_count": 6,
208
      "metadata": {
209
        "colab": {
210
          "base_uri": "https://localhost:8080/",
211
          "height": 35
212
        },
213
        "id": "Y_HCXuCeXQ2z",
214
        "outputId": "2b5c0b65-fb41-4676-c37a-804b4403c69e"
215
      },
216
      "outputs": [
217
        {
218
          "output_type": "execute_result",
219
          "data": {
220
            "text/plain": [
221
              "'file-lNA3Mipra1v7Q9ckvqTn84UA'"
222
            ],
223
            "application/vnd.google.colaboratory.intrinsic+json": {
224
              "type": "string"
225
            }
226
          },
227
          "metadata": {},
228
          "execution_count": 6
229
        }
230
      ],
231
      "source": [
232
        "file_id = res[\"id\"]\n",
233
        "file_id"
234
      ]
235
    },
236
    {
237
      "cell_type": "markdown",
238
      "metadata": {
239
        "id": "BuGmK_pLXQ2z"
240
      },
241
      "source": [
242
        "We then create the fine-tuning job _(note, it can take some time before the file above is ready)_."
243
      ]
244
    },
245
    {
246
      "cell_type": "code",
247
      "execution_count": 7,
248
      "metadata": {
249
        "colab": {
250
          "base_uri": "https://localhost:8080/"
251
        },
252
        "id": "Lxv-abQYXQ2z",
253
        "outputId": "0f9081d5-ed62-498f-d94c-96ade0344fb8"
254
      },
255
      "outputs": [
256
        {
257
          "output_type": "execute_result",
258
          "data": {
259
            "text/plain": [
260
              "<FineTuningJob fine_tuning.job id=ftjob-NLYU2lRW6AjOkAswAu52lkDi at 0x7ddc8503e750> JSON: {\n",
261
              "  \"object\": \"fine_tuning.job\",\n",
262
              "  \"id\": \"ftjob-NLYU2lRW6AjOkAswAu52lkDi\",\n",
263
              "  \"model\": \"gpt-3.5-turbo-0613\",\n",
264
              "  \"created_at\": 1693167129,\n",
265
              "  \"finished_at\": null,\n",
266
              "  \"fine_tuned_model\": null,\n",
267
              "  \"organization_id\": \"org-f8Ugk8IIbz8etfgd5WdFJngy\",\n",
268
              "  \"result_files\": [],\n",
269
              "  \"status\": \"created\",\n",
270
              "  \"validation_file\": null,\n",
271
              "  \"training_file\": \"file-lNA3Mipra1v7Q9ckvqTn84UA\",\n",
272
              "  \"hyperparameters\": {\n",
273
              "    \"n_epochs\": 3\n",
274
              "  },\n",
275
              "  \"trained_tokens\": null\n",
276
              "}"
277
            ]
278
          },
279
          "metadata": {},
280
          "execution_count": 7
281
        }
282
      ],
283
      "source": [
284
        "res = openai.FineTuningJob.create(training_file=file_id, model=\"gpt-3.5-turbo\")\n",
285
        "res"
286
      ]
287
    },
288
    {
289
      "cell_type": "code",
290
      "execution_count": 8,
291
      "metadata": {
292
        "colab": {
293
          "base_uri": "https://localhost:8080/",
294
          "height": 35
295
        },
296
        "id": "gFjR5USVXQ20",
297
        "outputId": "ced7e1fa-c38e-4491-c946-60682dd3e754"
298
      },
299
      "outputs": [
300
        {
301
          "output_type": "execute_result",
302
          "data": {
303
            "text/plain": [
304
              "'ftjob-NLYU2lRW6AjOkAswAu52lkDi'"
305
            ],
306
            "application/vnd.google.colaboratory.intrinsic+json": {
307
              "type": "string"
308
            }
309
          },
310
          "metadata": {},
311
          "execution_count": 8
312
        }
313
      ],
314
      "source": [
315
        "job_id = res[\"id\"]\n",
316
        "job_id"
317
      ]
318
    },
319
    {
320
      "cell_type": "markdown",
321
      "metadata": {
322
        "id": "ZByfStbHXQ20"
323
      },
324
      "source": [
325
        "We can retrieve info for a our fine-tuning job like so:"
326
      ]
327
    },
328
    {
329
      "cell_type": "code",
330
      "execution_count": 9,
331
      "metadata": {
332
        "colab": {
333
          "base_uri": "https://localhost:8080/"
334
        },
335
        "id": "s64fq_nMXQ20",
336
        "outputId": "3c063a59-fe56-4dfa-ca7a-0f1253923954"
337
      },
338
      "outputs": [
339
        {
340
          "output_type": "execute_result",
341
          "data": {
342
            "text/plain": [
343
              "<FineTuningJob fine_tuning.job id=ftjob-NLYU2lRW6AjOkAswAu52lkDi at 0x7ddc850660c0> JSON: {\n",
344
              "  \"object\": \"fine_tuning.job\",\n",
345
              "  \"id\": \"ftjob-NLYU2lRW6AjOkAswAu52lkDi\",\n",
346
              "  \"model\": \"gpt-3.5-turbo-0613\",\n",
347
              "  \"created_at\": 1693167129,\n",
348
              "  \"finished_at\": null,\n",
349
              "  \"fine_tuned_model\": null,\n",
350
              "  \"organization_id\": \"org-f8Ugk8IIbz8etfgd5WdFJngy\",\n",
351
              "  \"result_files\": [],\n",
352
              "  \"status\": \"running\",\n",
353
              "  \"validation_file\": null,\n",
354
              "  \"training_file\": \"file-lNA3Mipra1v7Q9ckvqTn84UA\",\n",
355
              "  \"hyperparameters\": {\n",
356
              "    \"n_epochs\": 3\n",
357
              "  },\n",
358
              "  \"trained_tokens\": null\n",
359
              "}"
360
            ]
361
          },
362
          "metadata": {},
363
          "execution_count": 9
364
        }
365
      ],
366
      "source": [
367
        "openai.FineTuningJob.retrieve(job_id)"
368
      ]
369
    },
370
    {
371
      "cell_type": "markdown",
372
      "metadata": {
373
        "id": "6TPYgQ4_XQ20"
374
      },
375
      "source": [
376
        "The `\"finished_at\"` value is still `null`, so fine-tuning isn't yet complete. We can check for events from our fine-tuning job while we wait:"
377
      ]
378
    },
379
    {
380
      "cell_type": "code",
381
      "execution_count": 10,
382
      "metadata": {
383
        "colab": {
384
          "base_uri": "https://localhost:8080/"
385
        },
386
        "id": "QejzgDcQXQ20",
387
        "outputId": "2b9645a0-ba2c-461b-e2cc-45918f210102"
388
      },
389
      "outputs": [
390
        {
391
          "output_type": "execute_result",
392
          "data": {
393
            "text/plain": [
394
              "<OpenAIObject list at 0x7ddc8503dfd0> JSON: {\n",
395
              "  \"object\": \"list\",\n",
396
              "  \"data\": [\n",
397
              "    {\n",
398
              "      \"object\": \"fine_tuning.job.event\",\n",
399
              "      \"id\": \"ftevent-aunKCogMZVKxsfuULBI5wzet\",\n",
400
              "      \"created_at\": 1693167130,\n",
401
              "      \"level\": \"info\",\n",
402
              "      \"message\": \"Fine tuning job started\",\n",
403
              "      \"data\": null,\n",
404
              "      \"type\": \"message\"\n",
405
              "    },\n",
406
              "    {\n",
407
              "      \"object\": \"fine_tuning.job.event\",\n",
408
              "      \"id\": \"ftevent-IU02P4hK6Lr2OSJJw4ipwmeO\",\n",
409
              "      \"created_at\": 1693167129,\n",
410
              "      \"level\": \"info\",\n",
411
              "      \"message\": \"Created fine-tune: ftjob-NLYU2lRW6AjOkAswAu52lkDi\",\n",
412
              "      \"data\": null,\n",
413
              "      \"type\": \"message\"\n",
414
              "    }\n",
415
              "  ],\n",
416
              "  \"has_more\": false\n",
417
              "}"
418
            ]
419
          },
420
          "metadata": {},
421
          "execution_count": 10
422
        }
423
      ],
424
      "source": [
425
        "openai.FineTuningJob.list_events(id=job_id)"
426
      ]
427
    },
428
    {
429
      "cell_type": "markdown",
430
      "metadata": {
431
        "id": "oqxpSFftXQ20"
432
      },
433
      "source": [
434
        "We can setup a check for fine-tuning completion (or wait for OpenAI to send you an email telling you that the job has completed):"
435
      ]
436
    },
437
    {
438
      "cell_type": "code",
439
      "execution_count": 11,
440
      "metadata": {
441
        "colab": {
442
          "base_uri": "https://localhost:8080/",
443
          "height": 232
444
        },
445
        "id": "SAt5Eq6-XQ20",
446
        "outputId": "1e01a3ea-94f0-4ff4-9d64-d6661efd6336"
447
      },
448
      "outputs": [
449
        {
450
          "output_type": "stream",
451
          "name": "stdout",
452
          "text": [
453
            "."
454
          ]
455
        },
456
        {
457
          "output_type": "error",
458
          "ename": "KeyboardInterrupt",
459
          "evalue": "ignored",
460
          "traceback": [
461
            "\u001b[0;31m---------------------------------------------------------------------------\u001b[0m",
462
            "\u001b[0;31mKeyboardInterrupt\u001b[0m                         Traceback (most recent call last)",
463
            "\u001b[0;32m<ipython-input-11-b75b9552a60b>\u001b[0m in \u001b[0;36m<cell line: 3>\u001b[0;34m()\u001b[0m\n\u001b[1;32m      7\u001b[0m     \u001b[0;32melse\u001b[0m\u001b[0;34m:\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[1;32m      8\u001b[0m         \u001b[0mprint\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;34m\".\"\u001b[0m\u001b[0;34m,\u001b[0m \u001b[0mend\u001b[0m\u001b[0;34m=\u001b[0m\u001b[0;34m\"\"\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0;32m----> 9\u001b[0;31m         \u001b[0msleep\u001b[0m\u001b[0;34m(\u001b[0m\u001b[0;36m100\u001b[0m\u001b[0;34m)\u001b[0m\u001b[0;34m\u001b[0m\u001b[0;34m\u001b[0m\u001b[0m\n\u001b[0m",
464
            "\u001b[0;31mKeyboardInterrupt\u001b[0m: "
465
          ]
466
        }
467
      ],
468
      "source": [
469
        "from time import sleep\n",
470
        "\n",
471
        "while True:\n",
472
        "    res = openai.FineTuningJob.retrieve(job_id)\n",
473
        "    if res[\"finished_at\"] != None:\n",
474
        "        break\n",
475
        "    else:\n",
476
        "        print(\".\", end=\"\")\n",
477
        "        sleep(100)"
478
      ]
479
    },
480
    {
481
      "cell_type": "markdown",
482
      "metadata": {
483
        "id": "nNCuwKPMXQ20"
484
      },
485
      "source": [
486
        "Once complete, we can see our model details in the `res`:"
487
      ]
488
    },
489
    {
490
      "cell_type": "code",
491
      "execution_count": null,
492
      "metadata": {
493
        "id": "xef-ZRAoXQ20"
494
      },
495
      "outputs": [],
496
      "source": [
497
        "res"
498
      ]
499
    },
500
    {
501
      "cell_type": "markdown",
502
      "metadata": {
503
        "id": "9K3P1eGlXQ20"
504
      },
505
      "source": [
506
        "We access our fine-tuned model name:"
507
      ]
508
    },
509
    {
510
      "cell_type": "code",
511
      "execution_count": null,
512
      "metadata": {
513
        "id": "Zu6bjioRXQ20"
514
      },
515
      "outputs": [],
516
      "source": [
517
        "ft_model = res[\"fine_tuned_model\"]\n",
518
        "ft_model"
519
      ]
520
    },
521
    {
522
      "cell_type": "markdown",
523
      "metadata": {
524
        "id": "QfsVLrePXQ20"
525
      },
526
      "source": [
527
        "Finally, we use our new model!"
528
      ]
529
    },
530
    {
531
      "cell_type": "code",
532
      "execution_count": 12,
533
      "metadata": {
534
        "id": "CwkWKgvcXQ20"
535
      },
536
      "outputs": [],
537
      "source": [
538
        "ft_model = 'ft:gpt-3.5-turbo-0613:pinecone::7s8gnk9R'"
539
      ]
540
    },
541
    {
542
      "cell_type": "code",
543
      "source": [
544
        "import requests\n",
545
        "\n",
546
        "res = requests.get('https://raw.githubusercontent.com/pinecone-io/examples/master/learn/generation/openai/fine-tuning/gpt-3.5-agent-training/chains.py')\n",
547
        "with open(\"chains.py\", 'w') as fp:\n",
548
        "    fp.write(res.text)"
549
      ],
550
      "metadata": {
551
        "id": "5UmpXZbrXwh6"
552
      },
553
      "execution_count": 13,
554
      "outputs": []
555
    },
556
    {
557
      "cell_type": "code",
558
      "execution_count": 15,
559
      "metadata": {
560
        "id": "V43IjsFNXQ2x"
561
      },
562
      "outputs": [],
563
      "source": [
564
        "from langchain.agents import Tool\n",
565
        "from langchain.chat_models import ChatOpenAI\n",
566
        "from langchain.memory import ConversationBufferWindowMemory\n",
567
        "from chains import VectorDBChain\n",
568
        "\n",
569
        "llm = ChatOpenAI(\n",
570
        "    temperature=0.5,\n",
571
        "    model_name=ft_model\n",
572
        ")\n",
573
        "\n",
574
        "memory = ConversationBufferWindowMemory(\n",
575
        "    memory_key=\"chat_history\",\n",
576
        "    k=5,\n",
577
        "    return_messages=True,\n",
578
        "    output_key=\"output\"\n",
579
        ")\n",
580
        "# app.pinecone.io\n",
581
        "vdb = VectorDBChain(\n",
582
        "    index_name=\"llama-2-arxiv-papers\",\n",
583
        "    environment=os.getenv(\"PINECONE_ENV\") or \"YOUR_ENV\",\n",
584
        "    pinecone_api_key=os.getenv(\"PINECONE_API_KEY\") or \"YOUR_KEY\"\n",
585
        ")\n",
586
        "\n",
587
        "vdb_tool = Tool(\n",
588
        "    name=vdb.name,\n",
589
        "    func=vdb.query,\n",
590
        "    description=\"This tool allows you to get research information about LLMs.\"\n",
591
        ")"
592
      ]
593
    },
594
    {
595
      "cell_type": "code",
596
      "execution_count": 16,
597
      "metadata": {
598
        "id": "xndHtjmAXQ20"
599
      },
600
      "outputs": [],
601
      "source": [
602
        "from langchain.agents import AgentType, initialize_agent\n",
603
        "\n",
604
        "agent = initialize_agent(\n",
605
        "    agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION,\n",
606
        "    tools=[vdb_tool],\n",
607
        "    llm=llm,\n",
608
        "    verbose=True,\n",
609
        "    max_iterations=3,\n",
610
        "    early_stopping_method=\"generate\",\n",
611
        "    memory=memory,\n",
612
        "    return_intermediate_steps=True\n",
613
        ")"
614
      ]
615
    },
616
    {
617
      "cell_type": "code",
618
      "execution_count": 17,
619
      "metadata": {
620
        "colab": {
621
          "base_uri": "https://localhost:8080/"
622
        },
623
        "id": "cdFVEhYQXQ21",
624
        "outputId": "aa2dd898-a0eb-4579-ec5a-b02e6b035d0e"
625
      },
626
      "outputs": [
627
        {
628
          "output_type": "stream",
629
          "name": "stdout",
630
          "text": [
631
            "\n",
632
            "\n",
633
            "\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
634
            "\u001b[32;1m\u001b[1;3m```json\n",
635
            "{\n",
636
            "    \"action\": \"Vector Search Tool\",\n",
637
            "    \"action_input\": \"Llama 2 information\"\n",
638
            "}\n",
639
            "```\u001b[0mLlama 2 information\n",
640
            "[-0.013843749649822712, 0.01913735456764698, -0.017765453085303307, -0.01960851438343525, -0.007406890857964754, 0.023308495059609413, -0.019359076395630836, -0.006499217823147774, -0.008557071909308434, -0.03156762942671776, 0.011681962758302689, 0.015478947199881077, 0.009249952621757984, -0.0033795239869505167, 0.004604190122336149, 0.010199198499321938, 0.0141901895403862, -0.007843405939638615, 0.024112235754728317, 0.01829204149544239, -0.00408452982082963, -0.0030417446978390217, 0.008633289486169815, -0.022837337106466293, 0.0012237998889759183, -0.0005486746085807681, 0.014716778881847858, -0.008058198727667332, 0.009506318718194962, -0.01578381471335888, 0.03905073553323746, 0.0026156234089285135, -0.03641778975725174, -0.010545639321208, -0.02688375674188137, -0.019372934475541115, -0.0011631728848442435, -0.006471502594649792, 0.028629815205931664, -0.013372590765357018, 0.022005880251526833, 0.01847219094634056, -0.001976441126316786, -0.007185169029980898, -0.002847738331183791, 0.004482935648411512, 0.02595529705286026, -0.005574222654104233, -0.03397885337471962, 0.008342279121279716, -0.0005270221154205501, 0.012388700619339943, -0.02772907167673111, -0.01698942668735981, 0.0029031685553491116, 0.0011077424278482795, -0.008425424806773663, -0.009277667850255966, 0.028241803869605064, -0.001943529350683093, 0.004496793262660503, -0.013033078983426094, -0.02025982178747654, 0.0004150786262471229, -0.00785033404827118, 0.0027576638385653496, 0.0056296526454389095, 0.017779309302568436, -0.003571798326447606, 0.003436686471104622, 0.017626876011490822, 0.009042088873684406, -0.00041183075518347323, -0.018347471952438354, 0.010829719714820385, -0.004773945547640324, -0.02268490195274353, 0.010254628956317902, -0.006260173860937357, -0.004160746466368437, 0.010116052813827991, -0.03666722774505615, 0.00016336819680873305, 0.022726476192474365, 0.021992022171616554, 0.018513763323426247, 0.017502157017588615, 0.012735140509903431, -0.03117961622774601, -0.034699447453022, 0.006080024875700474, -0.0022743798326700926, 0.010323917493224144, 0.020273679867386818, -0.013386448845267296, 0.00447947159409523, -0.01035856083035469, 0.0007262252038344741, -0.028338806703686714, -0.03436686471104622, -0.01924821548163891, -0.008667932823300362, -0.029184119775891304, -0.0009197986801154912, -0.048418477177619934, -0.034699447453022, 0.011314735747873783, 0.0032548054587095976, 0.016213400289416313, 8.352455915883183e-05, -0.013642814010381699, 0.0226710457354784, 0.025082267820835114, -0.01924821548163891, -0.008570929989218712, 0.017141859978437424, 0.005220853257924318, -0.010095266625285149, -0.006426465231925249, -0.011619603261351585, 0.007448463700711727, 0.013331018388271332, 0.015063218772411346, -0.023765796795487404, 0.007302958983927965, 0.005660832393914461, -0.002783646807074547, -0.010275415144860744, -0.025345563888549805, 0.004538366105407476, -0.003531957510858774, 0.0394110344350338, 0.006215136963874102, 0.012049189768731594, -0.006668973248451948, 0.00497488072142005, -0.0008786589023657143, 0.010594140738248825, 0.00999133475124836, -0.0007933480083011091, 0.02912868931889534, 0.021784158423542976, -0.014827639795839787, 0.01050406601279974, -0.008044340647757053, 0.028546670451760292, 0.02043997123837471, 0.0033604695927351713, 0.005175816360861063, 0.007857263088226318, 0.0006357177044264972, -0.01671227440237999, 0.016670702025294304, 0.004399790428578854, -0.013331018388271332, 0.018084177747368813, 0.006454180460423231, 0.02437552995979786, -0.016019394621253014, -0.009637965820729733, -0.019954953342676163, 0.002057854551821947, -0.0018499905709177256, 0.00762861268594861, 0.028352664783596992, 0.038441002368927, 0.029184119775891304, -0.025816721841692924, -0.006267102900892496, 0.012277839705348015, -0.014993931166827679, 0.0252208448946476, 0.00511345686390996, 0.023654935881495476, -0.004278535954654217, 0.00864021759480238, 0.0013866268564015627, -0.018139608204364777, -0.005466825794428587, -0.03334140405058861, -0.01744672656059265, 0.010323917493224144, -0.006551183760166168, 0.007122809998691082, -0.048501625657081604, -0.005518792197108269, -0.013074652291834354, -0.010587211698293686, 0.01978866197168827, 0.0038524146657437086, 0.004295858088880777, 0.02761821076273918, 0.01092672348022461, -0.01746058464050293, -0.688224196434021, -0.012076904065907001, 0.012236267328262329, -9.16442513698712e-05, -0.0126104224473238, 0.020786410197615623, -0.009797328151762486, -0.0026346775703132153, -0.022518610581755638, 0.015728384256362915, 0.004850162658840418, 0.0005135975661687553, 0.008418496698141098, -0.0013251337222754955, -0.013407235033810139, -0.03677808865904808, -0.007205955684185028, -0.020135102793574333, 0.0045175799168646336, 0.00854321476072073, -0.003137015737593174, 0.003648014971986413, -0.0011016797507181764, 0.009083661250770092, 0.0021236783359199762, 0.014342622831463814, 0.027050048112869263, -0.014356480911374092, -0.004711586516350508, 0.01744672656059265, -0.02118828147649765, 0.00825913343578577, 0.02491597831249237, -0.013289445079863071, 0.03772040456533432, -0.011723535135388374, -0.004999131895601749, 0.024250812828540802, -0.006274031475186348, 0.015187936834990978, -0.02127142623066902, -0.01988566666841507, 0.035891201347112656, 0.021035848185420036, 0.0038939875084906816, 0.0027074299287050962, 0.03530918434262276, 0.020675549283623695, 0.0037796623073518276, -0.01297764852643013, 0.001738263526931405, -0.013185513205826283, -0.0024077591951936483, 0.00269530457444489, 0.014952357858419418, 0.01933136209845543, 0.03905073553323746, 0.002598301274701953, -0.006665509194135666, 0.0004993068869225681, -0.01585310325026512, 0.007635541260242462, -0.023114489391446114, 0.003672265913337469, -0.0027039656415581703, 0.006284424569457769, -0.006080024875700474, 0.01996881142258644, 0.010580282658338547, 0.009243023581802845, 0.0015096130082383752, 0.03406199812889099, 0.010587211698293686, 0.02269876003265381, 0.012457989156246185, 0.010718858800828457, 0.021548578515648842, -0.014993931166827679, -0.011785894632339478, -0.003194178454577923, 0.00471851509064436, -0.002823487389832735, -0.01511864922940731, -0.02753506600856781, 0.02269876003265381, -0.007566253654658794, -0.00830763578414917, -0.010843577794730663, -0.009256881661713123, 0.019262073561549187, 0.015478947199881077, 0.004115709103643894, -0.017058713361620903, 0.004295858088880777, -0.002612158888950944, 0.022061310708522797, -0.016642985865473747, 0.00047245778841897845, -0.00027022333233617246, -0.010538710281252861, -0.009229166433215141, 0.0017807024996727705, 0.021604008972644806, -0.008189845830202103, 0.02968299388885498, 0.014481198973953724, -0.04254285246133804, 0.0015303994296118617, 0.027133194729685783, -0.025317847728729248, -0.026856042444705963, 0.021035848185420036, -0.00046379677951335907, -0.009942833334207535, 0.022158313542604446, -0.03201107308268547, 0.02118828147649765, 0.00919452216476202, 0.03378484770655632, -0.006377963814884424, 0.00792655162513256, -0.00422310596331954, 0.03791441395878792, -0.015825387090444565, 7.415984873659909e-05, 0.026093874126672745, -0.008799580857157707, -0.006485360208898783, -0.00826606247574091, -0.00380391301587224, -0.012451060116291046, -0.002811362035572529, 0.016933996230363846, -0.013247872702777386, 0.01130087859928608, -0.001995495520532131, 0.04525894299149513, -0.013933823443949223, 0.017696164548397064, -0.024417104199528694, -0.033923421055078506, 0.0009301918908022344, -0.005321321077644825, -0.03641778975725174, 0.0069876983761787415, -0.03702752664685249, -0.0024354744236916304, -0.006603149697184563, -0.009132162667810917, -0.00998440571129322, -0.010753503069281578, -0.040187060832977295, -0.021451575681567192, 0.02587215229868889, 0.020010383799672127, 0.00022107214317657053, -0.010933651588857174, 0.005806337110698223, 0.0030607988592237234, -0.027105478569865227, 0.012076904065907001, 0.017599161714315414, -0.014314907602965832, 0.007192098069936037, 0.0028217551298439503, -0.019359076395630836, 0.007226741872727871, 0.019386792555451393, -0.004354753065854311, -0.01754373125731945, -0.003727696370333433, 0.007289101369678974, 0.004160746466368437, 0.014869212172925472, 0.011910613626241684, 0.006786763202399015, 0.006274031475186348, -0.008106700144708157, -0.008730292320251465, 0.005837516859173775, 0.0033847205340862274, 0.047559306025505066, 0.015894675627350807, 0.002437206683680415, -0.0021340714301913977, 0.01949765346944332, 0.01931750401854515, 0.017599161714315414, -0.01765459217131138, -0.005969164427369833, -0.0005629652878269553, 0.015603665262460709, -0.023668792098760605, -0.010677286423742771, -0.018541477620601654, 0.010760432109236717, 0.020966559648513794, -0.009312312118709087, 0.022574041038751602, 0.016947854310274124, 0.005546507425606251, 0.0214792899787426, -0.004746230319142342, -0.015437373891472816, -0.000747877755202353, -0.029821570962667465, 0.006544254720211029, -0.014522772282361984, 0.014003111980855465, 0.019580798223614693, 0.013975396752357483, -0.054183244705200195, 0.0006426465115509927, -0.0006850854260846972, 0.002684911247342825, 0.018984921276569366, 0.01260349340736866, 0.010219985619187355, 0.0008604708127677441, 0.013227085582911968, -0.014730636030435562, 0.011224661953747272, 0.016462836414575577, -0.006495753303170204, -0.011169231496751308, -0.022283032536506653, 0.015631381422281265, 0.022809620946645737, -0.008612502366304398, -0.008383852429687977, -0.008619431406259537, -0.019913380965590477, 0.009693396277725697, 0.0018777057994157076, 0.0023020950611680746, 0.00242681335657835, 0.012499561533331871, -0.007122809998691082, 0.02624630741775036, 0.023544074967503548, 0.001608348567970097, 0.01952536776661873, 0.02351635880768299, -0.0107881473377347, 0.0034557408653199673, 0.002451064297929406, 0.01157110184431076, 0.009450888261198997, -0.018458332866430283, -0.0015927586937323213, -0.010933651588857174, 0.01960851438343525, -0.002610426628962159, -0.007725615985691547, 0.011529529467225075, -0.01830589957535267, -0.003699981141835451, 0.008203702978789806, 0.008466998115181923, 0.01588081754744053, 0.026648178696632385, 0.009527104906737804, 0.018319755792617798, 0.002414688002318144, 0.017945600673556328, -0.012201623059809208, -0.003353540785610676, -0.03275938332080841, -0.010150697082281113, -0.006551183760166168, -0.005875625181943178, -0.0018482583109289408, 0.0283803790807724, -0.01597782038152218, 0.01153645757585764, 0.016005536541342735, 0.019483795389533043, -0.021091278642416, 0.016878565773367882, 0.024306243285536766, -0.014370338059961796, -0.016310403123497963, 0.016185684129595757, 0.007441535126417875, 0.004365146160125732, -0.006675902288407087, -0.004500257782638073, 0.01567295379936695, -0.02016281895339489, 0.006530397105962038, 0.006142384372651577, -0.0212852843105793, -0.009672610089182854, 0.017211148515343666, 0.023557931184768677, 0.003758875885978341, 0.026357168331742287, -0.017058713361620903, -0.011231590062379837, -0.0023159526754170656, -0.003263466525822878, -0.002515155589208007, -0.014058542437851429, 0.010975224897265434, 0.0352814681828022, 0.012873716652393341, -0.0004087994166184217, -0.012284768745303154, 0.019719375297427177, -0.012929147109389305, 0.008570929989218712, -0.020952701568603516, -0.00043391631334088743, -0.03425600379705429, 0.0009033427340909839, 0.00032110672327689826, 0.007268314715474844, -0.013310231268405914, 0.023544074967503548, -0.0011328593827784061, -0.003959811292588711, -0.008633289486169815, -0.024721970781683922, -0.006689759902656078, 0.052326325327157974, 0.02055083028972149, 0.026662036776542664, 0.008321492932736874, -0.014869212172925472, -0.02624630741775036, -0.01894334889948368, -0.008349208161234856, 0.01585310325026512, 0.012035331688821316, -0.008016625419259071, -0.007254457101225853, 0.017432870343327522, 0.004042956978082657, 0.016088681295514107, -0.013843749649822712, -0.02091112919151783, 0.0009605053928680718, 0.006069631781429052, 0.013060794211924076, 0.010019049979746342, 0.012728212401270866, 0.020093530416488647, 0.023571789264678955, -2.296519414812792e-05, -0.0031560698989778757, 0.01423176284879446, 0.006831800099462271, 0.00576822878792882, -0.012908360920846462, 0.016878565773367882, -0.007600897457450628, 0.010012120939791203, 0.008654075674712658, -0.02353021688759327, 0.04060278832912445, 0.00018956772692035884, -0.017252720892429352, 0.004860555753111839, 0.013552739284932613, 0.004673478193581104, 0.00850164145231247, 0.01237484347075224, 0.007406890857964754, 0.02089727111160755, -0.01915121264755726, -0.005033775698393583, 0.025899868458509445, 0.01793174259364605, -0.010323917493224144, 0.020564688369631767, -0.011591888032853603, 0.004285464994609356, 0.0011380559299141169, 0.007462321314960718, -0.01502164639532566, -0.017890170216560364, 0.006551183760166168, -0.00618742173537612, -0.009416243992745876, -0.01516022253781557, -0.012354056350886822, 0.035891201347112656, -0.010892079211771488, -0.0022068240214139223, -0.033757131546735764, -0.021132851019501686, 0.014148617163300514, -0.017141859978437424, -0.01978866197168827, 0.0013944216771051288, -0.020772552117705345, -0.05351807922124863, 0.012111548334360123, 0.025983013212680817, 0.011598817072808743, -0.003990991041064262, -0.0022102883085608482, -0.014716778881847858, 0.021299142390489578, 0.006083489395678043, -0.02136843092739582, 0.003661872586235404, -0.01886020228266716, -0.007455392740666866, -0.00038130072061903775, 0.007022342178970575, -0.025913724675774574, -0.012180836871266365, 0.01012991089373827, 0.004867484327405691, -0.0011415203334763646, 0.00618395721539855, -0.02678675390779972, 0.017945600673556328, 0.02156243659555912, -0.004770481027662754, 0.0015970892272889614, -0.005910269450396299, -0.009166806936264038, 0.030126437544822693, -0.025262417271733284, -0.009104447439312935, 0.0004122638201806694, 0.00825913343578577, 0.00998440571129322, 0.025137698277831078, 0.003395113628357649, 0.003445347538217902, 0.03334140405058861, 0.004964487627148628, -0.022061310708522797, 0.008487784303724766, 0.001347652287222445, 0.011487956158816814, 0.01022691372781992, 0.0024198845494538546, -0.010386276058852673, 0.009229166433215141, -0.027382630854845047, 0.01960851438343525, -0.026953045278787613, 0.017987173050642014, 0.024250812828540802, -0.01298457756638527, -0.014016969129443169, 0.010649571195244789, -0.011238519102334976, 0.013005363754928112, 0.010199198499321938, 0.007580111268907785, 0.010753503069281578, -0.004389396868646145, 0.018319755792617798, -0.016642985865473747, 0.011737393215298653, -0.01876319944858551, 0.028491239994764328, -0.010857434943318367, -0.014938500709831715, 0.014023898169398308, -0.012277839705348015, -0.01288064569234848, -0.013691315427422523, 0.010538710281252861, -0.014086257666349411, -0.017030999064445496, 0.020038099959492683, -0.00010772124369395897, 0.026398740708827972, 0.014072399586439133, 0.007240599486976862, -0.049249935895204544, -0.010767360217869282, 0.01008833758533001, -0.024292385205626488, 0.006662044674158096, -0.008848082274198532, 0.026662036776542664, 0.005896411836147308, 0.0044344342313706875, 0.0044760070741176605, -0.0011181356385350227, -0.02091112919151783, -0.010400134138762951, 0.011481027118861675, -0.006159706506878138, 0.0004092324525117874, -0.03539232909679413, 0.01597782038152218, 0.010275415144860744, 0.020883413031697273, 0.01521565206348896, 0.01521565206348896, 0.006017665844410658, 0.007448463700711727, -0.0011216000420972705, 0.0023332745768129826, 0.01194525696337223, -0.000605837267357856, -0.0017945601139217615, -0.012215480208396912, -0.0018395973602309823, -0.00040381934377364814, -0.01274899858981371, 0.004250820726156235, 0.003317164722830057, -0.018056461587548256, -0.02136843092739582, -0.016587555408477783, 0.00785033404827118, -0.009132162667810917, 0.02380736917257309, 0.0068560512736439705, 0.018527621403336525, -0.011176159605383873, -0.013525024056434631, -0.017751595005393028, 0.02260175719857216, -0.008342279121279716, -0.008764936588704586, 0.000765632779803127, 0.010892079211771488, -0.016767704859375954, 0.013310231268405914, 0.0001778753794496879, -0.0017477907240390778, -0.0031041039619594812, -0.005494541022926569, -0.0038974520284682512, -0.003289449494332075, -0.02605230174958706, -0.028879253193736076, -0.005047633312642574, 0.0013649743050336838, 0.006575434468686581, 0.0019088853150606155, 0.01849990524351597, -0.01339337695389986, -0.03763725981116295, 0.016656843945384026, -0.008591716177761555, 0.04536980390548706, 0.005979557521641254, 0.010483279824256897, 0.021507006138563156, 0.003519832156598568, -0.010199198499321938, 0.013684387318789959, -0.003648014971986413, 0.002723019802942872, 0.007954266853630543, 0.04124023765325546, -0.008390781469643116, 0.0016984229441732168, 0.01820889487862587, 0.02400137484073639, 0.01026155799627304, -0.04872334748506546, -0.0005859169759787619, -0.00010918278712779284, -0.000773427716922015, -0.031151900067925453, -0.014508914202451706, -0.015520519576966763, -0.0036791947204619646, -0.010698072612285614, 0.019012637436389923, 0.002269183052703738, 0.0021236783359199762, -0.011862111277878284, 0.007240599486976862, -0.015894675627350807, 0.0031041039619594812, 0.0027005011215806007, -0.012929147109389305, 0.012492632493376732, -0.026662036776542664, -0.0024753150064498186, -0.010109124705195427, -0.00656157685443759, 0.021604008972644806, -0.008085913956165314, 0.007420748472213745, 0.002936080563813448, -0.007344531826674938, -0.0013078117044642568, 0.010899008251726627, 0.024417104199528694, 0.027645926922559738, -0.02753506600856781, -0.02474968694150448, -0.015146364457905293, 0.002529013203456998, 0.01635197550058365, -0.00988740287721157, -0.001716611091978848, 0.017017140984535217, -0.02483283169567585, -0.008418496698141098, 0.008245276287198067, 0.012125406414270401, -0.02885153703391552, -0.021714869886636734, -0.014342622831463814, -0.029461272060871124, -0.033479977399110794, 0.020495401695370674, -0.004486400168389082, 0.017959458753466606, 0.0036133709363639355, -0.002861595945432782, 0.004167675506323576, 0.0062359231524169445, 0.022005880251526833, -8.85587724042125e-05, 0.015187936834990978, 0.025913724675774574, -0.04916679114103317, 0.016407405957579613, -0.0021306071430444717, 0.004146888852119446, -0.02621859312057495, 0.007677114102989435, 0.017502157017588615, -0.02344707027077675, 0.01773773692548275, -0.007199026644229889, -0.024666540324687958, -0.015645237639546394, 0.03530918434262276, 0.014204047620296478, -0.004424041137099266, -0.013559668324887753, 0.03572491183876991, 0.027036191895604134, 0.0107396449893713, -0.01707257144153118, -0.009152949787676334, 0.02800622396171093, 0.0013130082515999675, -0.020800268277525902, 0.010573354549705982, -0.0060765608213841915, 0.02595529705286026, 0.000964835868217051, -0.007178240455687046, 0.012056117877364159, -0.009915118105709553, -0.021423861384391785, -0.01022691372781992, 0.021451575681567192, -0.004091458395123482, 0.005144636612385511, -0.015617523342370987, -0.017696164548397064, -0.01643512211740017, 0.010185341350734234, -0.0015286672860383987, 0.002594836987555027, 0.02595529705286026, 0.0107396449893713, -0.0026433386374264956, 0.026856042444705963, -0.02089727111160755, -0.028338806703686714, -0.007663256488740444, -0.02763206884264946, -0.03517060726881027, -0.019636228680610657, -0.018153464421629906, 0.020024241879582405, 0.003533689770847559, 0.005518792197108269, -0.028990114107728004, -0.0036791947204619646, -0.0195115115493536, -0.004694264382123947, -0.01139788143336773, -0.006055774167180061, 0.0032877172343432903, 0.011169231496751308, -0.001363242045044899, 0.003907845355570316, 0.006121597718447447, 0.01942836493253708, -0.019303645938634872, -0.0157699566334486, 0.0022345390170812607, -0.007905764505267143, 0.01250649057328701, -0.004919450264424086, -0.0197055172175169, -0.013933823443949223, 0.004815518390387297, -0.0052000670693814754, 0.005979557521641254, 0.009520175866782665, -0.03957732394337654, -0.017848597839474678, -0.0023055593483150005, -0.008667932823300362, -0.006156241986900568, -0.0060349879786372185, -0.00774640217423439, -0.004656156059354544, -0.009014373645186424, 0.0011432525934651494, 0.0061181336641311646, -0.008113629184663296, 0.0032322867773473263, 0.006028058938682079, 0.021507006138563156, 0.010968295857310295, 0.012069975957274437, -0.01848604716360569, -0.004264678340405226, -0.01586695946753025, 0.01363588497042656, 0.013996182940900326, -0.009104447439312935, 0.020578546449542046, -0.010510995052754879, -0.0008639352163299918, -0.000546509400010109, 0.012277839705348015, -0.0031560698989778757, -0.004843233618885279, 0.02359950542449951, 0.005321321077644825, -0.01700328290462494, 0.009769612923264503, -0.02314220368862152, -0.03397885337471962, -0.00943010114133358, -0.009859687648713589, 0.0017356652533635497, -0.014016969129443169, 0.009395457804203033, 0.0027940399013459682, -0.011349380016326904, -0.011044512502849102, -0.012215480208396912, -0.0015226046089082956, 0.022851193323731422, -0.004957559052854776, -0.005951842293143272, 0.015312655828893185, 0.0011138052213937044, -0.008952014148235321, 0.002056122524663806, -0.0042473566718399525, -0.0007755929254926741, -0.008619431406259537, -0.03386799246072769, 0.013836820609867573, 0.20564688742160797, -0.01232634112238884, 0.004441363271325827, 0.03467173129320145, -0.007864192128181458, 0.0023800439666956663, 0.025262417271733284, -0.007108952384442091, -0.005820194724947214, 0.01553437765687704, -0.014647490344941616, -0.008494713343679905, -0.01567295379936695, 0.011245448142290115, 0.010718858800828457, -0.014495057053864002, -0.036556366831064224, -0.015908533707261086, -0.02959984913468361, 0.0027749857399612665, 0.009014373645186424, 0.009021301753818989, 0.014127830043435097, -0.01895720697939396, 0.0358634851872921, -0.010219985619187355, -0.020135102793574333, 0.00882729608565569, 0.02474968694150448, 0.01549280434846878, -0.00957560632377863, 0.007171311415731907, -0.013628956861793995, -0.009443959221243858, -0.02351635880768299, 0.006405679043382406, 0.006336390972137451, -0.001817078678868711, 0.028130942955613136, 0.003928631544113159, -0.012714354321360588, 0.0030833175405859947, -0.016878565773367882, -0.013739817775785923, -0.0033137002028524876, 0.004728908184915781, -0.020038099959492683, 0.0028858466539531946, 0.0056296526454389095, 0.02436167374253273, -0.03564176708459854, 0.01671227440237999, 0.003436686471104622, 0.018818629905581474, 0.007365318015217781, -5.743328438256867e-05, 0.020370682701468468, 0.008175987750291824, -0.024777401238679886, -0.014301050454378128, -0.017405154183506966, 0.020647834986448288, 0.003616835456341505, 0.0003929930680897087, -0.00551186315715313, -0.008564000949263573, -0.00616317056119442, -0.014030827209353447, 0.00316126667894423, -0.032731667160987854, 0.016185684129595757, -0.011765108443796635, -0.010815862566232681, 0.007600897457450628, -0.019539225846529007, -0.013850677758455276, -0.015797672793269157, 0.0417945422232151, 0.005570758134126663, 0.005671225488185883, -0.011882898397743702, -0.024694256484508514, -0.00762861268594861, -0.01868005469441414, 0.0029984398279339075, -0.018887918442487717, -0.007697900757193565, -0.00513424351811409, -0.0019608514849096537, -0.010705001652240753, 0.006474967114627361, -0.0019314039964228868, -0.028629815205931664, 0.004157281946390867, 0.018347471952438354, 0.012818286195397377, -0.010344703681766987, 0.013996182940900326, -0.01404468435794115, 0.007018878124654293, -0.027465777471661568, 0.05748135223984718, 0.009804257191717625, -0.006173564121127129, -0.01643512211740017, 0.008550143800675869, -0.010275415144860744, 0.013642814010381699, 0.0042196414433419704, -0.008889654651284218, 0.011564172804355621, -0.039078451693058014, 0.01782088354229927, -0.006862979847937822, 0.007330674212425947, 0.0045937965624034405, 0.006644722539931536, -0.012263982556760311, 0.009922046214342117, -0.008439282886683941, -0.0009968816302716732, -0.026107732206583023, 0.006232458632439375, 0.024846689775586128, -0.0033691306598484516, -0.020384540781378746, -0.013337946496903896, -0.0008695648284628987, -0.00043001887388527393, -0.02024596370756626, 0.0024753150064498186, -0.009631036780774593, 0.005262426100671291, 0.013622027821838856, 0.014411911368370056, -0.006055774167180061, 0.0019210107857361436, 0.0036272285506129265, -0.01791788637638092, -0.02483283169567585, -0.016074825078248978, -0.007122809998691082, -0.0015381943667307496, -0.01405161339789629, 0.0029291517566889524, -0.001675038249231875, -0.0027074299287050962, -0.00025246827863156796, -0.02595529705286026, -0.00669668847694993, -0.0228650514036417, -0.006769441068172455, 0.0054564327001571655, -0.002397366100922227, 0.024500248953700066, 0.01326172985136509, -0.007365318015217781, -0.013421092182397842, 0.011522600427269936, -0.004933307878673077, -0.007670185528695583, -0.005335178691893816, -0.009076732210814953, 0.01570066809654236, -0.01940065063536167, -0.008564000949263573, -0.18203352391719818, -0.0031248903833329678, 0.0025931047275662422, -0.020481543615460396, -0.007420748472213745, 8.612286183051765e-05, 0.0030174939893186092, 0.02634331025183201, -0.025650430470705032, 0.007483107969164848, 0.014342622831463814, -1.8489214426153922e-06, -0.003095442894846201, -0.009534033946692944, -0.02025982178747654, -0.0017157449619844556, 0.014938500709831715, -0.006689759902656078, 0.024486390873789787, 0.008210632018744946, 0.036002062261104584, -0.03314739465713501, 0.007254457101225853, 0.001029793405905366, 0.01728043518960476, -0.0018759735394269228, -0.007940408773720264, 0.014536629430949688, -0.0007128006545826793, -0.01716957427561283, -0.02063397690653801, 0.026828326284885406, 0.016864707693457603, 0.006274031475186348, 0.024888262152671814, -0.0006430795765481889, -0.007635541260242462, -0.005058026406913996, -0.011924470774829388, 0.001142386463470757, 0.004493329208344221, 0.003959811292588711, -0.001665511168539524, 0.03062531165778637, -0.008383852429687977, 0.05121771618723869, 0.005598473362624645, -0.014578202739357948, 0.011280092410743237, -0.03893987461924553, 0.032343655824661255, -0.01635197550058365, 0.005463361740112305, -0.017044857144355774, 0.012700497172772884, 0.018444474786520004, -0.009118305519223213, -0.013227085582911968, -0.002697036834433675, -0.006790227256715298, -0.020204391330480576, -0.014952357858419418, -0.007379175629466772, -0.0018638481851667166, -0.005203531589359045, -0.028962397947907448, -0.01036548987030983, 0.014162474311888218, -0.019372934475541115, 0.016088681295514107, -0.001000346033833921, -0.019733231514692307, -0.009346956387162209, -0.023405497893691063, 0.03749868646264076, 0.01167503371834755, -0.00906287506222725, -0.004042956978082657, 0.025414850562810898, -0.025608858093619347, -0.02389051392674446, 0.005910269450396299, -0.0023453999310731888, -0.009416243992745876, -0.00230036280117929, 0.004534902051091194, -0.005688547622412443, 0.02624630741775036, -0.010989082045853138, -0.004437898751348257, 0.027479635551571846, -0.018998779356479645, 0.025262417271733284, -0.0425705686211586, 0.011210803873836994, 0.02398751862347126, 0.02698076143860817, -0.007981982082128525, 0.0029828499536961317, -0.03367398679256439, -0.007940408773720264, 0.008037412539124489, -0.003755411598831415, -0.0031127650290727615, 0.031318191438913345, -0.012284768745303154, -0.0050407047383487225, 0.032537661492824554, 0.02904554456472397, -0.011862111277878284, 0.024486390873789787, 0.002710894448682666, 0.007836476899683475, 0.0012731676688417792, -0.016102539375424385, -0.0005560364807024598, -0.010330845601856709, -0.016767704859375954, -0.002257057698443532, -0.01180668082088232, 0.05881168320775032, 0.0033379511442035437, 0.011460240930318832, 0.01923435926437378, -0.014924642629921436, -0.01801488921046257, -0.08641603589057922, -0.021410003304481506, 0.027853790670633316, 0.011986830271780491, 0.009305383078753948, 0.02007967233657837, -0.008633289486169815, 0.025470281019806862, -0.0006045381305739284, 0.03852414712309837, -0.026662036776542664, -0.0010514459572732449, -0.00909751933068037, 0.016462836414575577, 0.021811872720718384, 0.015368086285889149, -0.028255660086870193, -0.007136667612940073, -0.00027238859911449254, 0.014010041020810604, -0.022075168788433075, -0.02222760207951069, -0.012028402648866177, -0.022657187655568123, -0.01301922183483839, -0.009000515565276146, -0.033369116485118866, 0.01521565206348896, 0.004271607380360365, -0.0005932787898927927, -0.007413819897919893, -0.012492632493376732, 0.013033078983426094, 0.011266234330832958, -0.0052277822978794575, 0.012457989156246185, -0.03777583688497543, -0.01624111458659172, 0.03408971428871155, -0.005969164427369833, -0.01288757473230362, 0.005660832393914461, -0.009346956387162209, -0.021118992939591408, -0.0020197462290525436, -0.0050961351953446865, -0.017335865646600723, 0.010711930692195892, 0.0018673125887289643, -0.045037221163511276, -0.02549799717962742, -0.01446734182536602, -0.014287193305790424, -0.008938156068325043, 0.01597782038152218, -0.024042949080467224, -0.015936248004436493, 0.01913735456764698, -0.02836652100086212, 0.005882554221898317, 0.011287020519375801, -0.01068421546369791, -0.0009197986801154912, 0.014716778881847858, 0.014342622831463814, -0.00671747513115406, -0.004812053870409727, -0.016365833580493927, 0.04667241871356964, 0.009658752009272575, -0.008349208161234856, 0.022560184821486473, -0.019262073561549187, 0.017793167382478714, 0.0012662388617172837, 0.0192205011844635, -0.016185684129595757, -0.0024493320379406214, 0.009409314952790737, 0.004548759665340185, 0.007296029943972826, -0.017141859978437424, -0.01969165913760662, -0.007912693545222282, 0.012527276761829853, 0.027770644053816795, -0.0018413295038044453, -0.006949590053409338, 0.014113972894847393, -0.006388356909155846, -0.01697556860744953, 0.020412255078554153, 0.0352814681828022, -0.019386792555451393, -0.005030311178416014, 0.016407405957579613, -0.028685245662927628, 0.015409658662974834, 0.020938843488693237, 0.0036688013933598995, -0.027645926922559738, -0.019456081092357635, -0.0866931900382042, 0.01092672348022461, 0.004250820726156235, -0.005446039605885744, 0.013227085582911968, -0.013240943662822247, 0.01980252005159855, -0.01690628007054329, 0.016573697328567505, 0.0050268471240997314, -0.041129376739263535, 0.02278190664947033, -0.016823135316371918, -0.01329637411981821, -0.01409318670630455, -0.010899008251726627, 0.02688375674188137, -0.0052139246836304665, 0.021922733634710312, 0.005681619048118591, 0.010711930692195892, -0.019857950508594513, 0.013365661725401878, 0.004860555753111839, 0.015298797748982906, 0.007302958983927965, -0.021770300343632698, -0.0061319912783801556, -0.016726132482290268, 0.007046593353152275, -3.6606277262762887e-06, -0.012097691185772419, 0.006059238687157631, 0.04060278832912445, -0.01325480081140995, -0.04024248942732811, -0.0029447413980960846, 0.025054553523659706, 0.021507006138563156, 0.004067207686603069, 0.010012120939791203, -0.02642645686864853, -0.004469078034162521, -0.005747442599385977, 0.0006430795765481889, -0.016407405957579613, -0.012444131076335907, 0.008903512731194496, 0.03733239322900772, 0.014370338059961796, 0.04262600094079971, -0.001653385697863996, -0.03295338898897171, -0.03142905235290527, 0.005317856557667255, -0.02278190664947033, 0.003769269213080406, -0.01353195309638977, 0.009679538197815418, -0.007393033243715763, 0.02846352569758892, 0.019954953342676163, 0.023779653012752533, -0.009076732210814953, -0.0052277822978794575, -0.012402558699250221, -0.020800268277525902, -0.010136839933693409, -0.009596392512321472, -0.0181257501244545, -0.0009691664017736912, 0.010289273224771023, -0.00025485004880465567, -0.012492632493376732, -0.020786410197615623, 0.011792823672294617, 0.011931399814784527, 0.0004178934614174068, -0.014314907602965832, 0.02865753136575222, 0.003306771395727992, -0.031124185770750046, -0.04201626405119896, 0.0008587386109866202, 0.020107388496398926, 0.0194145068526268, -0.010947509668767452, 0.0126104224473238, 0.021534720435738564, 0.007046593353152275, -0.034782592207193375, 0.022435465827584267, -0.016850849613547325, -0.015049361623823643, 0.000514896702952683, -0.009651822969317436, -0.006149312946945429, -0.013324089348316193, 0.019830236211419106, 0.008570929989218712, 0.005726655945181847, 0.010573354549705982, -0.025096125900745392, -0.02213059924542904, -0.022754190489649773, -0.012257053516805172, -0.030209584161639214, -0.023377783596515656, 0.020093530416488647, 0.031124185770750046, 0.0052693551406264305, -0.030403589829802513, -0.006059238687157631, 0.019622372463345528, -0.018458332866430283, -0.015007788315415382, -0.007503894157707691, -0.0008011428872123361, -0.01820889487862587, 0.014592059887945652, 0.01801488921046257, 0.009159877896308899, 0.015478947199881077, 0.013490380719304085, -0.011106871999800205, -0.0012177372118458152, 0.02707776427268982, -0.021617867052555084, 0.003928631544113159, -0.00282521964982152, 0.004922914784401655, -0.010413991287350655, -0.02369650825858116, -0.012347128242254257, -0.0061804926954209805, 0.006790227256715298, 0.006610078737139702, 0.018790915608406067, -0.008002768270671368, 0.09622722119092941, -0.00882729608565569, -0.02063397690653801, 0.012139263562858105, -0.017238862812519073, 0.010455564595758915, -0.002553264144808054, 0.008647146634757519, -0.009499389678239822, -0.008688719943165779, 0.013150868937373161, -0.014799924567341805, 0.016767704859375954, 0.002852934878319502, -0.026939187198877335, 0.0013597776414826512, -0.010802004486322403, 0.012631208635866642, -0.012014545500278473, 0.000964835868217051, 0.02540099434554577, 0.004153817892074585, 0.008841153234243393, -0.0030711921863257885, 0.0007119345827959478, 0.026357168331742287, 0.027382630854845047, -0.012451060116291046, -0.01470292080193758, -0.01525722537189722, 0.018222752958536148, 0.01553437765687704, -0.022615615278482437, -0.011543386615812778, 0.00023341407359112054, -0.01942836493253708, -0.009679538197815418, -0.011972972191870213, 0.0014749690890312195, -0.0004884806694462895, -0.009187593124806881, 0.01184825412929058, -0.02686990052461624, -0.03359083831310272, -0.015090934000909328, 0.00816905964165926, 0.006790227256715298, -0.007406890857964754, -0.025692002847790718]\n",
641
            "\n",
642
            "Observation: \u001b[36;1m\u001b[1;3m['Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety', 'asChatGPT,BARD,andClaude. TheseclosedproductLLMsareheavilyfine-tunedtoalignwithhuman\\npreferences, which greatly enhances their usability and safety. This step can require significant costs in\\ncomputeandhumanannotation,andisoftennottransparentoreasilyreproducible,limitingprogresswithin\\nthe community to advance AI alignment research.\\nIn this work, we develop and release Llama 2, a family of pretrained and fine-tuned LLMs, L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle and\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , at scales up to 70B parameters. On the series of helpfulness and safety benchmarks we tested,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc models generally perform better than existing open-source models. They also appear to\\nbe on par with some of the closed-source models, at least on the human evaluations we performed (see', 'models will be released as we improve model safety with community feedback.\\nLicense A custom commercial license is available at: ai.meta.com/resources/\\nmodels-and-libraries/llama-downloads/\\nWhere to send commentsInstructions on how to provide feedback or comments on the model can be\\nfound in the model README, or by opening an issue in the GitHub repository\\n(https://github.com/facebookresearch/llama/ ).\\nIntended Use\\nIntended Use Cases L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is intended for commercial and research use in English. Tuned models\\nare intended for assistant-like chat, whereas pretrained models can be adapted\\nfor a variety of natural language generation tasks.\\nOut-of-Scope Uses Use in any manner that violates applicable laws or regulations (including trade\\ncompliancelaws). UseinlanguagesotherthanEnglish. Useinanyotherway\\nthat is prohibited by the Acceptable Use Policy and Licensing Agreement for\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle.\\nHardware and Software (Section 2.2)\\nTraining Factors We usedcustomtraininglibraries, Meta’sResearchSuperCluster, andproductionclustersforpretraining. Fine-tuning,annotation,andevaluationwerealso', 'Evaluation Results\\nSee evaluations for pretraining (Section 2); fine-tuning (Section 3); and safety (Section 4).\\nEthical Considerations and Limitations (Section 5.2)\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is a new technology that carries risks with use. Testing conducted to date has been in\\nEnglish, and has notcovered, nor could it coverall scenarios. For these reasons, aswith all LLMs,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle’s potential outputs cannot be predicted in advance, and the model may in some instances\\nproduceinaccurateorobjectionableresponsestouserprompts. Therefore,beforedeployingany\\napplications of L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle, developers should perform safety testing and tuning tailored to their\\nspecific applications of the model. Please see the Responsible Use Guide available available at\\nhttps://ai.meta.com/llama/responsible-user-guide\\nTable 52: Model card for L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle .\\n77', 'Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aur’elien Rodriguez, Armand Joulin, Edouard\\nGrave, and Guillaume Lample. Llama: Open and efficient foundation language models. arXiv preprint\\narXiv:2302.13971 , 2023.\\nAshish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser,\\nand Illia Polosukhin. Attention is all you need, 2017.\\nOriol Vinyals, Igor Babuschkin, Wojciech M Czarnecki, Michaël Mathieu, Andrew Dudzik, Junyoung Chung,\\nDavid H Choi, Richard Powell, Timo Ewalds, Petko Georgiev, et al. Grandmaster level in starcraft ii using\\nmulti-agent reinforcement learning. Nature, 575(7782):350–354, 2019.\\nYizhong Wang, Yeganeh Kordi, Swaroop Mishra, Alisa Liu, Noah A Smith, Daniel Khashabi, and HannanehHajishirzi. Self-instruct: Aligninglanguagemodel withselfgeneratedinstructions. arXivpreprint']\u001b[0m\n",
643
            "Thought:\u001b[32;1m\u001b[1;3m```json\n",
644
            "{\n",
645
            "    \"action\": \"Final Answer\",\n",
646
            "    \"action_input\": \"Llama 2 is a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, are optimized for dialogue use cases. They outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. The approach to fine-tuning and safety is detailed in the work. Llama 2 is intended for commercial and research use in English, with tuned models intended for assistant-like chat and pretrained models adaptable for various natural language generation tasks.\"\n",
647
            "}\n",
648
            "```\u001b[0m\n",
649
            "\n",
650
            "\u001b[1m> Finished chain.\u001b[0m\n"
651
          ]
652
        },
653
        {
654
          "output_type": "execute_result",
655
          "data": {
656
            "text/plain": [
657
              "{'input': 'tell me about Llama 2?',\n",
658
              " 'chat_history': [],\n",
659
              " 'output': 'Llama 2 is a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, are optimized for dialogue use cases. They outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. The approach to fine-tuning and safety is detailed in the work. Llama 2 is intended for commercial and research use in English, with tuned models intended for assistant-like chat and pretrained models adaptable for various natural language generation tasks.',\n",
660
              " 'intermediate_steps': [(AgentAction(tool='Vector Search Tool', tool_input='Llama 2 information', log='```json\\n{\\n    \"action\": \"Vector Search Tool\",\\n    \"action_input\": \"Llama 2 information\"\\n}\\n```'),\n",
661
              "   ['Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety',\n",
662
              "    'asChatGPT,BARD,andClaude. TheseclosedproductLLMsareheavilyfine-tunedtoalignwithhuman\\npreferences, which greatly enhances their usability and safety. This step can require significant costs in\\ncomputeandhumanannotation,andisoftennottransparentoreasilyreproducible,limitingprogresswithin\\nthe community to advance AI alignment research.\\nIn this work, we develop and release Llama 2, a family of pretrained and fine-tuned LLMs, L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle and\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , at scales up to 70B parameters. On the series of helpfulness and safety benchmarks we tested,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc models generally perform better than existing open-source models. They also appear to\\nbe on par with some of the closed-source models, at least on the human evaluations we performed (see',\n",
663
              "    'models will be released as we improve model safety with community feedback.\\nLicense A custom commercial license is available at: ai.meta.com/resources/\\nmodels-and-libraries/llama-downloads/\\nWhere to send commentsInstructions on how to provide feedback or comments on the model can be\\nfound in the model README, or by opening an issue in the GitHub repository\\n(https://github.com/facebookresearch/llama/ ).\\nIntended Use\\nIntended Use Cases L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is intended for commercial and research use in English. Tuned models\\nare intended for assistant-like chat, whereas pretrained models can be adapted\\nfor a variety of natural language generation tasks.\\nOut-of-Scope Uses Use in any manner that violates applicable laws or regulations (including trade\\ncompliancelaws). UseinlanguagesotherthanEnglish. Useinanyotherway\\nthat is prohibited by the Acceptable Use Policy and Licensing Agreement for\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle.\\nHardware and Software (Section 2.2)\\nTraining Factors We usedcustomtraininglibraries, Meta’sResearchSuperCluster, andproductionclustersforpretraining. Fine-tuning,annotation,andevaluationwerealso',\n",
664
              "    'Evaluation Results\\nSee evaluations for pretraining (Section 2); fine-tuning (Section 3); and safety (Section 4).\\nEthical Considerations and Limitations (Section 5.2)\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is a new technology that carries risks with use. Testing conducted to date has been in\\nEnglish, and has notcovered, nor could it coverall scenarios. For these reasons, aswith all LLMs,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle’s potential outputs cannot be predicted in advance, and the model may in some instances\\nproduceinaccurateorobjectionableresponsestouserprompts. Therefore,beforedeployingany\\napplications of L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle, developers should perform safety testing and tuning tailored to their\\nspecific applications of the model. Please see the Responsible Use Guide available available at\\nhttps://ai.meta.com/llama/responsible-user-guide\\nTable 52: Model card for L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle .\\n77',\n",
665
              "    'Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aur’elien Rodriguez, Armand Joulin, Edouard\\nGrave, and Guillaume Lample. Llama: Open and efficient foundation language models. arXiv preprint\\narXiv:2302.13971 , 2023.\\nAshish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser,\\nand Illia Polosukhin. Attention is all you need, 2017.\\nOriol Vinyals, Igor Babuschkin, Wojciech M Czarnecki, Michaël Mathieu, Andrew Dudzik, Junyoung Chung,\\nDavid H Choi, Richard Powell, Timo Ewalds, Petko Georgiev, et al. Grandmaster level in starcraft ii using\\nmulti-agent reinforcement learning. Nature, 575(7782):350–354, 2019.\\nYizhong Wang, Yeganeh Kordi, Swaroop Mishra, Alisa Liu, Noah A Smith, Daniel Khashabi, and HannanehHajishirzi. Self-instruct: Aligninglanguagemodel withselfgeneratedinstructions. arXivpreprint'])]}"
666
            ]
667
          },
668
          "metadata": {},
669
          "execution_count": 17
670
        }
671
      ],
672
      "source": [
673
        "agent(\"tell me about Llama 2?\")"
674
      ]
675
    },
676
    {
677
      "cell_type": "code",
678
      "execution_count": 18,
679
      "metadata": {
680
        "colab": {
681
          "base_uri": "https://localhost:8080/"
682
        },
683
        "id": "pVytkznkXQ21",
684
        "outputId": "41aa81d9-a0f3-4f2a-d24e-1b6e8997d727"
685
      },
686
      "outputs": [
687
        {
688
          "output_type": "stream",
689
          "name": "stdout",
690
          "text": [
691
            "\n",
692
            "\n",
693
            "\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
694
            "\u001b[32;1m\u001b[1;3m```json\n",
695
            "{\n",
696
            "    \"action\": \"Vector Search Tool\",\n",
697
            "    \"action_input\": \"Llama 2 features and advantages\"\n",
698
            "}\n",
699
            "```\u001b[0mLlama 2 features and advantages\n",
700
            "[-0.007560313679277897, 0.014035549946129322, -0.019672317430377007, -0.002987486543133855, -0.004100747872143984, 0.02533726766705513, -0.024097178131341934, -0.00222476152703166, -0.027620157226920128, -0.029593026265501976, 0.0096459174528718, 0.018051745370030403, 0.0035652550868690014, -0.009364078752696514, -0.0043403105810284615, 0.007828060537576675, 0.013986228033900261, 0.0006363381398841739, 0.007813967764377594, 0.021631093695759773, -0.006552741397172213, -0.0042768968269228935, 0.00280957599170506, -0.029283003881573677, -0.011618785560131073, 0.009364078752696514, 0.030748562887310982, -0.011477867141366005, 0.004231098107993603, 0.006443529389798641, 0.027930179610848427, 0.0001197812962345779, -0.03821728006005287, -0.007377118803560734, -0.03801999241113663, -0.009018827229738235, 0.00849742628633976, -0.001609120867215097, 0.034835219383239746, -0.010174364782869816, 0.026351885870099068, 0.016896208748221397, 0.00714460201561451, -0.019179100170731544, -0.01334504596889019, 0.009230205789208412, 0.02318120375275612, 0.0006253288593143225, -0.02948029153048992, 0.00041923453682102263, 0.0042768968269228935, 0.012464300729334354, -0.024280373007059097, -0.006165213882923126, 0.0006548337987624109, 0.008779264986515045, -0.02278663031756878, -0.02072921022772789, 0.021645184606313705, -0.010160272009670734, -0.004002104513347149, 0.0013052638387307525, -0.022702079266309738, -0.004477706737816334, -0.02280072309076786, -0.0002481498522683978, 0.009230205789208412, 0.01509244367480278, 0.006975498981773853, -0.004428384825587273, 0.0096459174528718, 0.016064785420894623, -0.003457804210484028, -0.017995378002524376, 0.007961933501064777, -0.016656646504998207, -0.043487656861543655, -0.00092125911032781, 0.006158167961984873, -0.015078351832926273, 0.006214535795152187, -0.03376423567533493, 0.01299274805933237, 0.01651572808623314, 0.03398970514535904, 0.03041035681962967, 0.011273534037172794, 0.0026246195193380117, -0.03170681372284889, -0.014000319875776768, -0.0029698715079575777, 0.014613318257033825, 0.01699485257267952, 0.03181954845786095, -0.009934801608324051, -0.0032481870148330927, 0.0019746299367398024, -0.02065875008702278, -0.014768329448997974, -0.009561366401612759, -0.008363553322851658, -0.0012964564375579357, -0.02771880105137825, -0.003093175822868943, -0.048419829457998276, -0.02867705188691616, 0.020151441916823387, 0.004541120491921902, 0.025830484926700592, 0.009265435859560966, -0.01802356168627739, 0.029677577316761017, 0.01574067212641239, -0.022913457825779915, -0.0007983951945789158, -2.7426944143371657e-05, 0.015162902884185314, -0.0006447052001021802, 0.007835105992853642, 0.0008164504542946815, 0.006816965062171221, 0.0035458786878734827, 0.028536133468151093, -0.02366033010184765, 0.0015421841526404023, -0.023956259712576866, 0.006140552926808596, -0.013020931743085384, -0.01970050111413002, 0.004188822582364082, 0.007179832085967064, 0.028014732524752617, 0.017008943483233452, -0.007510991767048836, -0.007637819275259972, -0.005027291364967823, 0.0036603754851967096, 0.00021490173821803182, -0.0020169056951999664, 0.014951524324715137, 0.022053850814700127, 0.017206231132149696, -0.021814288571476936, -0.0008032392943277955, -0.011527188122272491, 0.032777801156044006, 0.01104101724922657, 0.00029460914083756506, 0.011491958983242512, -0.009681147523224354, -0.007377118803560734, -0.03105858527123928, -0.00029659082065336406, 0.003521217964589596, -0.007112895138561726, 0.014726053923368454, 0.008835632354021072, 0.026365976780653, -0.014909248799085617, -0.01730487495660782, -0.011351039633154869, 0.006070093251764774, -0.007070619612932205, 0.0075180381536483765, 0.032045021653175354, 0.02843748964369297, 0.024519937112927437, -0.007035389542579651, 0.002369203604757786, 0.002570013515651226, 0.0009846726898103952, 0.027296043932437897, -0.00367446755990386, 0.03782270476222038, -0.013619838282465935, -0.007440532557666302, 0.008081714622676373, -0.003269324777647853, -0.0077012330293655396, -0.023237571120262146, 0.0074898540042340755, -0.0020574198570102453, 0.0002470489125698805, 0.00601724861189723, -0.04861711338162422, 0.0003236737393308431, -0.008159220218658447, 0.00675002858042717, 0.026943746954202652, -0.0018495641415938735, -0.008877907879650593, 0.0228852741420269, -0.0058093927800655365, -0.017671264708042145, -0.6664913296699524, 0.0011026925640180707, 0.0108437305316329, -0.00829309318214655, -0.005580399185419083, 0.009209067560732365, -0.002265275688841939, -0.015472925268113613, -0.0033732526935636997, 0.0038647083565592766, -0.0035952003672719, -0.013309815898537636, 0.0003001138102263212, -0.019911879673600197, -0.022124310955405235, -0.04526323825120926, 0.0028588976711034775, -0.02312483638525009, 0.0018301877425983548, 0.0005460617830976844, -0.021405622363090515, 0.004283942747861147, -0.002448470564559102, -0.007842152379453182, -0.0018742249812930822, 0.022208862006664276, 0.022828906774520874, -0.008426966145634651, -0.00925134401768446, 0.015670211985707283, -0.02962120994925499, 0.01748806983232498, 0.010378696955740452, -0.003070276463404298, 0.04813798889517784, -0.008631299249827862, -0.006725367624312639, 0.026986021548509598, 0.005210486240684986, 0.03001578338444233, -0.01588159054517746, -0.01474014576524496, 0.021180151030421257, 0.015261546708643436, 0.0021719166543334723, -0.0023515888024121523, 0.04154297336935997, 0.029113901779055595, -0.008074668236076832, -0.017051219940185547, 0.013683252036571503, -0.013098437339067459, 0.01682574860751629, 0.013880538754165173, 0.01922137476503849, -0.0007094399770721793, 0.04827890917658806, -0.008976551704108715, -0.016642553731799126, 0.002323404885828495, -0.009434538893401623, -0.014500582590699196, -0.02026417665183544, -0.026802826672792435, -0.027916088700294495, 0.01314071286469698, -0.003952782601118088, 0.009906617924571037, 0.02175792120397091, -0.0013246402377262712, -0.006831056904047728, 0.03590620681643486, 0.012499530799686909, 0.0018513256218284369, 0.014387847855687141, 0.030635828152298927, 0.008913137950003147, -0.016543911769986153, -0.013295724056661129, 0.01636071689426899, 0.014458307065069675, -0.006158167961984873, -0.005347882863134146, -0.00522457854822278, 0.03429972752928734, 0.002281129127368331, -0.017502160742878914, -0.003949259873479605, 0.0008820659713819623, 0.014754237607121468, 0.014197606593370438, 1.2956582395418081e-05, -0.02001052349805832, -0.009124516509473324, 0.013950997963547707, 0.029677577316761017, -0.014444215223193169, 0.009032919071614742, 0.003140736138448119, -0.025562738999724388, -0.01493743248283863, -0.018869077786803246, 0.011766751296818256, 0.002485461998730898, 0.024167638272047043, 0.02423809841275215, -0.031199505552649498, -0.0067465053871273994, 0.022293413057923317, -0.03181954845786095, -0.019756868481636047, 0.008800402283668518, -0.005601536948233843, -0.02764834277331829, 0.013570516370236874, -0.03373605012893677, 0.013232310302555561, -0.0016390661476179957, 0.03018488734960556, -0.014641501940786839, 0.002525976160541177, 0.007828060537576675, 0.025280900299549103, -0.005379589274525642, -0.00446713762357831, 0.02374488115310669, -0.017107587307691574, -0.012844782322645187, -0.009272481314837933, -0.014211698435246944, 0.0024361403193324804, 0.0027708231937140226, 0.014113055542111397, -0.006919131614267826, 0.015994327142834663, -0.0064012533985078335, 0.02749333158135414, -0.010160272009670734, 0.014782421290874481, -0.003311600536108017, -0.012485438957810402, 0.008835632354021072, 0.0008450746536254883, -0.019911879673600197, -0.004601011052727699, -0.024083087220788002, -0.00789851974695921, 0.00482648191973567, -0.006422391161322594, -0.008532656356692314, -0.011055109091103077, -0.04199391230940819, -0.02302619256079197, 0.018319493159651756, 0.022377964109182358, -0.0038224325980991125, -0.014909248799085617, -0.013605746440589428, -0.0015668451087549329, -0.03573710098862648, 0.010435065254569054, 0.008955413475632668, -0.004079610109329224, 0.008722896687686443, -0.006732413545250893, -0.013105482794344425, -0.005879852455109358, 0.028944797813892365, -0.009624779224395752, -0.03229867294430733, -0.019320018589496613, 0.010364605113863945, 0.0032059112563729286, 0.03359512984752655, -0.003244664054363966, -0.008715851232409477, 0.004879326559603214, -0.019108640030026436, -0.009357033297419548, 0.008426966145634651, 0.013838263228535652, 0.03940100222826004, 0.015585660934448242, -0.0228852741420269, -0.013366183266043663, 0.003101983340457082, 0.01140036154538393, 0.008011255413293839, -0.023942166939377785, 0.028296569362282753, 0.00019112162408418953, 0.003625145647674799, -0.01891135238111019, -0.004431908018887043, -0.01270386390388012, 0.007666002959012985, 0.016078878194093704, 0.00041769322706386447, 0.03520160913467407, 0.011971083469688892, 0.0013774848775938153, 0.0029487337451428175, -0.012288152240216732, -0.02661963179707527, 0.008553793653845787, -0.027507422491908073, 0.004738407209515572, -0.010188456624746323, 0.016558002680540085, 0.02120833657681942, 0.007204492576420307, -0.04827890917658806, 0.001213666400872171, -0.012097910977900028, 0.018460411578416824, 0.026112323626875877, 0.01004753727465868, 0.0030896528623998165, -0.0032305719796568155, 0.003998581785708666, 0.0009538466692902148, 0.0019429230596870184, -0.00014224028564058244, -0.016741197556257248, -0.008574931882321835, -0.007581451442092657, -0.0030649921391159296, 0.018488595262169838, -0.02057419903576374, -0.011266487650573254, -0.00034943551872856915, -0.004882849287241697, 0.0046538556925952435, 0.011921762488782406, 0.010723949410021305, 0.002751446794718504, 0.01707940362393856, -0.005404250230640173, 0.03413062542676926, 0.010660535655915737, 0.015374281443655491, 0.012471347115933895, 0.010878960601985455, -0.010012307204306126, 0.014430123381316662, 0.031932283192873, 0.02088422141969204, -0.0054641407914459705, -0.02413945458829403, 0.0037378810811787844, -0.003614576766267419, 0.005379589274525642, -0.0077223707921803, -0.002654565032571554, 0.020912405103445053, -0.01358460821211338, 0.01187948603183031, 0.017178047448396683, 0.020306453108787537, 0.027141032740473747, 0.01620570570230484, 0.008884954266250134, -0.0016575617482885718, 0.0023991488851606846, 0.017530344426631927, -0.014500582590699196, -0.011900624260306358, -0.03596257418394089, -0.009462722577154636, -0.02280072309076786, 0.013091390952467918, 0.00637306971475482, 0.02828247845172882, -0.0216874610632658, 0.011569464579224586, 0.019362295046448708, 0.004974446725100279, 0.00834241509437561, 0.00242204824462533, 0.02900116518139839, -0.004727838095277548, -0.018784526735544205, 0.02225113846361637, -0.00444952305406332, -0.008152173832058907, -0.01731896586716175, -0.02454812079668045, 0.011499004438519478, 0.001935877138748765, 0.017051219940185547, 0.010139134712517262, -0.009758653119206429, -0.01259112823754549, 0.009554320015013218, 0.013507102616131306, 0.02072921022772789, 0.03294690325856209, -0.025154072791337967, -0.008088761009275913, -0.01565612107515335, 0.013965089805424213, 0.0058410996571183205, -0.024984968826174736, -0.006422391161322594, 0.02701420523226261, -0.007236199453473091, -0.019038179889321327, -0.02962120994925499, 0.0058093927800655365, -0.02415354549884796, 0.00829309318214655, -0.034271541982889175, 0.016882117837667465, -0.034666117280721664, 0.019756868481636047, -0.006246242206543684, 0.017121680080890656, -0.020235992968082428, 0.0322423055768013, 0.010540754534304142, -0.009864342398941517, -0.01175265945494175, -0.011005787178874016, 0.010787363164126873, 0.05664950609207153, 0.00716573977842927, 0.021025139838457108, 0.007461670320481062, -0.019038179889321327, -0.014838788658380508, -0.014042595401406288, -0.020715119317173958, 0.015078351832926273, -0.0008085237350314856, -0.014965616166591644, -0.006179305724799633, 0.016698922961950302, -0.002779630711302161, 0.012555898167192936, -0.0016637269873172045, -0.004830004647374153, -0.01827721670269966, 0.015106535516679287, 0.022575251758098602, 0.012379749678075314, 0.020870130509138107, 0.03438427671790123, 0.009145654737949371, 0.005763594061136246, 0.010512569919228554, 0.01860133185982704, 0.007468716241419315, 0.005073090083897114, -0.01882680132985115, 0.013521194458007812, -0.02437901683151722, 0.0255909226834774, 0.017981287091970444, -0.011964038014411926, 0.03813272714614868, -0.010146180167794228, -0.004858188331127167, 0.009258389472961426, 0.0018072883831337094, 0.02732422761619091, 0.0177839994430542, 0.013070253655314445, 0.004076086916029453, 0.020292360335588455, -0.0037378810811787844, -0.002129641128703952, 0.02653508074581623, 0.011111476458609104, -0.0019253082573413849, 0.013788941316306591, -0.009328849613666534, 0.002737354952841997, 0.0031160751823335886, 0.005516985431313515, -0.006158167961984873, -0.02962120994925499, 0.006901516579091549, 0.0072714295238256454, -0.016022510826587677, -0.003949259873479605, -0.014225790277123451, 0.018150389194488525, -0.01906636357307434, 0.002913503907620907, -0.02754969894886017, -0.019799143075942993, 0.0036920823622494936, -0.030607644468545914, -0.015430649742484093, -0.009237252175807953, -0.022758446633815765, -0.05225282907485962, 0.010787363164126873, 0.022208862006664276, 0.020052798092365265, -0.0009194976300932467, -0.0020116211380809546, -0.013993274420499802, 0.013683252036571503, 0.016233889386057854, -0.0005081897834315896, 0.0035388327669352293, -0.017614897340536118, -0.013161851093173027, -0.016952576115727425, -0.00029460914083756506, -0.015036076307296753, -0.019601857289671898, 0.005058998242020607, 0.01827721670269966, -0.007250291295349598, 0.014711962081491947, -0.024252189323306084, 0.015191086567938328, 0.015303822234272957, 0.005210486240684986, 0.0009582503698766232, -0.009364078752696514, -0.020841944962739944, 0.025520462542772293, -0.026070047169923782, -0.025055428966879845, 0.0010225448058918118, 0.008166265673935413, 0.008920183405280113, 0.025450002402067184, -0.005136503838002682, -0.0028060530312359333, 0.008828585967421532, 0.0069614071398973465, -0.014324434101581573, 0.010350513271987438, 0.012562944553792477, -0.010935327969491482, 0.02802882343530655, 0.005481755826622248, -0.003112552221864462, 0.013662113808095455, -0.03531434386968613, 0.009927756153047085, -0.019108640030026436, -0.0011114999651908875, 0.014669685624539852, -0.011872440576553345, -0.00984320417046547, 0.025196347385644913, -0.010082767345011234, -0.007391210645437241, 0.01088600605726242, -0.005376066546887159, 0.012661587446928024, 0.0011476104846224189, 0.006492850836366415, -0.01613524556159973, 0.0043438333086669445, -0.008236725814640522, 0.020038707181811333, -0.012527714483439922, -0.0017826275434345007, -0.0011308763641864061, 0.009469768032431602, -0.002666895277798176, -0.01588159054517746, -0.0037061742041260004, -0.01754443719983101, -0.015501108951866627, 0.021786104887723923, -0.01731896586716175, 0.018375860527157784, 0.004808866884559393, 0.007320750970393419, -0.06256811320781708, -0.016966668888926506, 0.014979708008468151, -0.01930592767894268, 0.0032640404533594847, -0.019263651221990585, 0.0073418887332081795, 0.010146180167794228, 0.011428545229136944, 0.016698922961950302, 0.00928657315671444, -0.025351358577609062, 0.004882849287241697, 0.00838469062000513, -0.0024731315206736326, 0.0014822935918346047, -0.02167336829006672, 0.029903048649430275, 0.01263340376317501, 0.015557477250695229, -0.0013871730770915747, 0.0011264726053923368, 0.0005147953634150326, 0.005890421569347382, -0.0001959657238330692, 0.0009230205905623734, -0.009434538893401623, -0.025844575837254524, -0.0020327591337263584, -0.01338732149451971, 0.01528973039239645, -0.01524745486676693, -0.014711962081491947, -0.0021789628081023693, -0.0023903416004031897, -0.012513622641563416, -0.009441584348678589, -0.015543384477496147, 0.013838263228535652, 0.007926703430712223, 0.03415880724787712, 0.027380594983696938, 0.006112369243055582, -0.013274585828185081, -0.014232836663722992, -0.034102439880371094, 0.0025735364761203527, 0.007049481850117445, 0.01238679513335228, -0.0019376386189833283, 0.001978152897208929, -0.022744353860616684, 0.007461670320481062, 0.0025823437608778477, -0.009800928644835949, -0.0077998763881623745, -0.011520142666995525, -0.005668473895639181, 0.0139369061216712, -0.029593026265501976, -0.02716921642422676, -0.017333058640360832, -0.004685562569648027, 0.008483334444463253, 0.00444599986076355, 0.02295573428273201, -0.002751446794718504, -0.026070047169923782, 0.021180151030421257, -0.0036040078848600388, 0.03390515223145485, 0.006288518197834492, -0.0004742810851894319, 0.020292360335588455, 0.005703703500330448, -0.01183721050620079, -0.00985025055706501, 0.008800402283668518, -0.007503945846110582, -0.0005839338409714401, 0.033707864582538605, 0.001951730577275157, 0.011076247319579124, 0.01881271041929722, 0.013105482794344425, -0.018150389194488525, -0.03762542083859444, 0.006725367624312639, 0.00909633282572031, 0.003973920829594135, -0.03503250703215599, -0.009998215362429619, 0.00444599986076355, -0.003489511087536812, -0.017685355618596077, 0.022758446633815765, 0.0034666117280721664, 0.00016833234985824674, -0.007084711454808712, 0.016332531347870827, -0.015613844618201256, 0.009702284820377827, -0.020433280616998672, -0.004798297770321369, 0.019348202273249626, -0.0308894831687212, -0.0011159037239849567, -0.013161851093173027, 0.0030685150995850563, 0.030128519982099533, -0.003783679800108075, 0.003991535399109125, 0.021941116079688072, -0.005693134851753712, 0.010977603495121002, 0.01636071689426899, 0.009603641927242279, 0.03128405660390854, -0.022110218182206154, -0.026267334818840027, 0.021067416295409203, 0.014768329448997974, 0.018220849335193634, -0.012126094661653042, -0.001229519839398563, 0.014035549946129322, -0.0016901493072509766, -0.001519284793175757, 0.019362295046448708, 0.018375860527157784, -0.0331723727285862, -0.016487542539834976, 0.0013158328365534544, -0.01922137476503849, -0.027972456067800522, 0.017840366810560226, -0.002222999930381775, -0.0004967401036992669, 0.017347149550914764, 0.01723441481590271, -0.0020732732955366373, 0.010082767345011234, 0.018559055402874947, 0.005287991836667061, 0.0019640610553324223, 0.02264571189880371, -0.04613693803548813, 0.026027770712971687, 0.004463614895939827, 0.007961933501064777, -0.022913457825779915, 0.023167112842202187, 0.0003833441878668964, -0.01579703949391842, 0.016783474013209343, -0.0019253082573413849, -0.015064259991049767, -0.0275637898594141, 0.008666529320180416, -0.0029223114252090454, 0.0026193351950496435, -0.012478392571210861, 0.023730788379907608, 0.018051745370030403, 0.004347356501966715, -0.03345421329140663, -0.009434538893401623, 0.022110218182206154, 0.009927756153047085, -0.008765172213315964, 0.018150389194488525, -0.0003069395897909999, 0.00698959082365036, -0.008398782461881638, -0.006168736610561609, 0.0009142131311818957, -0.00973046850413084, -0.0313122421503067, -0.01493743248283863, 0.019334111362695694, 0.004382586106657982, -0.0077998763881623745, -0.0029593026265501976, -0.009272481314837933, -0.00597497308626771, 0.0041747307404875755, -0.002695079194381833, 0.005851668771356344, 0.017685355618596077, -0.00928657315671444, -0.0017068835441023111, 0.03181954845786095, -0.019503213465213776, -0.030043967068195343, -0.0036057692486792803, -0.020546015352010727, -0.03455338254570961, -0.027211492881178856, -0.007475762162357569, 0.030804932117462158, 0.016966668888926506, 0.004907510243356228, -0.012471347115933895, 0.0013669159961864352, -0.019009996205568314, -0.01668483018875122, -0.020841944962739944, -0.0032481870148330927, 0.010096859186887741, 0.01533200591802597, 0.019658224657177925, 0.006288518197834492, 0.007447578478604555, 0.0032675634138286114, -0.026408253237605095, -0.01706531271338463, -0.014063733629882336, -0.01603660173714161, 0.01493743248283863, -0.01258408185094595, -0.028944797813892365, -0.0005174375837668777, 0.01032937504351139, 0.005714272614568472, 0.014838788658380508, 0.007391210645437241, -0.018629515543580055, -0.009032919071614742, 0.006521034985780716, -0.013859400525689125, 0.001999290892854333, 0.008631299249827862, -0.005929174367338419, 0.010075720958411694, -0.007433486636728048, 0.013711435720324516, -0.0038647083565592766, -0.0025347836781293154, -0.00541834207251668, -0.006940269377082586, 0.017206231132149696, 0.008567885495722294, 0.004717269446700811, -0.017854459583759308, 0.016501635313034058, -0.013246402144432068, 0.0027232631109654903, 0.00039369292790070176, -0.0034824651665985584, 0.0272537674754858, -0.0003489951486699283, -0.0040302881971001625, 0.0006627605180256069, 0.01334504596889019, 0.01353528629988432, -0.01723441481590271, 0.027465147897601128, 0.014866973273456097, -0.0035176947712898254, 0.006869809702038765, -0.016501635313034058, -0.019080456346273422, -0.009836158715188503, -0.014528767205774784, -0.013760757632553577, 0.0010154987685382366, 0.0011661062017083168, 0.008673574775457382, -0.02628142572939396, -0.01788264326751232, -0.015698395669460297, 0.00973046850413084, 0.01603660173714161, -0.0100898128002882, 0.003998581785708666, 0.02318120375275612, 0.011548326350748539, -0.027606066316366196, 0.020123258233070374, -0.00949090626090765, 0.0018178573809564114, -0.0008094045333564281, -0.016882117837667465, 0.02105332538485527, 0.22186315059661865, -0.014458307065069675, -0.008328323252499104, 0.038330014795064926, 0.009913664311170578, 0.022589342668652534, 0.01574067212641239, -0.011301717720925808, -0.0053936815820634365, 0.022969825193285942, 0.0011775558814406395, 0.009357033297419548, -0.022462517023086548, 0.0012656303588300943, 0.011132614687085152, -0.019009996205568314, -0.015754763036966324, -0.018854985013604164, -0.03238322585821152, -0.010357559658586979, 0.011188982054591179, 0.006200443487614393, 0.014176469296216965, -0.014951524324715137, 0.02653508074581623, -0.003625145647674799, -0.01788264326751232, 0.005929174367338419, 0.018714066594839096, 0.007042435929179192, -0.005689611658453941, -0.002823667833581567, 0.004023242276161909, -0.00016183686966542155, -0.012119049206376076, 0.009941847994923592, 0.01652981899678707, -0.009307711385190487, 0.011675153858959675, 0.0028853199910372496, -0.01668483018875122, -0.00601724861189723, 0.0010894814040511847, 0.0014021458337083459, 0.013168896548449993, -0.0008050007745623589, -0.02074330300092697, 0.0018072883831337094, 0.0100898128002882, 0.010984649881720543, -0.017333058640360832, 0.023054376244544983, -0.007926703430712223, 0.012238830327987671, 0.015261546708643436, -0.00030187529046088457, 0.004970923997461796, 0.009603641927242279, -0.00829309318214655, -0.006422391161322594, -0.011675153858959675, 0.013669160194694996, 0.0011053347261622548, 0.007877381518483162, -0.01954548992216587, 0.014303295873105526, -0.02676055021584034, 0.005689611658453941, 0.00019937861361540854, -0.017854459583759308, 0.0063026100397109985, -0.014500582590699196, -0.0038083407562226057, 0.020996956154704094, -0.024252189323306084, -0.028606591746211052, -0.0007433486171066761, 0.014113055542111397, 0.014585134573280811, 0.02002461440861225, -0.0051470729522407055, -0.01533200591802597, -0.02492860145866871, -0.0322423055768013, 0.014683777466416359, -0.012168371118605137, -0.003980966750532389, -0.0022723216097801924, -0.008877907879650593, -0.018220849335193634, 0.004907510243356228, -0.010928281582891941, -0.019601857289671898, -0.011921762488782406, 0.01298570167273283, 0.013838263228535652, -0.006390684749931097, 0.017910826951265335, -0.015825223177671432, -0.004819435533136129, -0.022575251758098602, 0.06651385128498077, 0.010871914215385914, -0.017051219940185547, -0.010653489269316196, 0.009504998102784157, -0.007158693857491016, 0.021067416295409203, 0.0103998351842165, -0.006126461084932089, 0.0127954613417387, -0.04123295098543167, 0.011696291156113148, -0.005185825750231743, 0.010420972481369972, 0.0005447407020255923, 0.01629025675356388, -0.013352091424167156, 0.007088234648108482, -0.008765172213315964, 0.012816598638892174, -0.005502893589437008, 0.011738567613065243, 0.01572657935321331, 0.0002197458379669115, -0.012140186503529549, -0.02184247225522995, 0.006355454679578543, -0.006950838025659323, -0.027028298005461693, -0.0028888429515063763, -0.004886372480541468, 0.014571042731404305, -0.007905565202236176, 0.0115976482629776, 0.004505890421569347, 0.004417816177010536, -0.0042909886687994, -0.013880538754165173, -0.00737007288262248, -0.0008336249738931656, 0.015007891692221165, -0.008363553322851658, -0.022363873198628426, 0.011414453387260437, -0.008236725814640522, -0.004678516648709774, -0.007292567286640406, -0.028620684519410133, -0.022998008877038956, -0.0223075058311224, 0.004893418401479721, -0.00026180141139775515, -0.008765172213315964, 0.028099283576011658, 0.0032904627732932568, -0.010082767345011234, 0.0034754190128296614, 0.020038707181811333, -0.002857136307284236, -0.015458833426237106, 0.0021736782509833574, -0.004146546591073275, 0.023801248520612717, -0.010794408619403839, -0.011823118664324284, -0.18443500995635986, -0.008560840040445328, 0.003709697164595127, -0.033228740096092224, 0.00021380081307142973, -0.0073418887332081795, 0.003713220125064254, 0.005700180772691965, -0.03711811080574989, 0.004534074570983648, 0.010547799989581108, 0.009420447051525116, -0.0051470729522407055, -0.01063939742743969, -0.01923546753823757, 0.0023463042452931404, 0.010568938218057156, 0.004192345310002565, 0.014035549946129322, 0.00032653615926392376, 0.05101273953914642, -0.037287212908267975, 0.010195502080023289, 0.002282890724018216, 0.016727106645703316, -0.016797564923763275, -0.0067147985100746155, 0.016698922961950302, -0.002554160077124834, -0.03511705994606018, -0.014444215223193169, 0.020715119317173958, 0.04092292860150337, -0.0005980257410556078, 0.013887584209442139, 0.006221581716090441, 0.001962299458682537, -0.0033186464570462704, 0.0013528240378946066, 0.007056527771055698, 0.0004804462951142341, 0.005971449892967939, -0.0048370505683124065, 0.013760757632553577, -0.008469242602586746, 0.04362857714295387, 0.014077825471758842, -0.01994006335735321, 0.016008418053388596, -0.030043967068195343, 0.0325523279607296, -0.02129288762807846, 0.008962459862232208, -0.004135977942496538, 0.014444215223193169, -0.006517511792480946, -0.009483860805630684, -0.011611740104854107, -0.009469768032431602, -0.00581996189430356, -0.007391210645437241, -0.010308237746357918, -0.008603115566074848, 0.004872280638664961, -0.006292040925472975, -0.022772539407014847, -0.006584448274224997, 0.017530344426631927, -0.016980759799480438, 0.018699973821640015, -0.024562211707234383, 0.0017174524255096912, -0.011830165050923824, -0.02105332538485527, 0.02931118756532669, 0.011372176930308342, -0.013260493986308575, 0.004981492646038532, 0.015782946720719337, -0.028648868203163147, -0.014965616166591644, -0.007729416713118553, 0.012407933361828327, -0.0033767756540328264, -0.01258408185094595, 0.0081380819901824, -0.017981287091970444, 0.02399853616952896, -0.027746984735131264, 0.0013475395971909165, 0.04374131187796593, -0.027281951159238815, 0.0014690824318677187, -0.029677577316761017, -0.016487542539834976, 0.020151441916823387, 0.03128405660390854, -0.007856244221329689, -0.0008719373727217317, -0.03993649408221245, 0.01714986376464367, 0.0031424975022673607, -0.011928807944059372, -0.011414453387260437, 0.031932283192873, 0.005531077738851309, 0.018559055402874947, 0.027380594983696938, 0.03754086792469025, -0.011407407000660896, 0.009209067560732365, 0.0020133827347308397, 0.013901676051318645, 0.00011102889402536675, -0.00016921310452744365, 0.015768855810165405, -0.009307711385190487, -0.010900097899138927, 0.00242204824462533, -0.02501315250992775, 0.057945962995290756, -0.018854985013604164, -0.008532656356692314, 0.010413927026093006, -0.002562967361882329, -0.01140036154538393, -0.10123633593320847, -0.029170269146561623, 0.028014732524752617, 0.007204492576420307, -0.012407933361828327, 0.021659277379512787, -0.01258408185094595, 0.030128519982099533, -0.010787363164126873, 0.04489684849977493, -0.002742639509961009, -0.005330267827957869, -0.0005953835207037628, 0.017826275900006294, 0.03393333777785301, 0.007229153532534838, -0.01944684609770775, -0.008279001340270042, 0.0038224325980991125, 0.012267014011740685, -0.006943792104721069, -0.01937638595700264, -0.0006081542815081775, -0.030776748433709145, -0.023082559928297997, -0.009427492506802082, -0.032129570841789246, 0.0054958476684987545, 0.0012198316399008036, 0.015148811042308807, 0.010660535655915737, -0.008377645164728165, 0.022673895582556725, 0.013063207268714905, 0.015613844618201256, 0.004033811390399933, -0.011224212124943733, -0.006816965062171221, 0.0275637898594141, -0.009237252175807953, -0.01660027913749218, -0.0029311187099665403, -0.004911033436655998, -0.021814288571476936, -0.010935327969491482, -0.009272481314837933, 0.004135977942496538, 0.005051952321082354, 0.019728684797883034, -0.03413062542676926, -0.03486340492963791, -0.018854985013604164, -0.037371765822172165, -0.005270377267152071, 0.014824696816504002, -0.041740261018276215, -0.00041725285700522363, 0.01581113040447235, -0.020503738895058632, 0.002846567425876856, 0.006954361218959093, -0.011203073896467686, 0.0008415516931563616, 0.02389989234507084, 0.016022510826587677, 0.0018847939791157842, 0.014352617785334587, -0.014810604974627495, 0.04500958323478699, 0.020207809284329414, -0.024914510548114777, 0.01937638595700264, -0.011231258511543274, 0.024590395390987396, 0.0031178367789834738, 0.031086768954992294, -0.02437901683151722, -0.025534553453326225, 0.007334842812269926, 0.015458833426237106, -0.002233568811789155, -0.016868025064468384, -0.009624779224395752, -0.02549227885901928, 0.012224738486111164, 0.020038707181811333, 0.009660009294748306, -0.012443163432180882, 0.005781209096312523, -0.019179100170731544, -0.024562211707234383, 0.020362820476293564, 0.03655443340539932, -0.01915091648697853, 0.001197812962345779, 0.00925134401768446, -0.036328963935375214, 0.014542859047651291, 0.022349780425429344, -0.0006037505809217691, -0.0223075058311224, -0.025703657418489456, -0.07897110283374786, 0.018305400386452675, 0.011696291156113148, -0.005330267827957869, 0.015191086567938328, -0.016713013872504234, 0.018488595262169838, -0.03630077838897705, 0.023364398628473282, -0.00318653485737741, -0.028705235570669174, 0.021856563165783882, -0.01810811460018158, -0.0219974834471941, -0.007158693857491016, -0.019122732803225517, 0.02129288762807846, 0.00581643870100379, 0.023364398628473282, 0.01270386390388012, 0.006887424737215042, -0.024097178131341934, 0.006517511792480946, 0.0022899366449564695, 0.01612115278840065, -0.0020961726550012827, -0.025154072791337967, 0.00949795264750719, -0.015768855810165405, 0.0039950585924088955, -0.0015553954290226102, -0.026196874678134918, 0.0243931096047163, 0.03415880724787712, -0.017826275900006294, -0.034186992794275284, -0.014571042731404305, 0.012224738486111164, 0.005502893589437008, -0.028944797813892365, -0.008884954266250134, -0.013225264847278595, 0.02191293239593506, -0.004766590893268585, 0.003540594130754471, -0.028169743716716766, -0.003914030268788338, -0.011773796752095222, 0.03241141140460968, 0.010512569919228554, 0.03305963799357414, -0.0023885800037533045, -0.012245875783264637, -0.03382060304284096, -0.0033450687769800425, -0.01882680132985115, 0.0020538968965411186, -0.019601857289671898, 0.01994006335735321, -0.013204126618802547, 0.02192702330648899, -0.0005896586808376014, 0.03066401183605194, -0.0018495641415938735, -0.014444215223193169, -0.01354233268648386, -0.027338320389389992, -0.01369734387844801, -0.006045432761311531, -0.012252922169864178, -0.007105849217623472, 0.004512936342507601, 0.011956991627812386, -0.004664424806833267, -0.01740351878106594, 0.009216113947331905, -0.002469608560204506, -0.0018143344204872847, -0.02948029153048992, 0.02566138096153736, 0.017290782183408737, -0.029846681281924248, -0.041965730488300323, 0.018009470775723457, 0.02398444339632988, 0.0033468303736299276, -0.0201796256005764, 0.00870175939053297, -0.004682039376348257, 0.005784732289612293, -0.020827854052186012, 0.004012673627585173, -0.03787907212972641, 0.005886898376047611, -0.003857662435621023, 0.018840894103050232, -0.008321276865899563, -0.01314071286469698, 0.011273534037172794, -0.0072432453744113445, -0.011675153858959675, 0.001858371659182012, -0.026506897062063217, -0.004319172818213701, 0.0007226511370390654, 0.003359160851687193, -0.020151441916823387, -0.00973046850413084, 0.022842997685074806, 0.049998123198747635, 0.02025008574128151, -0.04050017148256302, -0.02629551850259304, 0.01779809221625328, -0.020292360335588455, -0.006866286974400282, -0.008828585967421532, 0.013330954127013683, -0.025125889107584953, 0.020193718373775482, 0.032214123755693436, 0.011217166669666767, 0.03266506269574165, 0.013232310302555561, -0.0024449476040899754, -0.0007363026961684227, 0.011520142666995525, -0.006890947464853525, -0.0019394000992178917, 0.012302244082093239, 0.009223160333931446, -0.008243771269917488, -0.022899365052580833, -0.003857662435621023, 0.009187930263578892, -0.0039598289877176285, 0.013478918932378292, 0.0177839994430542, -0.017840366810560226, 0.10117996484041214, 0.012534760870039463, -0.014458307065069675, 0.0034349048510193825, -0.02415354549884796, 0.02177201211452484, -0.002108503133058548, 0.007574405521154404, -0.010188456624746323, -0.0014461830724030733, -0.005791778210550547, -0.013922814279794693, 0.005450048949569464, 0.008490379899740219, -0.011809026822447777, -0.0019552535377442837, -0.01533200591802597, 0.010019353590905666, -0.03190410137176514, -0.004815912805497646, 0.037935443222522736, -0.009751606732606888, 0.020461464300751686, -0.003422574372962117, -0.011569464579224586, 0.015515200793743134, 0.025788208469748497, 0.010928281582891941, -0.008286047726869583, -0.024900417774915695, 0.012689772062003613, 0.014007366262376308, -0.030635828152298927, -0.014754237607121468, -0.012069727294147015, -0.016219796612858772, 0.0008979193517006934, -0.027676526457071304, -0.026394160464406013, 0.011301717720925808, -0.022335689514875412, 0.009617733769118786, -0.026196874678134918, -0.044586826115846634, -0.006295564118772745, 0.015923867002129555, -0.002621096558868885, -0.00794079527258873, -0.024195821955800056]\n",
701
            "\n",
702
            "Observation: \u001b[36;1m\u001b[1;3m['Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety', 'asChatGPT,BARD,andClaude. TheseclosedproductLLMsareheavilyfine-tunedtoalignwithhuman\\npreferences, which greatly enhances their usability and safety. This step can require significant costs in\\ncomputeandhumanannotation,andisoftennottransparentoreasilyreproducible,limitingprogresswithin\\nthe community to advance AI alignment research.\\nIn this work, we develop and release Llama 2, a family of pretrained and fine-tuned LLMs, L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle and\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , at scales up to 70B parameters. On the series of helpfulness and safety benchmarks we tested,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc models generally perform better than existing open-source models. They also appear to\\nbe on par with some of the closed-source models, at least on the human evaluations we performed (see', 'models will be released as we improve model safety with community feedback.\\nLicense A custom commercial license is available at: ai.meta.com/resources/\\nmodels-and-libraries/llama-downloads/\\nWhere to send commentsInstructions on how to provide feedback or comments on the model can be\\nfound in the model README, or by opening an issue in the GitHub repository\\n(https://github.com/facebookresearch/llama/ ).\\nIntended Use\\nIntended Use Cases L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is intended for commercial and research use in English. Tuned models\\nare intended for assistant-like chat, whereas pretrained models can be adapted\\nfor a variety of natural language generation tasks.\\nOut-of-Scope Uses Use in any manner that violates applicable laws or regulations (including trade\\ncompliancelaws). UseinlanguagesotherthanEnglish. Useinanyotherway\\nthat is prohibited by the Acceptable Use Policy and Licensing Agreement for\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle.\\nHardware and Software (Section 2.2)\\nTraining Factors We usedcustomtraininglibraries, Meta’sResearchSuperCluster, andproductionclustersforpretraining. Fine-tuning,annotation,andevaluationwerealso', 'Evaluation Results\\nSee evaluations for pretraining (Section 2); fine-tuning (Section 3); and safety (Section 4).\\nEthical Considerations and Limitations (Section 5.2)\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is a new technology that carries risks with use. Testing conducted to date has been in\\nEnglish, and has notcovered, nor could it coverall scenarios. For these reasons, aswith all LLMs,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle’s potential outputs cannot be predicted in advance, and the model may in some instances\\nproduceinaccurateorobjectionableresponsestouserprompts. Therefore,beforedeployingany\\napplications of L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle, developers should perform safety testing and tuning tailored to their\\nspecific applications of the model. Please see the Responsible Use Guide available available at\\nhttps://ai.meta.com/llama/responsible-user-guide\\nTable 52: Model card for L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle .\\n77', 'Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aur’elien Rodriguez, Armand Joulin, Edouard\\nGrave, and Guillaume Lample. Llama: Open and efficient foundation language models. arXiv preprint\\narXiv:2302.13971 , 2023.\\nAshish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser,\\nand Illia Polosukhin. Attention is all you need, 2017.\\nOriol Vinyals, Igor Babuschkin, Wojciech M Czarnecki, Michaël Mathieu, Andrew Dudzik, Junyoung Chung,\\nDavid H Choi, Richard Powell, Timo Ewalds, Petko Georgiev, et al. Grandmaster level in starcraft ii using\\nmulti-agent reinforcement learning. Nature, 575(7782):350–354, 2019.\\nYizhong Wang, Yeganeh Kordi, Swaroop Mishra, Alisa Liu, Noah A Smith, Daniel Khashabi, and HannanehHajishirzi. Self-instruct: Aligninglanguagemodel withselfgeneratedinstructions. arXivpreprint']\u001b[0m\n",
703
            "Thought:\u001b[32;1m\u001b[1;3m```json\n",
704
            "{\n",
705
            "    \"action\": \"Final Answer\",\n",
706
            "    \"action_input\": \"Llama 2 is special because it features a collection of pretrained and fine-tuned large language models optimized for dialogue use cases. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. They are intended for commercial and research use in English, with tuned models suitable for assistant-like chat and pretrained models adaptable for various natural language generation tasks.\"\n",
707
            "}\n",
708
            "```\u001b[0m\n",
709
            "\n",
710
            "\u001b[1m> Finished chain.\u001b[0m\n"
711
          ]
712
        },
713
        {
714
          "output_type": "execute_result",
715
          "data": {
716
            "text/plain": [
717
              "{'input': 'what makes llama 2 so special?',\n",
718
              " 'chat_history': [HumanMessage(content='tell me about Llama 2?', additional_kwargs={}, example=False),\n",
719
              "  AIMessage(content='Llama 2 is a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, are optimized for dialogue use cases. They outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. The approach to fine-tuning and safety is detailed in the work. Llama 2 is intended for commercial and research use in English, with tuned models intended for assistant-like chat and pretrained models adaptable for various natural language generation tasks.', additional_kwargs={}, example=False)],\n",
720
              " 'output': 'Llama 2 is special because it features a collection of pretrained and fine-tuned large language models optimized for dialogue use cases. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. They are intended for commercial and research use in English, with tuned models suitable for assistant-like chat and pretrained models adaptable for various natural language generation tasks.',\n",
721
              " 'intermediate_steps': [(AgentAction(tool='Vector Search Tool', tool_input='Llama 2 features and advantages', log='```json\\n{\\n    \"action\": \"Vector Search Tool\",\\n    \"action_input\": \"Llama 2 features and advantages\"\\n}\\n```'),\n",
722
              "   ['Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety',\n",
723
              "    'asChatGPT,BARD,andClaude. TheseclosedproductLLMsareheavilyfine-tunedtoalignwithhuman\\npreferences, which greatly enhances their usability and safety. This step can require significant costs in\\ncomputeandhumanannotation,andisoftennottransparentoreasilyreproducible,limitingprogresswithin\\nthe community to advance AI alignment research.\\nIn this work, we develop and release Llama 2, a family of pretrained and fine-tuned LLMs, L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle and\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , at scales up to 70B parameters. On the series of helpfulness and safety benchmarks we tested,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc models generally perform better than existing open-source models. They also appear to\\nbe on par with some of the closed-source models, at least on the human evaluations we performed (see',\n",
724
              "    'models will be released as we improve model safety with community feedback.\\nLicense A custom commercial license is available at: ai.meta.com/resources/\\nmodels-and-libraries/llama-downloads/\\nWhere to send commentsInstructions on how to provide feedback or comments on the model can be\\nfound in the model README, or by opening an issue in the GitHub repository\\n(https://github.com/facebookresearch/llama/ ).\\nIntended Use\\nIntended Use Cases L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is intended for commercial and research use in English. Tuned models\\nare intended for assistant-like chat, whereas pretrained models can be adapted\\nfor a variety of natural language generation tasks.\\nOut-of-Scope Uses Use in any manner that violates applicable laws or regulations (including trade\\ncompliancelaws). UseinlanguagesotherthanEnglish. Useinanyotherway\\nthat is prohibited by the Acceptable Use Policy and Licensing Agreement for\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle.\\nHardware and Software (Section 2.2)\\nTraining Factors We usedcustomtraininglibraries, Meta’sResearchSuperCluster, andproductionclustersforpretraining. Fine-tuning,annotation,andevaluationwerealso',\n",
725
              "    'Evaluation Results\\nSee evaluations for pretraining (Section 2); fine-tuning (Section 3); and safety (Section 4).\\nEthical Considerations and Limitations (Section 5.2)\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle is a new technology that carries risks with use. Testing conducted to date has been in\\nEnglish, and has notcovered, nor could it coverall scenarios. For these reasons, aswith all LLMs,\\nL/l.sc/a.sc/m.sc/a.sc /two.taboldstyle’s potential outputs cannot be predicted in advance, and the model may in some instances\\nproduceinaccurateorobjectionableresponsestouserprompts. Therefore,beforedeployingany\\napplications of L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle, developers should perform safety testing and tuning tailored to their\\nspecific applications of the model. Please see the Responsible Use Guide available available at\\nhttps://ai.meta.com/llama/responsible-user-guide\\nTable 52: Model card for L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle .\\n77',\n",
726
              "    'Baptiste Rozière, Naman Goyal, Eric Hambro, Faisal Azhar, Aur’elien Rodriguez, Armand Joulin, Edouard\\nGrave, and Guillaume Lample. Llama: Open and efficient foundation language models. arXiv preprint\\narXiv:2302.13971 , 2023.\\nAshish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser,\\nand Illia Polosukhin. Attention is all you need, 2017.\\nOriol Vinyals, Igor Babuschkin, Wojciech M Czarnecki, Michaël Mathieu, Andrew Dudzik, Junyoung Chung,\\nDavid H Choi, Richard Powell, Timo Ewalds, Petko Georgiev, et al. Grandmaster level in starcraft ii using\\nmulti-agent reinforcement learning. Nature, 575(7782):350–354, 2019.\\nYizhong Wang, Yeganeh Kordi, Swaroop Mishra, Alisa Liu, Noah A Smith, Daniel Khashabi, and HannanehHajishirzi. Self-instruct: Aligninglanguagemodel withselfgeneratedinstructions. arXivpreprint'])]}"
727
            ]
728
          },
729
          "metadata": {},
730
          "execution_count": 18
731
        }
732
      ],
733
      "source": [
734
        "agent(\"what makes llama 2 so special?\")"
735
      ]
736
    },
737
    {
738
      "cell_type": "code",
739
      "execution_count": 19,
740
      "metadata": {
741
        "colab": {
742
          "base_uri": "https://localhost:8080/"
743
        },
744
        "id": "EhH69sYdXQ21",
745
        "outputId": "bb91e4db-e673-41be-8182-17043988618d"
746
      },
747
      "outputs": [
748
        {
749
          "output_type": "stream",
750
          "name": "stdout",
751
          "text": [
752
            "\n",
753
            "\n",
754
            "\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n",
755
            "\u001b[32;1m\u001b[1;3m```json\n",
756
            "{\n",
757
            "    \"action\": \"Vector Search Tool\",\n",
758
            "    \"action_input\": \"Llama 2 red teaming\"\n",
759
            "}\n",
760
            "```\u001b[0mLlama 2 red teaming\n",
761
            "[-0.024061694741249084, -0.027173474431037903, -0.011430328711867332, -0.0008598336717113853, -0.0025880311150103807, 0.021222878247499466, -0.02164597250521183, -0.002226355019956827, -0.0065954700112342834, -0.04864202067255974, 0.01959874853491783, 0.01131431944668293, 0.025044361129403114, -0.01090487465262413, 0.001903917407616973, -0.0031458993908017874, 0.015900099650025368, 0.0015149450628086925, 0.022041767835617065, 0.002535144565626979, -0.014235024340450764, 0.003722533816471696, 0.004114918410778046, -0.012044495902955532, -0.015804562717676163, -0.02099085971713066, 0.014207728207111359, -0.01070015225559473, 0.036085717380046844, -0.010529550723731518, 0.04015286639332771, -0.009314864873886108, -0.013477551750838757, 0.012419819831848145, -0.02238297276198864, -0.019830767065286636, 0.006087076384574175, 0.010652383789420128, 0.01880715601146221, -0.012180977500975132, 0.02511260285973549, 0.012590421363711357, 0.000930633454117924, -0.022369323298335075, -0.015995636582374573, 0.0025539107155054808, -0.0026852742303162813, -0.0041115060448646545, -0.035103052854537964, 0.016814524307847023, 0.001357138273306191, -0.006578410044312477, -0.03780538588762283, -0.015012969262897968, 0.007001502905040979, 0.015531598590314388, -0.018970932811498642, -0.007622493896633387, 0.014617172069847584, -0.0059369467198848724, -0.012201448902487755, -0.019366730004549026, -0.007540605030953884, 0.015340524725615978, 0.015531598590314388, 0.004026205278933048, 0.019298488274216652, 0.009526411071419716, -0.004271871875971556, 0.005558210425078869, 0.006414632312953472, 0.0028149315621703863, 0.016541562974452972, -0.02377508394420147, 0.028169788420200348, -0.00015812665515113622, -0.025371916592121124, -0.008837179280817509, -0.006568173877894878, -0.014112190343439579, 0.013266005553305149, -0.02404804714024067, -0.005626451224088669, 0.018848100677132607, 0.028797604143619537, 0.013382014818489552, 0.01011328212916851, 0.004497066605836153, -0.015804562717676163, -0.0026067972648888826, 0.004026205278933048, 0.009608300402760506, 0.033328790217638016, 0.02101815678179264, -0.04621264711022377, 0.0014185549225658178, -0.032509900629520416, 0.009969975799322128, -0.003654293017461896, -0.0465129092335701, -0.001980688190087676, -0.016814524307847023, -0.031663715839385986, 8.332837751368061e-05, -0.033847421407699585, -0.030380789190530777, -0.0009400165872648358, 0.02063600905239582, 0.021850693970918655, 0.0033386796712875366, -0.030599160119891167, 0.019939951598644257, 0.005554798524826765, -0.031336162239313126, -0.0022741234861314297, -0.0019414498237892985, 0.006052955985069275, -0.005602566991001368, -0.026477418839931488, -0.01085028238594532, 0.02513989806175232, 0.014071246609091759, 0.010406716726720333, -0.029698383063077927, 0.013422959484159946, -0.011273374781012535, 0.014699061401188374, -0.028470048680901527, -4.070455179316923e-05, 0.0008803058881312609, 0.011887541972100735, 0.020076433196663857, 0.008045586757361889, 0.007806743960827589, -0.009335337206721306, 0.010679679922759533, 0.0005821790546178818, 0.017305858433246613, 0.0014492633054032922, -0.01076839305460453, 0.002842227928340435, 0.014658116735517979, 0.003267026739194989, -0.0013537262566387653, 0.00019512594735715538, 0.0315818265080452, 0.013498024083673954, 0.004493654705584049, -0.005182886496186256, -0.011785180307924747, -0.0017051661852747202, -0.028115196153521538, -0.0033062652219086885, -0.008127475157380104, -0.009935855865478516, 0.014016653411090374, -0.010488606058061123, 0.02047223038971424, -0.03731405362486839, -0.009417225606739521, -0.019298488274216652, -0.0010099633364006877, 0.0017401395598426461, -0.007097039837390184, 0.02063600905239582, 0.03807834908366203, 0.03796916455030441, 0.008414086885750294, 0.008004642091691494, -0.0011148835765197873, 0.00324484845623374, 0.035348717123270035, -0.0012812204658985138, 0.028988678008317947, 0.008284429088234901, -0.0061041368171572685, -0.005779993254691362, -0.006721715442836285, -0.003036714158952236, -0.028306270018219948, -0.014289616607129574, -0.002850758144631982, -0.0001425592345185578, 0.02178245224058628, -0.02342023141682148, -0.005988127551972866, -0.018725266680121422, 0.004480006638914347, 0.00997680053114891, 0.008373142220079899, -0.0017213734099641442, 0.0372321642935276, 0.002767163095995784, -0.003413744503632188, -0.6634094715118408, -0.021768804639577866, 0.030708344653248787, -0.011280198581516743, 0.0015729496954008937, 0.003937492147088051, 0.015272283926606178, -0.012024023570120335, -0.008632456883788109, 0.00969018880277872, 0.007438243832439184, -0.007233521435409784, -0.008564216084778309, -0.01867067441344261, -0.003028184175491333, -0.04585779458284378, 0.002893408527597785, 0.0037498301826417446, -0.002680156147107482, 0.013006689958274364, -0.02683226950466633, -0.0034273925703018904, 0.017032895237207413, -0.0023423642851412296, 0.006547701545059681, 0.015422413125634193, 0.02232837863266468, -0.017032895237207413, -0.007301762234419584, 0.008352669887244701, -0.032509900629520416, -0.014194079674780369, 0.012201448902487755, -0.007260817568749189, 0.05055275931954384, -0.011669171042740345, -0.022587694227695465, 0.02161867544054985, 0.006168965250253677, 0.008018290624022484, 0.016541562974452972, 0.005169237963855267, 0.028142493218183517, 0.022724175825715065, -0.008147947490215302, 0.0013588443398475647, 0.02153678610920906, 0.007240345701575279, -0.022614991292357445, -0.009656068868935108, 0.017647063359618187, -0.028824901208281517, -0.005824349354952574, 0.009485466405749321, 0.027255361899733543, 0.007677086163312197, 0.02404804714024067, 0.001852736808359623, -0.02025385946035385, -0.0033165013883262873, -0.035266827791929245, -0.001842500758357346, -0.031254272907972336, -0.01091852318495512, -0.019434969872236252, 0.011273374781012535, -0.011068652383983135, 0.013927940279245377, 0.0035212235525250435, -0.005056641064584255, 0.012488060630857944, 0.021850693970918655, -0.009853966534137726, 0.006516993511468172, 0.022669583559036255, 0.015490653924643993, 0.0025129662826657295, -0.02464856579899788, -0.01959874853491783, 0.000955370778683573, 0.01064555998891592, -0.0003224376414436847, -0.032236937433481216, -0.009355809539556503, 0.031199678778648376, -0.0001096117339329794, -0.018684322014451027, -0.00269551039673388, -0.0022894777357578278, 0.010379420593380928, 0.019394027069211006, 0.01984441466629505, -0.014043950475752354, 0.00956735573709011, 0.0070287990383803844, 0.014835542999207973, -0.002910468727350235, 0.01105500478297472, 0.023543065413832664, -0.025890547782182693, -0.0018749150913208723, 0.005602566991001368, 0.02238297276198864, -0.025180842727422714, 0.010679679922759533, 0.023160915821790695, -0.029534604400396347, 0.008680225349962711, 0.028688419610261917, -0.027269011363387108, -0.020240211859345436, 0.0029565312433987856, -0.025863250717520714, -0.005797053221613169, 0.025726769119501114, -0.028633825480937958, 0.0041115060448646545, -0.0023014198523014784, 0.0459669828414917, -0.028579233214259148, 0.017537876963615417, -0.014849191531538963, 0.030517270788550377, -0.009813022799789906, 0.027951419353485107, 0.008727993816137314, -0.010570495389401913, -0.013702746480703354, -0.006441928446292877, -0.0029906516429036856, -0.022287433966994286, -0.014043950475752354, 0.016268599778413773, -0.007424595765769482, 0.013102227821946144, -0.013300125487148762, 0.034229569137096405, -0.021195583045482635, 0.013068106956779957, 0.0020182207226753235, -0.01344343088567257, -0.02131841517984867, 0.01030435599386692, -0.007465539965778589, 0.005056641064584255, -0.036167606711387634, -0.033929310739040375, -0.01011328212916851, 0.0003951567050535232, 0.018766211345791817, 0.0009570767870172858, -0.019885359331965446, -0.0019329197239130735, 0.001323870848864317, 0.012617718428373337, -0.008161596022546291, -0.014535283669829369, -0.011948958039283752, -0.008502800017595291, -0.030135123059153557, 0.006100724451243877, 0.009321688674390316, -0.007370003033429384, 0.009226151742041111, 0.007015150971710682, -0.017210321500897408, -0.009697013534605503, 0.030025938525795937, -0.017333155497908592, -0.021113693714141846, 0.00472226133570075, -0.009833494201302528, -0.016405081376433372, 0.003985261078923941, -0.027746696025133133, 0.008700697682797909, -0.00592329865321517, -0.005489969626069069, -0.0007967109559103847, 0.029643788933753967, 0.0008734818547964096, 0.04053501784801483, -0.0016488675028085709, 0.0009843730367720127, -0.0025283205322921276, 0.006244030315428972, 0.009540059603750706, 0.018097450956702232, -0.017346803098917007, 0.004162686876952648, -0.014944728463888168, -0.0029138808604329824, -0.006151905283331871, -0.005131705664098263, 0.01004504133015871, 0.0038726634811609983, -0.006841137073934078, 0.008229836821556091, 0.015900099650025368, 0.025180842727422714, -0.004186571110039949, -0.012631366029381752, 0.020786138251423836, -0.011116420850157738, 0.006759248208254576, -0.021277470514178276, -0.004367409273982048, -0.004834858234971762, 0.036249496042728424, 0.026218103244900703, 0.002152996137738228, -0.037859976291656494, 0.0005314250010997057, -0.01211956050246954, 0.004964516032487154, 0.029234344139695168, -0.016596155241131783, 0.030217012390494347, -0.002516378415748477, 0.005940358620136976, -0.006223557982593775, 0.005002048332244158, 0.015517950989305973, -0.026791324838995934, -0.008905420079827309, -0.022287433966994286, 0.016254950314760208, 0.029561901465058327, 0.00578681705519557, -0.008223012089729309, 0.002038693055510521, -0.0010884403018280864, 0.0070287990383803844, -0.000665347499307245, 0.013811931014060974, -0.008605160750448704, 0.005462673492729664, 0.01085028238594532, 0.03930668160319328, 0.009751605801284313, 0.002219530986621976, 0.022587694227695465, 0.031663715839385986, -0.02014467492699623, 0.032318826764822006, 0.020349396392703056, 0.027569269761443138, 0.01063191145658493, -0.027269011363387108, 0.022205546498298645, -0.017524229362607002, 0.029725678265094757, -0.015135802328586578, -0.008120651356875896, -0.0005911356420256197, -0.011457624845206738, 0.0248942319303751, 0.0004028337716590613, 0.018547840416431427, 0.027883177623152733, 0.02030845172703266, 0.006875257473438978, 0.020895322784781456, 0.003422274487093091, 0.005370548460632563, -0.028006011620163918, -0.004350348841398954, -0.024007102474570274, 0.0025658528320491314, -0.031991273164749146, 0.0005322779761627316, 0.0013170468155294657, 0.008052410557866096, 0.0021615263540297747, 0.012180977500975132, 0.02574041858315468, 0.007131160236895084, -0.013491200283169746, 0.003824895014986396, 0.02071789652109146, 0.00262726959772408, -0.009096493944525719, 0.035594385117292404, 0.009785725735127926, -0.008311725221574306, -0.013859700411558151, -0.013907468877732754, 0.0034086264204233885, -0.017797192558646202, 0.016937358304858208, 0.007977345958352089, -0.002313361968845129, -0.0022297671530395746, 0.018315821886062622, 0.009157910943031311, 0.007834040559828281, 0.0262590479105711, -0.032318826764822006, -0.0055070300586521626, -0.013846051879227161, -0.0013008395908400416, 0.006117784883826971, -0.03657705336809158, -0.016541562974452972, 0.035130348056554794, 0.015804562717676163, -0.002373072784394026, -0.02519449219107628, -0.002466903766617179, -0.013955237343907356, -0.0015072679379954934, 0.005837997887283564, 0.012003551237285137, -0.030599160119891167, 0.004271871875971556, -0.0019499799236655235, -0.015913747251033783, -0.02052682265639305, 0.030954012647271156, 0.011935310438275337, -0.02164597250521183, -0.026477418839931488, 0.000537822546903044, -0.004892862867563963, 0.052217837423086166, 0.035703569650650024, 0.012221921235322952, 0.007943225093185902, -0.012208273634314537, -0.02748738043010235, -0.015900099650025368, -0.030462678521871567, 0.017223969101905823, -0.033110421150922775, -0.01918930374085903, 0.007417771499603987, 0.016923710703849792, 0.006155317183583975, 0.004070561844855547, 0.007131160236895084, 0.009874438866972923, 0.004005732946097851, 0.006390748079866171, 0.008345846086740494, -0.008700697682797909, 0.0019005053909495473, -0.0008709228131920099, 0.013074930757284164, -0.00047640586853958666, -0.003831719048321247, 0.01962604559957981, 0.012685958296060562, -0.0017094312934204936, -0.028224380686879158, 0.008352669887244701, 0.002270711585879326, 0.00468814093619585, 0.0042138672433793545, -0.019052822142839432, 0.019366730004549026, -0.019394027069211006, 0.006360039580613375, 0.010945819318294525, -0.013450255617499352, -0.0023082438856363297, 0.02363860234618187, 0.01273372769355774, 0.0030162418261170387, 0.030217012390494347, -0.01119148638099432, 0.009533234871923923, 0.009792550466954708, -0.009103318676352501, -0.03936127573251724, 0.030735641717910767, 0.0018919752910733223, -0.005783405154943466, -0.004995224066078663, -0.010208819061517715, -0.0001613254426047206, -0.004582367371767759, -0.002344070468097925, -2.28913131650188e-06, -0.011839773505926132, -0.033492568880319595, -0.006186025682836771, 0.031636420637369156, -0.019776174798607826, -0.004507302772253752, -0.032919347286224365, -0.008031938225030899, -0.004155862610787153, -0.012153680436313152, -0.029780270531773567, -0.033001236617565155, -0.018738914281129837, -0.04312816634774208, 0.012474412098526955, 0.026299992576241493, 0.01057731918990612, -0.006151905283331871, -0.013197764754295349, -0.02216460183262825, 0.012017198838293552, -0.005380784627050161, -0.0026511538308113813, 0.016568858176469803, -0.017455987632274628, -0.003859015414491296, 0.0029411769937723875, -0.002451549516990781, -0.009526411071419716, -0.015968339517712593, -0.007294937968254089, 0.020212914794683456, -0.014289616607129574, 0.013491200283169746, -0.009888087399303913, 0.010079161264002323, 0.014999320730566978, -0.008884947746992111, -0.006121196784079075, 0.011430328711867332, 0.005660571623593569, 0.010079161264002323, -0.04700424149632454, -0.02140030451118946, -0.006755835842341185, -0.005565034691244364, 0.006356627680361271, 0.001816910458728671, 0.018547840416431427, -0.008707522414624691, 0.018793508410453796, 0.02453937940299511, -0.024143584072589874, 0.004780265968292952, -0.024962473660707474, 0.014344209805130959, 9.696319693830446e-07, 0.004340112674981356, 0.01063191145658493, 0.010747920721769333, -0.037395939230918884, 0.029097862541675568, -0.017251266166567802, 0.009519587270915508, 0.021277470514178276, -0.02131841517984867, -0.0068991417065262794, -0.003599700517952442, -0.038105644285678864, 0.0013426371151581407, 0.008455031551420689, -0.01981711946427822, 0.012085439637303352, -0.018916340544819832, 0.006578410044312477, -0.038214828819036484, 0.010474957525730133, 0.016200358048081398, 0.020322101190686226, -0.003859015414491296, -0.01959874853491783, -0.02148219384253025, 0.014726357534527779, -0.0022672994527965784, -0.008646105416119099, -0.008980484679341316, -0.014371505938470364, -0.015517950989305973, 0.033410679548978806, -0.020349396392703056, 0.016855468973517418, 0.009116966277360916, 0.009396754205226898, -0.034720901399850845, 0.008379966020584106, 0.002327010268345475, -0.018206637352705002, -0.017660710960626602, -0.0011652110842987895, 0.010515902191400528, 0.025153547525405884, 0.01277467142790556, -0.0012129796668887138, 0.008448206819593906, -0.016186710447072983, 0.01179200503975153, -0.004807562101632357, -0.007997818291187286, 0.00019118077761959285, -0.024143584072589874, 0.022860657423734665, 0.008079706691205502, 0.017128432169556618, 0.021195583045482635, 0.01044083759188652, 0.01102770771831274, 0.014999320730566978, -0.011962606571614742, -0.0044151777401566505, -0.000913573254365474, -0.023706842213869095, -0.0022144129034131765, -0.014221375808119774, -0.002925822976976633, -0.019830767065286636, -0.011089124716818333, -0.006039307918399572, 0.020267508924007416, -0.00014277247828431427, 0.008277605287730694, -0.0009357515373267233, 0.0025539107155054808, -0.014385153539478779, 0.012010375037789345, -0.0053057195618748665, 0.02423912100493908, -0.026136213913559914, -0.01183294877409935, -0.011409856379032135, 0.025235436856746674, 0.008229836821556091, 0.008837179280817509, 0.01970793306827545, 0.01130067091435194, -0.004367409273982048, 0.011171014048159122, 0.036522459238767624, 0.013293301686644554, -0.0017571997595950961, 0.0008990720962174237, -0.020840730518102646, 0.015736320987343788, -0.032291531562805176, -0.008086531423032284, -0.009492291137576103, 0.000546352646779269, 0.0034836912527680397, -0.006056368350982666, 0.013764162547886372, -0.006919614039361477, -0.022778768092393875, 0.004312816541641951, 0.007144808303564787, 0.04236386716365814, 0.006987854838371277, 0.0059369467198848724, 0.02432101033627987, 0.012563125230371952, -0.004732497036457062, 0.019175656139850616, 0.0019397438736632466, -0.0008141976431943476, 0.012399347499012947, 0.037750791758298874, -0.009922207333147526, 0.0025846189819276333, 0.007253993768244982, 0.009676541201770306, 0.018302174285054207, -0.0394158661365509, 0.01913471147418022, -0.019011877477169037, 0.0025726768653839827, -0.04375598207116127, -0.021768804639577866, -0.0250307135283947, -0.01190118957310915, -0.00648969691246748, 0.0032226701732724905, 0.020021840929985046, -0.00554797425866127, -0.016200358048081398, 0.013068106956779957, -0.01056367065757513, 0.018452303484082222, 0.004732497036457062, -0.01085028238594532, 0.018725266680121422, -0.027623862028121948, 0.007083391770720482, -0.009273920208215714, -0.0013631093315780163, 0.031609125435352325, -0.004527775105088949, 0.013736866414546967, 0.006223557982593775, -0.0014603524468839169, -0.0021035217214375734, 0.00571857625618577, 0.01325235702097416, 0.028797604143619537, -0.008202540688216686, -0.013859700411558151, -0.018397711217403412, 0.010829810053110123, 0.037395939230918884, -0.0020847555715590715, 0.00648969691246748, 0.008980484679341316, -0.03169101104140282, -0.017155729234218597, 0.0019431558903306723, 0.0007796507561579347, -0.030380789190530777, 0.0027586331125348806, -0.019107414409518242, -0.011798828840255737, -0.0257131215184927, 0.009519587270915508, -0.0077589754946529865, 0.0060700164176523685, 0.01024976372718811, 0.005762932822108269, 0.008277605287730694, 0.01278831996023655, 0.007103864103555679, 0.017633413895964622, -0.001989218406379223, 0.011327967047691345, -0.05128975957632065, 0.01959874853491783, -0.028852196410298347, 0.02669578790664673, -0.011737411841750145, 0.025863250717520714, 0.015258635394275188, 0.011184661649167538, 0.04329194501042366, -0.014576228335499763, -0.0017426986014470458, -0.012085439637303352, 0.02527637965977192, -0.00025334383826702833, -0.0017051661852747202, -0.014357857406139374, 0.057922765612602234, 0.02281971275806427, 0.006759248208254576, -0.031554531306028366, -0.006087076384574175, 0.02426641620695591, 0.004374233074486256, 0.006145081017166376, 0.012645014561712742, -0.018684322014451027, 0.006046132184565067, -0.013170468620955944, -0.009035077877342701, 0.0014740006299689412, -0.005285247694700956, -0.014931079931557178, -0.007015150971710682, 0.0036474689841270447, -0.009096493944525719, -0.024689510464668274, -0.011675995774567127, -0.015026616863906384, -0.00030836297082714736, -0.017674358561635017, 0.010829810053110123, 0.011000411584973335, 0.02186434157192707, 0.01886174827814102, -9.052564564626664e-05, 0.0238023791462183, -0.024007102474570274, -0.02197352796792984, -0.0009843730367720127, -0.016145765781402588, -0.02271052822470665, -0.021605027839541435, -0.02044493332505226, 0.021086396649479866, 0.025344621390104294, -0.020431285724043846, 0.002253651386126876, -0.013886996544897556, -0.015067561529576778, -0.008400438353419304, -0.010945819318294525, 0.009110142476856709, 0.032919347286224365, 0.004101270344108343, -0.009929032064974308, 0.014685412868857384, 0.0053739603608846664, 0.0004828034434467554, -0.0017435515765100718, -0.005309131927788258, 0.001885151257738471, -0.034693606197834015, 0.014494339004158974, 0.00032222436857409775, -0.01998089626431465, 0.0045789554715156555, 0.018725266680121422, 0.003910196013748646, 0.013354718685150146, 0.004462946206331253, -0.026054324582219124, -0.013723218813538551, -0.0024259593337774277, -0.008277605287730694, 0.014153135009109974, -0.000554882746655494, -0.00970383733510971, -0.004394705407321453, -0.009096493944525719, 0.015122153796255589, 0.008352669887244701, 0.001844206708483398, 0.004220691509544849, 0.02011737786233425, 0.007035623304545879, 0.0018987993244081736, 0.005025932565331459, 0.008891772478818893, 0.005353488028049469, -0.02268323116004467, 0.017537876963615417, 0.015367820858955383, -0.004855330567806959, 0.027118880301713943, 0.0233929343521595, 0.01239252369850874, 0.0038692515809088945, 0.006684183143079281, 0.002833697944879532, -0.014740006066858768, 0.028360862284898758, 0.012563125230371952, -0.01192848663777113, 0.005848233588039875, -0.012017198838293552, -0.034202273935079575, -0.016309544444084167, -0.009813022799789906, 0.015163098461925983, -0.021714212372899055, -0.00535690039396286, -0.00556162279099226, -0.018875395879149437, -0.0009920501615852118, -0.02167326770722866, 0.010952643118798733, 0.019448619335889816, -0.023679547011852264, -0.012194625101983547, 0.018370414152741432, -0.007847688160836697, -0.013586737215518951, -0.005298895761370659, -0.01995360106229782, -0.02006278559565544, 0.015163098461925983, -0.010126929730176926, 0.026081621646881104, 0.20930808782577515, -0.016486968845129013, 0.00264432979747653, 0.04053501784801483, -0.017278563231229782, 0.018274877220392227, 0.033792827278375626, -0.008516447618603706, 0.013122699223458767, 0.016391431912779808, 0.0187525637447834, 0.0010056983446702361, -0.008400438353419304, -0.00017859888612292707, 0.018192987889051437, 0.007745326962321997, -0.028934085741639137, -0.007008326705545187, -0.01192848663777113, -0.0003975024737883359, 0.013068106956779957, 0.011751060374081135, 0.01231063436716795, -0.006100724451243877, 0.02118193358182907, -0.007840864360332489, -0.01981711946427822, 0.005674219690263271, 0.02456667646765709, -0.004033029545098543, -0.021632323041558266, -0.0020096905063837767, 0.01932578533887863, 0.004288932308554649, -0.014357857406139374, 0.007226697169244289, 0.0240753423422575, -0.00542514119297266, 0.017674358561635017, -0.007718030828982592, -0.0019431558903306723, 0.009662892669439316, 0.00552750239148736, -0.0093080410733819, -0.009151087142527103, 0.007233521435409784, -0.020322101190686226, 0.00038790612597949803, 0.012235569767653942, 0.011341615580022335, -0.036713533103466034, 0.02249215729534626, 0.008611984550952911, 0.028060603886842728, 0.011955782771110535, 0.003464925102889538, 0.02003549039363861, 0.017469637095928192, -0.032728273421525955, 0.0020762253552675247, -0.028115196153521538, 0.0018032622756436467, 0.02262863889336586, 0.005646923556923866, -0.007213049102574587, 0.005145353730767965, -0.00970383733510971, -0.0016761638689786196, -0.0031168970745056868, -0.025644879788160324, 0.0024600797332823277, -0.016077524051070213, 0.007062919437885284, 0.007486012298613787, -0.019885359331965446, -0.02522178739309311, -0.0019278016407042742, 0.02650471404194832, 0.010550023056566715, 0.017032895237207413, -0.005135117564350367, -0.008946364745497704, 0.0017222263850271702, -0.012877033092081547, -0.0003132677811663598, -0.016159413382411003, -0.0033437975216656923, 0.011321143247187138, -0.005725400522351265, 0.0011643581092357635, 0.007294937968254089, -0.007281289901584387, -0.016650747507810593, -0.009929032064974308, 0.007929577492177486, 0.020622359588742256, 0.002400368917733431, 0.027951419353485107, 0.007970522157847881, -0.017892729490995407, -0.03753242269158363, 0.021905286237597466, 0.027541974559426308, -0.0001729476934997365, -0.011143716983497143, 0.0042821080423891544, -0.0018203224753960967, 0.003386448137462139, 0.0030742466915398836, -0.005841409787535667, -0.010468133725225925, -0.037832681089639664, 0.014289616607129574, -0.0007092774612829089, 0.010201995261013508, -0.0009016311378218234, -0.005162414163351059, 0.0039886729791760445, -0.010270235128700733, -0.021877991035580635, 0.0020182207226753235, -0.024853287264704704, -0.0006354921497404575, -0.009956328198313713, -0.00652381731197238, -0.021004509180784225, -0.012474412098526955, -0.00158830382861197, -0.003575816284865141, -0.018889045342803, 0.0067012435756623745, -0.018370414152741432, 0.021577730774879456, -0.002485669916495681, 0.010256587527692318, 0.014740006066858768, 0.011205133982002735, -0.0008555686217732728, -0.004288932308554649, -0.004920159466564655, -0.008188892155885696, -0.005496793892234564, 0.002431077416986227, 0.004265048075467348, 0.015818210318684578, -0.019475914537906647, -0.002390132984146476, -0.0034103323705494404, -0.029671085998415947, -0.008168419823050499, -0.025235436856746674, 0.0028302858117967844, 0.003824895014986396, 0.00046830225619487464, 0.014589875936508179, -0.023570360615849495, -0.015108506195247173, -0.008093355223536491, 0.016555210575461388, -0.003064010525122285, -0.028879493474960327, 0.021741509437561035, 0.007820392027497292, 0.003471749136224389, -0.02273782342672348, -0.01306128315627575, -0.17709843814373016, -0.006779720075428486, 0.014412450604140759, -0.014685412868857384, -0.00032840869971551, -0.010625087656080723, -0.0001382941845804453, 0.04039853438735008, -0.03799645975232124, 0.011512217111885548, 0.030053233727812767, 0.015572543255984783, -0.030353493988513947, -0.003157841507345438, -0.012058143503963947, 0.020185619592666626, 0.017660710960626602, 0.012679134495556355, 0.01258359756320715, 0.0027279246132820845, 0.029616493731737137, -0.04383786767721176, -0.0045175389386713505, 0.016036581248044968, 0.016596155241131783, -0.015367820858955383, -0.015545247122645378, 0.016978302970528603, -0.019039174541831017, -0.022587694227695465, -0.029316233471035957, 0.013334246352314949, 0.02563123218715191, 0.017633413895964622, -0.0034120383206754923, 0.010092809796333313, -0.012044495902955532, 0.0035894643515348434, -0.016282247379422188, 0.005326191894710064, 0.004097857978194952, 0.00011664906196529046, -0.009833494201302528, 0.007335882633924484, -0.027432788163423538, 0.05377372354269028, 0.02374778687953949, -0.00043226260459050536, -0.002344070468097925, -0.03796916455030441, 0.0022826537024229765, -0.020595064386725426, -0.004196807276457548, -0.007526956498622894, 0.031308863312006, 0.026340937241911888, -0.005663983523845673, -0.006663710810244083, 0.002175174420699477, 0.014780950732529163, -0.00315954745747149, -0.026340937241911888, 0.0011225605849176645, -0.003301147138699889, -0.023570360615849495, -0.015763618052005768, 0.01225604210048914, 0.022287433966994286, -0.030571864917874336, 0.014685412868857384, -0.0071243359707295895, -0.013975709676742554, -0.01171011570841074, -0.031008604913949966, 0.027651159092783928, 0.020458582788705826, -0.014671765267848969, 0.004442473873496056, 0.033028531819581985, -0.027419140562415123, 0.0013724924065172672, 0.010693328455090523, 0.023925213143229485, -0.015627136453986168, -0.01085028238594532, 0.0008457590010948479, -0.017415044829249382, 0.004265048075467348, -0.00616214144974947, -0.029234344139695168, 0.034311458468437195, -0.028251677751541138, 0.014562579803168774, -0.04700424149632454, -0.006356627680361271, 0.011785180307924747, 0.00047214081860147417, -0.016841821372509003, -0.0010364066110923886, -0.02729630656540394, 0.002977003576233983, 0.0004147759173065424, -0.022860657423734665, -0.01263818982988596, 0.01246076449751854, 0.007574725430458784, 0.007642966229468584, 0.018111100420355797, 0.02590419538319111, -0.003739594016224146, -0.001365668373182416, 0.017087487503886223, 0.010877578519284725, 0.0008146241889335215, -0.0026392117142677307, 0.021523138508200645, 0.02088167518377304, -0.00628497451543808, -0.02358401007950306, -0.007329058367758989, 0.08369047939777374, -0.010092809796333313, 0.015094857662916183, 0.0033642698545008898, -0.001476559671573341, -0.015395116992294788, -0.09597381949424744, -0.02071789652109146, 0.016350487247109413, 0.006472636945545673, 0.004186571110039949, 0.016432376578450203, -0.007021975237876177, 0.024498436599969864, 0.004094446077942848, 0.037013791501522064, -0.01102770771831274, -0.008912243880331516, -0.007574725430458784, 0.014248671941459179, 0.0254401583224535, 0.028797604143619537, -0.011962606571614742, -0.007383651100099087, 0.006691007409244776, 0.029807567596435547, -0.01232428289949894, -0.016759932041168213, 0.009069197811186314, -0.015599839389324188, -0.013081755489110947, -0.0016369253862649202, -0.04018016532063484, 0.028852196410298347, 0.014808246865868568, 0.003055480308830738, 0.016541562974452972, -0.008427734486758709, 0.02161867544054985, -0.010434013791382313, 0.001305104698985815, -0.0022144129034131765, -0.027391843497753143, -0.027501029893755913, 0.028060603886842728, -0.019421322271227837, -0.0021444661542773247, 0.0017844961257651448, -0.01270643062889576, -0.01306128315627575, -0.016527913510799408, 0.004793914034962654, 0.0019090354908257723, 0.015504302456974983, -0.0022928898688405752, -0.04285520315170288, -0.034584421664476395, -0.019557803869247437, -0.011662347242236137, -0.004073973745107651, 0.02306537888944149, -0.020813433453440666, 0.014439746737480164, 0.017142081633210182, -0.030954012647271156, -0.001672751852311194, -0.004486830439418554, 0.001160946092568338, -5.469924144563265e-05, 0.025835955515503883, 0.02243756502866745, -0.0020011605229228735, -0.017469637095928192, -0.005831173621118069, 0.025590287521481514, 0.021523138508200645, -0.005756108555942774, 0.0243483055382967, -0.02276512049138546, 0.006946910172700882, -0.009232975542545319, 0.01190118957310915, -0.012624542228877544, -0.005094173364341259, 0.035075753927230835, 0.0034973393194377422, -0.009888087399303913, -0.02107274904847145, -0.010208819061517715, -0.012331106700003147, 0.008236660622060299, 0.02290160208940506, 0.0032363184727728367, -0.005626451224088669, 0.008202540688216686, -0.012747375294566154, -0.018302174285054207, 0.022069064900279045, 0.012726902961730957, -0.04274601861834526, -0.007636141963303089, 0.015285932458937168, -0.029452715069055557, 0.013975709676742554, 0.004401529673486948, 0.011157365515828133, -0.007042447105050087, -0.00472226133570075, -0.08500070124864578, 0.021168285980820656, -0.0058004651218652725, -0.004609663970768452, 0.016787229105830193, -0.01009963359683752, 0.011935310438275337, -0.02377508394420147, 0.014180431142449379, 0.0053739603608846664, -0.019639693200588226, 0.035048458725214005, -0.0187525637447834, -0.01224239356815815, -0.008618809282779694, -0.016241302713751793, 0.024662213400006294, -0.007308586034923792, 0.030489975586533546, -0.0035724041517823935, 0.033001236617565155, -0.005384196527302265, 0.005227242596447468, -7.282569276867434e-05, 0.033820126205682755, -0.0021393480710685253, -0.013074930757284164, 0.0017896140925586224, -0.0056366873905062675, -0.008223012089729309, -0.014043950475752354, -0.029725678265094757, -0.012542652897536755, 0.021550433710217476, -0.013136347755789757, -0.05257268622517586, -0.0010739390272647142, 0.012228745967149734, -0.00013317613047547638, 0.02363860234618187, -0.0019687460735440254, -0.028551938012242317, -0.0010287296026945114, -0.02467586100101471, -0.007035623304545879, -0.013027162291109562, -0.018042858690023422, -0.010590966790914536, 0.013033987022936344, 0.014166783541440964, 0.027951419353485107, 0.009348984807729721, -0.035239532589912415, -0.031390752643346786, -0.0022400033194571733, -0.033956605941057205, 0.004596015904098749, -0.009560531936585903, 0.018425006419420242, -0.014385153539478779, 0.013716394081711769, 0.007008326705545187, 0.015695376321673393, 0.009731133468449116, -0.017728950828313828, -0.011471273377537727, -0.006609118543565273, -9.708315337775275e-05, 0.0028285798616707325, -0.022396620362997055, -0.005858469754457474, 0.026136213913559914, 0.0017640237929299474, 0.005329603794962168, -0.02257404662668705, 0.010338475927710533, -0.003072540508583188, 0.01959874853491783, -0.029370825737714767, 0.031172383576631546, -0.0034290985204279423, -0.05148083716630936, -0.031035901978611946, 0.005367136560380459, 0.01921660080552101, 0.04067149758338928, -0.0038692515809088945, 0.009833494201302528, 0.004998636431992054, 0.0035724041517823935, -0.037450533360242844, 0.006216734182089567, -0.012999866157770157, -0.0030094177927821875, -0.010788865387439728, 0.0014287910889834166, 0.007649790029972792, -0.0032380244228988886, 0.030053233727812767, -0.0016173061449080706, 0.009649244137108326, -0.0014228200307115912, -0.014890135265886784, -0.026136213913559914, -0.008994133211672306, -0.016869118437170982, -0.006380511913448572, -0.00278081139549613, 0.007370003033429384, 0.020076433196663857, 0.02238297276198864, -0.02524908445775509, -0.00669441930949688, 0.026491066440939903, -0.016896413639187813, -0.006090488750487566, -0.016377784311771393, -0.017237618565559387, -0.028906788676977158, 0.037750791758298874, 0.028715714812278748, 0.021959878504276276, 0.022423915565013885, 0.008209364488720894, 0.0029155868105590343, 0.000700320873875171, 0.02123652771115303, -0.020076433196663857, -0.00522041879594326, 0.0056366873905062675, 0.008762114681303501, 0.0006440222496166825, -0.014248671941459179, -0.006615942344069481, -0.012965746223926544, 0.015122153796255589, -0.004377645440399647, 0.0034615129698067904, -0.006005187518894672, 0.08281699568033218, 0.012808792293071747, -0.032564494758844376, 0.017333155497908592, -0.022041767835617065, -0.010215642862021923, 0.004920159466564655, 0.02142760157585144, -0.018329469487071037, -0.021987175568938255, -0.00528865959495306, -0.003430804703384638, 0.015681728720664978, 0.008455031551420689, -0.020786138251423836, -0.011014060117304325, -0.002698922296985984, 0.016473321244120598, -0.03215504810214043, -0.0002100536075886339, 0.029971344396471977, -0.0031117789912968874, 0.015094857662916183, 0.009833494201302528, -0.003746418049558997, 0.008557392284274101, 0.014630820602178574, 0.0053398399613797665, -0.003644057083874941, 0.002560734748840332, 0.007492836099117994, 0.003859015414491296, -0.03799645975232124, -0.02251945249736309, 0.00585505785420537, 0.0008201687014661729, -0.01097311545163393, -0.018916340544819832, 0.005336428061127663, 0.0027876354288309813, 0.008577864617109299, 0.013450255617499352, -0.033738236874341965, -0.0426914244890213, 0.011443977244198322, 0.005213594529777765, -0.00134775519836694, -0.010208819061517715, 0.004077386111021042]\n",
762
            "\n",
763
            "Observation: \u001b[36;1m\u001b[1;3m['cyber); findingsonthesetopicsweremarginal andweremitigated. Nonetheless, wewill continueourred\\nteaming efforts in this front.\\nTodate,allofourredteamingeffortshavetargetedmodeloutputsinEnglish,buthavecruciallyincluded\\nnon-Englishpromptsanddialoguecontexts,asthatisawell-knownattackvector. Inallexercises,participants\\nwere given risk category definitions and were shown just a handful of examples of risky interactions with an\\nLLM.Afterthat,eachparticipantwaspartofasubteamfocusedonaparticularcategoryofriskorattack\\nvector. Aftercreatingeachdialogue,theredteamparticipantwouldannotatevariousattributes,including\\nrisk areas and degree of risk, as captured by a 5-point Likert scale.\\nSome examples of useful insights provided by members of red teams that we were able to improve upon\\nthroughout development:\\n•[Early models] were more likely to have generated unsafe responses without noting that they contain problematiccontent. However, [slightly later models] have tended todisplay knowledge\\nthat the content is problematic, even if they do go on to provide it. “They respond with ‘[UNSAFE', 'vague answers due to context distillation). We thus leverage the safety reward model to decide whether to\\nuse safety context distillation – we keep the context-distilled output only on the examples where it gets a\\nbetterrewardmodelscorethantheoriginalanswer. Wenoticethatthisisparticularlyhelpfulonprompts\\nthat the model is very bad at, but limits the negative impact of context distillation (see Figure 16b).\\n4.3 Red Teaming\\nGivenhowbroadthecapabilitiesofLLMsareandhowvariedtheirtrainingdatais,itisinsufficienttoidentify\\nrisks solely via ex post facto usage and analysis. Rather, as has been done for other LLMs, we performed\\nvarious kinds of proactive risk identification, colloquially called “red teaming,“ based on the term commonly\\nused within computer security. This kind of granular analysis is very important because safety is a long-tail\\nissue,inwhichevenveryinfrequentedgecasescancausenoticeableproblems. Evenifquantitativescores\\nreport good results, these types of qualitative insights allow us to recognize and target specific patterns in a\\nmore comprehensive way.\\nWe conducted a series of red teaming with various groups of internal employees, contract workers, and', 'more comprehensive way.\\nWe conducted a series of red teaming with various groups of internal employees, contract workers, and\\nexternalvendors. Theseteamsincludedover350people,includingdomainexpertsincybersecurity,election fraud, social media misinformation, legal, policy, civil rights, ethics, software engineering, machine\\nlearning, responsible AI, and creative writing. They also included individuals representative of a variety of\\nsocioeconomic, gender, ethnicity, and racial demographics.\\n28\\nTheredteamersprobedourmodelsacrossawiderangeofriskcategories(suchascriminalplanning,human\\ntrafficking, regulated or controlled substances, sexually explicit content, unqualified health or financial\\nadvice, privacy violations, and more), as well as different attack vectors (such as hypothetical questions,\\nmalformed/misspelledinputs,orextendeddialogues). Additionally,weconductedspecificteststodetermine\\nthe capabilities of our models to facilitate the production of weapons (e.g. nuclear, biological, chemical, and\\ncyber); findingsonthesetopicsweremarginal andweremitigated. Nonetheless, wewill continueourred\\nteaming efforts in this front.', 'From Red Teaming Insights to Safer Models. Crucially, after each exercise, we performed a thorough\\nanalysis of the collected data, including dialogue length, risk area distribution, histogram of topic of misinformation (where appropriate), and rated degree of risk. In each case, we took the overall lessons as a guide\\nto helpfurther modelsafetytraining, and specificallytook data fromthese exercisesformodel fine-tuning,\\nmodel feedback training, and as a signal for other safety model training.\\nMultiple additionalrounds ofred teaming wereperformed over severalmonths tomeasure the robustness\\nof each new model as it was released internally. We defined the robustness of a model, \\r, with respect to\\na red teaming exercise executed by a set of experts as the average number of created prompts that would\\ntriggeraviolatingresponsefromthemodelperpersonperhour. Asanexample,onour7Bmodel,wehadan\\nevolution of \\r: 1:8!0:45over several red teaming iterations and model refinements. Robustness will likely\\ncontinue to improve with additional red teaming efforts. Another magnitude that we tracked as new models\\nwere produced was the percentage of prompts triggering violating responses discovered in the previous red', 'Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety']\u001b[0m\n",
764
            "Thought:\u001b[32;1m\u001b[1;3m```json\n",
765
            "{\n",
766
            "    \"action\": \"Final Answer\",\n",
767
            "    \"action_input\": \"Llama 2 red teaming refers to the proactive risk identification process, similar to what is commonly known as 'red teaming' in computer security. This process involves a granular analysis to identify and target specific patterns that could pose risks or issues. It includes probing the models across a wide range of risk categories and attack vectors, and conducting specific tests to determine their capabilities. The red teaming efforts also involve working with various groups of internal employees, contract workers, and external vendors, including domain experts in various fields. These efforts are aimed at improving the safety and robustness of the models.\"\n",
768
            "}\n",
769
            "```\u001b[0m\n",
770
            "\n",
771
            "\u001b[1m> Finished chain.\u001b[0m\n"
772
          ]
773
        },
774
        {
775
          "output_type": "execute_result",
776
          "data": {
777
            "text/plain": [
778
              "{'input': 'tell me about llama 2 red teaming?',\n",
779
              " 'chat_history': [HumanMessage(content='tell me about Llama 2?', additional_kwargs={}, example=False),\n",
780
              "  AIMessage(content='Llama 2 is a collection of pretrained and fine-tuned large language models (LLMs) ranging in scale from 7 billion to 70 billion parameters. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, are optimized for dialogue use cases. They outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. The approach to fine-tuning and safety is detailed in the work. Llama 2 is intended for commercial and research use in English, with tuned models intended for assistant-like chat and pretrained models adaptable for various natural language generation tasks.', additional_kwargs={}, example=False),\n",
781
              "  HumanMessage(content='what makes llama 2 so special?', additional_kwargs={}, example=False),\n",
782
              "  AIMessage(content='Llama 2 is special because it features a collection of pretrained and fine-tuned large language models optimized for dialogue use cases. These models, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc, outperform open-source chat models on most benchmarks tested and may be a suitable substitute for closed-source models. They are intended for commercial and research use in English, with tuned models suitable for assistant-like chat and pretrained models adaptable for various natural language generation tasks.', additional_kwargs={}, example=False)],\n",
783
              " 'output': \"Llama 2 red teaming refers to the proactive risk identification process, similar to what is commonly known as 'red teaming' in computer security. This process involves a granular analysis to identify and target specific patterns that could pose risks or issues. It includes probing the models across a wide range of risk categories and attack vectors, and conducting specific tests to determine their capabilities. The red teaming efforts also involve working with various groups of internal employees, contract workers, and external vendors, including domain experts in various fields. These efforts are aimed at improving the safety and robustness of the models.\",\n",
784
              " 'intermediate_steps': [(AgentAction(tool='Vector Search Tool', tool_input='Llama 2 red teaming', log='```json\\n{\\n    \"action\": \"Vector Search Tool\",\\n    \"action_input\": \"Llama 2 red teaming\"\\n}\\n```'),\n",
785
              "   ['cyber); findingsonthesetopicsweremarginal andweremitigated. Nonetheless, wewill continueourred\\nteaming efforts in this front.\\nTodate,allofourredteamingeffortshavetargetedmodeloutputsinEnglish,buthavecruciallyincluded\\nnon-Englishpromptsanddialoguecontexts,asthatisawell-knownattackvector. Inallexercises,participants\\nwere given risk category definitions and were shown just a handful of examples of risky interactions with an\\nLLM.Afterthat,eachparticipantwaspartofasubteamfocusedonaparticularcategoryofriskorattack\\nvector. Aftercreatingeachdialogue,theredteamparticipantwouldannotatevariousattributes,including\\nrisk areas and degree of risk, as captured by a 5-point Likert scale.\\nSome examples of useful insights provided by members of red teams that we were able to improve upon\\nthroughout development:\\n•[Early models] were more likely to have generated unsafe responses without noting that they contain problematiccontent. However, [slightly later models] have tended todisplay knowledge\\nthat the content is problematic, even if they do go on to provide it. “They respond with ‘[UNSAFE',\n",
786
              "    'vague answers due to context distillation). We thus leverage the safety reward model to decide whether to\\nuse safety context distillation – we keep the context-distilled output only on the examples where it gets a\\nbetterrewardmodelscorethantheoriginalanswer. Wenoticethatthisisparticularlyhelpfulonprompts\\nthat the model is very bad at, but limits the negative impact of context distillation (see Figure 16b).\\n4.3 Red Teaming\\nGivenhowbroadthecapabilitiesofLLMsareandhowvariedtheirtrainingdatais,itisinsufficienttoidentify\\nrisks solely via ex post facto usage and analysis. Rather, as has been done for other LLMs, we performed\\nvarious kinds of proactive risk identification, colloquially called “red teaming,“ based on the term commonly\\nused within computer security. This kind of granular analysis is very important because safety is a long-tail\\nissue,inwhichevenveryinfrequentedgecasescancausenoticeableproblems. Evenifquantitativescores\\nreport good results, these types of qualitative insights allow us to recognize and target specific patterns in a\\nmore comprehensive way.\\nWe conducted a series of red teaming with various groups of internal employees, contract workers, and',\n",
787
              "    'more comprehensive way.\\nWe conducted a series of red teaming with various groups of internal employees, contract workers, and\\nexternalvendors. Theseteamsincludedover350people,includingdomainexpertsincybersecurity,election fraud, social media misinformation, legal, policy, civil rights, ethics, software engineering, machine\\nlearning, responsible AI, and creative writing. They also included individuals representative of a variety of\\nsocioeconomic, gender, ethnicity, and racial demographics.\\n28\\nTheredteamersprobedourmodelsacrossawiderangeofriskcategories(suchascriminalplanning,human\\ntrafficking, regulated or controlled substances, sexually explicit content, unqualified health or financial\\nadvice, privacy violations, and more), as well as different attack vectors (such as hypothetical questions,\\nmalformed/misspelledinputs,orextendeddialogues). Additionally,weconductedspecificteststodetermine\\nthe capabilities of our models to facilitate the production of weapons (e.g. nuclear, biological, chemical, and\\ncyber); findingsonthesetopicsweremarginal andweremitigated. Nonetheless, wewill continueourred\\nteaming efforts in this front.',\n",
788
              "    'From Red Teaming Insights to Safer Models. Crucially, after each exercise, we performed a thorough\\nanalysis of the collected data, including dialogue length, risk area distribution, histogram of topic of misinformation (where appropriate), and rated degree of risk. In each case, we took the overall lessons as a guide\\nto helpfurther modelsafetytraining, and specificallytook data fromthese exercisesformodel fine-tuning,\\nmodel feedback training, and as a signal for other safety model training.\\nMultiple additionalrounds ofred teaming wereperformed over severalmonths tomeasure the robustness\\nof each new model as it was released internally. We defined the robustness of a model, \\r, with respect to\\na red teaming exercise executed by a set of experts as the average number of created prompts that would\\ntriggeraviolatingresponsefromthemodelperpersonperhour. Asanexample,onour7Bmodel,wehadan\\nevolution of \\r: 1:8!0:45over several red teaming iterations and model refinements. Robustness will likely\\ncontinue to improve with additional red teaming efforts. Another magnitude that we tracked as new models\\nwere produced was the percentage of prompts triggering violating responses discovered in the previous red',\n",
789
              "    'Alan Schelten Ruan Silva Eric Michael Smith Ranjan Subramanian Xiaoqing Ellen Tan Binh Tang\\nRoss Taylor Adina Williams Jian Xiang Kuan Puxin Xu Zheng Yan Iliyan Zarov Yuchen Zhang\\nAngela Fan Melanie Kambadur Sharan Narang Aurelien Rodriguez Robert Stojnic\\nSergey Edunov Thomas Scialom\\x03\\nGenAI, Meta\\nAbstract\\nIn this work, we develop and release Llama 2, a collection of pretrained and fine-tuned\\nlarge language models (LLMs) ranging in scale from 7 billion to 70 billion parameters.\\nOur fine-tuned LLMs, called L/l.sc/a.sc/m.sc/a.sc /two.taboldstyle-C/h.sc/a.sc/t.sc , are optimized for dialogue use cases. Our\\nmodels outperform open-source chat models on most benchmarks we tested, and based on\\nourhumanevaluationsforhelpfulnessandsafety,maybeasuitablesubstituteforclosedsource models. We provide a detailed description of our approach to fine-tuning and safety'])]}"
790
            ]
791
          },
792
          "metadata": {},
793
          "execution_count": 19
794
        }
795
      ],
796
      "source": [
797
        "agent(\"tell me about llama 2 red teaming?\")"
798
      ]
799
    },
800
    {
801
      "cell_type": "markdown",
802
      "source": [
803
        "---"
804
      ],
805
      "metadata": {
806
        "id": "qYzR178ofUFJ"
807
      }
808
    }
809
  ],
810
  "metadata": {
811
    "kernelspec": {
812
      "display_name": "redacre",
813
      "language": "python",
814
      "name": "python3"
815
    },
816
    "language_info": {
817
      "codemirror_mode": {
818
        "name": "ipython",
819
        "version": 3
820
      },
821
      "file_extension": ".py",
822
      "mimetype": "text/x-python",
823
      "name": "python",
824
      "nbconvert_exporter": "python",
825
      "pygments_lexer": "ipython3",
826
      "version": "3.9.12"
827
    },
828
    "orig_nbformat": 4,
829
    "colab": {
830
      "provenance": []
831
    },
832
    "widgets": {
833
      "application/vnd.jupyter.widget-state+json": {
834
        "39364e874e5c4e7baa01c08ac31165fb": {
835
          "model_module": "@jupyter-widgets/controls",
836
          "model_name": "HBoxModel",
837
          "model_module_version": "1.5.0",
838
          "state": {
839
            "_dom_classes": [],
840
            "_model_module": "@jupyter-widgets/controls",
841
            "_model_module_version": "1.5.0",
842
            "_model_name": "HBoxModel",
843
            "_view_count": null,
844
            "_view_module": "@jupyter-widgets/controls",
845
            "_view_module_version": "1.5.0",
846
            "_view_name": "HBoxView",
847
            "box_style": "",
848
            "children": [
849
              "IPY_MODEL_cf43f35611b444b498153f8d659ce153",
850
              "IPY_MODEL_f2a10ce29d894e74a22842953fb8bc59",
851
              "IPY_MODEL_58fac49a766a4233b513bc05a30da756"
852
            ],
853
            "layout": "IPY_MODEL_7b5bcd804aa14aaca9d835c1a6262111"
854
          }
855
        },
856
        "cf43f35611b444b498153f8d659ce153": {
857
          "model_module": "@jupyter-widgets/controls",
858
          "model_name": "HTMLModel",
859
          "model_module_version": "1.5.0",
860
          "state": {
861
            "_dom_classes": [],
862
            "_model_module": "@jupyter-widgets/controls",
863
            "_model_module_version": "1.5.0",
864
            "_model_name": "HTMLModel",
865
            "_view_count": null,
866
            "_view_module": "@jupyter-widgets/controls",
867
            "_view_module_version": "1.5.0",
868
            "_view_name": "HTMLView",
869
            "description": "",
870
            "description_tooltip": null,
871
            "layout": "IPY_MODEL_fe26f0a8030b40528b5036bb8d994db5",
872
            "placeholder": "​",
873
            "style": "IPY_MODEL_221b7605257a4235b77fdd828e7fd6e6",
874
            "value": "Creating json from Arrow format: 100%"
875
          }
876
        },
877
        "f2a10ce29d894e74a22842953fb8bc59": {
878
          "model_module": "@jupyter-widgets/controls",
879
          "model_name": "FloatProgressModel",
880
          "model_module_version": "1.5.0",
881
          "state": {
882
            "_dom_classes": [],
883
            "_model_module": "@jupyter-widgets/controls",
884
            "_model_module_version": "1.5.0",
885
            "_model_name": "FloatProgressModel",
886
            "_view_count": null,
887
            "_view_module": "@jupyter-widgets/controls",
888
            "_view_module_version": "1.5.0",
889
            "_view_name": "ProgressView",
890
            "bar_style": "success",
891
            "description": "",
892
            "description_tooltip": null,
893
            "layout": "IPY_MODEL_de5ce44aeb78464a9be9c6b7392b6969",
894
            "max": 1,
895
            "min": 0,
896
            "orientation": "horizontal",
897
            "style": "IPY_MODEL_280c7b6c0e4d42249a9adb5a0ca1d553",
898
            "value": 1
899
          }
900
        },
901
        "58fac49a766a4233b513bc05a30da756": {
902
          "model_module": "@jupyter-widgets/controls",
903
          "model_name": "HTMLModel",
904
          "model_module_version": "1.5.0",
905
          "state": {
906
            "_dom_classes": [],
907
            "_model_module": "@jupyter-widgets/controls",
908
            "_model_module_version": "1.5.0",
909
            "_model_name": "HTMLModel",
910
            "_view_count": null,
911
            "_view_module": "@jupyter-widgets/controls",
912
            "_view_module_version": "1.5.0",
913
            "_view_name": "HTMLView",
914
            "description": "",
915
            "description_tooltip": null,
916
            "layout": "IPY_MODEL_8996a369a00a447093e6866183ef8648",
917
            "placeholder": "​",
918
            "style": "IPY_MODEL_eece5f66123d4ded8181c0373781da5b",
919
            "value": " 1/1 [00:00&lt;00:00, 16.27ba/s]"
920
          }
921
        },
922
        "7b5bcd804aa14aaca9d835c1a6262111": {
923
          "model_module": "@jupyter-widgets/base",
924
          "model_name": "LayoutModel",
925
          "model_module_version": "1.2.0",
926
          "state": {
927
            "_model_module": "@jupyter-widgets/base",
928
            "_model_module_version": "1.2.0",
929
            "_model_name": "LayoutModel",
930
            "_view_count": null,
931
            "_view_module": "@jupyter-widgets/base",
932
            "_view_module_version": "1.2.0",
933
            "_view_name": "LayoutView",
934
            "align_content": null,
935
            "align_items": null,
936
            "align_self": null,
937
            "border": null,
938
            "bottom": null,
939
            "display": null,
940
            "flex": null,
941
            "flex_flow": null,
942
            "grid_area": null,
943
            "grid_auto_columns": null,
944
            "grid_auto_flow": null,
945
            "grid_auto_rows": null,
946
            "grid_column": null,
947
            "grid_gap": null,
948
            "grid_row": null,
949
            "grid_template_areas": null,
950
            "grid_template_columns": null,
951
            "grid_template_rows": null,
952
            "height": null,
953
            "justify_content": null,
954
            "justify_items": null,
955
            "left": null,
956
            "margin": null,
957
            "max_height": null,
958
            "max_width": null,
959
            "min_height": null,
960
            "min_width": null,
961
            "object_fit": null,
962
            "object_position": null,
963
            "order": null,
964
            "overflow": null,
965
            "overflow_x": null,
966
            "overflow_y": null,
967
            "padding": null,
968
            "right": null,
969
            "top": null,
970
            "visibility": null,
971
            "width": null
972
          }
973
        },
974
        "fe26f0a8030b40528b5036bb8d994db5": {
975
          "model_module": "@jupyter-widgets/base",
976
          "model_name": "LayoutModel",
977
          "model_module_version": "1.2.0",
978
          "state": {
979
            "_model_module": "@jupyter-widgets/base",
980
            "_model_module_version": "1.2.0",
981
            "_model_name": "LayoutModel",
982
            "_view_count": null,
983
            "_view_module": "@jupyter-widgets/base",
984
            "_view_module_version": "1.2.0",
985
            "_view_name": "LayoutView",
986
            "align_content": null,
987
            "align_items": null,
988
            "align_self": null,
989
            "border": null,
990
            "bottom": null,
991
            "display": null,
992
            "flex": null,
993
            "flex_flow": null,
994
            "grid_area": null,
995
            "grid_auto_columns": null,
996
            "grid_auto_flow": null,
997
            "grid_auto_rows": null,
998
            "grid_column": null,
999
            "grid_gap": null,
1000
            "grid_row": null,
1001
            "grid_template_areas": null,
1002
            "grid_template_columns": null,
1003
            "grid_template_rows": null,
1004
            "height": null,
1005
            "justify_content": null,
1006
            "justify_items": null,
1007
            "left": null,
1008
            "margin": null,
1009
            "max_height": null,
1010
            "max_width": null,
1011
            "min_height": null,
1012
            "min_width": null,
1013
            "object_fit": null,
1014
            "object_position": null,
1015
            "order": null,
1016
            "overflow": null,
1017
            "overflow_x": null,
1018
            "overflow_y": null,
1019
            "padding": null,
1020
            "right": null,
1021
            "top": null,
1022
            "visibility": null,
1023
            "width": null
1024
          }
1025
        },
1026
        "221b7605257a4235b77fdd828e7fd6e6": {
1027
          "model_module": "@jupyter-widgets/controls",
1028
          "model_name": "DescriptionStyleModel",
1029
          "model_module_version": "1.5.0",
1030
          "state": {
1031
            "_model_module": "@jupyter-widgets/controls",
1032
            "_model_module_version": "1.5.0",
1033
            "_model_name": "DescriptionStyleModel",
1034
            "_view_count": null,
1035
            "_view_module": "@jupyter-widgets/base",
1036
            "_view_module_version": "1.2.0",
1037
            "_view_name": "StyleView",
1038
            "description_width": ""
1039
          }
1040
        },
1041
        "de5ce44aeb78464a9be9c6b7392b6969": {
1042
          "model_module": "@jupyter-widgets/base",
1043
          "model_name": "LayoutModel",
1044
          "model_module_version": "1.2.0",
1045
          "state": {
1046
            "_model_module": "@jupyter-widgets/base",
1047
            "_model_module_version": "1.2.0",
1048
            "_model_name": "LayoutModel",
1049
            "_view_count": null,
1050
            "_view_module": "@jupyter-widgets/base",
1051
            "_view_module_version": "1.2.0",
1052
            "_view_name": "LayoutView",
1053
            "align_content": null,
1054
            "align_items": null,
1055
            "align_self": null,
1056
            "border": null,
1057
            "bottom": null,
1058
            "display": null,
1059
            "flex": null,
1060
            "flex_flow": null,
1061
            "grid_area": null,
1062
            "grid_auto_columns": null,
1063
            "grid_auto_flow": null,
1064
            "grid_auto_rows": null,
1065
            "grid_column": null,
1066
            "grid_gap": null,
1067
            "grid_row": null,
1068
            "grid_template_areas": null,
1069
            "grid_template_columns": null,
1070
            "grid_template_rows": null,
1071
            "height": null,
1072
            "justify_content": null,
1073
            "justify_items": null,
1074
            "left": null,
1075
            "margin": null,
1076
            "max_height": null,
1077
            "max_width": null,
1078
            "min_height": null,
1079
            "min_width": null,
1080
            "object_fit": null,
1081
            "object_position": null,
1082
            "order": null,
1083
            "overflow": null,
1084
            "overflow_x": null,
1085
            "overflow_y": null,
1086
            "padding": null,
1087
            "right": null,
1088
            "top": null,
1089
            "visibility": null,
1090
            "width": null
1091
          }
1092
        },
1093
        "280c7b6c0e4d42249a9adb5a0ca1d553": {
1094
          "model_module": "@jupyter-widgets/controls",
1095
          "model_name": "ProgressStyleModel",
1096
          "model_module_version": "1.5.0",
1097
          "state": {
1098
            "_model_module": "@jupyter-widgets/controls",
1099
            "_model_module_version": "1.5.0",
1100
            "_model_name": "ProgressStyleModel",
1101
            "_view_count": null,
1102
            "_view_module": "@jupyter-widgets/base",
1103
            "_view_module_version": "1.2.0",
1104
            "_view_name": "StyleView",
1105
            "bar_color": null,
1106
            "description_width": ""
1107
          }
1108
        },
1109
        "8996a369a00a447093e6866183ef8648": {
1110
          "model_module": "@jupyter-widgets/base",
1111
          "model_name": "LayoutModel",
1112
          "model_module_version": "1.2.0",
1113
          "state": {
1114
            "_model_module": "@jupyter-widgets/base",
1115
            "_model_module_version": "1.2.0",
1116
            "_model_name": "LayoutModel",
1117
            "_view_count": null,
1118
            "_view_module": "@jupyter-widgets/base",
1119
            "_view_module_version": "1.2.0",
1120
            "_view_name": "LayoutView",
1121
            "align_content": null,
1122
            "align_items": null,
1123
            "align_self": null,
1124
            "border": null,
1125
            "bottom": null,
1126
            "display": null,
1127
            "flex": null,
1128
            "flex_flow": null,
1129
            "grid_area": null,
1130
            "grid_auto_columns": null,
1131
            "grid_auto_flow": null,
1132
            "grid_auto_rows": null,
1133
            "grid_column": null,
1134
            "grid_gap": null,
1135
            "grid_row": null,
1136
            "grid_template_areas": null,
1137
            "grid_template_columns": null,
1138
            "grid_template_rows": null,
1139
            "height": null,
1140
            "justify_content": null,
1141
            "justify_items": null,
1142
            "left": null,
1143
            "margin": null,
1144
            "max_height": null,
1145
            "max_width": null,
1146
            "min_height": null,
1147
            "min_width": null,
1148
            "object_fit": null,
1149
            "object_position": null,
1150
            "order": null,
1151
            "overflow": null,
1152
            "overflow_x": null,
1153
            "overflow_y": null,
1154
            "padding": null,
1155
            "right": null,
1156
            "top": null,
1157
            "visibility": null,
1158
            "width": null
1159
          }
1160
        },
1161
        "eece5f66123d4ded8181c0373781da5b": {
1162
          "model_module": "@jupyter-widgets/controls",
1163
          "model_name": "DescriptionStyleModel",
1164
          "model_module_version": "1.5.0",
1165
          "state": {
1166
            "_model_module": "@jupyter-widgets/controls",
1167
            "_model_module_version": "1.5.0",
1168
            "_model_name": "DescriptionStyleModel",
1169
            "_view_count": null,
1170
            "_view_module": "@jupyter-widgets/base",
1171
            "_view_module_version": "1.2.0",
1172
            "_view_name": "StyleView",
1173
            "description_width": ""
1174
          }
1175
        }
1176
      }
1177
    }
1178
  },
1179
  "nbformat": 4,
1180
  "nbformat_minor": 0
1181
}

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.