haystack-tutorials

Форк
0
/
36_Building_Fallbacks_with_Conditional_Routing.ipynb 
552 строки · 22.3 Кб
1
{
2
  "cells": [
3
    {
4
      "cell_type": "markdown",
5
      "metadata": {
6
        "id": "IR5wivW8THt7"
7
      },
8
      "source": [
9
        "# Tutorial: Building Fallbacks to Websearch with Conditional Routing\n",
10
        "\n",
11
        "- **Level**: Intermediate\n",
12
        "- **Time to complete**: 10 minutes\n",
13
        "- **Components Used**: [`ConditionalRouter`](https://docs.haystack.deepset.ai/v2.0/docs/conditionalrouter), [`SerperDevWebSearch`](https://docs.haystack.deepset.ai/v2.0/docs/serperdevwebsearch), [`PromptBuilder`](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder), [`OpenAIGenerator`](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator)\n",
14
        "- **Prerequisites**: You must have an [OpenAI API Key](https://platform.openai.com/api-keys) and a [Serper API Key](https://serper.dev/api-key) for this tutorial\n",
15
        "- **Goal**: After completing this tutorial, you'll have learned how to create a pipeline with conditional routing that can fallback to websearch if the answer is not present in your dataset.\n",
16
        "\n",
17
        "> This tutorial uses Haystack 2.0 Beta. To learn more, read the [ Haystack 2.0 Beta announcement](https://haystack.deepset.ai/blog/introducing-haystack-2-beta-and-advent) or visit the [Haystack 2.0 Documentation](https://docs.haystack.deepset.ai/v2.0/docs).\n"
18
      ]
19
    },
20
    {
21
      "cell_type": "markdown",
22
      "metadata": {
23
        "id": "F-a-MAMVat-o"
24
      },
25
      "source": [
26
        "## Overview\n",
27
        "\n",
28
        "When developing applications using **retrieval augmented generation ([RAG](https://www.deepset.ai/blog/llms-retrieval-augmentation))**, the retrieval step plays a critical role. It serves as the primary information source for **large language models (LLMs)** to generate responses. However, if your database lacks the necessary information, the retrieval step's effectiveness is limited. In such scenarios, it may be practical to use the web as a fallback data source for your RAG application. By implementing a conditional routing mechanism in your system, you gain complete control over the data flow, enabling you to design a system that can leverage the web as its data source under some conditions.\n",
29
        "\n",
30
        "In this tutorial, you will learn how to create a pipeline with conditional routing that directs the query to a **web-based RAG** route if the answer is not found in the initially given documents."
31
      ]
32
    },
33
    {
34
      "cell_type": "markdown",
35
      "metadata": {
36
        "id": "LSwNKkeKeq0f"
37
      },
38
      "source": [
39
        "## Development Environment"
40
      ]
41
    },
42
    {
43
      "cell_type": "markdown",
44
      "metadata": {
45
        "id": "eGJ7GmCBas4R"
46
      },
47
      "source": [
48
        "### Prepare the Colab Environment\n",
49
        "\n",
50
        "- [Enable GPU Runtime in Colab](https://docs.haystack.deepset.ai/v2.0/docs/enabling-gpu-acceleration)\n",
51
        "- [Set logging level to INFO](https://docs.haystack.deepset.ai/v2.0/docs/setting-the-log-level)"
52
      ]
53
    },
54
    {
55
      "cell_type": "markdown",
56
      "metadata": {
57
        "id": "FwIgIpE2XqpO"
58
      },
59
      "source": [
60
        "### Install Haystack\n",
61
        "\n",
62
        "Install Haystack 2.0 Beta with `pip`:"
63
      ]
64
    },
65
    {
66
      "cell_type": "code",
67
      "execution_count": null,
68
      "metadata": {
69
        "id": "uba0mntlqs_O"
70
      },
71
      "outputs": [],
72
      "source": [
73
        "%%bash\n",
74
        "\n",
75
        "pip install haystack-ai"
76
      ]
77
    },
78
    {
79
      "cell_type": "markdown",
80
      "metadata": {
81
        "id": "WBkJ7d3hZkOJ"
82
      },
83
      "source": [
84
        "### Enable Telemetry\n",
85
        "\n",
86
        "Knowing you're using this tutorial helps us decide where to invest our efforts to build a better product but you can always opt out by commenting the following line. See [Telemetry](https://docs.haystack.deepset.ai/v2.0/docs/telemetry) for more details."
87
      ]
88
    },
89
    {
90
      "cell_type": "code",
91
      "execution_count": null,
92
      "metadata": {
93
        "id": "HvrOixzzZmMi"
94
      },
95
      "outputs": [],
96
      "source": [
97
        "from haystack.telemetry import tutorial_running\n",
98
        "\n",
99
        "tutorial_running(36)"
100
      ]
101
    },
102
    {
103
      "cell_type": "markdown",
104
      "metadata": {
105
        "id": "QfECEAy2Jdqs"
106
      },
107
      "source": [
108
        "### Enter API Keys\n",
109
        "\n",
110
        "Enter API keys required for this tutorial."
111
      ]
112
    },
113
    {
114
      "cell_type": "code",
115
      "execution_count": 2,
116
      "metadata": {
117
        "colab": {
118
          "base_uri": "https://localhost:8080/"
119
        },
120
        "id": "13U7Z_k3yE-F",
121
        "outputId": "6ec48553-12d2-4c89-ca13-fc5d34fbc625"
122
      },
123
      "outputs": [],
124
      "source": [
125
        "from getpass import getpass\n",
126
        "import os\n",
127
        "\n",
128
        "os.environ[\"OPENAI_API_KEY\"] = getpass(\"Enter OpenAI Api key: \")\n",
129
        "os.environ[\"SERPERDEV_API_KEY\"] = getpass(\"Enter Serper Api key: \")"
130
      ]
131
    },
132
    {
133
      "cell_type": "markdown",
134
      "metadata": {
135
        "id": "i_AlhPv1T-4t"
136
      },
137
      "source": [
138
        "## Creating a Document\n",
139
        "\n",
140
        "Create a Document about Munich, where the answer to your question will be initially searched:"
141
      ]
142
    },
143
    {
144
      "cell_type": "code",
145
      "execution_count": null,
146
      "metadata": {
147
        "id": "5CHbQlLMyVbg"
148
      },
149
      "outputs": [],
150
      "source": [
151
        "from haystack.dataclasses import Document\n",
152
        "\n",
153
        "documents = [\n",
154
        "    Document(\n",
155
        "        content=\"\"\"Munich, the vibrant capital of Bavaria in southern Germany, exudes a perfect blend of rich cultural\n",
156
        "                                heritage and modern urban sophistication. Nestled along the banks of the Isar River, Munich is renowned\n",
157
        "                                for its splendid architecture, including the iconic Neues Rathaus (New Town Hall) at Marienplatz and\n",
158
        "                                the grandeur of Nymphenburg Palace. The city is a haven for art enthusiasts, with world-class museums like the\n",
159
        "                                Alte Pinakothek housing masterpieces by renowned artists. Munich is also famous for its lively beer gardens, where\n",
160
        "                                locals and tourists gather to enjoy the city's famed beers and traditional Bavarian cuisine. The city's annual\n",
161
        "                                Oktoberfest celebration, the world's largest beer festival, attracts millions of visitors from around the globe.\n",
162
        "                                Beyond its cultural and culinary delights, Munich offers picturesque parks like the English Garden, providing a\n",
163
        "                                serene escape within the heart of the bustling metropolis. Visitors are charmed by Munich's warm hospitality,\n",
164
        "                                making it a must-visit destination for travelers seeking a taste of both old-world charm and contemporary allure.\"\"\"\n",
165
        "    )\n",
166
        "]"
167
      ]
168
    },
169
    {
170
      "cell_type": "markdown",
171
      "metadata": {
172
        "id": "zMNy0tjtUh_L"
173
      },
174
      "source": [
175
        "## Creating the Initial Pipeline Components\n",
176
        "\n",
177
        "First, define a prompt instructing the LLM to respond with the text `\"no_answer\"` if the provided documents do not offer enough context to answer the query. Next, initialize a [PromptBuilder](https://docs.haystack.deepset.ai/v2.0/docs/promptbuilder) with that prompt. It's crucial that the LLM replies with `\"no_answer\"` as you will use this keyword to indicate that the query should be directed to the fallback web search route.\n",
178
        "\n",
179
        "As the LLM, you will use an [OpenAIGenerator](https://docs.haystack.deepset.ai/v2.0/docs/openaigenerator) with the `gpt-3.5-turbo` model.\n",
180
        "\n",
181
        "> The provided prompt works effectively with the `gpt-3.5-turbo` model. If you prefer to use a different [Generator](https://docs.haystack.deepset.ai/v2.0/docs/generators), you may need to update the prompt to provide clear instructions to your model."
182
      ]
183
    },
184
    {
185
      "cell_type": "code",
186
      "execution_count": 4,
187
      "metadata": {
188
        "id": "nzhn2kDfqvbs"
189
      },
190
      "outputs": [],
191
      "source": [
192
        "from haystack.components.builders.prompt_builder import PromptBuilder\n",
193
        "from haystack.components.generators import OpenAIGenerator\n",
194
        "\n",
195
        "prompt_template = \"\"\"\n",
196
        "Answer the following query given the documents.\n",
197
        "If the answer is not contained within the documents reply with 'no_answer'\n",
198
        "Query: {{query}}\n",
199
        "Documents:\n",
200
        "{% for document in documents %}\n",
201
        "  {{document.content}}\n",
202
        "{% endfor %}\n",
203
        "\"\"\"\n",
204
        "\n",
205
        "prompt_builder = PromptBuilder(template=prompt_template)\n",
206
        "llm = OpenAIGenerator(model=\"gpt-3.5-turbo\")"
207
      ]
208
    },
209
    {
210
      "cell_type": "markdown",
211
      "metadata": {
212
        "id": "LepACkkWPsBx"
213
      },
214
      "source": [
215
        "## Initializing the Web Search Components\n",
216
        "\n",
217
        "Initialize the necessary components for a web-based RAG application. Along with a `PromptBuilder` and an `OpenAIGenerator`, you will need a [SerperDevWebSearch](https://docs.haystack.deepset.ai/v2.0/docs/serperdevwebsearch) to retrieve relevant documents for the query from the web.\n",
218
        "\n",
219
        "> If desired, you can use a different [Generator](https://docs.haystack.deepset.ai/v2.0/docs/generators) for the web-based RAG branch of the pipeline."
220
      ]
221
    },
222
    {
223
      "cell_type": "code",
224
      "execution_count": 5,
225
      "metadata": {
226
        "id": "VEYchFgQPxZ_"
227
      },
228
      "outputs": [],
229
      "source": [
230
        "from haystack.components.builders.prompt_builder import PromptBuilder\n",
231
        "from haystack.components.generators import OpenAIGenerator\n",
232
        "from haystack.components.websearch.serper_dev import SerperDevWebSearch\n",
233
        "\n",
234
        "prompt_for_websearch = \"\"\"\n",
235
        "Answer the following query given the documents retrieved from the web.\n",
236
        "Your answer shoud indicate that your answer was generated from websearch.\n",
237
        "\n",
238
        "Query: {{query}}\n",
239
        "Documents:\n",
240
        "{% for document in documents %}\n",
241
        "  {{document.content}}\n",
242
        "{% endfor %}\n",
243
        "\"\"\"\n",
244
        "\n",
245
        "websearch = SerperDevWebSearch()\n",
246
        "prompt_builder_for_websearch = PromptBuilder(template=prompt_for_websearch)\n",
247
        "llm_for_websearch = OpenAIGenerator(model=\"gpt-3.5-turbo\")"
248
      ]
249
    },
250
    {
251
      "cell_type": "markdown",
252
      "metadata": {
253
        "id": "vnacak_tVWqv"
254
      },
255
      "source": [
256
        "## Creating the ConditionalRouter\n",
257
        "\n",
258
        "[ConditionalRouter](https://docs.haystack.deepset.ai/v2.0/docs/conditionalrouter) is the component that handles data routing on specific conditions. You need to define a `condition`, an `output`, an `output_name` and an `output_type` for each route. Each route that the `ConditionalRouter` creates acts as the output of this component and can be connected to other components in the same pipeline.  \n",
259
        "\n",
260
        "In this case, you need to define two routes:\n",
261
        "- If the LLM replies with the `\"no_answer\"` keyword, the pipeline should perform web search. It means that you will put the original `query` in the output value to pass to the next component (in this case the next component will be the `SerperDevWebSearch`) and the output name will be `go_to_websearch`.\n",
262
        "- Otherwise, the given documents are enough for an answer and pipeline execution ends here. Return the LLM reply in the output named `answer`."
263
      ]
264
    },
265
    {
266
      "cell_type": "code",
267
      "execution_count": 6,
268
      "metadata": {
269
        "id": "qyE9rGcawX3F"
270
      },
271
      "outputs": [],
272
      "source": [
273
        "from haystack.components.routers import ConditionalRouter\n",
274
        "\n",
275
        "routes = [\n",
276
        "    {\n",
277
        "        \"condition\": \"{{'no_answer' in replies[0]}}\",\n",
278
        "        \"output\": \"{{query}}\",\n",
279
        "        \"output_name\": \"go_to_websearch\",\n",
280
        "        \"output_type\": str,\n",
281
        "    },\n",
282
        "    {\n",
283
        "        \"condition\": \"{{'no_answer' not in replies[0]}}\",\n",
284
        "        \"output\": \"{{replies[0]}}\",\n",
285
        "        \"output_name\": \"answer\",\n",
286
        "        \"output_type\": str,\n",
287
        "    },\n",
288
        "]\n",
289
        "\n",
290
        "router = ConditionalRouter(routes)"
291
      ]
292
    },
293
    {
294
      "cell_type": "markdown",
295
      "metadata": {
296
        "id": "Wdyko78oXb5a"
297
      },
298
      "source": [
299
        "## Building the Pipeline\n",
300
        "\n",
301
        "Add all components to your pipeline and connect them. `go_to_websearch` output of the `router` should be connected to the `websearch` to retrieve documents from the web and also to `prompt_builder_for_websearch` to use in the prompt."
302
      ]
303
    },
304
    {
305
      "cell_type": "code",
306
      "execution_count": 7,
307
      "metadata": {
308
        "colab": {
309
          "base_uri": "https://localhost:8080/"
310
        },
311
        "id": "4sCyBwc0oTVs",
312
        "outputId": "fd2347d4-9363-45e0-e734-87e4a160f741"
313
      },
314
      "outputs": [
315
        {
316
          "data": {
317
            "text/plain": [
318
              "<haystack.pipeline.Pipeline at 0x127609e50>"
319
            ]
320
          },
321
          "execution_count": 7,
322
          "metadata": {},
323
          "output_type": "execute_result"
324
        }
325
      ],
326
      "source": [
327
        "from haystack import Pipeline\n",
328
        "\n",
329
        "pipe = Pipeline()\n",
330
        "pipe.add_component(\"prompt_builder\", prompt_builder)\n",
331
        "pipe.add_component(\"llm\", llm)\n",
332
        "pipe.add_component(\"router\", router)\n",
333
        "pipe.add_component(\"websearch\", websearch)\n",
334
        "pipe.add_component(\"prompt_builder_for_websearch\", prompt_builder_for_websearch)\n",
335
        "pipe.add_component(\"llm_for_websearch\", llm_for_websearch)\n",
336
        "\n",
337
        "pipe.connect(\"prompt_builder\", \"llm\")\n",
338
        "pipe.connect(\"llm.replies\", \"router.replies\")\n",
339
        "pipe.connect(\"router.go_to_websearch\", \"websearch.query\")\n",
340
        "pipe.connect(\"router.go_to_websearch\", \"prompt_builder_for_websearch.query\")\n",
341
        "pipe.connect(\"websearch.documents\", \"prompt_builder_for_websearch.documents\")\n",
342
        "pipe.connect(\"prompt_builder_for_websearch\", \"llm_for_websearch\")"
343
      ]
344
    },
345
    {
346
      "cell_type": "markdown",
347
      "metadata": {
348
        "id": "d0HmdbUJKJ_9"
349
      },
350
      "source": [
351
        "### Visualize the Pipeline\n",
352
        "\n",
353
        "To understand how you formed this pipeline with conditional routing, use [draw()](https://docs.haystack.deepset.ai/v2.0/docs/drawing-pipeline-graphs) method of the pipeline. If you're running this notebook on Google Colab, the generated file will be saved in \"Files\" section on the sidebar or you can call `Image.open()`:"
354
      ]
355
    },
356
    {
357
      "cell_type": "code",
358
      "execution_count": 12,
359
      "metadata": {
360
        "colab": {
361
          "base_uri": "https://localhost:8080/",
362
          "height": 1000
363
        },
364
        "id": "svF_SUK4rFwv",
365
        "outputId": "60894eea-2cec-4be8-d13c-83d2c81656f4"
366
      },
367
      "outputs": [],
368
      "source": [
369
        "from PIL import Image\n",
370
        "\n",
371
        "pipe.draw(\"pipe.png\")\n",
372
        "Image.open(\"pipe.png\")"
373
      ]
374
    },
375
    {
376
      "cell_type": "markdown",
377
      "metadata": {
378
        "id": "jgk1z6GGYH6J"
379
      },
380
      "source": [
381
        "## Running the Pipeline!\n",
382
        "\n",
383
        "In the `run()`, pass the query to the `prompt_builder` and the `router`. In real life applications, `documents` will be provided by a [Retriever](https://docs.haystack.deepset.ai/v2.0/docs/retrievers) but to keep this example simple, you will provide the defined `documents` to the `prompt_builder`."
384
      ]
385
    },
386
    {
387
      "cell_type": "code",
388
      "execution_count": 9,
389
      "metadata": {
390
        "colab": {
391
          "base_uri": "https://localhost:8080/"
392
        },
393
        "id": "d_l4rYmCoVki",
394
        "outputId": "3bd7956a-7612-4bc1-c3e5-a7a51be8981f"
395
      },
396
      "outputs": [
397
        {
398
          "name": "stdout",
399
          "output_type": "stream",
400
          "text": [
401
            "Munich is in southern Germany.\n"
402
          ]
403
        }
404
      ],
405
      "source": [
406
        "query = \"Where is Munich?\"\n",
407
        "\n",
408
        "result = pipe.run({\"prompt_builder\": {\"query\": query, \"documents\": documents}, \"router\": {\"query\": query}})\n",
409
        "\n",
410
        "# Print the `answer` coming from the ConditionalRouter\n",
411
        "print(result[\"router\"][\"answer\"])"
412
      ]
413
    },
414
    {
415
      "cell_type": "markdown",
416
      "metadata": {
417
        "id": "dBN8eLSKgb16"
418
      },
419
      "source": [
420
        "✅ The answer to this query can be found in the defined document.\n",
421
        "\n",
422
        "Now, try a different query that doesn't have an answer in the given document and test if the web search works as expected:"
423
      ]
424
    },
425
    {
426
      "cell_type": "code",
427
      "execution_count": 10,
428
      "metadata": {
429
        "colab": {
430
          "base_uri": "https://localhost:8080/"
431
        },
432
        "id": "_v-WdlSy365M",
433
        "outputId": "603c9346-8718-427e-d232-4cc71799a2bb"
434
      },
435
      "outputs": [
436
        {
437
          "name": "stdout",
438
          "output_type": "stream",
439
          "text": [
440
            "['According to the documents retrieved from the web, the population of Munich is approximately 1.47 million as of 2019. However, the most recent estimates suggest that the population has grown to about 1.58 million as of May 31, 2022. Additionally, the current estimated population of Munich is around 1.46 million, with the urban area being much larger at 2.65 million.']\n"
441
          ]
442
        }
443
      ],
444
      "source": [
445
        "query = \"How many people live in Munich?\"\n",
446
        "\n",
447
        "result = pipe.run({\"prompt_builder\": {\"query\": query, \"documents\": documents}, \"router\": {\"query\": query}})\n",
448
        "\n",
449
        "# Print the `replies` generated using the web searched Documents\n",
450
        "print(result[\"llm_for_websearch\"][\"replies\"])"
451
      ]
452
    },
453
    {
454
      "cell_type": "markdown",
455
      "metadata": {
456
        "id": "wUkuXoWnHa5c"
457
      },
458
      "source": [
459
        "If you check the whole result, you will see that `websearch` component also provides links to Documents retrieved from the web:"
460
      ]
461
    },
462
    {
463
      "cell_type": "code",
464
      "execution_count": 11,
465
      "metadata": {
466
        "colab": {
467
          "base_uri": "https://localhost:8080/"
468
        },
469
        "id": "_EYLZguZGznY",
470
        "outputId": "df49a576-9961-44b4-e89d-2c5195869360"
471
      },
472
      "outputs": [
473
        {
474
          "data": {
475
            "text/plain": [
476
              "{'llm': {'meta': [{'model': 'gpt-3.5-turbo-0613',\n",
477
              "    'index': 0,\n",
478
              "    'finish_reason': 'stop',\n",
479
              "    'usage': {'completion_tokens': 2,\n",
480
              "     'prompt_tokens': 271,\n",
481
              "     'total_tokens': 273}}]},\n",
482
              " 'websearch': {'links': ['https://en.wikipedia.org/wiki/Munich',\n",
483
              "   'https://worldpopulationreview.com/world-cities/munich-population',\n",
484
              "   'https://en.wikipedia.org/wiki/Demographics_of_Munich',\n",
485
              "   'https://www.macrotrends.net/cities/204371/munich/population',\n",
486
              "   'https://www.britannica.com/place/Munich-Bavaria-Germany',\n",
487
              "   'https://www.statista.com/statistics/519723/munich-population-by-age-group/',\n",
488
              "   'https://www.citypopulation.de/en/germany/bayern/m%C3%BCnchen_stadt/09162000__m%C3%BCnchen/',\n",
489
              "   'https://www.quora.com/How-many-people-live-in-Munich',\n",
490
              "   'https://earth.esa.int/web/earth-watching/image-of-the-week/content/-/article/munich-germany/']},\n",
491
              " 'llm_for_websearch': {'replies': ['According to the documents retrieved from the web, the population of Munich is approximately 1.47 million as of 2019. However, the most recent estimates suggest that the population has grown to about 1.58 million as of May 31, 2022. Additionally, the current estimated population of Munich is around 1.46 million, with the urban area being much larger at 2.65 million.'],\n",
492
              "  'meta': [{'model': 'gpt-3.5-turbo-0613',\n",
493
              "    'index': 0,\n",
494
              "    'finish_reason': 'stop',\n",
495
              "    'usage': {'completion_tokens': 85,\n",
496
              "     'prompt_tokens': 436,\n",
497
              "     'total_tokens': 521}}]}}"
498
            ]
499
          },
500
          "execution_count": 11,
501
          "metadata": {},
502
          "output_type": "execute_result"
503
        }
504
      ],
505
      "source": [
506
        "result"
507
      ]
508
    },
509
    {
510
      "cell_type": "markdown",
511
      "metadata": {
512
        "id": "6nhdYK-vHpNM"
513
      },
514
      "source": [
515
        "## What's next\n",
516
        "\n",
517
        "🎉 Congratulations! You've built a pipeline with conditional routing! You can now customize the condition for your specific use case and create a custom Haystack 2.0 pipeline to meet your needs.\n",
518
        "\n",
519
        "If you liked this tutorial, there's more to learn about Haystack 2.0:\n",
520
        "- [Creating Your First QA Pipeline with Retrieval-Augmentation](https://haystack.deepset.ai/tutorials/27_first_rag_pipeline)\n",
521
        "- [Model-Based Evaluation of RAG Pipelines](https://haystack.deepset.ai/tutorials/35_model_based_evaluation_of_rag_pipelines)\n",
522
        "\n",
523
        "To stay up to date on the latest Haystack developments, you can [sign up for our newsletter](https://landing.deepset.ai/haystack-community-updates?utm_campaign=developer-relations&utm_source=tutorial&utm_medium=conditional-router) or [join Haystack discord community](https://discord.gg/haystack).\n",
524
        "\n",
525
        "Thanks for reading!"
526
      ]
527
    }
528
  ],
529
  "metadata": {
530
    "colab": {
531
      "provenance": []
532
    },
533
    "kernelspec": {
534
      "display_name": "Python 3",
535
      "name": "python3"
536
    },
537
    "language_info": {
538
      "codemirror_mode": {
539
        "name": "ipython",
540
        "version": 3
541
      },
542
      "file_extension": ".py",
543
      "mimetype": "text/x-python",
544
      "name": "python",
545
      "nbconvert_exporter": "python",
546
      "pygments_lexer": "ipython3",
547
      "version": "3.9.6"
548
    }
549
  },
550
  "nbformat": 4,
551
  "nbformat_minor": 0
552
}
553

Использование cookies

Мы используем файлы cookie в соответствии с Политикой конфиденциальности и Политикой использования cookies.

Нажимая кнопку «Принимаю», Вы даете АО «СберТех» согласие на обработку Ваших персональных данных в целях совершенствования нашего веб-сайта и Сервиса GitVerse, а также повышения удобства их использования.

Запретить использование cookies Вы можете самостоятельно в настройках Вашего браузера.