ollama
102 строки · 2.9 Кб
1{
2"cells": [
3{
4"cell_type": "code",
5"execution_count": null,
6"id": "93f59dcb-c588-41b8-a792-55d88ade739c",
7"metadata": {},
8"outputs": [],
9"source": [
10"# Download and run the Ollama Linux install script\n",
11"!curl -fsSL https://ollama.com/install.sh | sh\n",
12"!command -v systemctl >/dev/null && sudo systemctl stop ollama"
13]
14},
15{
16"cell_type": "code",
17"execution_count": null,
18"id": "658c147e-c7f8-490e-910e-62b80f577dda",
19"metadata": {},
20"outputs": [],
21"source": [
22"!pip install aiohttp pyngrok\n",
23"\n",
24"import os\n",
25"import asyncio\n",
26"from aiohttp import ClientSession\n",
27"\n",
28"# Set LD_LIBRARY_PATH so the system NVIDIA library becomes preferred\n",
29"# over the built-in library. This is particularly important for \n",
30"# Google Colab which installs older drivers\n",
31"os.environ.update({'LD_LIBRARY_PATH': '/usr/lib64-nvidia'})\n",
32"\n",
33"async def run(cmd):\n",
34" '''\n",
35" run is a helper function to run subcommands asynchronously.\n",
36" '''\n",
37" print('>>> starting', *cmd)\n",
38" p = await asyncio.subprocess.create_subprocess_exec(\n",
39" *cmd,\n",
40" stdout=asyncio.subprocess.PIPE,\n",
41" stderr=asyncio.subprocess.PIPE,\n",
42" )\n",
43"\n",
44" async def pipe(lines):\n",
45" async for line in lines:\n",
46" print(line.strip().decode('utf-8'))\n",
47"\n",
48" await asyncio.gather(\n",
49" pipe(p.stdout),\n",
50" pipe(p.stderr),\n",
51" )\n",
52"\n",
53"\n",
54"await asyncio.gather(\n",
55" run(['ollama', 'serve']),\n",
56" run(['ngrok', 'http', '--log', 'stderr', '11434']),\n",
57")"
58]
59},
60{
61"cell_type": "markdown",
62"id": "e7735a55-9aad-4caf-8683-52e2163ba53b",
63"metadata": {},
64"source": [
65"The previous cell starts two processes, `ollama` and `ngrok`. The log output will show a line like the following which describes the external address.\n",
66"\n",
67"```\n",
68"t=2023-11-12T22:55:56+0000 lvl=info msg=\"started tunnel\" obj=tunnels name=command_line addr=http://localhost:11434 url=https://8249-34-125-179-11.ngrok.io\n",
69"```\n",
70"\n",
71"The external address in this case is `https://8249-34-125-179-11.ngrok.io` which can be passed into `OLLAMA_HOST` to access this instance.\n",
72"\n",
73"```bash\n",
74"export OLLAMA_HOST=https://8249-34-125-179-11.ngrok.io\n",
75"ollama list\n",
76"ollama run mistral\n",
77"```"
78]
79}
80],
81"metadata": {
82"kernelspec": {
83"display_name": "Python 3 (ipykernel)",
84"language": "python",
85"name": "python3"
86},
87"language_info": {
88"codemirror_mode": {
89"name": "ipython",
90"version": 3
91},
92"file_extension": ".py",
93"mimetype": "text/x-python",
94"name": "python",
95"nbconvert_exporter": "python",
96"pygments_lexer": "ipython3",
97"version": "3.11.6"
98}
99},
100"nbformat": 4,
101"nbformat_minor": 5
102}
103