windowns Ollama 下载,安装,本地部署大模型
一、相关链接Ollama官网https://ollama.com/irm https://ollama.com/install.ps1 | iexpaste this in PowerShell, or download Ollama下载Ollamahttps://ollama.com/download最新版本0.18.3搜索模型https://ollama.com/search如搜索deepseekQwengemmahttps://ollama.com/library/deepseek-r1有不同参数的模型。Ollamas documentationhttps://docs.ollama.com/Ollama中文文档https://ollama.it-docs.cn/二、安装、配置Ollama1.2.3.4.5.6.7.8.9.更换模型存储位置10. Launch三、运行个模型cmd运行拉取启动C:\Users\adminollama run deepseek-r1:1.5bpulling manifestpulling aabd4debf0c8: 100% ▕██████████████████████████████████████████████████████████▏ 1.1 GBpulling c5ad996bda6e: 100% ▕██████████████████████████████████████████████████████████▏ 556 Bpulling 6e4c38e1172f: 100% ▕██████████████████████████████████████████████████████████▏ 1.1 KBpulling f4d24e9138dd: 100% ▕██████████████████████████████████████████████████████████▏ 148 Bpulling a85fe2a2e58e: 100% ▕██████████████████████████████████████████████████████████▏ 487 Bverifying sha256 digestwriting manifestsuccess 你好你好很高兴见到你有什么我可以帮忙的吗无论是问题、建议还是闲聊我都在这儿呢 /?Available Commands:/set Set session variables/show Show model information/load model Load a session or model/save model Save your current session/clear Clear session context/bye Exit/?, /help Help for a command/? shortcuts Help for keyboard shortcutsUse to begin a multi-line message. Send a message (/? for help) /bye先拉取ollama pull deepseek-r1:1.5b再运行ollama run deepseek-r1:1.5b下面的3个是本地装过的模型四、Ollama命令提示ollama命令帮助C:\Users\adminollama --help Large language model runner Usage: ollama [flags] ollama [command] Available Commands: serve Start Ollama create Create a model show Show information for a model run Run a model stop Stop a running model pull Pull a model from a registry push Push a model to a registry signin Sign in to ollama.com signout Sign out from ollama.com list List models ps List running models cp Copy a model rm Remove a model launch Launch the Ollama menu or an integration help Help about any command Flags: -h, --help help for ollama --nowordwrap Dont wrap words to the next line automatically --verbose Show timings for response -v, --version Show version information Use ollama [command] --help for more information about a command.Ollama命令提示交互式C:\Users\adminollama Ollama 0.20.0 ▸ Chat with a model Start an interactive chat with a model Launch Claude Code (not installed) Anthropics coding tool with subagents Launch Codex (not installed) OpenAIs open-source coding agent Launch OpenClaw (install) Personal AI with 100 skills More... Show additional integrations ↑/↓ navigate • enter launch • → configure • esc quit ------------------------------------------------------------------------------------------- C:\Users\adminollama Select model to run: Type to filter... Recommended ▸ glm-4.7-flash Reasoning and code generation locally, ~25GB, (not downloaded) qwen3.5 Reasoning, coding, and visual understanding locally, ~11GB, (not downloaded) kimi-k2.5:cloud Multimodal reasoning with subagents qwen3.5:cloud Reasoning, coding, and agentic tool use with vision glm-5:cloud Reasoning and code generation minimax-m2.7:cloud Fast, efficient coding and real-world productivity More deepseek-r1:1.5b qwen3.5:27b qwen3.5:35b ↑/↓ navigate • enter select • ← back五、常用命令列出本地模型C:\Users\adminollama listNAME ID SIZE MODIFIEDdeepseek-r1:1.5b e0979632db5a 1.1 GB 9 minutes agoqwen3.5:35b 3460ffeede54 23 GB 3 days agoqwen3.5:27b 7653528ba5cb 17 GB 4 days ago查看运行中的模型C:\Users\adminollama psNAME ID SIZE PROCESSOR CONTEXT UNTILdeepseek-r1:1.5b e0979632db5a 2.8 GB 100% GPU 32768 21 seconds from now# 停止运行中的模型ollama stopollama stop deepseek-r1:1.5b# 启动服务命令行启动与应用程序启动不太一样不常用ollama serve 模型存储位置还得再设置一下# 再次运行ollama run --verbose deepseek-r1:1.5b 不设置模型存储位置会下载到默认模型存储位置--verbose 显示统计信息C:\Users\adminollama run --verbose deepseek-r1:1.5b 你好 你好很高兴见到你有什么我可以帮忙的吗 total duration: 303.208ms load duration: 42.5447ms prompt eval count: 4 token(s) prompt eval duration: 32.7748ms prompt eval rate: 122.04 tokens/s eval count: 17 token(s) eval duration: 213.5049ms eval rate: 79.62 tokens/s /bye六、模型存储位置设置1、应用程序启动的2、命令行启动的临时生效set OLLAMA_MODELSE:\OllamaModelsollama serve或配置环境变量OLLAMA_MODELSE:\OllamaModels3、相关目录日志目录C:\Users\admin\AppData\Local\Ollama应用程序目录C:\Users\admin\AppData\Local\Programs\Ollama默认模型存储位置C:\Users\admin\.ollama\models新的模型存储位置E:\OllamaModels七、查看统计信息方法 1用命令行ollama run --verbose最直接ollama run --verbose deepseek-r1:1.5b每次回复后都会显示统计。方法 2在命令行会话内临时开启 / 关闭进入ollama run后用内置命令切换 /set verbose # 开启统计 /set quiet # 关闭统计默认八、关闭思考deepseek 关闭思考在ollama中在代码中https://blog.csdn.net/haveqing/article/details/151162448九、其他Ollamas new apphttps://ollama.com/blog/new-app
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/2523746.html
如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!