Java程序员72小时Python实战手册
Java写了几年的人转Python根本不需要从头学。核心语法一张表就能覆盖然后直接案例练手。这篇给你一张对照表再加一个实战把一段Java的HttpClient调用改成Python异步流式请求模块直接对接大模型API。后面专栏里所有和大模型的交互代码都基于这一套。2.1 语言切换的核心对照下面这张表覆盖了专栏里95%的用法。瞅一眼就行不用死记。Java → Python 常用对照场景JavaPython变量声明String name CSDN;name: str CSDN常量static final int MAX 100;MAX: int 100约定大写列表ListString list new ArrayList();list: list[str] []MapMapString, Integer map new HashMap();dict[str, int] {}for循环for (int i 0; i n; i)for i in range(n):for-eachfor (String s : list)for s in list:if-elseif (x 0) { ... } else { ... }if x 0:\n ...\nelse:\n ...方法public String greet(String name) { return Hi name; }def greet(name: str) - str:\n return fHi {name}nullnullNone类public class Dog { ... }class Dog:\n def __init__(self, name: str):\n self.name name构造器public Dog(String name) { this.name name; }__init__同上静态方法public static void util() { ... }staticmethod\ndef util():\n ...try-catchtry { ... } catch (Exception e) { ... }try:\n ...\nexcept Exception as e:\n ...with-resourcestry (FileReader fr new FileReader(...))with open(file) as f:\n ...importimport java.util.List;from typing import Listmainpublic static void main(String[] args)if __name__ __main__:字符串格式化String.format(val: %d, val)fval: {val}列表推导list.stream().map(...).collect(...)[item.process() for item in items]lambda(x) - x * 2lambda x: x * 22.2 实战把Java的HttpClient调用改写成Python异步流式客户端后续我们要频繁调大模型API而且一般都是流式返回SSE——数据一块一块推过来。这要求HTTP客户端支持异步流式处理。Java里可以实现但代码量不低Python里可以非常简洁。2.2.1 Java里的写法参考java // Java 11 异步流式请求示意 HttpClient client HttpClient.newHttpClient(); HttpRequest request HttpRequest.newBuilder() .uri(URI.create(https://api.openai.com/v1/chat/completions)) .header(Authorization, Bearer API_KEY) .header(Content-Type, application/json) .POST(HttpRequest.BodyPublishers.ofString(jsonBody)) .build(); client.sendAsync(request, HttpResponse.BodyHandlers.ofLines()) .thenAccept(response - { response.body().forEach(line - { if (line.startsWith(data: )) { String data line.substring(6); if (!data.equals([DONE])) { // 解析并处理 } } }); });不算复杂但链式调用多了容易绕。而且每轮对话都得重新组织一遍。2.2.2 用Python写同样的东西我们会分两步先写一个能用的同步版本再改成异步生成器处理流式响应的同时保持调用优雅。第一步安装依赖bash pip install httpxhttpx是Python里比较现代的HTTP客户端支持同步、异步、HTTP/2API设计清晰。第二步同步版——先跑通python import httpx import json API_KEY your-api-key BASE_URL https://api.openai.com/v1 def chat_completion_sync(messages: list[dict]) - str: 同步请求返回完整回答文本。 headers { Authorization: fBearer {API_KEY}, Content-Type: application/json, } payload { model: gpt-3.5-turbo, messages: messages, stream: False, } with httpx.Client(timeout60.0) as client: response client.post( f{BASE_URL}/chat/completions, headersheaders, jsonpayload, ) response.raise_for_status() data response.json() return data[choices][0][message][content]可以测试一下python if __name__ __main__: msgs [{role: user, content: 用一句话解释Spring Boot的优点}] print(chat_completion_sync(msgs))第三步异步流式版——对接后续所有场景python import httpx import json from typing import AsyncIterator API_KEY your-api-key BASE_URL https://api.openai.com/v1 async def chat_completion_stream(messages: list[dict], model: str gpt-3.5-turbo) - AsyncIterator[str]: 异步流式请求。 每产生一块文本片段就通过yield返回。 调用方用 async for 消费。 headers { Authorization: fBearer {API_KEY}, Content-Type: application/json, } payload { model: model, messages: messages, stream: True, } async with httpx.AsyncClient(timeout120.0) as client: async with client.stream( POST, f{BASE_URL}/chat/completions, headersheaders, jsonpayload, ) as response: response.raise_for_status() async for line in response.aiter_lines(): if not line.startswith(data: ): continue data_str line[6:].strip() if data_str [DONE]: break try: chunk json.loads(data_str) delta chunk[choices][0].get(delta, {}) content delta.get(content, ) if content: yield content except Exception: # 忽略解析错误的行 continue使用示例python import asyncio async def main(): msgs [{role: user, content: 简述Java的GC机制}] full_response async for token in chat_completion_stream(msgs): print(token, end, flushTrue) # 逐字输出 full_response token print(\n\n---完整回复结束---) asyncio.run(main())2.2.3 对比一下同样的流式请求Python方案的调用方只需一个async for代码清晰。而httpx的stream()方法直接提供了异步行迭代器不用手动管理连接、解析SSE协议。后续专栏里所有和大模型的交互都会基于这个封装。2.3 关于迁移成本说句实话Python和Java本质上只是语法糖厚度不同。Java里那些设计模式、分层思想、异常处理思维在Python里一模一样只是写得短一点。真正耗时的是领域知识比如RAG原理、Embedding选型、模型性能评估——这些和语言无关也是专栏真正要讲的东西。下一篇我们将开始在本地真正部署第一个开源大模型并测试不同尺寸模型在实际问答时的表现差异。本篇文章源码stage01_env/simple_chat_client.py
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.coloradmin.cn/o/2572596.html
如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈,一经查实,立即删除!