大家好,今天给大家分享在使用 Locust 做接口压力测试时,如何实现自定义指标收集,特别是统计「首token首次响应时间」的实践过程、遇到的问题和解决办法,以及如何让自定义指标在 Locust Web UI 中正确展示的心得体会。希望对大家有所帮助!
在对一个聊天接口 /chat/
进行压力测试时,想除了统计接口的总响应时间,还想统计首个返回数据(首token)的延迟时间,这样能更准确地衡量接口响应的实时性,尤其是流式接口非常重要。
用 Locust 的 catch_response=True
捕获请求,并用 stream=True
读取接口的流式响应:
from locust import HttpUser, task, between
import time
import random
questions = [
"您好",
"我这个月的通话时间还剩多少",
# ... 省略若干问题
]
styles = ["默认风格", "幽默风格", "正式风格", "友好风格", "简洁风格"]
class ChatUser(HttpUser):
wait_time = between(1, 3)
@task
def send_chat_request(self):
question = random.choice(questions)
style = random.choice(styles)
data = {
"currentQuestion": question,
"title": "用户",
"phone": "13800000000",
"historyQuestion": [{"question": ""}, {"question": ""}],
"style": style
}
headers = {"Content-Type": "application/json"}
start_time = time.time()
first_token_time = None
answer_content = ""
try:
with self.client.post(
"/chat/",
headers=headers,
json=data,
stream=True,
catch_response=True,
timeout=15,
name="POST /chat"
) as response:
response.raise_for_status()
for line in response.iter_lines(decode_unicode=True):
if line:
if first_token_time is None:
first_token_time = time.time()
answer_content += line
response.success()
total_time = time.time() - start_time
first_token_delay = (first_token_time - start_time) if first_token_time else 0
print(f"问题: '{question}' - {style} | 首token: {first_token_delay:.3f}s, 总时间: {total_time:.3f}s")
except Exception as e:
if 'response' in locals():
response.failure(f"请求异常: {str(e)}")
print(f"请求失败: 问题 '{question}' - {style},异常: {str(e)}")
Locust 原生只自动统计请求总响应时间,要统计首token响应时间,需要通过 事件机制 自定义上报指标。
from locust.events import request_success
导入报错 ModuleNotFoundError: No module named 'locust.events'
这是因为 Locust 的事件 API 在较新版本中改成了 self.environment.events
访问,而不是直接导入 locust.events
。
self.environment.events.request_success.fire(...)
报错 'Events' object has no attribute 'request_success'
因为事件名是 request
,而不是 request_success
,正确的写法是 self.environment.events.request.fire(...)
。
if first_token_time:
first_token_delay = (first_token_time - start_time) * 1000 # ms
# 自定义事件,通知Locust统计首token响应时间
self.environment.events.request.fire(
request_type="http", # 请求类型,建议用"http"
name="chat_first_token_delay", # 自定义指标名
response_time=first_token_delay, # 单位毫秒
response_length=0, # 可填0或者长度
context=self # 传入当前环境,方便跟踪
)
from locust import HttpUser, task, between
import time
import random
questions = [
# ... 问题列表省略
]
styles = ["默认风格", "幽默风格", "正式风格", "友好风格", "简洁风格"]
class ChatUser(HttpUser):
wait_time = between(1, 3)
@task
def send_chat_request(self):
question = random.choice(questions)
style = random.choice(styles)
data = {
"currentQuestion": question,
"title": "用户",
"phone": "13800000000",
"historyQuestion": [{"question": ""}, {"question": ""}],
"style": style
}
headers = {"Content-Type": "application/json"}
start_time = time.time()
first_token_time = None
answer_content = ""
try:
with self.client.post(
"/chat/",
headers=headers,
json=data,
stream=True,
catch_response=True,
timeout=15,
name="POST /chat"
) as response:
response.raise_for_status()
for line in response.iter_lines(decode_unicode=True):
if line:
if first_token_time is None:
first_token_time = time.time()
answer_content += line
response.success()
total_time = time.time() - start_time
first_token_delay = (first_token_time - start_time) if first_token_time else 0
# 上报首token时间到Locust统计
if first_token_time:
self.environment.events.request.fire(
request_type="http",
name="chat_first_token_delay",
response_time=first_token_delay * 1000, # ms
response_length=0,
context=self,
)
print(
f"问题: '{question}' - {style} | "
f"首token: {first_token_delay:.3f}s, "
f"总时间: {total_time:.3f}s, 字数: {len(answer_content)}"
)
except Exception as e:
if 'response' in locals():
response.failure(f"请求异常: {str(e)}")
print(f"请求失败: 问题 '{question}' - {style},异常: {str(e)}")
你可能遇到的问题 | 解决方法 |
---|---|
不能导入 locust.events |
使用 self.environment.events 访问事件 |
request_success 不存在 |
事件名为 request ,用 self.environment.events.request.fire() |
自定义指标不显示在图表上 | Locust 设计如此,图表只能显示真实请求时间,需另寻方案展示自定义指标 |
希望这篇经验分享能帮到正在用 Locust 做接口测试的你!
如果觉得有用,欢迎点赞、收藏、留言交流!