字
字节笔记本
2026年2月22日
在 Flutter 中显示 Python OpenCV 视频流
本文介绍一种在 Flutter 应用中实时显示 Python OpenCV 视频流的完整方案,使用 FastAPI 作为后端服务,通过 HTTP 流式传输实现跨平台视频显示。
方案概述
这个方案包含两个核心组件:
- Python 后端(FastAPI + OpenCV):生成视频帧并通过流式响应提供
- Flutter 前端:通过 HTTP 请求获取视频帧并实时显示,同时计算 FPS
后端实现(Python)
使用 FastAPI 搭建轻量级视频流服务:
python
from fastapi import FastAPI, responses
from uvicorn import run
import cv2
import numpy as np
app = FastAPI()
def _get_frame():
frame = np.random.randint(low=0, high=255, size=(480,640, 3), dtype='uint8')
return frame
def generate():
encodedImage = cv2.imencode('.png', _get_frame())[1]
yield (encodedImage.tobytes())
@app.get("/video_frame")
async def video_feed():
return responses.StreamingResponse(generate())
if __name__ == '__main__':
run("app:app", host="127.0.0.1", port=8001, reload=True)后端代码说明
_get_frame():生成随机彩色图像(480x640),实际应用中可替换为摄像头捕获或视频文件读取generate():将 OpenCV 图像编码为 PNG 格式并转换为字节流/video_frame端点:返回流式响应,适合连续帧传输
前端实现(Flutter)
Flutter 端使用 StreamBuilder 实现实时视频显示:
dart
import 'dart:async';
import 'dart:convert';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:http/http.dart' as http;
void main() {
runApp(MyApp());
}
Image base64ToImage(String base64String) {
return Image.memory(
base64Decode(base64String),
gaplessPlayback: true,
);
}
class MyApp extends StatelessWidget {
@override
Widget build(BuildContext context) {
return MaterialApp(
title: 'Flutter Demo',
theme: ThemeData(
primarySwatch: Colors.blue,
),
home: MyHomePage(title: 'Flutter Demo Home Page'),
);
}
}
class MyHomePage extends StatefulWidget {
MyHomePage({Key? key, required this.title}) : super(key: key);
final String title;
@override
_MyHomePageState createState() => _MyHomePageState();
}
class _MyHomePageState extends State<MyHomePage> {
int frameCounter = 0;
int lastTime = DateTime.now().millisecondsSinceEpoch;
int fps = 0;
final fpsValueNotifier = ValueNotifier(0);
final pollingRate = 10; // time between requests in ms
final url = 'http://127.0.0.1:8001/video_frame';
bool _timeDifferenceBiggerThanSecond() {
return DateTime.now().millisecondsSinceEpoch - lastTime > 1000;
}
Future<Image> _fetchVideoFrame() async {
final response = await http.get(Uri.parse(url));
if (_timeDifferenceBiggerThanSecond()) {
fpsValueNotifier.value = frameCounter;
lastTime = DateTime.now().millisecondsSinceEpoch;
frameCounter = 0;
} else {
frameCounter++;
}
return Image.memory(
response.bodyBytes,
gaplessPlayback: true,
);
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text(widget.title),
),
body: Center(
child: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: <Widget>[
ValueListenableBuilder(
valueListenable: fpsValueNotifier,
builder: (context, value, child) {
return Text('FPS $value');
}),
StreamBuilder<Image>(
stream: getVideoFrame(),
builder: (context, snapshot) {
if (snapshot.hasData) {
return snapshot.data!;
} else if (snapshot.hasError) {
return Text("${snapshot.error}");
}
return CircularProgressIndicator();
},
),
],
),
),
);
}
Stream<Image> getVideoFrame() => Stream.periodic(Duration(milliseconds: pollingRate))
.asyncMap((_) => _fetchVideoFrame());
}前端代码说明
_fetchVideoFrame():异步获取单帧图像,同时计算 FPSgetVideoFrame():创建定时流,每 10ms 请求一帧StreamBuilder:监听视频流并实时更新 UIgaplessPlayback: true:确保图像切换时无闪烁ValueListenableBuilder:实时显示当前 FPS
运行步骤
1. 启动后端服务
bash
# 安装依赖
pip install fastapi uvicorn opencv-python numpy
# 运行服务
python app.py服务将在 http://127.0.0.1:8001 启动。
2. 运行 Flutter 应用
bash
# 确保已添加 http 依赖(pubspec.yaml)
dependencies:
http: ^1.0.0
# 运行应用
flutter run扩展应用
此方案可扩展至以下场景:
- 实时监控:连接摄像头进行实时视频流显示
- 视频处理:在 Python 端添加图像处理(滤镜、目标检测等)
- 远程监控:将后端部署到服务器,实现远程视频查看
- AI 推理:集成 TensorFlow/PyTorch 模型进行实时推理
注意事项
- 网络权限:Flutter 应用需要添加
INTERNET权限(Android) - CORS 配置:如果前后端分离部署,需在 FastAPI 中配置 CORS
- 性能优化:根据网络状况调整
pollingRate,避免过于频繁的请求 - 内存管理:长时间运行时注意图像内存的释放
参考链接
分享: