要从摄像头获取实时视频流,可以使用AVFoundation框架提供的AVCaptureSession类。以下是一个基本示例,显示摄像头的实时视频流:
import SwiftUI
import AVFoundation
struct CameraView: UIViewControllerRepresentable {
func makeUIViewController(context: Context) -> UIViewController {
let captureSession = AVCaptureSession()
guard let device = AVCaptureDevice.default(for: .video) else { return UIViewController() }
guard let input = try? AVCaptureDeviceInput(device: device) else { return UIViewController() }
captureSession.addInput(input)
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = .resizeAspectFill
let viewController = UIViewController()
viewController.view.layer.addSublayer(previewLayer)
captureSession.startRunning()
return viewController
}
func updateUIViewController(_ uiViewController: UIViewController, context: Context) {}
}
这个示例中,我们创建一个AVCaptureSession并添加一个AVCaptureDeviceInput来捕获视频。然后,我们使用AVCaptureVideoPreviewLayer将摄像头的实时视频流显示在UIView上。最后,我们启动捕获会话。
要在SwiftUI中使用这个视图,只需在视图层次结构中包含它:
struct ContentView: View {
var body: some View {
CameraView()
}
}
这是一个基本的示例,你可以在此基础上进行更多的自定义,例如添加控件和处理视频数据。