Add a Camera Feed To SwiftUI

Daily Coding Tip 011

If you want to see what the camera sees, you need to gain permission from the user to use the camera. This requires you to have a valid reason, and your app will be rejected from the App Store if it doesn’t sound good. Right-click or on the Info.plist file in your Xcode project, and select Open As > Source code. This is the easiest way to copy and paste the following right before the closing </dict> and </plist> tags:

<key>NSCameraUsageDescription</key>
<string>The camera is needed in order to see what is around you</string>

This isn’t a great explanation of why we need the camera feed, as I don’t know what your app is going to be doing, but it nevertheless stops the app from crashing.

Now that the Info.plist is out of the way, we need to create a UIView that will show our camera feed.

import UIKit
import AVFoundation
class PreviewView: UIView {
var videoPreviewLayer: AVCaptureVideoPreviewLayer {
guard
let layer = layer as? AVCaptureVideoPreviewLayer
else { fatalError("Could not get layer") }
layer.videoGravity = AVLayerVideoGravity.resizeAspectFill
layer.connection?.videoOrientation = .portrait
return layer
}
override class var layerClass: AnyClass {
return AVCaptureVideoPreviewLayer.self
}
}
extension AVCaptureSession {
convenience init(device: AVCaptureDevice.DeviceType) {
self.init()
beginConfiguration()
guard
let videoDevice = AVCaptureDevice.default(device,
for: AVMediaType.video,
position: .back),
let videoDeviceInput = try? AVCaptureDeviceInput(device: videoDevice),
canAddInput(videoDeviceInput)
else { return }
addInput(videoDeviceInput)
commitConfiguration()
startRunning()
}
}
view raw PreviewView.swift hosted with ❤ by GitHub

SwiftUI does not include camera functionality, so we are relying on the old way of doing things. The process is relatively simple, as we are creating a UIView that contains an AVCaptureVideoPreviewLayer upon which our camera feed will be drawn. I’ve also added a convenience initialiser for AVCaptureSession, which will do the work of getting a feed from the camera we select later.

Finally let’s convert this to SwiftUI, so that we can add it to our app.

import SwiftUI
import AVFoundation
struct CameraView: UIViewRepresentable {
let preview = PreviewView()
func makeUIView(context: Context) -> PreviewView {
AVCaptureDevice.requestAccess(for: .video) { granted in
if granted {
DispatchQueue.main.async {
preview.videoPreviewLayer.session = AVCaptureSession(device: .builtInWideAngleCamera)
}
}
}
return preview
}
func updateUIView(_ uiView: PreviewView, context: Context) {}
}
view raw CameraView.swift hosted with ❤ by GitHub

We’re requesting access to the camera and, if we are given permission, creating an AVCaptureSession with the convenience initialiser we made earlier.

This can now be used anywhere in SwiftUI, simply by creating it as CameraView().


Get more Daily Coding Tips in your inbox!