Rate this page:

How to use broadcast sharing in iOS

Screen sharing in iOS-based clients can be of two types, in-app and outside the application (the whole device screen with menu, widgets, and everything). While the first one operates similarly to other SDKs, for screen sharing outside the application, you have to use the Broadcast sharing. Learn how to do that in this article.

Please note that Broadcast sharing is available only in the conference mode due to the technical implementation of obtaining frames in iOS (they're obtained in a separate operating system process).

What Broadcast in iOS is

Sharing a device's screen in iOS outside the application is implemented via the Broadcast Upload iOS application extension, which is available in iOS 11.0+. It can be launched in two ways, via the Control panel or the RPSystemBroadcastPickerView button integrated into your iOS application.

Once launched, broadcast sharing allows sending either the whole device screen or only the selected application with the support of app expanding. For both modes the H.264 codec is used.

For more details on how these two broadcast sharing modes compare, check this article.

Start Sharing in Conference

Let's imagine there's a conference and one of its participants wants to share his or her device screen. We're going to use a second additional call, see the example in our Github repo.

H.264 + calls = ❤️

Since this approach works only with the H.264 codec, use this codec in conference calls.

According to Apple's developer guidelines, such functionality should be implemented as an application extension. To do so, add a new target to your Xcode project by following this path: File -> New -> Target -> Broadcast Upload Extension

Click Next and a form will appear. Enter a name for a new target and unselect the Include UI Extension checkbox as it you won't need it.

After the BroadcastUploadExtension is addes to your project, it can be launched in two ways (as has been mentioned before):

  • Via the Control Panel (iOS 11+)
  • In the app, via the RPSystemBroadcastPickerView button (iOS 12 +)
Control Panel Requirements

To make launching from the Control Panel possible, it is required to add a ScreenRecording button beforehand by going to Settings -> Control Centre -> Customise Controls -> Screen Recording. To launch the extension, click or hold down ScreenRecording in the Control Panel and select BroadcastUploadExtension.

After a user gave permission for broadcasting, a separate process launches called application extension and the control is switched to the [RPBroadcastSampleHandler broadcastStartedWithSetupInfo:(nullable NSDictionary <NSString *, NSObject *> *)setupInfo]; method.

Next, your application needs to connect to the Voximplant cloud and authorize the user, the token-based authorizaion is preferred.

You can pass the credentials and the roomid from your app to Voximplant through the app extension, e.g., Keychain or UserDefaults. Bear in mind that this will require to set up the App Group.

clientSessionDidConnect

clientSessionDidConnect

Now the application should create a new call, connect it to the conference, and start sending frames to the conference via this call. The screen frames will be processed with the the VICustomVideoSource class.

There ase nuances due to restrictions of RAM usage for app extensions (no more 50 Mb):

  • Receiving of audio and video should be turned off.
  • Audio sending should be turned off as well.
  • The codec should be H.264.
Call Settings

Call Settings

To pass the frames from the screen, you need to implement a custom video source initialized with the initScreenCastFormat constructor.

Broadcast extension sends the frames using the RPBroadcastSampleHandler.processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) method in Swift and the [RPBroadcastSampleHandler processSampleBuffer:(CMSampleBufferRef)sampleBuffer withType:(RPSampleBufferType)sampleBufferType]; method in Objective-C.

There are three types of the frames: .video, .audioApp, .audioMic. And .video frames must be passed to a custom video source using the sendVideoFrame:rotation API. And then the rotation should be obtained from the resulting frame.

Received frames processing

Received frames processing

Synchronizing call statuses

With this approach (conference mode with two calls), you will need to synchronize the statuses of two calls (main and additional). For example, you'll have to pass the information about the end of one call to the other, between the processes of the operating system. To do this, you can use the CFNotifyCenterGetDarwinNotifyCenter API.

Known Issues

  1. RPSystemBroadcastPickerView may not display or work when added to Xcode's Storyboard. It's a bug. If you want to add this widget to the Storyboard, you have to initialize it first: let _ = RPSystemBroadcastPickerView ().

  2. You may encounter this kind of problem:

App extension process failed with Error: Error code: Thread 7: EXCRESOURCE RESOURCETYPE_MEMORY (limit = 50 MB, unused = 0x0)

This means you are most likely using the VP8 codec in your call. Switch to H264 to solve the problem: VICallSettings.preferredVideoCodec = .H264.