This guide walks you through implementing the "Make a Call" feature using the AtomicXCore SDK, leveraging the DeviceStore, CallStore, and the core UI component CallCoreView.
Core Features
To build multi-party audio/video calling scenarios with AtomicXCore, you’ll use the following three core modules:
|
| Core call UI component. Automatically observes CallStore data and renders video streams, with support for UI customization such as layout switching, avatar and icon configuration. |
| Manages the call lifecycle: make, answer, reject, and hang up calls. Provides real-time access to participant audio/video status, call duration, call history, and more. |
| Controls audio/video devices: microphone (toggle/on/off, volume), camera (toggle/on/off, switch, quality), screen sharing, and real-time device status monitoring. |
Getting Started
Step 1: Activate the Service
Step 2: Integrate the SDK
1. Add Pod dependency: Add pod 'AtomicXCore' to your project's Podfile.
target 'YourProjectTarget' do
pod 'AtomicXCore'
end
Tips:
If your project doesn’t have a Podfile, navigate to your .xcodeproj directory in the terminal and run pod init to create one.
2. Install the component: In the terminal, go to the directory containing your Podfile and run:
pod install --repo-update
Tips:
After installation, open your project using the YourProjectName.xcworkspace file.
Step 3: Initialize and Log In
To start the call service, initialize CallStore and log in the user in sequence. CallStore automatically syncs user information after a successful login and enters the ready state. The following flowchart and sample code illustrate the process:
import UIKit
import AtomicXCore
import Combine
class ViewController: UIViewController {
var cancellables = Set<AnyCancellable>()
override func viewDidLoad() {
super.viewDidLoad()
let _ = CallStore.shared
let userID = "test_001"
let sdkAppID: Int = 1400000001
let secretKey = "**************"
let userSig = GenerateTestUserSig.genTestUserSig(
userID: userID,
sdkAppID: sdkAppID,
secretKey: secretKey
)
LoginStore.shared.login(
sdkAppID: sdkAppID,
userID: userID,
userSig: userSig
) { result in
switch result {
case .success:
Log.info("login success")
TUICallEngine.createInstance().`init`(Int32(sdkAppID), userId: userID, userSig: userSig) {
Log.info("TUICallEngine init success")
} fail: { code, message in
Log.error("TUICallEngine init failed, code: \\(code), message: \\(message ?? "")")
}
case .failure(let error):
Log.error("login failed, code: \\(error.code), error: \\(error.message)")
}
}
}
}
|
userID | String | Unique identifier for the current user. Only letters, numbers, hyphens, and underscores are allowed. Avoid using simple IDs like 1 or 123 to prevent multi-device login conflicts. |
sdkAppID | int | |
secretKey | String | SDKSecretKey for your audio/video application, created in the console. |
userSig | String | Authentication token for TRTC. |
Implementation Steps
Before making a call, ensure the user is logged in—this is required for the service to function. The following five steps outline how to implement the "Make a Call" feature.
Step 1: Create the Call Interface
You need to create a call screen that is displayed when a call is initiated.
1. Create the call screen: Implement a new UIViewController to serve as the call interface. This will be used for both outgoing and incoming calls.
2. Attach CallCoreView: Add the core call view component to your call screen. CallCoreView automatically observes CallStore data and renders video streams, with options for customizing layout, avatars, and icons. import UIKit
import AtomicXCore
class CallViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = .black
callCoreView = CallCoreView(frame: view.bounds)
callCoreView?.autoresizingMask = [.flexibleWidth, .flexibleHeight]
if let callCoreView = callCoreView {
view.addSubview(callCoreView)
}
}
}
CallCoreView Feature Overview:
|
Set Layout Mode | Switch between layout modes. If not set, layout adapts automatically based on participant count. | Switch Layout Mode |
Set Avatar | Customize avatars for specific users by providing avatar resource paths. | Customize Default Avatar |
Set Volume Indicator Icon | Set custom volume indicator icons for different volume levels. | Customize Volume Indicator Icon |
Set Network Indicator Icon | Set network status indicator icons based on real-time network quality. | Customize Network Indicator Icon |
Set Waiting Animation for Users | Support GIF animations for users in waiting state during multi-party calls. | |
DeviceStore Controls: Microphone (toggle/on/off, volume), camera (toggle/on/off, switch, quality), screen sharing, and real-time device status monitoring. Bind button actions to the corresponding methods, and update button UI in real time by observing device status changes.
CallStore Controls: Answer, hang up, reject, and other core call actions. Bind button actions to the appropriate methods and observe call status to keep the UI in sync.
Icon Resources: Download button icons from GitHub. These icons are designed for TUICallKit and are free to use. Example: Adding Hang Up, Microphone, and Camera Buttons
1.1 Add Hang Up Button: Create and add a hang up button. When tapped, call hangup and close the call screen.
import UIKit
import AtomicXCore
import Combine
class CallViewController: UIViewController {
private lazy var buttonHangup: UIButton = {
let buttonWidth: CGFloat = 80
let buttonHeight: CGFloat = 80
let spacing: CGFloat = 30
let bottomMargin: CGFloat = 80
let totalWidth = buttonWidth * 3 + spacing * 2
let startX = (view.bounds.width - totalWidth) / 2
let buttonY = view.bounds.height - bottomMargin - buttonHeight
let button = createButton(
frame: CGRect(x: startX + (buttonWidth + spacing) * 2, y: buttonY, width: buttonWidth, height: buttonHeight),
title: "hangup"
)
button.backgroundColor = .systemRed
button.addTarget(self, action: #selector(touchHangupButton), for: .touchUpInside)
return button
}()
override func viewDidLoad() {
super.viewDidLoad()
view.addSubview(buttonHangup)
}
@objc private func touchHangupButton() {
CallStore.shared.hangup(completion: nil)
}
private func createButton(frame: CGRect, title: String) -> UIButton {
let button = UIButton(type: .system)
button.frame = frame
button.setTitle(title, for: .normal)
button.setTitleColor(.white, for: .normal)
button.backgroundColor = UIColor(white: 0.3, alpha: 0.8)
button.layer.cornerRadius = frame.width / 2
button.titleLabel?.font = UIFont.systemFont(ofSize: 14)
return button
}
}
import UIKit
import AtomicXCore
import Combine
class CallViewController: UIViewController {
private lazy var buttonMicrophone: UIButton = {
let buttonWidth: CGFloat = 80
let buttonHeight: CGFloat = 80
let spacing: CGFloat = 30
let bottomMargin: CGFloat = 80
let totalWidth = buttonWidth * 3 + spacing * 2
let startX = (view.bounds.width - totalWidth) / 2
let buttonY = view.bounds.height - bottomMargin - buttonHeight
let button = createButton(
frame: CGRect(x: startX + buttonWidth + spacing, y: buttonY, width: buttonWidth, height: buttonHeight),
title: "Microphone"
)
button.addTarget(self, action: #selector(touchMicrophoneButton), for: .touchUpInside)
return button
}()
override func viewDidLoad() {
super.viewDidLoad()
view.addSubview(buttonMicrophone)
}
@objc private func touchMicrophoneButton() {
let microphoneStatus = DeviceStore.shared.state.value.microphoneStatus
if microphoneStatus == .on {
DeviceStore.shared.closeLocalMicrophone()
} else {
DeviceStore.shared.openLocalMicrophone(completion: nil)
}
}
private func createButton(frame: CGRect, title: String) -> UIButton {
let button = UIButton(type: .system)
button.frame = frame
button.setTitle(title, for: .normal)
button.setTitleColor(.white, for: .normal)
button.backgroundColor = UIColor(white: 0.3, alpha: 0.8)
button.layer.cornerRadius = frame.width / 2
button.titleLabel?.font = UIFont.systemFont(ofSize: 14)
return button
}
}
import UIKit
import AtomicXCore
import Combine
class CallViewController: UIViewController {
private lazy var buttonCamera: UIButton = {
let buttonWidth: CGFloat = 80
let buttonHeight: CGFloat = 80
let spacing: CGFloat = 30
let bottomMargin: CGFloat = 80
let totalWidth = buttonWidth * 3 + spacing * 2
let startX = (view.bounds.width - totalWidth) / 2
let buttonY = view.bounds.height - bottomMargin - buttonHeight
let button = createButton(
frame: CGRect(x: startX, y: buttonY, width: buttonWidth, height: buttonHeight),
title: "Camera"
)
button.addTarget(self, action: #selector(touchCameraButton), for: .touchUpInside)
return button
}()
override func viewDidLoad() {
super.viewDidLoad()
view.addSubview(buttonCamera)
}
@objc private func touchCameraButton() {
let cameraStatus = DeviceStore.shared.state.value.cameraStatus
if cameraStatus == .on {
DeviceStore.shared.closeLocalCamera()
} else {
let isFront = DeviceStore.shared.state.value.isFrontCamera
DeviceStore.shared.openLocalCamera(isFront: isFront, completion: nil)
}
}
private func createButton(frame: CGRect, title: String) -> UIButton {
let button = UIButton(type: .system)
button.frame = frame
button.setTitle(title, for: .normal)
button.setTitleColor(.white, for: .normal)
button.backgroundColor = UIColor(white: 0.3, alpha: 0.8)
button.layer.cornerRadius = frame.width / 2
button.titleLabel?.font = UIFont.systemFont(ofSize: 14)
return button
}
}
1.4 Update Button Labels in Real Time: Observe microphone and camera status and update button labels accordingly.
import UIKit
import AtomicXCore
import Combine
class CallViewController: UIViewController {
private var cancellables = Set<AnyCancellable>()
override func viewDidLoad() {
super.viewDidLoad()
observeDeviceState()
}
private func observeDeviceState() {
DeviceStore.shared.state.subscribe()
.map { $0.cameraStatus }
.removeDuplicates()
.receive(on: DispatchQueue.main)
.sink { [weak self] cameraStatus in
let title = cameraStatus == .on ? "Turn Off Camera" : "Turn On Camera"
self?.buttonCamera?.setTitle(title, for: .normal)
}
.store(in: &cancellables)
DeviceStore.shared.state.subscribe()
.map { $0.microphoneStatus }
.removeDuplicates()
.receive(on: DispatchQueue.main)
.sink { [weak self] microphoneStatus in
let title = microphoneStatus == .on ? "Turn Off Mic" : "Turn On Mic"
self?.buttonMicrophone?.setTitle(title, for: .normal)
}
.store(in: &cancellables)
}
}
Step 3: Request Microphone/Camera Permissions
Check for audio/video permissions before starting a call. If permissions are missing, prompt the user to grant them.
1. Declare permissions: Add the following keys to your app’s Info.plist with appropriate usage descriptions. These will be shown to users when the system requests permissions:
<key>NSCameraUsageDescription</key>
<string>Camera access is required for video calls and group video calls.</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone access is required for audio calls, group audio calls, video calls, and group video calls.</string>
2. Request permissions dynamically: Request audio/video permissions based on the call media type when initiating a call.
import AVFoundation
import UIKit
extension UIViewController {
func checkMicrophonePermission(completion: @escaping (Bool) -> Void) {
let status = AVCaptureDevice.authorizationStatus(for: .audio)
switch status {
case .authorized:
completion(true)
case .notDetermined:
AVCaptureDevice.requestAccess(for: .audio) { granted in
DispatchQueue.main.async {
completion(granted)
}
}
case .denied, .restricted:
completion(false)
@unknown default:
completion(false)
}
}
func checkCameraPermission(completion: @escaping (Bool) -> Void) {
let status = AVCaptureDevice.authorizationStatus(for: .video)
switch status {
case .authorized:
completion(true)
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { granted in
DispatchQueue.main.async {
completion(granted)
}
}
case .denied, .restricted:
completion(false)
@unknown default:
completion(false)
}
}
func showPermissionAlert(message: String) {
let alert = UIAlertController(
title: "Permission Required",
message: message,
preferredStyle: .alert
)
alert.addAction(UIAlertAction(title: "Settings", style: .default) { _ in
if let url = URL(string: UIApplication.openSettingsURLString) {
UIApplication.shared.open(url)
}
})
alert.addAction(UIAlertAction(title: "Cancel", style: .cancel))
present(alert, animated: true)
}
}
Step 4: Make a Call
After calling calls, navigate to the call screen. For the best user experience, automatically enable the microphone or camera based on the media type. 1. Initiate the call: Call calls to start a call.
2. Enable media devices: After the call is initiated, enable the microphone. If it’s a video call, also enable the camera.
3. Present the call screen: On successful call initiation, present the call screen.
import UIKit
import AtomicXCore
import Combine
class MainViewController: UIViewController {
private func startCall(userIdList: [String], mediaType: CallMediaType) {
var params = CallParams()
params.timeout = 30
CallStore.shared.calls(
participantIds: userIdList,
callMediaType: mediaType,
params: params
) { [weak self] result in
switch result {
case .success:
self?.openDevices(for: mediaType)
DispatchQueue.main.async {
let callVC = CallViewController()
callVC.modalPresentationStyle = .fullScreen
self?.present(callVC, animated: true)
}
case .failure(let error):
Log.error("Failed to initiate call: \\(error)")
}
}
}
private func openDevices(for mediaType: CallMediaType) {
DeviceStore.shared.openLocalMicrophone(completion: nil)
if mediaType == .video {
let isFront = DeviceStore.shared.state.value.isFrontCamera
DeviceStore.shared.openLocalCamera(isFront: isFront, completion: nil)
}
}
}
calls API Parameter Reference:
|
participantIds | List<String> | Yes | A list of target user IDs. |
callMediaType | | Yes | The media type of the call, used to specify whether to initiate an audio or video call. CallMediaType.video : Video call.
CallMediaType.audio : Audio call.
|
params | | No | Extended call parameters, such as Room ID, call invitation timeout, etc. roomId (String) : Room ID. An optional parameter; if not specified, it will be automatically assigned by the server.
timeout (Int) : Call Timeout (in seconds).
userData (String) : Custom User Data for application-specific information.
chatGroupId (String) : Chat Group ID, used for group call scenarios.
isEphemeralCall (Boolean) : Ephemeral Call. If set to true, no call history record will be generated.
|
Step 5: End the Call
When you call hangup or the remote party ends the call, the onCallEnded event is triggered. Listen for this event and close the call screen when the call ends. 1. Listen for call end event: Observe the onCallEnded event.
2. Close the call screen: When onCallEnded is triggered, dismiss the call screen.
import UIKit
import AtomicXCore
import Combine
class CallViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
addListener()
}
private func addListener() {
CallStore.shared.callEventPublisher
.receive(on: DispatchQueue.main)
.sink { [weak self] event in
if case .onCallEnded = event {
self?.dismiss(animated: true)
}
}
.store(in: &cancellables)
}
}
onCallEnded Event Parameters:
Params | Type | Description |
callId | String | Unique ID for this call. |
mediaType | | The media type of the call: CallMediaType.video : Video call.
CallMediaType.audio : Audio call.
|
reason | | The reason why the call ended. unknown : Unable to determine the reason for termination.
hangup : Normal termination; a user actively ended the call.
reject : The callee declined the incoming call.
noResponse : The callee did not answer within the timeout period.
offline : The callee is currently offline.
lineBusy : The callee is already in another call.
canceled : The caller canceled the call before it was answered.
otherDeviceAccepted : The call was answered on another logged-in device.
otherDeviceReject : The call was declined on another logged-in device.
endByServer : The call was forced to end by the server.
|
userId | String | The User ID of the person who triggered the call termination. |
Demo
After completing these five steps, the "Make a Call" feature will look like this:
Customization
CallCoreView provides extensive UI customization, including support for custom avatars and volume indicator icons. To speed up your integration, you can download ready-to-use icons from GitHub. All icons are designed for TUICallKit and are free to use. Customizing Volume Indicator Icons
Use the CallCoreView setVolumeLevelIcons method to assign different icons for each volume level.
let volumeLevelIcons: [VolumeLevel: String] = [
.mute: "Path to the corresponding icon resource"
]
callCoreView.setVolumeLevelIcons(icons: volumeLevelIcons)
setVolumeLevelIcons API Parameters:
|
icons | [VolumeLevel: String] | Yes | A mapping table of volume levels to icon resources. The dictionary structure is defined as follows: Key ( VolumeLevel ) Represents the volume intensity level: VolumeLevel.mute :Microphone is off or muted.
VolumeLevel.low :Volume range (0, 25].
VolumeLevel.medium : Volume range (25, 50].
VolumeLevel.high : Volume range (50, 75].
VolumeLevel.peak : Volume range (75, 100].
Value ( String ) The resource path or name of the icon corresponding to the volume level. |
Icon Configuration Guide:
Icons | Description | Download Links |
| Volume Indicator Icon. You can set this icon for VolumeLevel.low or VolumeLevel.medium. It will be displayed when the user's volume exceeds the specified level. | |
| Volume Indicator Icon. You can set this icon for VolumeLevel.mute. It will be displayed when the user is currently muted. | |
Customizing Network Indicator Icons
let networkQualityIcons: [NetworkQuality: String] = [
.bad: "Path to the corresponding icon"
]
callCoreView.setNetworkQualityIcons(icons: networkQualityIcons)
setNetworkQualityIcons API Parameters:
|
icons | [NetworkQuality: String] | Yes | Network Quality Icon Mapping Table. The dictionary structure is defined as follows: Key ( NetworkQuality ) : NetworkQuality NetworkQuality.unknown :Network status is undetermined.
NetworkQuality.excellent:Outstanding network connection.
NetworkQuality.good : Stable and good network connection.
NetworkQuality.poor : Weak network signal.
NetworkQuality.bad : Very weak or unstable network.
NetworkQuality.veryBad :Extremely poor network, near disconnection.
NetworkQuality.down :Network is disconnected.
Value ( String ) : The absolute path or resource name of the icon corresponding to the network status. |
Network Warning Icon:
Icons | Description | Download Links |
| Poor Network Indicator. You can set this icon for NetworkQuality.bad, NetworkQuality.veryBad or NetworkQuality.down .It will be displayed when the network quality is poor. | |
Customizing Default Avatars
Use the CallCoreView setParticipantAvatars API to set user avatars. Listen to the allParticipants reactive data: when you have a user's avatar, set and display it; if not, show the default avatar.
var avatars: [String: String] = [:]
let userId = ""
let avatarPath = ""
avatars[userId] = avatarPath
callCoreView.setParticipantAvatars(avatars: avatars)
setParticipantAvatars API Parameters:
|
avatars | [String: String] | Yes | User Avatar Mapping Table. The dictionary structure is described as follows: Key : The userID of the user. Value : The absolute path to the user's avatar resource. |
Default Avatar:
Icons | Description | Download Links |
| Default Profile Picture. You can set this as the default avatar for a user when their profile image fails to load or if no avatar is provided. | |
Customizing Waiting Animations
Use the CallCoreView setWaitingAnimation API to set a waiting animation for users who are waiting to answer.
let waitingAnimationPath = ""
callCoreView.setWaitingAnimation(path: waitingAnimationPath)
setWaitingAnimation API Parameters:
|
path | String | Yes | Absolute path to a GIF format image resource. |
User Waiting Animation:
Icons | Description | Download Links |
| User Waiting Animation Animations for group calls. Once configured, this animation will be displayed when the user's status is "Waiting to Answer" (Pending). | |
Adding a Call Duration Indicator
To display the call duration in real time, subscribe to the activeCall duration field. 1. Subscribe to the data layer: Observe CallStore.observerState.activeCall for changes.
2. Bind the duration to your UI: The activeCall.duration field is reactive and will automatically update your UI.
import UIKit
import AtomicXCore
import Combine
class TimerView: UILabel {
private var cancellables = Set<AnyCancellable>()
override init(frame: CGRect) {
super.init(frame: frame)
setupView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupView()
}
private func setupView() {
textColor = .white
textAlignment = .center
font = .systemFont(ofSize: 16)
}
override func didMoveToWindow() {
super.didMoveToWindow()
if window != nil {
registerActiveCallObserver()
} else {
cancellables.removeAll()
}
}
private func registerActiveCallObserver() {
CallStore.shared.state.subscribe()
.map { $0.activeCall }
.removeDuplicates { $0.duration == $1.duration }
.receive(on: DispatchQueue.main)
.sink { [weak self] activeCall in
self?.updateDurationView(activeCall: activeCall)
}
.store(in: &cancellables)
}
private func updateDurationView(activeCall: CallInfo) {
let currentDuration = activeCall.duration
let minutes = currentDuration / 60
let seconds = currentDuration % 60
text = String(format: "%02d:%02d", minutes, seconds)
}
}
Note:
For more reactive call status data, see CallState. More Features
Customizing User Avatar and Nickname
Before a call starts, set your own nickname and avatar using setSelfInfo. var userProfile = UserProfile()
userProfile.userID = ""
userProfile.avatarURL = ""
userProfile.nickname = ""
LoginStore.shared.setSelfInfo(userProfile: userProfile) { result in
switch result {
case .success:
case .failure(let error):
}
}
setSelfInfo API Parameters:
|
userProfile | | Yes | User info struct: userID: User ID
avatarURL: User avatar URL
nickname: User nickname
For more field details, please refer to the UserProfile class. |
Switching Layout Modes
CallCoreView supports three built-in layout modes. Use setLayoutTemplate to set the layout. If not set, CallCoreView will automatically use Float mode for 1-on-1 calls and Grid mode for multi-party calls. |
| | |
Layout: While waiting, display your own video full screen. After answering, show the remote video full screen and your own video as a floating window. Interaction: Drag the small window or tap to swap big/small video. | Layout: All participant videos are tiled in a grid. Best for 2+ participants. Tap to enlarge a video. Interaction: Tap a participant to enlarge their video. | Layout: In 1v1, remote video is fixed; in multi-party, the active speaker is shown full screen. Interaction: Shows your own video while waiting, displays call timer after answering. |
setLayoutTemplate Sample code:
func setLayoutTemplate(_ template: CallLayoutTemplate)
setLayoutTemplate API Parameters:
|
template | | CallCoreView's layout mode CallLayoutTemplate.float :
Layout: While waiting, display your own video full screen. After answering, show the remote video full screen and your own video as a floating window. Interaction: Drag the small window or tap to swap big/small video. CallLayoutTemplate.grid :
Layout: All participant videos are tiled in a grid. Best for 2+ participants. Tap to enlarge a video. Interaction: Tap a participant to enlarge their video. CallLayoutTemplate.pip :
Layout: In 1v1, remote video is fixed; in multi-party, the active speaker is shown full screen. Interaction: Shows your own video while waiting, displays call timer after answering. |
Setting Default Call Timeout
When making a call using calls, set the timeout field in CallParams to specify the call invitation timeout. var callParams = CallParams()
callParams.timeout = 30
CallStore.shared.calls(
participantIds: userIdList,
callMediaType: .video,
params: callParams,
completion: nil
)
|
participantIds | List<String> | Yes | A list of User IDs for the target participants. |
callMediaType | | Yes | The media type of the call, used to specify whether to initiate an audio or video call. CallMediaType.video : Video Call.
CallMediaType.audio : Audio Call.
|
params | | No | Extended call parameters, such as Room ID, call invitation timeout, etc. roomId (String) : Room ID. An optional parameter; if not specified, it will be automatically assigned by the server.
timeout (Int) : Call Timeout (in seconds).
userData (String) : User Custom Data for app-specific logic.
chatGroupId (String) : Chat Group ID, used specifically for group call scenarios.
isEphemeralCall (Boolean) : Ephemeral Call. Whether the call is encrypted and transient (will not generate a call history record).
|
Implementing In-App Floating Window
The AtomicXCore SDK provides the CallPipView component to enable in-app floating windows. When the call interface is covered by another screen (e.g., the user navigates away but the call is ongoing), a floating window displays call status and lets users quickly return to the call.
Step 1: Create the Floating Window Controller.
import UIKit
import AtomicXCore
import Combine
class FloatWindowViewController: UIViewController {
var tapGestureAction: (() -> Void)?
private var cancellables = Set<AnyCancellable>()
private lazy var callCoreView: CallCoreView = {
let view = CallCoreView(frame: self.view.bounds)
view.autoresizingMask = [.flexibleWidth, .flexibleHeight]
view.setLayoutTemplate(.pip)
view.isUserInteractionEnabled = false
return view
}()
override func viewDidLoad() {
super.viewDidLoad()
view.backgroundColor = UIColor(white: 0.1, alpha: 1.0)
view.layer.cornerRadius = 10
view.layer.masksToBounds = true
view.addSubview(callCoreView)
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(handleTap))
view.addGestureRecognizer(tapGesture)
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in
self?.observeCallStatus()
}
}
@objc private func handleTap() {
tapGestureAction?()
}
private func observeCallStatus() {
CallStore.shared.state
.subscribe(StatePublisherSelector<CallState, CallParticipantStatus>(keyPath: \\.selfInfo.status))
.removeDuplicates()
.receive(on: DispatchQueue.main)
.sink { [weak self] status in
if status == .none {
NotificationCenter.default.post(name: NSNotification.Name("HideFloatingWindow"), object: nil)
}
}
.store(in: &cancellables)
}
deinit {
cancellables.removeAll()
}
}
Step 2: Implement floating window management logic in the main interface.
import UIKit
import AtomicXCore
class MainViewController: UIViewController {
private var floatWindow: UIWindow?
override func viewDidLoad() {
super.viewDidLoad()
NotificationCenter.default.addObserver(
self,
selector: #selector(showFloatingWindow),
name: NSNotification.Name("ShowFloatingWindow"),
object: nil
)
NotificationCenter.default.addObserver(
self,
selector: #selector(hideFloatingWindow),
name: NSNotification.Name("HideFloatingWindow"),
object: nil
)
}
@objc private func showFloatingWindow() {
let selfStatus = CallStore.shared.state.value.selfInfo.status
guard selfStatus == .accept else {
return
}
guard floatWindow == nil else { return }
guard let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene else {
return
}
let pipWidth: CGFloat = 100
let pipHeight: CGFloat = pipWidth * 16 / 9
let pipX = UIScreen.main.bounds.width - pipWidth - 20
let pipY: CGFloat = 100
let window = UIWindow(windowScene: windowScene)
window.windowLevel = .alert + 1
window.backgroundColor = .clear
window.frame = CGRect(x: pipX, y: pipY, width: pipWidth, height: pipHeight)
let floatVC = FloatWindowViewController()
floatVC.tapGestureAction = { [weak self] in
self?.openCallViewController()
}
window.rootViewController = floatVC
self.floatWindow = window
window.isHidden = false
window.makeKeyAndVisible()
if let mainWindow = windowScene.windows.first(where: { $0 != window }) {
mainWindow.makeKey()
}
}
@objc private func hideFloatingWindow() {
floatWindow?.isHidden = true
floatWindow = nil
}
private func openCallViewController() {
hideFloatingWindow()
guard let topVC = getTopViewController() else {
return
}
let callVC = CallViewController()
callVC.modalPresentationStyle = .fullScreen
topVC.present(callVC, animated: true)
}
private func getTopViewController() -> UIViewController? {
guard let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene,
let keyWindow = windowScene.windows.first(where: { $0.isKeyWindow }),
let rootVC = keyWindow.rootViewController else {
return nil
}
var topVC = rootVC
while let presentedVC = topVC.presentedViewController {
topVC = presentedVC
}
return topVC
}
deinit {
NotificationCenter.default.removeObserver(self)
}
}
Step 3: Add the floating window trigger logic to the Call Interface.
import UIKit
import AtomicXCore
class CallViewController: UIViewController {
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
NotificationCenter.default.post(name: NSNotification.Name("HideFloatingWindow"), object: nil)
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
let selfStatus = CallStore.shared.state.value.selfInfo.status
if selfStatus == .accept {
NotificationCenter.default.post(name: NSNotification.Name("ShowFloatingWindow"), object: nil)
}
}
}
Enabling System Picture-in-Picture (PiP) Outside the App
AtomicXCore SDK supports system-level PiP via the underlying TRTC engine. When your app goes to the background, the call video can float above other apps as a system PiP window, so users can continue their video call while multitasking.
Note:
1. In Xcode, add Background Modes under Signing & Capabilities and enable Audio, AirPlay, and Picture in Picture.
2. Requires iOS 15.0 or later.
1. Configure PiP Parameters: You need to set parameters such as the fill mode for the PiP window, user video regions, and canvas configurations.
import Foundation
import AtomicXCore
enum PictureInPictureFillMode: Int, Codable {
case fill = 0
case fit = 1
}
struct PictureInPictureRegion: Codable {
let userId: String
let width: Double
let height: Double
let x: Double
let y: Double
let fillMode: PictureInPictureFillMode
let streamType: String
let backgroundColor: String
}
struct PictureInPictureCanvas: Codable {
let width: Int
let height: Int
let backgroundColor: String
}
struct PictureInPictureParams: Codable {
let enable: Bool
let cameraBackgroundCapture: Bool?
let canvas: PictureInPictureCanvas?
let regions: [PictureInPictureRegion]?
}
struct PictureInPictureRequest: Codable {
let api: String
let params: PictureInPictureParams
}
2. Enable Picture-in-Picture: You can enable or disable the PiP feature using the configPictureInPicture method.
let params = PictureInPictureParams(
enable: true,
cameraBackgroundCapture: true,
canvas: nil,
regions: nil
)
let request = PictureInPictureRequest(
api: "configPictureInPicture",
params: params
)
let encoder = JSONEncoder()
if let data = try? encoder.encode(request),
let jsonString = String(data: data, encoding: .utf8) {
TUICallEngine.createInstance().callExperimentalAPI(jsonObject: jsonString)
}
Keeping the Screen Awake During Calls
To prevent the screen from dimming or locking during a call, set UIApplication.shared.isIdleTimerDisabled = true when the call starts, and restore it when the call ends.
class CallViewController: UIViewController {
override func viewDidLoad() {
super.viewDidLoad()
UIApplication.shared.isIdleTimerDisabled = true
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
UIApplication.shared.isIdleTimerDisabled = false
}
}
Playing a Ringtone While Waiting for Answer
Listen to your own call status to play a ringtone while waiting for an answer, and stop the ringtone when the call is answered or ends.
import Combine
private var cancellables = Set<AnyCancellable>()
private func observeSelfCallStatus() {
CallStore.shared.state.subscribe()
.map { $0.selfInfo.status }
.removeDuplicates()
.receive(on: DispatchQueue.main)
.sink { [weak self] status in
if status == .accept || status == .none {
return
}
if status == .waiting {
}
}
.store(in: &cancellables)
}
Enabling Background Audio/Video Capture
To allow your app to capture audio and video while in the background (e.g., when the user locks the screen or switches apps), configure iOS background mode permissions and set up the audio session.
Configuration Steps:
1. In Xcode, select your project Target → Signing & Capabilities.
2. Click + Capability.
3. Add Background Modes.
4. Enable:
Audio, AirPlay, and Picture in Picture (for audio capture and PiP)
Voice over IP (for VoIP calls)
Remote notifications (optional, for offline push)
Your Info.plist will then include:
<key>UIBackgroundModes</key>
<array>
<string>audio</string>
<string>voip</string>
<string>remote-notification</string>
</array>
Configure Audio Session (AVAudioSession):
Set up the audio session before the call starts, ideally in the call interface’s viewDidLoad or before making/answering a call.
import AVFoundation
private func setupAudioSession() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, options: [.allowBluetooth, .allowBluetoothA2DP])
try audioSession.setActive(true)
} catch {
print("Failed to configure Audio Session: \\(error)")
}
}
Special Handling for Ringtone Playback (Optional):
To play a ringtone through the speaker while waiting for an answer, temporarily switch the audio session to .playback mode.
private func setAudioSessionForRingtone() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playback, options: [.allowBluetooth, .allowBluetoothA2DP])
try audioSession.overrideOutputAudioPort(.speaker)
try audioSession.setActive(true)
} catch {
print("Ringtone audio session error: \\(error)")
}
}
private func restoreAudioSessionForCall() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, options: [.allowBluetooth, .allowBluetoothA2DP])
try audioSession.setActive(true)
} catch {
print("Failed to restore call audio session: \\(error)")
}
}
Next Steps
Congratulations! You’ve completed the "Make a Call" feature. Next, see Answer Your First Call to implement the answer call functionality. FAQs
If the callee is offline and comes online within the call invitation timeout, will they receive the incoming call event?
For one-on-one calls, if the callee comes online within the timeout, they will receive an incoming call invitation. For group calls, if the callee comes online within the timeout, up to 20 pending group messages will be retrieved. If there is a call invitation, the incoming call event will be triggered.