Implementing WebRTC in IOS Apps
Posted By : Sumit Chahar | 27-Dec-2018
WebRTC is an open source project that provides mobile applications and web browsers with real-time communication. It is an exciting and powerful technology. It provides free API plugins that are usable with a desktop or a mobile browser. This technology is supported by major modern browser vendors. Earlier external plugins were used to achieve such results but now the same functionality is offered by WebRTC.
WebRTC uses various standards and protocols, which we will be discussing in this article. Some of them
Oodles Technologies is a renowned WebRTC Application Development company in India. We offer holistic WebRTC integration solutions for a wide range of applications and use cases.
Key Features of WebRTC
-
Peer-to-Peer Conferencing
-
Video Calling
-
Voice Calling
-
Peer-to-Peer File Transfers
-
Chat support
-
Desktop sharing
Technical Highlights of WebRTC
Let’s have a look at what are the technical expectations from WebRTC applications:
-
Streaming of audio, video and other data
-
Gathering network information like IP address and port. Transferring this information to WebRTC clients (peers) for connecting easily through NATs and firewalls
-
In case an error is reported immediate initiation, session closing and coordination and signalling communication start taking place.
-
Communicate with audio, streaming video, or data.
-
Exchanging information regarding media and client capability including resolution and codecs.
Oodles Technologies implement above-listed functions using WebRTC with the main APIs such as:
MediaStream (aka getUserMedia)
RTCPeerConnection
RTCDataChannel
Let's have a Look at how Oodles implement WebRTC in IOS.
CocoaPods, application dependency manager of Objective-C and other programming languages that run on Objective C runtime. It offers a standard format for the management of external libraries.
Let's look at the installation of Cocoa Pods
Step 1: Download CocoaPods
CocoaPods is disseminated as a ruby gem, and is installed by running the following commands in Terminal.app:
$
$ pod setup
Step 2: Creating a Podfile
The next step is creating a Podfile. Project dependencies are managed through CocoaPods visible in Podfile. We create directory files similar to the Xcode project (.xcodeproj) file:
-
$ touch Podfile
-
$ open -e Podfile
-
TextEdit opens an empty file.
A question is prompted: Ready to add some content to the empty pod file?
Now Copy and paste the following lines to the TextEdit window:
" platform :ios, '9.0' "
" pod 'WebRTC' "
Step 3: Installing Dependencies
Now you can install the dependencies in your project:
$ pod install
From now on, be sure to always open the generated Xcode workspace (.xcworkspace) instead of the project file when building your project:
$ open ProjectName.xcworkspace.
Step 4: Link Binary With Library Frameworks
Click on the Project → Select Target of interest → Choose Build Phases tab → Link Binary With Libraries → At the bottom of this list hit + to add libraries.
Here is the List of Required Apple Library frameworks:
-
ReplayKit.framework
-
CoreGraphics.framework
-
AVFoundation.framework
-
CoreMedia.framework
-
CoreVideo.framework
-
CoreImage.framework
-
GLKit.framework
-
AudioToolbox.framework
-
VideoToolbox.framework
3.Create Room for video Calling.
- (void)connectToRoomWithId:(NSString *)roomId
settings:(ARDSettingsModel *)settings
isLoopback:(BOOL)isLoopback
isAudioOnly:(BOOL)isAudioOnly
shouldMakeAecDump:(BOOL)shouldMakeAecDump
shouldUseLevelControl:(BOOL)shouldUseLevelControl {
NSParameterAssert(roomId.length);
NSParameterAssert(_state == kARDAppClientStateDisconnected);
_settings = settings;
_isLoopback = isLoopback;
_isAudioOnly = isAudioOnly;
_shouldMakeAecDump = shouldMakeAecDump;
_shouldUseLevelControl = shouldUseLevelControl;
self.state = kARDAppClientStateConnecting;
#if defined(WEBRTC_IOS)
if (kARDAppClientEnableTracing) {
NSString *filePath = [self documentsFilePathForFileName:@"webrtc-trace.txt"];
RTCStartInternalCapture(filePath);
}
#endif
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(showMainMenu:)
name:@"loginComplete" object:nil];
// Request TURN.
__weak ARDAppClient *weakSelf = self;
[_turnClient requestServersWithCompletionHandler:^(NSArray *turnServers,
NSError *error) {
if (error) {
RTCLogError("Error retrieving TURN servers: %@",
error.localizedDescription);
}
ARDAppClient *strongSelf = weakSelf;
[strongSelf.iceServers addObjectsFromArray:turnServers];
strongSelf.isTurnComplete = YES;
[strongSelf startSignalingIfReady];
}];
// Join room on room server.
[_roomServerClient joinRoomWithRoomId:roomId
isLoopback:isLoopback
completionHandler:^(ARDJoinResponse response, NSError error) {
ARDAppClient *strongSelf = weakSelf;
if (error) {
[strongSelf.delegate appClient:strongSelf didError:error];
return;
}
NSError *joinError =
[[strongSelf class] errorForJoinResultType:response.result];
if (joinError) {
RTCLogError(@"Failed to join room:%@ on room server.", roomId);
[strongSelf disconnect];
[strongSelf.delegate appClient:strongSelf didError:joinError];
return;
}
RTCLog(@"Joined room:%@ on room server.", roomId);
strongSelf.roomId = response.roomId;
strongSelf.clientId = response.clientId;
strongSelf.isInitiator = response.isInitiator;
for (ARDSignalingMessage *message in response.messages) {
if (message.type == kARDSignalingMessageTypeOffer ||
message.type == kARDSignalingMessageTypeAnswer) {
strongSelf.hasReceivedSdp = YES;
[strongSelf.messageQueue insertObject:message atIndex:0];
} else {
[strongSelf.messageQueue addObject:message];
}
}
strongSelf.webSocketURL = response.webSocketURL;
strongSelf.webSocketRestURL = response.webSocketRestURL;
[strongSelf registerWithColliderIfReady];
[strongSelf startSignalingIfReady];
}];
}
4. Manage Local Video Track.
- (RTCVideoTrack *)createLocalVideoTrack {
RTCVideoTrack* localVideoTrack = nil;
// The iOS simulator doesn't provide any sort of camera capture
// support or emulation (http://goo.gl/rHAnC1) so don't bother
// trying to open a local stream.
#if !TARGET_IPHONE_SIMULATOR
if (!_isAudioOnly) {
RTCVideoSource *source = [_factory videoSource];
RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
[_delegate appClient:self didCreateLocalCapturer:capturer];
localVideoTrack =
[_factory videoTrackWithSource:source
trackId:kARDVideoTrackId];
}
5.Start Camera Capture.
- (void)startCapture {
AVCaptureDevicePosition position =
_usingFrontCamera ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack;
AVCaptureDevice *device = [self findDeviceForPosition:position];
AVCaptureDeviceFormat *format = [self selectFormatForDevice:device];
int fps = [self selectFpsForFormat:format];
[_capturer startCaptureWithDevice:device format:format fps:fps];
}
6
- (void)stopCapture {
[_capturer stopCapture];
}
7.Switch camera.
- (void)switchCamera {
usingFrontCamera = !usingFrontCamera;
[self startCapture];
}
Are you planning to implement WebRTC in IOS apps? Get in touch Oodles technologies to experience excellent services in web application development and WebRTC Software development services at best prices. Our experts offer best WebRTC solutions to businesses across the globe. Contact us now for complete details.
Cookies are important to the proper functioning of a site. To improve your experience, we use cookies to remember log-in details and provide secure log-in, collect statistics to optimize site functionality, and deliver content tailored to your interests. Click Agree and Proceed to accept cookies and go directly to the site or click on View Cookie Settings to see detailed descriptions of the types of cookies and choose whether to accept certain cookies while on the site.
About Author
Sumit Chahar
Sumit Chahar is working as an IOS Developer.He is very dedicated towards his work.