WebRTC on Android: how to enable hardware encoding on multiple devices

Ivan Dyatlov
Bumble Tech
Published in
7 min readJun 16, 2020

--

For video calls on Badoo and Bumble apps, we use WebRTC with H.264 codec. Relying on relevant documentation, you would think that this codec should work seamlessly on any Android device, starting with Android 5.0. However, in practice, it didn’t go exactly this way. This article covers implementation features of hardware encoding for H.264 codec in WebRTC and the ways of enabling it on multiple devices.

Why H.264?

When connected through WebRTC all devices participating in the session transmit different communication parameters, including video- and audio-codecs. If a device supports multiple codecs (e.g. VP8 and H.264), the codecs with the highest priority for the platform are listed first. These codecs are used in the negotiation stage in WebRTC, after which only the codecs supported by all devices are left. See this document for a detailed example of such data.

For video calls, if one of the devices does not support H.264 codec, then both devices can switch to, for example, VP8 codec, which does not depend on the device hardware implementation. However, our application works on different devices, including previous-generations smartphones. This is why we wanted to use hardware encoding for video communication, where possible because it reduces the load on the processor and consumes less battery power, which is essential for legacy devices. Also, H.264 hardware encoding is supported on lots of devices, in contrast to VP8 mentioned above.

H.264 support on Android

According to the description of the support for multimedia formats, H.264 Baseline Profile decoding should work on all Android devices, while encoding should work on devices from Android 3.0 onwards. In Badoo and Bumble, we support devices starting with Android 5.0. Thus, we should not have encountered any problems. But it turned out not to be that simple, because we discovered lots of particularities in the devices, even with the 5.x version.

What might be the reason?

As we know, when developing a new Android device any manufacturer should undergo a Compatibility Test Suite. It runs on a PC which is connected to the device, and the results need to be sent to Google for the confirmation that the device complies with Android OS requirements. Only then can the device be launched onto the market.

As far as this suite is concerned, we are interested in multimedia tests and, more specifically, in tests for video encoding and decoding. I’ve chosen EncodeDecodeTest, MediaCodecTest, DecoderTest and EncoderTest, as they are included in all Android versions starting with 4.3. The SLOC count in these tests looks as follows:

Before version 4.3 most of these tests just did not exist, while for versions 5 and 7 their number has grown considerably. For this reason, it’s safe to say that before Android 4.3 Google was not testing the compliance of the devices with its own specification either for video encoding and decoding, and only starting implementing this check-in version 5.0.

One would think, then, that since 5.0 everything should have been fine with encoding. However, in the light of my previous experience with video streaming decoding on Android, this was not the case. The number of topics on encoding listed in the Google Group discuss-webrtc was proof enough.

In our search for hidden pitfalls, we relied on the open-sourced WebRTC code. Let’s have a closer look at it.

H.264 support in WebRTC

Let’s start with HardwareVideoEncoderFactory.

Here we see the method with a self-explanatory name isHardwareSupportedInCurrentSdkH264:

As we can see, hardware encoding support on Android is enabled only for Qualcomm and Exynos chipsets. So why is it that the standard WebRTC implementation does not support other chipsets? Most probably, it can be explained by the peculiarities of hardware codecs implementation on the part of different manufacturers. These peculiarities often can be revealed only at the production stage, as it is not always possible to find a particular device.

All codec descriptions are stored in media_codecs.xml file. For example, here is this file for Pixel XL and for HUAWEI P8 lite. When receiving the list of codecs by using the method getCodecInfos() of MediaCodecList object this file is parsed and the codecs it has stored are returned. This process, and the accuracy of the manufacturer’s filling in this file, are covered in CTS by MediaCodecListTest, of which the SLOC count has increased from 160 in Android 4.3 to 740 in Android 10.

In Badoo we changed the method code isHardwareSupportedInCurrentSdkH264 and replaced the allowlist of codecs with the blocklist of software codecs’ prefixes listed in WebRTC:

static final String[] SOFTWARE_IMPLEMENTATION_PREFIXES = {“OMX.google.”, “OMX.SEC.”};

But one does not simply implement the support of all codecs without taking into consideration the peculiarities of manufacturing. It’s apparent from the names of topics from discuss-webrtc, which are devoted to hardware encoding on Android, that something would be bound to go wrong. In most cases, the errors appear during the codec configuration stage.

Codec configuration parameters

Codec initialization for encoding looks as follows:

It’s easy to make a mistake in some of these parameters, which will lead to an exception during codec configuration and so disrupt the functioning of the application. One might also need to adjust the bitrate depending on different factors, as the codec itself won’t do it properly. In WebRTC this task is performed by BaseBitrateAdjuster, which has two subclasses:

Thus, we need to choose different means of bitrate adjustment for each codec. Let’s look closer at the specifics of setting initialization parameters for hardware codecs.

Stream resolution

After receiving the object MediaCodecInfo for each codec we can explore the codec in more detail by acquiring information about its capabilities in CodecCapabilities class. In this way, we can find out whether this codec supports the selected resolution and framerate or not. If it does support these parameters, they can be safely set.

However, in some cases, this rule does not apply. We’ve discovered that codecs with “OMX.MARVELL.” prefix were encoded wrongly leading to the appearance of green strips at the screen edges in certain instances, when stream resolution differed from 4:3, while the codec itself claimed that the selected resolution and framerate were supported.

Bitrate mode

Standard mode for all videocodecs is constant bitrate. However, on one occasion we had to use variable bitrate:

format.setInteger(MediaFormat.KEY_BITRATE_MODE, VIDEO_ControlRateVariable);

It happened on the Lenovo A1000 device with chipset from Spreadtrum (now known as Unisoc) beginning with “OMX.sprd.” An Internet search led us to a six-year-old post about Firefox OS, in which both the problem and the solution were described.

Color format

A proper format needs to be chosen when using codec in bytebuffer mode. This is usually it is performed with the help of the following function:

Roughly speaking, we should always choose the first of the supported color formats.

However, for HUAWEI codecs beginning with “OMX.IMG.TOPAZ.”, “OMX.hisi.” and “OMX.k3.” prefixes it did not work. After a long search we’ve found the solution: regardless of the format returned by these codecs, COLOR_FormatYUV420SemiPlanar format should be used. A thread on one Chinese forum helped us to puzzle this out.

Bitrate adjustment

Standard WebRTC code contains the following:

As can be seen from this code, bitrate adjustment is disabled for all chipsets apart from Exynos. However, this applies only to Qualcomm, as standard code supports only Exynos and Qualcomm. After playing around for a while with different values of this setting and performing some Internet searches we found out that it should also be enabled for codecs with “OMX.MTK.” prefixes. In addition, this should be done for HUAWEI codecs with “OMX.IMG.TOPAZ.”, “OMX.hisi.” and “OMX.k3.” prefixes. The reason for this is related to the fact that these codecs do not use frame timestamps for bitrate adjustment because they assume that all frames arrive with the same rate which was set initially during codec configuration.

In conclusion, let’s view the list of encoders we received for Android 5.0 and 5.1. They were of interest to us, primarily, because on the later Android versions things are improving and the number of nontypical codecs is reducing.

This is shown in the graph below. To show rare cases more distinctively a logarithmic scale is used.

As we can see, most devices have Spreadtrum, MediaTek, HUAWEI and MARVELL chipsets, and that’s why our changes helped to enable hardware encoding on these devices.

Conclusion

Although we predicted that on some devices there would be problems when working with H.264, Android managed to surprise us once again. As we can see from our user statistics there are still plenty of devices in use manufactured from 2014–2016, which they either won’t or can’t update. Though the situation with OS updates for new Android devices is now much better now than it was a few years ago, the percentage of previous generation devices still in use is decreasing pretty slowly, so we’re clearly going to have to support them for a long time to come.

Nowadays, WebRTC is actively being developed by Google as it’s being used in the Stadia project (here’s the detailed video about it). For this reason, it will continue to improve and most likely, will become a standard for video call implementation. I hope this article will help you understand the specifics of working with H.264 and WebRTC and that this knowledge will come in useful in your own projects.

--

--