Live Streaming Case Study: Focus Groups

Live Streaming Case Study: Focus Groups

I've been working with a local lawyer, who regularly hosts focus groups.

This client's needs are:

1. To stream privately to YouTube and switch between two cameras and a two camera multi-view.

2. Output a program feed to a client monitor.

3. He would like to record a local backup copy of the meetings as well, as he occasionally needs to go in and pull out one specific section of video.

After discovering that he has a fairly recent iMac to work with, my thoughts turned to mimoLive, my favorite Mac-based switching/streaming software.

The challenges:

1. He has two old Canon Vixia cameras that only output 1080i60 via HDMI. [We don't want to stream interlaced video, because computer monitors are progressive displays.]

2. Will his iMac be able to process two camera streams, two picture in picture effects, encode a live stream, output to a client monitor, and encode a file for recording––all at the same time?

3. Will the 100' HDMI cables the client already purchased and installed, be reliable?

To meet the first challenge of dealing with interlaced video signals, I contacted the Breidenbach Brothers of Boinx™, Oliver and Achim, and asked the following question:

"How does mimoLive deal with video sources that don't match the document? Let's say you have a 1080i60 video source, but mimoLive is set to 1080p30. What happens?"

If the document is 1080p30, every 1/30 of a second, a frame is rendered by the layer stack. If a Placer layer is supposed to display a source that is 1080i60, it will ask the source for the image that is available at the time the layer stack renders its frame. If you do not use the deinterlace filter, the source will provide the last field that it got from the device. The deinterlace filter will combine the last two interlace fields and provide a full frame to the layer stack.

Bottom line: mimoLive converts all the sources to the document resolution and frame rate when they are rendered in the layer stack.

Best regards,

Oliver.


So, the question now becomes; Will the iMac have enough processing power to deinterlace two 60i video streams while streaming, recording and playing out? After some testing, it became clear that this was asking too much of the machine. Time for some adjustments.

My first thought, to ease the burden, was to introduce an ATEM Television Studio HD switcher into the mix to handle the incoming video streams. That way, mimoLive would only have to deal with one video stream at a time. Though well intentioned, this turned out to be a flawed plan because of this client's requirement of two simultaneous picture-in-picture effects. All the ATEMs are limited to only one DVE effect!

Our second effort was to convert both incoming camera feeds to 1080p30, with Decimator MD-HXs. That way mimoLive doesn't have to do any frame rate conversions or deinterlacing. With this setup, we can switch both cameras, show two PinP effect simultaneously, stream and output to the client monitor via the SDI Playout. We are able to record as well, but we're dropping some frames. Trying to record to H.264 is more CPU intensive than recording to ProRes, but even with ProRes, the iMac isn't quite able to muster the necessary power to render all the frames to the hard drive.

But wait! mimoLive also offers a Full Screen Playout function. What if we use that instead of the SDI Playout? Another question for the Boinx Brothers™:

"Is there a difference in CPU/GPU requirements between Full Screen Playout and SDI Playout? In other words, does one require fewer resources than the other?"

If you can use an HDMI signal in your setup, the Full Screen Playout is way more efficient than the SDI Playout. Usually the HDMI video signal comes from the same graphics card in your Mac which also drives your main screen. Because mimoLive processes all the video data on the graphics card, the final video frame is stored here too. For the graphics card it doesn't matter if it needs to output the video frame on the main screen and additionally on the secondary HDMI port.

On the other hand, if you use the SDI Playout, the Mac needs to download the video image from graphics card into the CPU memory and process it here in order to push it to the hardware device which generates the SDI signal. This cost some CPU performance of course.

So why would one use the SDI Playout anyway? If your setup is working with SDI only, then you need the Mac to deliver an SDI signal. Also the Blackmagic Design products are capable of generating video signals with different frame rates. It could be that your Mac can't generate the needed frame rate with the HDMI port for your production environment.

-Achim


Very interesting. I will be testing this out for my client to see if we can free up enough CPU cycles to get a good backup recording, while streaming! I will follow up here :)

[Fade to black to indicate passage of time.]

I did end up switching to the Full Screen Playout, in lieu of the SDI Playout. A simple Mini DisplayPort to HDMI adapter did the trick for that. Definitely easier on the system! A couple of other problems came up in the mean time. 

Problem 1: One of the Canon Vixia's broke down. To replace it we picked up a Vixia HF R700. This one actually puts out 1080p60!

Problem 2: One of the 100' HDMI cables just quit. No signal at all. Then the second HDMI cable started blinking out intermittently. Being the genius I am, I seized the opportunity to recommend using SDI cables instead. I hope the reader learns a lesson here. Never use HDMI cables over 50' long. Just don't do it!

We used the Decimator's in the ceiling above the wall mounted cameras to convert the HDMI output to SDI, while simultaneously scaling to 1080p30.

In the 'control room', two AJA U-Taps happily accepted the SDI signals and converted them to USB 3 for ingest into mimoLive. I could have used a Blackmagic UltraStudio Mini Recorder instead, but we were short on open Thunderbolt ports!

The customer is happy with the way system is functioning; and it makes me happy as well. After all, my goal is to design a multi-camera system that does what you need––and just works!

I will leave you with the wiring schematic for your viewing pleasure.

8th Jun 2017 Jason Jenkins

Recent Posts