As I mentioned in my initial post, the one thing I really didn’t enjoy about Astro Hack Week was turning down so many potential participants. In order to give some of them a chance to participate remotely, and also to keep the lectures for future use, we decided to live-stream them. Aside from that, education is an integral part of the Moore-Sloan Data Science Environment at NYU, which Astro Hack Week was part of. Setting up a live-stream turns out to be both tricky and a considerable amount of work. Looking back on it, I think we could have gotten away with a somewhat simpler set-up, but I also wanted there to be the potential to use this for future workshops and talks at the Center for Data Science. This blog post will describe our set-up in some detail, in the hope of sparing you some of the research. I’d also like to point out here how instrumental the Digital Studio of the NYU Bobst Library was in making this happen. Without their advice and training, it would have been so much harder to pull off. The other thing that was an invaluable help was the friendly blog of the Thayer School of Engineering here.
Please note that I’m definitely not a video professional. All the information below is derived from my experiences in organizing the live stream for Astro Hack Week and sort-of-up-to-date as of October 2015 (when Astro Hack Week happened).
There are many ways to set up a live stream, from professional options (which would have cost $10k-$15k for 5 days of Astro Hack Week) to a webcam on top of a laptop. I wanted something that preferably cost less than the professional option, and was happy to make compromises, but I also wanted to make sure that anyone watching would be able to (1) see the slides being presented and (2) hear the speaker. The other, slightly less crucial constraint I imposed was that it had to be portable: since Astro Hack Week events were scheduled in three different locations, I had to be sure that I could quickly transport the entire gear without too much trouble. I think I found a reasonable middle ground, which I’m going to describe below.
There are two essential parts of live streaming: the technical set-up and the broadcasting. This blog post will be mostly about the technical set-up. New York University has its own streaming service called NYU Stream, and the Digital Studio helped me set up the streaming so that all I had to do was embed the video player into the Astro Hack Week website. If your university (and specifically the library) has a team that helps lecturers with digitizing their lectures, it’s worth asking them for advice and help!
The technical set-up itself requires several components. One decision to make is whether the camera will be pointed at the slides or whether you want to grab the slides directly from the presenter’s computer. The latter will look better and be more readable, but also somewhat more error-prone, because it relies on correct communication between the presenter’s computer, the projector, and the device used to divert the data sent to the screen to the capturing laptop. In practice, this sounds scarier than it is, which is to say it worked in all cases we’ve tried it.
Here’s what you’ll need (depending on your set-up + hardware, you might get away with leaving stuff out):
(1) a camera: we chose a high-end consumer camcorder (Canon Vixia HF20) instead of the professional camera the Digital Studio recommended to us. The primary reason was that I planned on not filming the slides, but grab them from the computer of the presenter directly. This means that it was less crucial to have a camera capable of dealing with a high dynamic range (=large brightness contrast). The other advantage of a professional camera is that it often (always?) has XLR inputs, an audio connector for high-end microphones. Since all I wanted was reasonably understandable audio, I figured this wasn’t essential. All in all, this brought down the cost for the camera from $2500 to $800.
(2) a Capture Card: this is a device that connects (usually via HDMI) to the camera. It reads the information recorded directly off the camera’s CCD and sends it to the encoding programme (the one that will likely do the broadcasting) in a format the software can use. Depending on your camera and encoding software, you might not need a capture card. However, be aware that in that case, you will *have* to be recording, which means you need to be sure your SD cards in the camera are large enough for your entire lecture (this would have been a challenge for the ~3 hours of near-continuous Astro Hack Week tutorials we ran). With a capture card, you don’t need the camera to actually record anything on the internal memory or SD card. All you need to do is turn on the camera, point it at the presenter and watch the video appear on your laptop! This means you’ll be able to use the camera longer without being connected to power (not recommended) and you don’t have to buy large SD cards for long lectures. We used a Blackmagic Intensity Shuttle (with Thunderbolt). This was the recommendation by the Digital Studio and worked really well, as in, it was essentially plug-and-play. An important note: when choosing a capture card, check with the software you’ll be using to do the encoding and broadcasting first (see below)! They have to be able to talk to each other, so make sure they do.
(3) a computer/laptop: if the set-up is permanent, a computer is fine. Since we carried around the equipment a lot, we used a laptop (mine, actually). Choose a laptop that’s fairly high-powered in terms of CPU and memory. You’ll need it. Especially if you intend to record to disk as well as live-stream at the same time, CPU power becomes a serious issue. If there aren’t enough computing resources, the frame rate will drop and the video will not be smooth. Do not skimp on this! We used a Mid-2014 13″ MacBook Pro with a 2.8 GHz Intel Core i5 and 16GB of RAM, and that managed to just about live-stream without recording and without me trying to do anything else with the laptop.
(4) a microphone for the presenter: this is absolutely essential, because depending on where you place your camera, the internal microphone does not pick up enough sound to be understandable. We bought an Azden WMS-Pro, which turned out to be an excellent choice for the price. It clipped onto the presenter’s shirt, and transmitted the signal wirelessly to a receiver that I connected to the camera Audio input. Note: make sure that your audio input matches the audio out on the microphone. We noticed soon after starting the live-stream that while the microphone did a terrific job of recording the speaker, questions from the audience were impossible to hear. It turned out that the camera itself allowed me to activate both external and internal microphone at the same time and mix relative amplitude, which improved the situation. For any similar workshop, I would probably buy a second (omnidirectional) microphone for the shoe on top of the camera. For the situation where we were in, a hand-held microphone connecting to the same receiver as the presenter’s mic would have been another option, though with the way discussions evolved, it would have been impossible to keep up handing around the hand-held microphone. I think a good omni-directional mic should do the trick.
(5) a frame grabber: This was the trickiest part of the set-up. This is essentially a device that connects to one of the video outputs (VGA, DVI or HDMI) of the presenter’s laptop as well as the live-streaming computer. We picked a Epiphan DVI2USB 3.0. This has the advantage that it allows for both DVI and VGA (and in some cases even HDMI) outputs. If you have a choice, choose DVI for better image quality. In addition, I bought VGA and DVI splitters (basically cables with one DVI or VGA input and two outputs), which, as the name suggests, splits the signal coming from the presenter’s laptop into two. One of these two signals goes into the projector as usual, the other goes into the game grabber for live-streaming. This somewhat degrades the quality of the image, but not enough to be really noticeable. In practice, connecting both is a bit finicky. You might need to figure out the magic procedure of connecting either the projector first or the frame grabber first. Don’t give up! Sometimes it only works on the fifth try. What worked surprisingly well with more modern laptops was to connect the projector via VGA (that’s all the projector allowed) and the frame grabber via HDMI at the same time. Note that the frame grabber needs to be supported by your streaming software, too! Check that this is true. I spent some time chatting with the people from Epiphan, who were very helpful in making sure this would all work the way I wanted it to. It’s also worth pointing out that Telestream Wirecast comes with an app that allows Wirecast to talk to the presenter’s laptop directly via WiFi or Bluetooth (I think). I decided against that because it seemed intrusive to ask presenters to install specialized software on their computers, and unfeasible for outside speakers that I might not have time to talk to much in advance of the tutorial.
(6) the encoding/broadcasting software: encoding and broadcasting are two separate taste, though you’ll probably do best with a software that does both. We chose Telestream Wirecast on recommendation by the Digital Studio, and though expensive, it is worth the money. For fairly straightforward things like live-streaming, it is very simple to use (I learned most of what I needed in ~a day). One thing I really wanted twas a kind of picture-in-picture, with the slides from the frame grabber filling the screen, and a smaller window in one corner showing the presenter. Wirecast allowed to do that very easily, made it simple to transition between e.g. camera-only and slides+camera. It was also really straightforward to add things like titles.
(7) an ethernet connection: this was recommended to me by the library, and I would recommend it to anyone trying to live stream. Depending on where you are, and how many people are there, the WiFi connection is just not reliable enough. Nothing worse than having the live stream break in the middle because the WiFi dropped out! Having an ethernet connection also allows for faster transmission, which helps minimize delays between recording and the live stream and helps with keeping the image smooth. That said, we did do a surprise live stream via WiFi, and it worked reasonably well. It’s just not something you can count on.
(8) a tripod: The camera needs to sit on something, preferably something large and visible and sturdy enough that people won’t knock it over. Useful fact: camcorders and photo cameras have the same connector, so that you can use the same tripod. The photo/video store really wanted to sell me an expensive video tripod, but I decided I didn’t see the advantage over my MeFoto RoadTrip Tripod. When considering a tripod, look at the weight it supports and make sure the supporting weight is more than your camera weighs (which should be true for any reasonable DSLR tripod and a consumer camcorder, though might not be true for professional cameras!).
(9) cables: lots and lots and lots of cables and connectors! Many of the devices listed above come with their own cables, but in many cases, they’ll also be too short! If at all possible, think in advance about (a) where the presenter+their laptop will be, (b) where the camera will be in relation to them, (c) where your ethernet port is, and (d) where the laptop doing the live-streaming will be. If you can, measure distances, but if you might move around your equipment a lot, just buy the longest cables you can find. They’re not that expensive. We bought a 100ft ethernet cable and made sure that all other cables were at least 10ft. The cables we bought for our set-up:
- a 100ft ethernet cable
- a 10ft HDMI cable (to connect camera + capture card)
- a 10ft VGA cable (to connect frame grabber to presenter’s laptop)
- a 10ft DVI cable (to connect frame grabber to presenter’s laptop)
- a 10ft HDMI cable (to connect frame grabber to presenter’s laptop)
- a VGA splitter
- a DVI splitter
- a Thunderbolt to Ethernet converter (for the MacBook Pro we used)
(10) feedback: This, I learned, is perhaps the most important part of your set-up. Giving the live stream viewers an opportunity to provide immediate feedback on how the live stream was doing helped significantly in trouble-shooting and improvements. We encouraged viewers to tweet both feedback and questions. Despite planning and trying it out ten times, stuff happens. Getting feedback via Twitter is how we discovered early on that the microphone wasn’t enough to hear questions from the room. We also learned very quickly whenever the live stream went down (this happened only twice, I think) before we could tell ourselves from the software. If possible, I’d always publicize a way to give feedback along with publicizing the live stream itself. It is invaluable for keeping things running smoothly and also to learn how to improve.
To set up, connect the microphone to the camera. Then connect the camera via HDMI to the capture card and the capture card via Thunderbolt or USB 3 to the laptop. In Wirecast, you should be able to import the Capture Card and if you turn on the camera, see the image from the CCD. Note: you don’t need to press recording. Just make sure you turn off the screen saver of your camera! Now connect the presenter’s laptop to the frame grabber. How you’ll do this depends on how the projector connects to the laptop and what kind of outputs the laptop has. The projector we used only supported VGA. If there were no other outputs on the laptop, I connected the VGA output of the laptop to the VGA splitter. Then I connected one output of the VGA splitter to the projector’s VGA cable and the other to a VGA cable going into the frame grabber. The frame grabber connects via USB3 to the laptop doing the broadcasting. Again, you should be able to import the frame grabber in Wirecast and then see the image. As mentioned above, this is the most fragile part of the whole set-up. Getting a laptop and a projector to talk to each other can be a fiddly task at best, and adding yet another component certainly does not make it easier. I’m happy to report it worked in all cases. At this point, the presenter’s laptop should be connected to yours via the frame grabber and the camera via the capture card. Make sure both the camera and the broadcasting computer are connected to power, and the computer is also connected to the ethernet.
That’s your basic set-up. I will not explain in detail here how Wirecast works (partly because I know how to do about five things in Wirecast). The encoding and broadcasting details depend on the service you use, which means said service should probably provide you with specific instructions. We used the NYU Stream service, which allowed us to embed the video directly on the Astro Hack Week website. It also allowed us to record the video on the NYU Stream server as we were broadcasting rather than saving to disk, which would probably have gotten us into trouble in terms of available CPU resources on the laptop we were using.
When live-streaming, it helps to monitor the live stream if you can. I used my normal headphones (Klipsch X10) to monitor sound and kept an eye on the video and the frame rate. Despite the temptation, don’t use the laptop for anything else while live streaming! Anything that uses resources will cause the frame rate to drop and degrade the quality of the broadcast.
And that’s it! If you have any questions, comments or suggestions, please feel free to comment or send me a message!
PS: The recordings of the Astro Hack Week Live Stream are here.