Der Blog, der Technologie erlebbar macht.

Der Blog, der Technologie erlebbar macht.

Technical challenges Wings for Life World Run 2014

Felix and Nathan from #netacad Team at #worldrun

How does the Wings for Life World Run team masters all the technical challenges regarding such a huge event?

Global Race Control Center - Datacenter Setup
Global Race Control Center – Datacenter

Basically all the competing countries use a satellite up-link where they send one HD video stream from all over the world to Spielberg, Austria. So there are a lot of antennas outside the Wings for Life World Run Control Center.

All those antennas are connected with a fiber channel link to the main datacenter. The datacenter is set up in a box for a racing car which looks like the picture on the right side. In the background you can see the network and system administrators sitting and observing/managing the network. On the right side are the racks that contain a big part of the network and storage used for all the video streams.

Let’s take a closer look at the datacenter setup.

W4Life Run Datacenter closer look
Global Race Control Center – Datacenter view

In the first rack there are all the administration / control parts of the setup.

As you can see the second rack, it contains a lot of the networking devices and also the storage arrays.

The third rack contains a lot of video calculation power and the fourth rack has a lot of ASIC components which are decoding the incoming HD video streams and storing them onto the storage arrays.

W4Life Run - Datacenter 7,4 km of copper cable
Global  Race Control Center – 7.4 km of copper cable

@Network Redundancy – The whole setup is redundant – which means the have two 500 MBit/s  Synchronous Internet connection. One connection leaves the building in the opposite direction of the other internet connection. The Internet Connection is used for social media, webpages, small clips and media purposes. The main TV-Stream is NOT transferred via this connection.

There is 7,4 km of copper cable used at the Global Race Control Center and there are over 700 switchports used for copper/fiber.

When we take a look behind the datacenter picture above we have this view as seen on the picture on the right side.

WFL - Data Center satellite antenna park
Data Center satellite antenna park

@34 HD Streams – the satellite antennas terminate here in the datacenter and the video is decoded directly by hardware ASIC components in the fourth rack and then transmitted to the storage arrays. The streams are MPEG encoded and use a strong compression, but they have almost no quality loss. The traffic produced is about 15Mbit/s per stream, so we estimate that the permanent traffic on the satellite connections is about 550 MBit/s incoming traffic. The stream is uncompressed and results in about 50MBit/s per video stream hence 220 MByte/s.

@Storage – there are 8 x 50 TB storage arrays. The overall storage is 400 TB only for storing the 34 HD Video streams from all over the world and cutting them locally to produce small clips. The storage arrays are redundantly connected to the system.

@Video Cutting Performance – There are several servers. We received the information that the severs have an Intel 8 x 3.6 GHz processor and about 48GB RAM. With this data and considering Hyper-threading we estimate an overall x86_64 CPU speed of 700 GHz and 600 GB RAM. Those servers are used for cutting together the streams in real time and also offline.

@Video Cutting Workstations – the video cutting of all those streams is performed via a thin-client like architecture. The workstations used for that are connected via 1 GBit/s fiber to the datacenter. There are several servers used to calculate and encode all those streams.

@Backup – They do not host a second datacenter like this, but all the streams have a decentralized  backup in every country. There is enough redundancy in this setup.

@Connection lost?

What happens when the connection to a certain country or car is lost? There is a local storage in every car which stores all the recorded video. When the transmission stops for some reason and is later resumed, the previous videos won’t be sent to the datacenter – only the current live stream. The other videos can be used after the event when transferred from the local storage. It’s the same with the video stream that is uploaded from the country to the satellite. The technology used to transmit the HD streams locally is Digital Video Broadcasting – DVB. There is a minimum of 6-8 hops before you can see the video at home.

The things we found out only describe the setup in the datacenter. There is also a huge truck where the live-stream director/producer is sitting. They have local video calculation power but they also rely on the storage in the datacenter.

For further questions, write us an e-mail to netacad.worldrun@gmail.com

Like this article?