Archive for ‘Video’ Category
Video »

NAB2014 Report

datePosted on 15:11, April 18th, 2014 by Many Ayromlou

DEKTEC: DekTec introduced the DTA-2180 low profile PCIe H.264 encoder. The DTA-2180 is a low latency — 150 to 600 ms — H.264 hardware encoder based on the Magnum chipset. It supports MPEG-2 and H.264 and up to 16 channels of audio. Audio can be encoded as AC-3, AAC or MPEG-1 Layer 2. The DTA-2180 offers a 10 bit 4:2:2 option for contribution encoding.The DTA-2180 has a 3G –SDI and HDMI input and an ASI output. The compressed stream output — TS encapsulated H.264 or mpeg-2 — is also available on the PCIe for real time streaming, processing and recording.1

NIMBUS: The WiMi6400T and WiMi6400R provides high quality Full HD encoding/decoding function with low latency of 40ms for encoding and decoding, each. It supports wide range of encoding rate from 1Mbps ~ 30Mbps for the high quality video for video broadcasting. WiMi6400T provides RTSP streaming server functionality. WiMi6400T also can be used as an real-time MPEG-2 TS/UDP streaming server with linear PCM audio for IPTV network. It supports one-to-many multicasting function over Ethernet LAN or IP network. So, there is no restriction on the numbers of receiver in Ethernet LAN or IP networks.2

VIOLIN MEMORY: Violin Memory’s 6000 Series flash Memory Arrays are all-silicon shared storage systems built from the ground up, harnessing the power of flash memory and delivering industry-leading performance and ultra-low data access latencies. A single 3U array delivers more than 1 million IOPS with consistent, spike-free latencies in microseconds. Violin Memory is uniquely positioned to deliver flash memory systems that can compete with performance disk from a cost for raw capacity perspective, even before taking into account the potential benefits of features like deduplication. This is possible because 6000 Series flash Memory Arrays are purpose built with flash components sourced through Violin Memory’s unique and strategic alliance with industry leader Toshiba. The core of the 6000 is the Flash Memory Fabric. The Flash Memory Fabric is a resilient, highly available deep mesh of thousands of flash dies that work in concert to continuously optimize performance, latency, and longevity. All of the active components of the Flash Memory Fabric are hot-swappable for enterprise grade reliability and serviceability. 6000 Series flash Memory Arrays connect natively to existing 8Gb/s Fibre Channel, 10GE iSCSI, and 40Gb/s Infiniband network infrastructures.3

TOSHIBA: ExaEdge™ by Toshiba is a next generation SSD-based edge streaming server with extra low power consumption. It allows you to stream large numbers of concurrent high quality video streaming sessions with low host CPU and memory resource utilization. ExaEdge™ adopts Toshiba’s NPEngine™, the world’s first direct SSD-to-IP embedded hardware technology. ExaEdge™ ExaEdge offers direct storage access from SSD as an embedded hardware solution, in 2RU compact-size server. The resulting performance is capable of sending up to 64,000 simultaneous sessions with the total host CPU usage at less than 12%. Modern video distribution over IP, like OTT streaming, leverage the existing HTTP-based caching functionalities. Unlike the traditional IPTV network which is basically adopting specialized network architectures, in adaptive bitrate scenarios HTTP chunks can be cached by traditional cache server at the edge to be then redistributed with lower latency.4 5

NHK: NHK was at NAB this week, quietly showing off footage shot with a Super Hi-Vision 8K camera, affectionately known as the Cube. The Cube camera is surprisingly compact  at 2 kg, since, it records to one of the only 8K HEVC real-time encoders in the world. It’s essentially a housing where the mammoth sensor and lens mount live, along with necessary connections. But even though it’s a simple design, it delivers an amazing resolution of 7680 x 4320 pixels. 8K is a great format that could rival IMAX and be excellent for big events that can be beamed around the world and give spectators who can’t make an event the opportunity to experience it in a way that all other formats before it could only dream to do. And NHK is planning on broadcasting the 2016 Summer Olympics in Rio in 8K.6 7

4EVER: 4Ever showed demos at NAB 2014 of MPEG DASH. The DASH demo featured a way to deliver content that’s adaptive, bit-rate streaming. It has four different HEVC encodes of original 4K content that it encoded at several bit rates, including a 14.5 and 11.5 Mbps for 4K content, 5.8 and 3.7 Mbps for a 1080 version, and a 720 version of that, which can stream at 2.9 or 1.8 megabits per second. The monitor runs a Chrome browser with HTML5 support which can only show a 4K/30 frame image. To show adaptive streaming, they randomly switched from one bit stream to the other, showing this data on the monitor.  The changes were seamless, but you do see a change in picture quality.8

VISION 3 IMAGING: Vision III Imaging demonstrated 4K 60p parallax scanned imagery and its Real Shot™ parallax induction technology. Parallax scanning is a technique for capturing three-dimensional depth information over time using one  camera and one lens. V3 imagery can be displayed on a standard display without 3D glasses or special screens. Real Shot is a parallax induction technique that also embeds three-dimensional parallax information into Internet or mobile digital advertising. Parallax scanning is accomplished using a digital parallax scanner (DPS). The DPS is a moving iris mechanism that is inserted into the optical path of a lens. When the iris is moved off the center of the lens, it records a different point of view at the plane of focus. The DPS iris scans in a circle around the center of the lens, making it possible to capture 360° of parallax information using a single lens.

RENEWED VISION: With its new Multiple Screen functionality, ProVideoPlayer 2 ($999) makes it easier than ever to create multi-screen presentations from a single computer with support for multiple graphics cards and easy mapping within each card and across multiple cards. Users can also add external graphics processors to each one of these graphics card outputs for even more screens, as well as add outputs that are not yet connected to a physical output, allowing shows to be pre-built off-site prior to the event. PVP 2 supports Multiple Layers, which afford the flexibility to create unique looks and allow the user to take full advantage of multiple screens. A layer is merely a video channel, so multiple layers are also great for a single screen environment where layering, textures, or PIPs are desired.10

THUNDERBOLT 2 Mobile 4K Workflow: HP showing 4K real-time streaming off a BMDC and 4K real-time playback from thunderbolt 2 little big disk all run through HP’s new Z series Laptops on a 21:9 screen.11

SILICON POWER: Silicon Power Thunder T11 is not only the lightest but also the smallest Thunderbolt™ SSD on the market. Featuring extremely small and featherweight design, Thunder T11 is half the size of ordinary storage devices and only weights 65g. Silicon Power’s Thunder T11, which enhances storage solution with Thunderbolt™ SuperSpeed I/O technology, is three times the speed of USB 3.0 HDD and delivers transfer rates up to Read/Write 380MB/340MB/sec.12

360HEROS: 360 degree shooting Hexacopter using 3-D printed Go-Pro3 mounts.13

ERICSSON: Showing 100 Mb/s (4x25Mb/s) live UHDTV broadcast using DVB-S2 extensions to broadcast true 4Kp60 over the air.35

LACIE: The LaCie 8Big Rack is the company’s first Thunderbolt 2 rackmount storage solution, featuring up to eight 6TB 7200RPM hard drives and delivering speeds of up to 1330 MB/s. The 8big Rack also features easy access to components and tool-free maintenance of the included power supplies units, fans, and disks, all while offering a cooling system with three fans that conducts heat away from vital components. The 8big Rack will be offered in 4-disk (12TB) or 8-disk (24TB and 48TB) configurations.16

SKYPE: Skype has been an essential tool in the production of podcasts and newscasts for years, and today Microsoft has announced a professional-grade version of the app designed specifically for the media industry. It’s called Skype TX and is intended to be used in studio environments; you won’t be using this to record a podcast in your bedroom. Skype TX is described as an “easy-to-use hardware and software combination that allows Skype video calls from anywhere in the world to be seamlessly integrated into any production.” It plays nice with industry standards by outputting calls in full-frame HD-SDI formats.

LIVESTREAM: Livestream announced a pair of production switchers: the HD510 and HD1710. The HD510 is a portable version with an integrated touch display, yet it’s still full featured with 5 SDI inputs. The rack mounted HD1710 is at the other end of the spectrum. It features up to 17 inputs and can drive 4 displays. They also announced Livestream Studio Control Surface a modular control surface with 5 assignable tracks, T-Bar and audio mixer and USB connection to Livestream Studio.19

AJA: CION™ is the new 4K/UHD and 2K/HD production camera from AJA. Record directly to Apple ProRes 422 and 444 at up to 4K 60fps or output AJA Raw at up to 4K 120fps.20

DIGITAL BOLEX: Digital Bolexs’ new monochrome 16mm camera, dubbed the D16M, has the same form factor as the original D16, but there’s a significant change under the hood. D16M sports a native black and white sensor for highest quality monochromatic capture without the need to debayer, retaining a higher sensativity to light and preserving the full dynamic range of the sensor.

Here are the technical specs:

  • Kodak native monochrome sensor
  • Same resolution options as D16: Super 16mm (2K), 16mm (HD), and Super 8 (720p)
  • No OPLF filter to further maximize fine details
  • ISO 100, 200, 400, 800
  • 500GB Hard Drive21 22



BLACKMAGIC: The new Blackmagic 4K URSA camera is weird, featuring a 4K Super 35mm global shutter sensor, real camera form factor, a built-in 10.1″ 1920 x 1200 fold out display, and two 5” 800 x 480 displays. Not only that, but it has both interchangeable lenses and sensors, meaning you’ll be able to upgrade to a better sensor at home removing a few screws when a better one is available. Here are the specs: 

  • 21.12mm x 11.88mm — Super 35mm Global Shutter 4K CMOS Sensor (Probably the same as current Blackmagic Production Camera 4K)
  • Interchangeable Lens Block
  • 3840 x 2160 — 24/25/60fps
  • 1920 x 1080 — 24/25/30/50/60fps
  • ProRes HQ and Lossless Compressed RAW
  • 12 Stops Dynamic Range
  • EF/PL/B4/ or No Mount
  • Two CFast 2.0 Slots for Media Recording
  • 1 x 10.1” 1920 x 1200 Fold Out Non-Touch Screen
  • 2 x 5” 800 x 480 Touch Screens
  • SDI Video Output: 1 x 12G-SDI 10-bit 4:2:2. 1 x 3G-SDI down converted for external monitoring
  • SDI Video Input: 1 x 12G-SDI
  • Ref Input: 1 x Reference Input
  • Timecode In/Out
  • 2 XLR Inputs
  • 2 SDI Audio Out
  • Headphone Jack
  • 1 x 2.5mm LANC for Rec Start/Stop, Iris Control and Focus.
  • Power: 12V 4-pin XLR In/Out (Can take battery plates for Gold Mount and V Mount)
  • Availability: July?
  • Price: $6,000 for EF, $6,500 for PL23 24

Blackmagic also seeks entry into the broadcast-camera market with its newly announced Studio Camera, available in Full HD and 4K (Ultra HD) models. Designed for live broadcast applications, the Blackmagic Studio Camera sports a unique design with a massive 10″ LCD screen, built-in 4 hour battery, and a set of features you’d expect to see in large studio cameras, such as built-in talkback and tally indicators. Intended to meet the needs of a variety of live broadcast applications, the Blackmagic Studio Camera provides the connections required to fit into those environments. Connections include SDI (3G on the HD version and 12G on the 4K version) and optical fiber video inputs/outputs, XLR audio connections, reference, LANC remote control, and a 4-pin XLR power input. The camera features an active Micro Four Thirds lens mount that is compatible with a wide range of lenses via third-party adapters, opening the door for the use of common DSLR lenses to PL-mount cinema lenses, and even B4 ENG lenses.25

SOLOSHOT: The surprisingly affordable soloshot 2 ($399) will follow a tracker that someone can wear or you can slap on something so you don’t have to do a thing. Put on the tracker, set up your camera with SOLOSHOT 2, and catch a wave with the perfect video. It features vertical tracking, automatic zoom, and the kit even includes a tripod for you to get started. It’s got a range up to 2,000 feet and 360 degree horizontal tracking.26


BRUSHLESSGIMBALS: Gimbi is a lightweight, easy to carry, simple to use, power-and-go, 2 axis handheld brushless gimbal for the GoPro. With Gimbi™, you can shoot videos and photos as smooth as the pros.
Key Features
– Adjustable cellphone stand permits use of cellphone as monitor
– Super-smooth tilt control with thumb pad (Controllable pitch 90 degrees)
– Increased auto leveling accuracy and battery efficiency due to built-in brushless motor encoders.
– 2 hour use time on one charge
– Includes four rechargeable batteries and battery charger27


JIGABOT: Jigabot’s AIMe is a pill-shaped tripod mount that automatically follows your subject—keeping it in frame—in case you’re shooting video by yourself. It uses infrared markers and swivels and tilts using complex algorithms powered by a quad-core ARM processor.28


CEREVO: Crevos’ LiveWedge ($999) provides easy control via smartphone/tablet app. The rotary control unique to the app enables slow transition, which is more difficult with a physical T-Bar. LiveWedge supports PiP and chroma key as well as all the basic transitions such as wipe, fade, cut and etc. Livewedge has a SD card slot and users can record 1080/30p (H.264) Full HD Video on it while switching! You can also use videos and images from the SD card as the video source. Streaming is built into LiveWedge. 720/30p HD Live streaming and 1080p HD video switching are available in one device! Supported streaming platforms include Ustream, Youtube Live and your own servers are all supported.29


PESA: PESA showed their brand new Xstream Live Streaming mobile solution, co-developed by Ryerson students. They also received the NewBay Media Best of Show Award at NAB.


COMREX: Comrex LiveShot™ delivers live video over a range of IP networks. LiveShot is used by TV stations and networks to deliver high quality, low latency (200ms) video from anywhere Internet access is available. LiveShot is especially optimized to perform well on challenging IP networks like 3G, 4G and satellite links. For optimal video quality, LiveShot encodes with H.264 HIGH profile. In addition to standard AAC audio coding, LiveShot utilizes HE-AAC and AAC-ELD audio coding, both reducing network bandwidth and lowering delay. LiveShot can encode and decode an audio/video stream with less than 200mS delay. LiveShot delivers full-duplex video and stereo audio between the field portable and studio rackmount systems. In addition, a full-duplex cue channel is available between the portable and studio units. On the portable, the return audio/video channel is delivered via output connectors. The cue channel is accessible on the portable via wired headset or Bluetooth audio to a wireless headset30


PANASONIC: The Lumix GH4 camera body and its 16MP CMOS Micro Four Thirds sensor will cost $1700, while the optional YAGH pro audio/video interface unit is available for an extra $2,000. The GH4 can shoot 4K at 30/25/24fps at 100Mbps using ALL-Intra compression. At 1080p that rises way beyond broadcast standard to 200Mbps. There are two 4K formats available too: the standard 3840 x 2160 resolution at 30/25/24p, or the cinema widescreen 4096 x 2160 resolution available at 24p only. When writing to SD card the camera captures 4K video with 8-bit colour and the data rate is limited to 100Mbps. Use an optional accessory – the Panasonic DMW-YAGH, which is about as big as the GH4 body – and its four SDI ports that can be used in tandem to extract uncompressed 4K at 10-bit colour. Power input, independent volume adjustment and twin XLR sockets ensure everything a broadcast pro is here – but only via the DMW-YAGH.31

The HX-A500 shoots a resolution of 3840×2160; so ultra HD. Sub 4K resolutions include 1080 up to 50p, and 720 up to 100p. Un surprisingly it shoots to an MPEG-4 AVC/H.264 codec in an .mp4 wrapper.

The camera has a perhaps slightly disappointing variable bit rate, half that of the GoPro Hero 3+. Here’s the breakdown:

  • 3840×2160/25p (Max. 72Mbps / VBR)
  • 1920×1080/50p (Max. 28Mbps / VBR)
  • 1920×1080/25p (Average 15Mbps / VBR)
  • 1280×720/50p (Average 15Mbps / VBR)
  • 1280×720/25p (Average 9Mbps / VBR)

The camera has a fixed focal, fixed f/2.8 aperture lens. It has a few different white balance presets including Auto / Indoor1 / Indoor2 / Sunny / Cloudy / White set. The shutter is listed as variable, from 1/25th-1/12000. The HX-A500 has an in-built image stabilizer, with an angle of view currently listed as only 160°.32


JVC: JVC has now also entered the large sensor market. And that this intriguing little camera covers super35mm on an MFT mount. In terms of specs the JVC GY-LSX2 has some really intriguing figures to offer. Not only is it very small and looks very ergonomic to handle, but it offers 4K with frame rates up to 30p as well as a slow motion feature at 2K resolution that will go up to 240fps. The footage is being recorded internally with an h.264 kind of codec. The JVC GY-LSX2 is announced with a price point “under $6000″ and to come at the end of 2014.33


The bigger brother, called GY-LSX1 will feature a higher framerate (60p) at 4K resolution, offer a shoulder-mount form factor and seems to come in at around twice the price of the small one.34


That’s it for now……This years buzz words: 4K, UHDTV, HEVC, H.265, OTT (Over The Top)….see you all next year :-)

Streaming 1080P video using Raspberry Pi (or BeagleBone Black)

datePosted on 21:24, November 9th, 2013 by Many Ayromlou

I’ve finally got this project to a point were I can do a write up on it. The following hardware is needed:

  1. Raspberry Pi 512K version (or BeagleBone Black)
  2. Logitech C920 Webcam
  3. 16 GB micro SDHC card (can probably do it on 8GB too)
  4. Wireless dongle supported by linux (I’m using a TrendNet TEW-645UB which was pretty much plug and play)

The goal of this project is to get the following installed and configured:

  1. CRTMP streaming server
  2. C920 install and config (v4l2), ffmpeg installation, boneCV installation from Derek Molloy’s site
  3. configuring ddclient for dynamic DNS (optional)
  4. putting it all together and creating a webpage with embedded JWplayer to view the stream

UPDATE: Sound works now on BeagleBone Black. On Raspberry you will run into alsa buffer xruns. See below for updated streamVideoRTSP script.

What I still need to figure out is the sound off the camera. At the moment I got buttery smooth 1080P video off the Pi (on wired or wireless connection) running at 5Mb/s but the sound is yet to come.

0) Preparation:

So to prepare you need to get linux installed on your Pi or BBB (BeagleBone Black). I used the latest raspbian for the PI and BeageBone Black Ubuntu Raring 13.04 for BBB. Get it installed onto your SD card. If you use a larger than 8GB SD card you can follow the procedure below to expand the partition from 8GB to whatever your SD card can hold (mine is a 16GB card) (NOTE: almost all commands need to be executed as root so do a sudo -i to start with):

  • Use fdisk to see the partition table
    root@debian-armhf:/# fdisk /dev/mmcblk0
    Command (m for help): p
    Disk /dev/mmcblk0: 3947 MB, 3947888640 bytes
    4 heads, 16 sectors/track, 120480 cylinders, total 7710720 sectors
    Units = sectors of 1 * 512 = 512 bytes
    Sector size (logical/physical): 512 bytes / 512 bytes
    I/O size (minimum/optimal): 512 bytes / 512 bytes
    Disk identifier: 0x80000000
            Device Boot      Start         End      Blocks   Id  System
    /dev/mmcblk0p1   *        2048        4095        1024    1  FAT12
    /dev/mmcblk0p2            4096     3751935     1873920   83  Linux
  • In this case we’re expanding partition 2 by first deleting it and without writing the partition table recreating it to span the entire disk (and then writing the new partition table to SD card). This in effect expands the partition. We will expand the filesystem after reboot.
    Command (m for help): d
    Partition number (1-4): 2
    Command (m for help): p
    Disk /dev/mmcblk0: 3947 MB, 3947888640 bytes
    4 heads, 16 sectors/track, 120480 cylinders, total 7710720 sectors
    Units = sectors of 1 * 512 = 512 bytes
    Sector size (logical/physical): 512 bytes / 512 bytes
    I/O size (minimum/optimal): 512 bytes / 512 bytes
    Disk identifier: 0x80000000
            Device Boot      Start         End      Blocks   Id  System
    /dev/mmcblk0p1   *        2048        4095        1024    1  FAT12
    Command (m for help): 
    Command (m for help): n
    Partition type:
       p   primary (1 primary, 0 extended, 3 free)
       e   extended
    Select (default p): p
    Partition number (1-4, default 2): 2
    First sector (4096-7710719, default 4096): 
    Using default value 4096
    Last sector, +sectors or +size{K,M,G} (4096-7710719, default 7710719): 
    Using default value 7710719
    Command (m for help): p
    Disk /dev/mmcblk0: 3947 MB, 3947888640 bytes
    4 heads, 16 sectors/track, 120480 cylinders, total 7710720 sectors
    Units = sectors of 1 * 512 = 512 bytes
    Sector size (logical/physical): 512 bytes / 512 bytes
    I/O size (minimum/optimal): 512 bytes / 512 bytes
    Disk identifier: 0x80000000
            Device Boot      Start         End      Blocks   Id  System
    /dev/mmcblk0p1   *        2048        4095        1024    1  FAT12
    /dev/mmcblk0p2            4096     7710719     3853312   83  Linux
    Command (m for help): 
    Command (m for help): w
    The partition table has been altered!
    Calling ioctl() to re-read partition table.
    WARNING: Re-reading the partition table failed with error 16: Device or resource busy.
    The kernel still uses the old table. The new table will be used at
    the next reboot or after you run partprobe(8) or kpartx(8)
    Syncing disks.
    root@debian-armhf:/# reboot
  • Once you’re back from reboot we need to expand the filesystem to cover the new partition. Don’t forget to sudo -i to become root.
    root@debian-armhf:/# df
    Filesystem     1K-blocks   Used Available Use% Mounted on
    rootfs           1811704 740184    977824  44% /
    /dev/root        1811704 740184    977824  44% /
    devtmpfs          253920      0    253920   0% /dev
    tmpfs              50816    216     50600   1% /run
    tmpfs               5120      0      5120   0% /run/lock
    tmpfs             101620      0    101620   0% /run/shm
    /dev/mmcblk0p1      1004    474       530  48% /boot/uboot
    root@debian-armhf:/# resize2fs /dev/mmcblk0p2 
    resize2fs 1.42.5 (29-Jul-2012)
    Filesystem at /dev/mmcblk0p2 is mounted on /; on-line resizing required
    old_desc_blocks = 1, new_desc_blocks = 1
    The filesystem on /dev/mmcblk0p2 is now 963328 blocks long.
    root@debian-armhf:/# df
    Filesystem     1K-blocks   Used Available Use% Mounted on
    rootfs           3761680 741096   2851404  21% /
    /dev/root        3761680 741096   2851404  21% /
    devtmpfs          253920      0    253920   0% /dev
    tmpfs              50816    216     50600   1% /run
    tmpfs               5120      0      5120   0% /run/lock
    tmpfs             101620      0    101620   0% /run/shm
    /dev/mmcblk0p1      1004    474       530  48% /boot/uboot

1) CRTMP Server installation

CRTMP server is a streaming media server very similar to wowza. I use this to point ffmpeg at and to get playback on JWplayer. Grab the source code and follow the instructions for installing it on Ubuntu from You will need to create your build environment before you start building so do the command below to setup your environment and get the prerequisites installed:

  • Install needed packages
    root@ubuntu-armhf:/# apt-get install g++ subversion cmake make libssl-dev
  • Run these two additional commands
    root@ubuntu-armhf:/# apt-get install libcurl4-openssl-dev pkg-config
    root@ubuntu-armhf:/# pkg-config openssl --cflags --libs
  • Make sure that the last commands output was “-lssl -lcrypto”
  • Fetch the latest repo version of CRTMP
    cd /opt
    svn co --username anonymous --password "" crtmpserver
    cd /crtmpserver/builders/cmake/cmake_find_modules
  • Edit the file Find_openssl.cmake and add the following path to the PATHS section of ssl, crypto and z sections above /usr/lib64 line:
  • Edit the file Find_pcap.cmake and add the following path to the PATHS section of pcap sections above /usr/lib64 line:
  • Edit the file Find_dl.cmake and add the following path to the PATHS section of dl sections above /usr/lib64 line:
  • Edit the file Find_lua.cmake and add the following path to the PATHS section of lua sections above /usr/lib64 line:
  • Might have to do this if locale is giving you grief (Note: I’ve picked en_CA since I live in Canada, you need to pick the right one for your country/region):
    root@ubuntu-armhf:/opt/crtmpserver/builders/cmake/cmake_find_modules# sudo locale-gen en_CA en_CA.UTF-8
    Generating locales...
    en_CA.ISO-8859-1... done
    en_CA.UTF-8... done
    Generation complete.
    root@ubuntu-armhf:/opt/crtmpserver/builders/cmake/cmake_find_modules# dpkg-reconfigure locales
    Generating locales...
    en_CA.ISO-8859-1... up-to-date
    en_CA.UTF-8... up-to-date
    en_US.UTF-8... done
    Generation complete.
  • Start building crtmp
    root@ubuntu-armhf:/opt/crtmpserver/builders/cmake# ./run
  • This process will take a while……go have a couple of coffee’s and/or snacks

Once this process is finished you’ll end up with the executable in /opt/crtmpserver/builders/cmake/crtmpserver, but that’s not how you run it. First you need a config file — you can edit crtmpserver.lua in /opt/crtmpserver/builders/cmake/crtmpserver or save a copy of the original and create a new one with the content from below (I’ve just cleaned up the original a tiny bit).

-- Start of the configuration. This is the only node in the config file. 
-- The rest of them are sub-nodes
	-- if true, the server will run as a daemon.
	-- NOTE: all console appenders will be ignored if this is a daemon
	-- the OS's path separator. Used in composing paths

	-- this is the place where all the logging facilities are setted up
	-- you can add/remove any number of locations

			-- name of the appender. Not too important, but is mandatory
			name="console appender",
			-- type of the appender. We can have the following values:
			-- console, coloredConsole and file
			-- NOTE: console appenders will be ignored if we run the server
			-- as a daemon
			-- the level of logging. 6 is the FINEST message, 0 is FATAL message.
			-- The appender will "catch" all the messages below or equal to this level
			-- bigger the level, more messages are recorded
			name="file appender",
			-- the file where the log messages are going to land

	-- this node holds all the RTMP applications
		-- this is the root directory of all applications
		-- usually this is relative to the binary execuable

		--this is where the applications array starts
			-- The name of the application. It is mandatory and must be unique 
			-- Short description of the application. Optional
			description="Application for selecting the rest of the applications",

			-- The type of the application. Possible values are:
			-- dynamiclinklibrary - the application is a shared library
			-- the complete path to the library. This is optional. If not provided, 
			-- the server will try to load the library from here
			-- //lib.{so|dll|dylib}
			-- library="/some/path/to/some/shared/"

			-- Tells the server to validate the clien's handshake before going further. 
			-- It is optional, defaulted to true
			-- this is the folder from where the current application gets it's content.
			-- It is optional. If not specified, it will be defaulted to:
			-- //mediaFolder
			-- mediaFolder="/some/directory/where/media/files/are/stored"
			-- the application will also be known by that names. It is optional
			--	"simpleLive",
			--	"vod",
			--	"live",
			-- This flag designates the default application. The default application
			-- is responsable of analyzing the "connect" request and distribute 
			-- the future connection to the correct application.
			acceptors = 
			description="FLV Playback Sample",
			acceptors = 
				}, ]]--
			externalStreams = 
					tcUrl="rtmp://", --this one is usually required and should have the same value as the uri
					emulateUserAgent="MAC 10,1,82,76",
                		}, ]]--
						"FMLE/3.0 (compatible; FMSc/1.0)",
						"My user agent",
			}, --]]
			mediaStorage = {
			--[[	namedStorage1={
					--this storage contains all properties with their
					--default values. The only mandatory property is
					description="Some storage",
					--this one doesn't have a name
				} --]]
			acceptors = 
			description="Variant protocol tests",
			acceptors = 
			description="Application for administering",
			acceptors = 
			description="Application for forwarding streams to another RTMP server",
			acceptors =
			targetServers = 
					emulateUserAgent="FMLE/3.0 (compatible; FMSc/1.0"
					targetStreamType="live", -- (live, record or append)
					emulateUserAgent="My user agent",
			externalStreams = 
			description="Application for stressing a streaming server",
			--[[streams = 
			streams = 
			description="An application demonstrating the use of virtual machines",

Once you have this saved (or modified yours to look like this you can go ahead and try to start the server with the following command (NOTE: you need to be in the cmake directory (rather than crtmpserver) and reference the files with partial paths……not sure why…..something to do with cmake base directory).

cd /opt/crtmpserver/builders/cmake/
./crtmpserver/crtmpserver ./crtmpserver/crtmpserver.lua

You should get output similar this this

/crtmpserver/src/crtmpserver.cpp:216 C++ RTMP Media Server ( version 1.1_rc1 build 808 - Gladiator - (built on 2013-09-28T21:19:24.000)
/crtmpserver/src/crtmpserver.cpp:219 OS files descriptors count limits: 4096/4096
/crtmpserver/src/crtmpserver.cpp:221 Initialize I/O handlers manager: epoll without timerfd_XXXX support
/crtmpserver/src/crtmpserver.cpp:224 Configure modules
/crtmpserver/src/crtmpserver.cpp:230 Plug in the default protocol factory
/crtmpserver/src/crtmpserver.cpp:237 Configure factories
/thelib/src/configuration/module.cpp:97 Loaded factory from application samplefactory
/crtmpserver/src/crtmpserver.cpp:243 Configure acceptors
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 0->1 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 1->2 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 2->3 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 3->4 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 4->5 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 5->6 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 6->7 IOHT_UDP_CARRIER
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 7->8 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 8->9 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 9->10 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 10->11 IOHT_ACCEPTOR
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 11->12 IOHT_ACCEPTOR
/crtmpserver/src/crtmpserver.cpp:249 Configure instances
/crtmpserver/src/crtmpserver.cpp:255 Start I/O handlers manager: epoll without timerfd_XXXX support
/crtmpserver/src/crtmpserver.cpp:258 Configure applications
/thelib/src/configuration/module.cpp:177 Application admin instantiated
/thelib/src/configuration/module.cpp:177 Application appselector instantiated
/thelib/src/configuration/module.cpp:177 Application flvplayback instantiated
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 12->13 IOHT_TIMER
/thelib/src/configuration/module.cpp:177 Application proxypublish instantiated
/thelib/src/netio/epoll/iohandlermanager.cpp:120 Handlers count changed: 13->14 IOHT_TIMER
/thelib/src/configuration/module.cpp:177 Application samplefactory instantiated
/thelib/src/configuration/module.cpp:177 Application stresstest instantiated
/thelib/src/configuration/module.cpp:177 Application vptests instantiated
/crtmpserver/src/crtmpserver.cpp:264 Install the quit signal
|                                                                     Services|
| c |      ip       | port|   protocol stack name   |     application name    |
|tcp|| 1112|           inboundJsonCli|                    admin|
|tcp|| 1935|              inboundRtmp|              appselector|
|tcp|| 8081|             inboundRtmps|              appselector|
|tcp|| 8080|             inboundRtmpt|              appselector|
|tcp|| 6666|           inboundLiveFlv|              flvplayback|
|tcp|| 9999|             inboundTcpTs|              flvplayback|
|udp||10000|             inboundUdpTs|              flvplayback|
|tcp||  554|              inboundRtsp|              flvplayback|
|tcp|| 6665|           inboundLiveFlv|             proxypublish|
|tcp|| 8989|         httpEchoProtocol|            samplefactory|
|tcp|| 8988|             echoProtocol|            samplefactory|
|tcp|| 1111|    inboundHttpXmlVariant|                  vptests|
/crtmpserver/src/crtmpserver.cpp:276 GO! GO! GO! (2368)

So far so good……the server is up and running now. You can stop it using Ctrl-C. Lets continue…….

2) C920 install and config (v4l2), ffmpeg installation, boneCV installation from Derek Molloy’s site

Logitec C920 is a really nice camera. Bit on the expensive side, but incredibly capable as we’ll see. I came across a post from Derek Molloy where he talks about UDP streaming h.264 streams off the C920 using a beagle board. That’s when I discovered the magic of this little camera. You see, the camera can provide image data via usb just like any other camera, but it also has the built in capability of producing a 3.5 Mb/s CBR video stream encoded in h.264 in either 640×480, 1280×720 or 1920×1080. I literally jumped out of my seat when I read this and picked one up from the local Best Buy (about $100). So now to get this signal in we need some of the tools that come with v4l (video4linux) utility package. Here is how you go about it:

  • First we need to install v4l-utils, ffmpeg and git (need git to grab the code in next step)
    apt-get install v4l-utils ffmpeg git
  • Then we need to pull down some code Derek has modified (and or written) from his git repo
    cd /opt
    git clone git://
  • Next we need to get into the boneCV directory and recompile Derek’s capture.c program
    cd /opt/boneCV
    gcc -o capture capture.c
  • Now that we have the fresh capture program maybe we should stop and let me explain. capture.c is a V4L2 video capture example, modified by Derek Molloy for the Logitech C920 camera. He’s added the -F mode for H264 capture and associated help detail, plus an option to allow capture to capture infinite number of frames. Before we continue to the next step it’s worth trying to visualize the chain we’re trying to create. Capture (capture.c) will be called to put the camera in -F mode (1080p h.264 pre-encoded 3.5Mb/s CBR stream over USB) and to continuously pass the frames to a pipe which will feed avconv (a program that comes with ffmpeg) that will not touch the video encoding of the file but will transmit it to a RTSP destination (our crtmp streaming server). Once the stream is runnning we will use JWplayer to view the RTMP stream. The reason I decided to use JWplayer is that various incarnations of VLC say they support RTMP, but their implementation is really bad. For the longest time while using VLC to view the stream (I think it was version 2.1.x and 2.2.x nightly builds) I had freezes and breakups in the stream and I thought the poor Pi was not doing it’s job. NO, it was the player, the Pi (and Beagle Board Black) worked wonderfully. So now we need to modify the streamVideoRTSP file Derek has to look like the following (might want to save the original as .bak or something).
    echo "Video Streaming for the Beaglebone -"
    echo "Piping the output of capture to avconv"
    #1080P mode 
    v4l2-ctl --set-fmt-video=width=1920,height=1080,pixelformat=1
    #720P mode
    #v4l2-ctl --set-fmt-video=width=1280,height=720,pixelformat=1
    # Pipe the output of capture into avconv/ffmpeg
    # capture "-F"   My H264 passthrough mode
    #         "-o"   Output the video (to be passed to avconv via pipe)
    #         "-c0"  Capture 0 frames, which means infinite frames in my program
    # avconv "-i -"  Take the input from the pipe
    #        "-vcodec copy" Do not transcode the video
    #1080P mode
    ./capture -F -o -c0|avconv -re -i - -f alsa -ac 2 -i hw:1,0 -strict experimental -threads 0 -acodec aac -ab 64k -ac 2 -vcodec copy -f  rtsp -metadata title=teststream rtsp://127.0.01:554/live
    #720P mode
    #./capture -f -o -c0|avconv -re -i - -f alsa -ac 2 -i hw:1,0 -strict experimental -threads 0 -acodec aac -ab 64k -ac 2 -vcodec copy -f  rtsp -metadata title=teststream rtsp://127.0.01:554/live

3) Configuring ddclient for dynamic DNS (optional)

This is totally optional and has no effect on the final product (makes life a bit simpler). I’m just going to provide the bare minimum explanation and my config. This process is very dependant on your DNS provider (if you have one), my example config is for my provider Your mileage will vary :-).

  • First we need to get ddclient program installed (this is one of dynamic dns tools available in linux)
    apt-get install ddclient
  • Then we need to edit the config file located in /etc called ddclient.conf. Here is mine which is specific for YOU WILL HAVE TO MODIFY THIS TO SUITE YOUR DNS PROVIDER.
    # Configuration file for ddclient generated by debconf
    # /etc/ddclient.conf
    # updates internet ip on wired
    # Use this is you want to register the interface ip address (ie: You're not behind a NAT or you don't care)
    use=if, if=eth0,
    # Use this is you want to register your external ip address (ie: You're behind a NAT and want to register your outside IP address not the internal 192.168.x.x one)
    #use=web,, web-skip='IP Address',
  • If you’ve enabled two factor authentication on easyDNS (or maybe even if you have not) there is a token that you’ll need to get called “Dynamic Authentication Token”. You can grab yours under the dynamic records page by enabling “Dynamic Authentication Token” and vieing your code. I will use XXXXXXXXXXXXXXXX as my code in the following example (NOTE: server and password clauses need to be changed):
    # Configuration file for ddclient generated by debconf
    # /etc/ddclient.conf
    # updates internet ip on wired
    # Use this is you want to register the interface ip address (ie: You're not behind a NAT or you don't care)
    use=if, if=eth0,
    # Use this is you want to register your external ip address (ie: You're behind a NAT and want to register your outside IP address not the internal 192.168.x.x one)
    #use=web,, web-skip='IP Address',
  • Once the config is there we can restart the ddclient service
    service ddclient restart
  • If you want to check/test/debug your ddclient config, first stop the daemon that’s running in the backgroud, start it from command line in forground like below
    service ddclient stop
    ddclient -daemon=0 -debug -verbose -noquiet

4) Putting it all together and creating a webpage with embedded JWplayer to view the stream

So now you need to grab JWplayer (the free version) install it (copy the jwplayer folder) into a folder on your webserver (I copied mine into a folder of my blog server). The main file here is the HTML file that has the specifications for the stream in it. You need to create this to suite your need (ie: if you have dynamic DNS use the DNS name or if you don’t use the IP address of the Pi/BeagleBoard. Your mileage will vary :-).

Screen Shot 2013-11-09 at 4.42.17 PM

The above code is an example you will need to sustitute your own data to get it to work. We need two ssh windows off the Pi (one to run the crtmpserver and the other to start capturing and feeding it via streamVideoRTSP script. So go ahead start crtmp (see above….we did this as a test in step 1) and in the other window start streamVideoRTSP script. Those two windows should look like this:

Screen Shot 2013-11-09 at 4.59.39 PM

That’s pretty much it. If you now load the html file for jwplayer and press play (assuming you’ve done everything correctly) the stream should start playing in about 4-5 seconds. The encoding delay in the entire chain is about 2-3 seconds, the quality (considering it’s a webcam feeding a $35 computer) is really good and given proper power the Pi can steam this 1080p/30 stream without a issue. Just for the fun of it I also (at the same time) tried to feed the crtmp server (on the pi) a seperate quarter rez HD stream (640×360) encoded by ffmpeg on my desktop and yep, no problems (although the Pi is on medium overclocking settings). These RTMP streams can also be very easily scaled by passing them to larger crtmp installations and/or ustream/wowza for rebroadcast. Below you’ll find a bunch of ffmpeg command line entries I used for this second stream and also a quick (optional) write up on how I got the wireless dongle from Trendnet to work and configured from CLI.

./ffmpeg -re -i /Volumes/Qmultimedia/1217209\(73\).avi -vcodec libx264  -b 500000 -s 640x360 -strict experimental -g 25 -me_method zero -acodec aac -ab 96000 -ar 48000 -ac 2 -vbsf h264_mp4toannexb -f mpegts -metadata title=xxx udp://
./ffmpeg -re -i /Volumes/Qmultimedia/1217209\(73\).avi -vcodec libx264  -b 500000 -s 640x360 -strict experimental -g 25 -me_method zero -acodec aac -ab 96000 -ar 48000 -ac 2 -f flv rtmp://
./ffmpeg -re -i /Volumes/Qmultimedia/1217209\(73\).avi -vcodec libx264  -b 500000 -s 640x360 -strict experimental -g 25 -me_method zero -acodec aac -ab 96000 -ar 48000 -ac 2 -f rtsp -metadata title=xxx rtsp://

For wireless I’m using a TrendNet TEW-645UB which is directly supported under linux. Initially I used wpa_cli to get things configured and once the system was configured I massaged the files a bit. Here is a log of the whole thing:

root@picrtmp:~# lsusb
Bus 001 Device 002: ID 0424:9512 Standard Microsystems Corp. 
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Bus 001 Device 003: ID 0424:ec00 Standard Microsystems Corp. 
Bus 001 Device 004: ID 157e:3013 TRENDnet 
Bus 001 Device 006: ID 046d:082d Logitech, Inc. 
root@picrtmp:~# wpa_cli 
wpa_cli v1.0
Copyright (c) 2004-2012, Jouni Malinen <> and contributors

This program is free software. You can distribute it and/or modify it
under the terms of the GNU General Public License version 2.

Alternatively, this software may be distributed under the terms of the
BSD license. See README and COPYING for more details.

Selected interface 'wlan0'

Interactive mode

> scan
> scan_results
bssid / frequency / signal level / flags / ssid
7c:d1:c3:zz:yy:xx	2411	-50	[WPA2-PSK-CCMP][ESS]	Nerdlogger
7c:d1:c3:zz:yy:xx	2412	-51	[WPA2-PSK-CCMP][ESS]	MaNiAc 2Ghz
8c:7c:b5:zz:yy:xx	2437	-64	[WPA-PSK-CCMP][ESS]	PS3-3313551
> add_network
> set_network 0 ssid "Nerdlogger"
> set_network 0 psk "supersecretpassword"
> enable_network 0
> add_network
> set_network 1 ssid "MaNiAc 2Ghz"
> set_network 1 psk "supersecretpassword"
> enable_network 1
> save_config
> quit
root@picrtmp:~# iwconfig
wlan0     IEEE 802.11bgn  ESSID:"MaNiAc 2Ghz"  
          Mode:Managed  Frequency:2.412 GHz  Access Point: 7C:D1:C3:CA:0F:7A   
          Bit Rate=43.3 Mb/s   Tx-Power=20 dBm   
          Retry  long limit:7   RTS thr:off   Fragment thr:off
          Encryption key:off
          Power Management:on
          Link Quality=59/70  Signal level=-51 dBm  
          Rx invalid nwid:0  Rx invalid crypt:0  Rx invalid frag:0
          Tx excessive retries:0  Invalid misc:1   Missed beacon:0

lo        no wireless extensions.

eth0      no wireless extensions.

root@picrtmp:~# ifconfig
eth0      Link encap:Ethernet  HWaddr b8:27:eb:37:a6:b3  
          inet addr:  Bcast:  Mask:
          RX packets:10397 errors:0 dropped:1 overruns:0 frame:0
          TX packets:5361 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:726543 (709.5 KiB)  TX bytes:918179 (896.6 KiB)

lo        Link encap:Local Loopback  
          inet addr:  Mask:
          UP LOOPBACK RUNNING  MTU:16436  Metric:1
          RX packets:0 errors:0 dropped:0 overruns:0 frame:0
          TX packets:0 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:0 
          RX bytes:0 (0.0 B)  TX bytes:0 (0.0 B)

wlan0     Link encap:Ethernet  HWaddr 00:14:d1:cc:16:d2  
          inet addr:  Bcast:  Mask:
          RX packets:1389 errors:0 dropped:41 overruns:0 frame:0
          TX packets:36 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:466517 (455.5 KiB)  TX bytes:4773 (4.6 KiB)

root@picrtmp:~# cat /etc/network/interfaces 
auto lo

iface lo inet loopback
auto eth0
iface eth0 inet dhcp

auto wlan0
allow-hotplug wlan0
iface wlan0 inet manual
	wpa-driver wext
	wpa-roam /etc/wpa_supplicant/wpa_supplicant.conf

iface default inet dhcp

iface work inet dhcp

iface home inet static

root@picrtmp:~# cat /etc/wpa_supplicant/wpa_supplicant.conf 
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev


	ssid="MaNiAc 2Ghz"


NAB2013 Recap….

datePosted on 12:03, April 14th, 2013 by Many Ayromlou
  • DAY 1 (South Hall Upper Floor):
    • Screen Shot 2013 04 13 at 5 23 28 PM
  • Capella:
    • Carina 1 frame latency DMA (Direct Memory Access) Camera and Video processing card
    • Cambria Live Series software
  • NTT Electronics:
    • Realtime H.264/AVC HD codec with 60p and 33 msec latency
    • AVC/H.264 Codec for Contribution and Distribution with 1 frame latency and 16ch/8 PES audio
    • Ultra Low Latency H.264 software Codec using Capella Carina 1 frame latency DMA (Direct Memory Access) Camera and Video processing card
  • Elemental:
    • HEVC H.265 software/hardware decoder
    • 4K Encoding for On Demand and Real-time Delivery
  • Sonic Notify
  • Nimbus (
    • Full HD 1080p60 Zero delay (10ms) H.264 Encoder for remote desktop
  • Panopto Video Platform: vCMS with E-learning recorder
  • Angry Moose :-)
  • VanguardVideo:
    • V.265 – World’s first pure software H.265/HEVC real-time encoder
    • H.265/HEVC 4K UltraHD Encode/Decode
  • Epson Curved Edgeblending Projection:
    • Concave or Convex multi projector curved edgeblending
    • HDBaseT based projectors PowerLite Pro G6900WU
    • HDBaseT based projectors PowerLite Pro G6900WUHDBaseT based projectors PowerLite Pro G6900WU
    • BrightLink Pro 1410Wi instant Whiteboard solution (no PC required)
  • Middle Atlantic
    • Presentation/Video Conferencing Monitor Pedestal
    • Screen Shot 2013 04 13 at 4 42 47 PM
    • Rack Link Intelligent Power Management
    • Screen Shot 2013 04 13 at 4 46 59 PM
    • New additions to ViewPoint Line of operator consoles
  • Brightcove Video Cloud:
    • Cloud based online video platform (vCMS)
    • Screen Shot 2013 04 13 at 4 58 27 PM
  • Streamstar streaming Innovations:
    • Webcast appliance with instant replay function
  • LiveU
    • Portable Uplink Solution
    • iPhone Uplink Solution with MiFi stick
  • Toshiba
    • On-Air Max Flash – Flash Memory playout server
    • ExaEdge – World’s first SSD-to-IP Direct Streaming Server w/ 64K concurrent sessions
  • DAY 2 (South Hall Lower Floor):
    • Screen Shot 2013 04 13 at 5 21 47 PM
  • Black Magic:
    • Black Magic
    • Pocket Cinema ($995)
    • Cinema Camera ($2995)
    • Production 4K ($3995)
    • BMCC Rigs
    • BMCC4K Rigs
  • RED labs
  • Christie:
    • World’s first 4K 60Hz DLP Projector
  • AJA:
    • Hi5-4K – 4K SDI to 4K HDMI converter
    • Screen Shot 2013 04 13 at 5 40 31 PM
    • T-Tap – HD-SDI to Thunderbolt converter
    • Screen Shot 2013 04 13 at 5 43 14 PM
    • Avio KVM Extenders – Dual DVI Fiber KVM Extender up to 4Km/2.5mi (Zero Compression, Zero Latency)
    • Screen Shot 2013 04 13 at 5 51 40 PM
    • Maevex Video Over IP Digital Signage solution
    • Screen Shot 2013 04 13 at 5 47 50 PM
    • Matrox Mojito 4K editing solution
    • Screen Shot 2013 04 13 at 5 55 48 PM
    • Matrox MicroQuad – Quad SDI to HDMI multiviewer for 3G/HD/SD
    • Screen Shot 2013 04 13 at 5 58 56 PM
  • Ventuz
    • Real-time graphics content creation for professional presentations, video wall setups, multitouch applications or broadcast graphics
    • Spotted Zebra Ltd
    • MultiTaction Cell Displays – unlimited touch points, including hands, fingers, fingertips, 2D markers, real-life objects 
    • 3Monkeys
  • Haivision
    • Viper
  • Darim
    • Microstation and standLED lights
  • CalDigit
    • VR mini 2
    • Thunderbolt Station
  • Promise Technology
    • FiberChannel to Thunderbolt Interface
    • V-Trak zero server workgroup GFS (VTrakFS) Fiberchannel Storage
    • Screen Shot 2013 04 13 at 6 37 10 PMScreen Shot 2013 04 13 at 6 40 22 PMScreen Shot 2013 04 13 at 6 40 40 PM
  • Samsung Transparent Display
  • LiveStream
  • DIT Station
    • Rogue4 on-set data management and playback workstation
  • SGI (Silicon Graphics)
  • DAY 3 (Central Hall)
  • The PadCaster
  • Portalis
    • The Pro-xi workstation integrator
  • Panasonic
    • Ultra Wide Camera Solutions
    • 4K Camcorder
  • JVC
    • GY-HM650 2.0 ProHD mobile news camera – Dual coded with wifi and optional LTE modem for realtime streaming
  • Codex – The Vault digital recorder
  • DJI
    • Phantom
  • Fraunhofer IIS:
    • Computational Imaging – HDR Camera/Camera Array
    • Multi-view capturing for lightfield computation, HDR and higher frame rates
    • Multi-view capturing for lightfield computation
    • HDR video capturing with a single shot
    • Multi-view capturing for lightfield computation
  • Tascam:
    • DR-60D DSLR External Audio Recorder
    • DR-60D DSLR External Audio Recorder
    • H2S HDMI to HD-SDI (or vice versa)
    • Samurai Blade
  • Freefly
    • Movi M10
  • Teradeck
    • VidiU portable RTMP streamer
  • Syrp genie motion controller for time lapse photography
  • Sony 4K Stitch with Virtual Camera
  • DAY 4 (North Hall)
  • IBM
    • PureData System for Analytics
    • GOPR0247
  • NHK
    • Over-the-Air Transmission of Super Hi-Vision (8K)
    • Super Hi-Vision Camera’s
    • Hybrid Cast – Accurate Graphic Rendering synchronized with frame and second screen

Stream your Windows desktop using ffmpeg

datePosted on 10:26, November 3rd, 2011 by Many Ayromlou

I’ve already covered how to do this with vlc a while back in parts 1 followed by part 2. I just found out that something very similar in results can be done with ffmpeg. ffmpeg has recently added support for directshow filters which now allows one to capture the screen and stream and/or save it. Here is how you can do this:

1.) Grab a copy of the Screen Capture DirectShow source filter from Unreal Streaming Technologies. It’s about half way down that page. They have both the UScreenCapture X86 Edition and the X64 Edition (depending on your OS installation). I used the 64 bit filter on a Windows 7 64 bit installation.

2.) Install the filter and make sure you make the following changes to your windows registry using regedit. The default frame rate for UScreenCapture filter is 10 f/s and we need to boost this to 30 frames/sec. You need to find the key HKLM\SOFTWARE\UNREAL\Live\UScreenCapture and insert a DWORD value of 30 for FrameRate (You have to create FrameRate, it does not exist by default). Once you’ve done the registry tweak, reboot.

3.) Install the latest greatest version of ffmpeg for your windows version from Zeranoe. I grabbed the 64 bit Static build since I didn’t want to deal with libraries and such. Extract it and stick it somewhere on your hard drive. Remember the path to this folder since we will need it later.

4.) Open a command line window and cd to the directory where you extracted ffmpeg into, find the bin directory and cd into it. This is were the ffmpeg executable resides. In my case (I extracted the ffmpeg files into “Program Files” directory) it is C:\Program Files\ffmpeg-git-059707e-win64-static\bin.

5.) If you’ve made it this far, hand in there, we’re almost home. Now you need to issue the command that gets the screen streaming going. But first we need to find out the name of the Screen filter device. So issue the following command:

ffmpeg -list_devices true -f dshow -i dummy

In the output look for a device called “UScreenCapture“. Hopefully if everything is working with the directshow filter you have a entry in the list. That’s the name of our device that we need to pass onto ffmpeg. While you’re there also look for your audio device entry as well. Mine was the truncated word “Stereo Mix (Realtek High Defini” (Yes mine was missing the end of that line). Jot that down somewhere as well. I will show you how to get audio going as well.

6.) So first step is to get video going. Assuming you have a “UScreenCapture” device (You could use another directshow filter if you like, this will work with most of them. I just used the Unreal filter for the heck of it), here is the command to start encoding and sending video:

ffmpeg -f dshow  -i video="UScreenCapture"  -r 30 -vcodec mpeg4 -q 12 -f mpegts udp://aaa.bbb.ccc.ddd:6666?pkt_size=188?buffer_size=65535
  • -f dshow specifies that you’re going to be using a directshow device as your input.
  • -i video=”UScreenCapture” is the name of the input directshow device which we picked up in step 5.
  • -r 30 is the frame rate.
  • -vcodec mpeg4 is our video codec of choice.
  • -q 12 is a quality measure for the encoding process (1 is the best and 30 the worst). We’re doing VBR encoding so this measures the compression ratio vs. picture quality.
  • -f mpegts is our output filetype. In this case mpeg-2 transport stream. Yes, we’re encapsulating mpeg4 video inside a mpeg-2 transport stream…..why?….google it.
  • udp://aaa.bbb.ccc.ddd:6666?pkt_size=188?buffer_size=65535 this last bit specifies the address and port number of the recipient machine (aaa.bbb.ccc.ddd is the ip address of that machine and 6666 is my arbitrary port number). We’re also instructing ffmpeg to create smaller 188 byte size udp packets (which is the size of the transport stream packets) to decrease latency and our buffer size is 64kb.

7.) On the receiving machine you should be able to use vlc, ffmpeg or mplayer to catch the stream. In vlc simply open the Network stream rtp://@:6666 , in ffmpeg you can use the command ffplay -i udp://:6666 or using mplayer you can issue the command mplayer -framedrop -double udp://:6666 .

8.) Now to optionally add sound to the whole thing we can use this command on the encoding machine (instead of step 6). You need to know the device name for your sound card and you probably want to turn the volume down (at least initially) on the decoding machine.

ffmpeg -f dshow  -i video="UScreenCapture" -f dshow -i audio="Stereo Mix (Realtek High Defini" -r 30 -vcodec mpeg4 -q 20 -acodec libmp3lame -ab 128k -f mpegts udp://
  • -f dshow specifies that you’re going to be using a directshow device as your input (VIDEO).
  • -i video=”UScreenCapture” is the name of the input directshow video device which we picked up in step 5.
  • -f dshow specifies that you’re going to be using a directshow device as your input (AUDIO).
  • -i audio=”Stereo Mix (Realtek High Defini” is the name of the input directshow audio device which we picked up in step 5.
  • -r 30 is the frame rate.
  • -vcodec mpeg4 is our video codec of choice.
  • -q 20 is a quality measure for the encoding process (1 is the best and 30 the worst). We’re doing VBR encoding so this measures the compression ratio vs. picture quality. I went with 20 instead of 12 from step 6 since the audio encoding slows the machine down a bit.
  • -acodec libmp3lame is our video codec of choice
  • -f mpegts is our output filetype. In this case mpeg-2 transport stream. Yes, we’re encapsulating mpeg4 video inside a mpeg-2 transport stream…..why?….google it.
  • udp://aaa.bbb.ccc.ddd:6666?pkt_size=188?buffer_size=65535 this last bit specifies the address and port number of the recipient machine (aaa.bbb.ccc.ddd is the ip address of that machine and 6666 is my arbitrary port number). We’re also instructing ffmpeg to create smaller 188 byte size udp packets (which is the size of the transport stream packets) to decrease latency and our buffer size is 64kb.

How to stream live HDV/DV to iphone…..

datePosted on 13:36, March 5th, 2010 by Many Ayromlou

In this guide I’ll show you how to stream live HDV/DV video to your iphone using a linux box (Ubuntu 9.10) with firewire input running vlc/ffmpeg and a Imac with OSX 10.6.2 running mediastreamsegmenter and apache2.

Start out with the iPhone streaming media overview. Without understanding this document you’ll have a hard time getting things working.

First things first, you need to have a working Ubuntu 9.10 machine. I’m using a small footprint 2.4Ghz Core2Duo machine with PCI firewire 400 card in it. For video input I’m using a Canon HV30 set to HDV mode (1080i/60) connected via firewire.

Next you need to follow the instructions on this page (steps 0-5) to get a working ffmpeg with x264 and aac encoding. Without this working you’re not going anywhere….sorry. If you’re trying this on a different Ubuntu installation follow the other links to get a working ffmpeg setup.

Then install vlc using “sudo apt-get install vlc“. I used vlc as my encoder frontend as I understand it better than ffmpeg. You can use just straight ffmpeg as well if you can figure out how to get it to encode the live HDV stream over firewire.

You’ll also need dvgrab utility. Install it using “sudo apt-get install dvgrab“.

Now we want to make sure the internal firewire module is working so type this command and see if you get a vlc window with the camera output in it (make sure you turn the camera ON and hook it up first).
sudo dvgrab -f hdv -noavc -nostop -|vlc -
You have to use sudo under ubuntu to get proper access to the firewire device. The above command runs dvgrab with hdv format and makes sure that 1394 AV/Device control is turned off (this way you can be in Camera mode and get a live feed). The nostop switch prevents dvgrab from sending stop commands to the camera everytime you stop it via Ctrl-C, which I though was a good thing. The last dash forces dvgrab to output to stdout, which we’ll then pipe into vlc (the dash for vlc tells it to use stdin as input).

Next we need to create a media stream out of our linux box and ship it over UDP to the Imac. The vlc command below gets the job done. Remember you’re sudo’ing and need to provide the password after you enter the command.
sudo dvgrab -f hdv -noavc -nostop -|vlc - --sout '#transcode{vcodec=h264,vb=256,venc=x264{aud,profile=baseline,level=30,keyint=30,bframes=0,ref=1,nocabac},acodec=mp4a,ab=64,scale=0.25,deinterlace,width=320,height=240}:duplicate{dst=std{access=udp,mux=ts,dst=}}'
The IP address toward the end of the command is the IP of the Imac machine receiving the stream. Port 1234 is arbitrary. The stream is comprised of h.264 video @ 256K and AAC audio @ 64K. Those elementary streams are then packaged in mpeg2 transport stream before being shipped to the Imac. This is the standard way of doing HTML5 video (from what I understand).

So now we can go over to the mac and see if we receive the video stream. For that just run VLC for OSX and open UDP network port on port 1234 (udp://). If things are working nicely you should see a 320×240 video from you HDV camera on the Imac.

Now that we have the video on the mac, we need to use the “mediastreamsegmenter” command line tool to create HTML5 video stream out of it. mediastreamsegmenter listens on a UDP port for incoming transport stream chops it (by default) into 10 sec. “mini” transport stream files and writes these mini files to wherever you tell it. This location is important since it needs to be accessible to your webserver. Remember, at the end of the chain (day), the webserver is doing all the heavy lifting of delivering the mini transport stream files to your iphone. mediastreamsegmenter also produces a file of type .m3u8
which is basically a live updated playlist.

Something you might not know is that apple ships standard OSX with apache builtin. All you have to do is use the following command to get it started.
apachectl start
Now point your browser on the mac to localhost and see if it loads a page. Now that apache is working we need to modify it so it knows how to deal with .ts and .m3u8 files. This involves adding a couple of lines to /etc/apache2/httpd.conf
AddType application/x-mpegURL .m3u8
AddType video/MP2T .ts

and /etc/apache2/mime.types
.m3u8 application/x-mpegURL
.ts video/MP2T

Next we need to restart apache
apachectl restart
By default OSX apache is setup to load it’s documents from /Library/WebServer/Documents, so I created a directory called “stream” to contain the media stuff (.ts files and .m3u8 file) and put the following into the index.html file in /Library/WebServer/Documents.
<title>Video Test</title>
<meta name="viewport" content="width=320; initial-scale=1.0; maximum-scale=1.0; user-scalable=0;"/>
<body style="background-color:#FFFFFF; ">
<video width='320' height='240' src="prog_index.m3u8" controls autoplay> </video>

Now that we’ve got all the prep done on the mac, we issue the following command from terminal window to get the mediastreamsegmenter going (details can be found by using man mediastreamsegmenter).
mediastreamsegmenter -b -f /Library/WebServer/Documents/stream
Here -b specifies the base of the URL that will be encoded into the .m3u8 file (this is the IP address of your Imac, stream is the folder in /Library/WebServer/Documents/ where the mini .ts files and the .m3u8 file are). The -f switch specifies the output directory for the mini .ts files and the .m3u8 file. and the last IP address:port is from your Linux box.

Now you should be able to open up your browser on your iphone/ipod touch and punch in (assuming the Imac is reachable from your phone) and see the streaming video (You might have to turn on “Plugins” feature under settings/safari on your device. Mine was turned off and drove me crazy until I figured it out). If Plugins is turned off, the index.html page will load, but no video.

Hopefully there is enough meat here to get you guys started……btw. I hear the following command (or variations of) can be used on linux side (instead of vlc). I haven’t tried it and can’t confirm if it works.
ffmpeg -i <in file> -f mpegts -acodec libmp3lame -ar 48000 -ab 64k -s 320×240 -vcodec libx264 -b 96k -flags +loop -cmp +chroma -partitions +parti4×4+partp8×8+partb8×8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 96k -bufsize 96k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 320:240 -g 30 -async 2 <output file>

Some excellent information can be found on Carson McDonald’s blog:

Movist…..will it be the next VLC for mac….

datePosted on 13:51, February 5th, 2010 by Many Ayromlou

The answer to that is maybe, we’ll see. But all that aside if you’re interested in a minimalistic video player that can handle more codecs than VLC and is generally faster then give Movist a try.

The unique thing about the player is that it allows you to switch codec engines between ffmpeg and quicktime based on file extensions. Oh and did I say it’s FREE aswell :-).

DisplayPort does true 4K video….plus other stuff….

datePosted on 12:16, January 19th, 2010 by Many Ayromlou

Well, it’s taken a bit of time, but I think with the announcement of DP1.2 specs last December, the specs (and hopefully soon the vendors) are ready for True Digital Cinema home implementation. The DisplayPort connector supports 1, 2, or 4 data pairs in a main link that also carries clock and optional audio signals, each with a symbol rate of 1.62, 2.7, or 5.4 Gbit/s. The video signal path supports 6 to 16 bits per color channel. This allows the updated DisplayPort 1.2 specification to drive 4K x 2K display (3840×2160) with 30 bits per pixel and 3D over a single 2m cable.

DP 1.2 supports a maximum of 5.4Gbps per lane, with 4 lanes providing a whopping 21.6Gbps throughput, more than enough for 10-bit 4xHD resolution (3840×2160). To achieve the 21.6 Gbps rate, the per-lane data rate is doubled from 2.7 Gbps to 5.4 Gbps max, over the four lanes that exist in the standard cable. For a single display, this enables up to 3840 x 2400 maximum resolution at 60Hz, or a 3D display (120Hz) at 2560 x 1600.

Display Port 1.2’s massive data rate will enable Multiple-Streaming, support for stereoscopic images beyond full HD, a high-speed data channel, and support for mini connectors.

Multi-Streaming — is the ability to transport multiple independent uncompressed display and audio streams over a single cable. This enables the use of multiple monitors connected by cable in a daisy chain or hub configuration. Whereas the current Display v1.1a standard can support one 2560 x 1600 monitor at 60Hz, DisplayPort v1.2 can support two such monitors with one port, or four 1920 x 1200 monitors.

Another new feature is the ability to support high-speed, bi-directional data transfer, allowing USB 2.0 or Ethernet data to be carried within a standard DisplayPort cable. For DisplayPort v1.2, the maximum data rate of this “AUX” channel has been increased from 1 Mbps (Mega-bit-per-second) to 720 Mbps, providing suitable bandwidth for USB 2.0. The DisplayPort cable can therefore support USB data to/from the display to support Display USB functions, in addition to sending the video and audio information. Standard Ethernet can also be transported in the DisplayPort cable.

On the audio front DisplayPort v1.2 adds the following new enhancements:

  • Audio Copy Protection and category codes
  • High definition audio formats such as Dolby MAT, DTS HD, all Blu-Ray formats, and the DRA standard from China
  • Synchronization assist between audio and video, multiple audio channels, and multiple audio sink devices using Global Time Code (GTC)

DisplayPort v1.2 also includes improved support for Full HD 3D Stereoscopic displays:

  • Life-like motion using up to 240 frames-per-second in full HD, providing 120 frames-per-second for each eye
  • 3D Stereo transmission format support 
    • Field sequential
    • Side by side
    • Pixel interleaved
    • Dual interface
    • Stacked
  • 3D Stereo display capability declaration
    • Mono
    • Stereo
    • 3D Glasses

The only thing on my wish-list that they (VESA) omitted is support for true 4K DCinema (4096×2048) resolution. But I guess you can’t have everything……and there is always DP1.3 :-).

OpenShot Video Editor 1.0 released…..iMovie for Linux is here.

datePosted on 13:24, January 14th, 2010 by Many Ayromlou

For those of you who don’t know OpenShot Video Editor(TM) is an open-source program that creates, modifies, and edits video files. OpenShot provides extensive editing and compositing features, and has been designed as a practical tool for working with high-definition video including HDV and AVCHD.

Jonathan Thomas and crew have reached their 1.0 milestone (congrats :-)). The program is rock solid and is running beautifully on my Ubuntu 9.10 installation.

OpenShot’s Features include:

  • Support for many video, audio, and image formats (based on FFmpeg)
  • Gnome integration (drag and drop support)
  • Multiple tracks
  • Clip resizing, trimming, snapping, and cutting
  • Video transitions with real-time previews
  • Compositing, image overlays, watermarks
  • Title templates, title creation
  • SVG friendly, to create and include titles and credits
  • Scrolling motion picture credits
  • Solid color clips (including alpha compositing)
  • Support for Rotoscoping / Image sequences
  • Drag and drop timeline
  • Frame stepping, key-mappings: J,K, and L keys
  • Video encoding (based on FFmpeg)
  • Key Frame animation
  • Digital zooming of video clips
  • Speed changes on clips (slow motion etc)
  • Custom transition lumas and masks
  • Re-sizing of clips (frame size)
  • Audio mixing and editing
  • Presets for key frame animations and layout
  • Ken Burns effect (making video by panning over an image)
  • Digital video effects, including brightness, gamma, hue, greyscale, chroma key (bluescreen/greenscreen), and over 20 other video effects.
 There are 4 ways to install OpenShot: LiveDVD, PPA, DEB Installer, and the Build Wizard. Grab it here.

Ipod/IPhone controlled CAR!!!!

datePosted on 23:42, November 11th, 2009 by Many Ayromlou

Ipod Touch Tiled Display…..

datePosted on 23:38, November 11th, 2009 by Many Ayromlou