Archive for ‘November, 2007’

Zoho Writer makes web2.0 Word Processing a breeze…..

datePosted on 18:39, November 27th, 2007 by Many Ayromlou

…..Not only that — yes we know about google docs and all the other online word processors out there — it also allows you to go offline while editing your documents and sync when you get back online. All this is done through the magic of Google Gears browser plugin (Thanks G). So throw away that old copy of MS Office, uninstall it off your harddrive and start using Zoho Writer. While you’re at it you might also want to send them a “thank you” note for taking another MS shackle off your computer/ankle :-). More details + movie….

Cool Hand Tracking Video….

datePosted on 17:36, November 25th, 2007 by Many Ayromlou

Speaking of hand tracking, here is a video of a guy playing around with an unknown system (looks a bit like linux). Very cool demo and almost perfect tracking. Not sure if it’s IR or not, you can see him in the corner of the screen, but can’t quite tell how it’s done. Anyways, I’m posting it since it’s one of the better ones I’ve seen. From the description:

A C++ computer vision application to emulate the mouse and the keyboard in any application using hand gestures and a low-cost webcam.

Tracking fingers with the Wii Remote

datePosted on 16:55, November 25th, 2007 by Many Ayromlou

Great tutorial video by Johnny Lee from Carnegie Mellon University showing how using an IR LED array and some reflective tape, you can track fingers in thin air using the Wii Remote. Great alternative to those FTIR (Frustrated Total Internal Reflection) tables.

OSX Escher Screen Saver

datePosted on 16:51, November 25th, 2007 by Many Ayromlou

This is one of those gotta haves. Beautifully done and Free. Grab it here.

Sony gets dethroned….JVC joins the 4K Projection club

datePosted on 14:46, November 25th, 2007 by Many Ayromlou

If you’ve been tuned into Digital Cinema Projection for the past couple of years, you’d know that when it comes to 4K projection (4Kx2K image), sony’s SXRD series was pretty much the only game in town. DLP is limited to 2K and most of the projectors out there (Christie, Barco, NEC) are all 2K projectors. A downside of Sony’s projector is that although it is as hefty as a small car it only has a 2000:1 contrast ratio (measured less than that calibrated). Its rated aggresively for 40ft screens which is not nearly big enough for true cinema applications.

That was true until JVC announced their 1.27-inch 4Kx2K D-ILA (Direct-drive Image Light Amplifier) chip at InfoComm 2007. The chip can produce a 4096×2400 pixel image with a 20,000:1 contrast ratio. That’s nearly 10x the contrast ratio of the Sony behemoth.

Major Specifications:

Device size

1.27-inch diagonal>

No. of pixels H x V

4096 x 2400 pixels

Pixel pitch

6.8 μm

Gap between pixels

0.25 μm

Aperture ratio


Device contrast ratio


Response time (tr+tf)

4.5 ms

LC mode

Vertical Aligned LC

LC alignment film

Light stabilized inorganic alignment film

The DLA-SH4K, which packs the 4k D-ILA chip, touts a 4,096 x 2,400 resolution, 10,000:1 contrast ratio, 3,500 lumens, a dual-link DVI input, multiscreen mode, an Ethernet port for remote operation and RS-232 / USB connectors. It measures 660 x 827 x 340 mm and is slated for launch in the first half of 2008.

How to live transcode and stream HDV to MP4 using VLC and Linux

datePosted on 22:20, November 21st, 2007 by Many Ayromlou

I’ve been trying to figure out a way to do this on the cheap for a long time and I finally figured it out today. This process allows you to grab HDV from a HDV Camera via firewire, feed it into linux, transcode the 25Mb/s mpeg-ts stream to a 4 Mb/s mpeg4 stream (inside a TS). This mpeg4 stream in turn can be viewed at full resolution (1920×1080) on a remote client running just vlc. Here is the prerequisites:

  1. A decent machine with working Firewire port (anything from the past 2-3 years should do). Laptops might work as well although I have not tried it yet. My machine is a Athlon 4200+ w/ 2GB of RAM and a 512 MB NVIDIA 7900.
  2. Ubuntu 7.10 (Gutsy Gibbon) installation CD.
  3. 4-pin to 6-pin Firewire cable.
  4. HDV Camera with Firewire out (I use a Canon HV20).
Okay so here we go, follow the steps below to get setup:
  1. Get Ubuntu 7.10 installed on your machine. This should be standard installation from the Live-CD.
  2. Reboot, do all adjustments to your display, get network setup, install all the updates.
  3. Using synaptic package manager install the following extra packages: ubuntu-extras, ffmpeg, dvgrab 3 and VLC plus anything else you might want.
  4. Connect the firewire camera to the computer and check /var/log/messages to make sure it gets recognized.
Now that you have the chain setup, it’s time to do a quick test and see if the system is working. Issue the following command from a xterm, making sure that the camera is turned on and in “Camera” mode.

sudo dvgrab -f hdv -noavc -nostop -|vlc -
You have to use sudo under ubuntu to get proper access to the firewire device. The above command runs dvgrab with hdv format and makes sure that 1394 AV/Device control is turned off (this way you can be in Camera mode and get a live feed). The nostop switch prevents dvgrab from sending stop commands to the camera everytime you stop it via Ctrl-C, which I though was a good thing. The last dash forces dvgrab to output to stdout, which we’ll then pipe into vlc (the dash for vlc tells it to use stdin as input).

If this works you should get a vlc window and be able to see live video from your HDV camera. If you didn’t then stop here and make sure you get this working first.
So now that we have dvgrab working, lets grab that 25Mb/s HDV stream and squish it down to 4Mb/s mpeg4 stream using the following command:

sudo dvgrab -f hdv -noavc -nostop -|vlc - --sout '#transcode{vcodec=mp4v,vb=4096,acodec=mpga,ab=256,scale=0.5,deinterlace,width=1920,height=1080}:duplicate{dst=std{access=udp,mux=ts,dst=receiver_ip_address:1234}}'
This is one massive command, the first part we already discussed, so lets take a look at the second half:

  • –sout is our output chain.
  • #transcode tells vlc that we first want to transcode the input using parameters to follow
  • {} contains our transcoding parameters
  • vcodec=mp4v sets the video codec to mpeg4 video
  • vb=4096 sets the bitrate of the transcoded video (4Mb/s)
  • acodec=mpga sets the audio codec to mpeg audio (mp3)
  • ab=256 sets the bitrate of the transcoded audio (256 Kb/s)
  • scale=1 sets the scaling value
  • deinterlace sets guess what?
  • width=1920 sets the width of the transcoded video to 1920 pixels
  • height=1080 sets the height of the transcoded video to 1080 pixels
  • :duplicate tells VLC that we want to duplicate the transcoded signal and send a copy of it to our receiver machine.
  • dst is the destination string
  • access=udp specifies UDP protocol
  • mux=ts sets multiplexing to mpeg-2 Transport stream
  • dst=receiver_ip_address:1234 is the ip address and port number of the receiving machine
So now you should be able to open up vlc on the receiver machine, goto File/Open Network menu and select UDP/RTP and specify port number 1234. Once you press OK, you should see the video stream on your receiver machine. Audio works as well and is perfectly synced since it’s captured by the HDV camera at the source and travels together with the video at all time. The delay is about 3 seconds.
This is a great way to quickly setup a HD Video Conference between a couple of locations. You could even modify the network portion of the chain to let VLC multicast the HD stream onto your network…..lots of possibilities. Enjoy :-)

Get the full path displayed in finder

datePosted on 23:15, November 14th, 2007 by Many Ayromlou

Here is a quick way to enable full path display in the finder windows under Leopard. You can turn this on by issuing the following two commands in a terminal window:defaults write _FXShowPosixPathInTitle -bool YES
killall Finder

You can also undo this effect by issuing the following two commands in a terminal window:defaults write _FXShowPosixPathInTitle -bool NO
killall Finder

Avid is going to miss NAB2008

datePosted on 22:53, November 14th, 2007 by Many Ayromlou

It’s official, Avid will not have a exhibit booth during NAB2008. According to the Press Release

Avid Technology, Inc. (NASDAQ: AVID) today announced that it will introduce a major shift in its approach to serving industry professionals in the digital content creation, management, and distribution industries. Based on extensive market research, Avid plans to announce a series of customer-focused initiatives in 2008 – all of which will be designed to make it easier for customers, prospects and the media to interact with the company. The company said it would reveal the full details of its 2008 plan to the public in February, which will set the stage for a blitz of new user-community initiatives, technical support programs, highly-personalized events, and innovative product announcements throughout the year. The company also announced that it will not have an exhibition booth at the 2008 National Association of Broadcasters (NAB) Convention, but plans to be in Las Vegas next April to meet with customers.

Wow…now, why would Avid do this? I think that Apple and Sony might just have Avid cornered. Apple’s pounding them in the high end editing market and Sony’s bringing up the rear with Vegas. I guess some companies never learn, proprietary never pays….any one remember SGI…they used to have monster booths at NAB and would only talk to you if you had hollywood written all over your face…..Now they are next to non existent. It’s time for me to listen to my favourite Queen song….”Another one Bites the Dust”.

More aka.iPhone and Quartz Composer Experiments…..

datePosted on 16:55, November 11th, 2007 by Many Ayromlou

So now that I have a basic OSC receiver for aka.iPhone’s XY controller, I’ve been going through Apple’s Demo Compositions — under /Developer/Examples/Quartz Composer/Compositions — and adding my portion of the OSC receiver to them. Here is the latest one, akaRemote-Caterpillar, which is a adaptation of “Caterpillar.qtz” under /Developer/Examples/Quartz Composer/Compositions/Interactive. Again I need to remind you to read the first Article to get started and that these QC compositions are for Leopard/QC3.0 only and require a jailbroken Ipod Touch or iPhone.

aka.iPhone and Quartz Composer Experiments…..

datePosted on 15:05, November 11th, 2007 by Many Ayromlou

I assume you know what aka.iPhone is and what it does. If you don’t please see this article over at Create Digital Motion. I’ve got aka.iPhone 2.1 installed on my ipod touch and while I enjoyed playing around with the accompanying MAX/MSP patch — via the free runtime — I wanted to see if I could get it working with Quartz Composer.

Well here are my two (akaRemote, akaRemote-Particle) attempts at QC compositions that work really well with the XY controller of aka.iPhone. The XY Controller surface is the only thing I’ve been able to get working with QC, since  Masayuki Akamatsu (the author of aka.iPhone) tends to use the same basic “/event” OSC message with a custom number of arguments. The limitations is actually in QC in that you can only have one OSC receiver on a UDP port at a time. Further a OSC receiver can not receive the same message with different arguments (int, float, float array). The author does mention that his protocol might change without notice, so hopefully he’ll read this post and change the messages to cascading/two level OSC messages to signify which button’s are activated and also to get more diversity in the base message string (ie: /event/Pad/buttonB1 message of type boolean which would signify a toggle button on the Pad screen being fired). I don’t pretend to be an OSC god, but I think it makes the protocol more readable/adaptable, which might not be the authors intent.
I decided that for my own use the XY controller was the most useful to reverse engineer (and also the easiest). The OSC command is “/event a b c“, where “a” is the trigger, “b” is the x-coordinate and “c” is the y-coordinate. X and Y coordinates are between (0,0) at the bottom left of the ipod touch screen and (1,1) at the top right.
Now here is how you get it all going:
  1. Get Masayuki Akamatsu’s aka.iPhone loaded into your ipod touch and/or iphone (I’m not going to tell you how to do this…..lets just say that if you have jailbroken your device you can just sftp his application to /Application on your device).
  2. Download akaRemote and akaRemote-Particle files and unzip them somewhere on your mac (I’m assuming you have OSX 10.5 and Quartz Composer 3.0, as they are required).
  3. Start with akaRemote by double clicking on its icon to load it into Quartz Composer, goto Preferences/Viewer and click the + to add a new Preset for the viewer. Call it 1:1 (or something) and give it width and height of 1 and make sure aspect ratio is selected.
  4. Now go to the bottom left side of the viewer and select 1:1 from the pull down. This guarantees that the coordinate system translation I use (0 to 1 from device is translated to -1 to 1 on the screen) works. Now resize the window to whatever size you want (not too small).
  5. The QC OSC listener is configured for aka.Iphone’s default port (5600) so you don’t have to change anything. Load up akaRemote application on your device and change the Host Address to the IP address of the machine running QC.
  6. Now if you switch to the XY tab on the device you should be able to see a dot move around the screen when you touch the surface (the video cube in the middle also rotates).
  7. Optionally if you like to see a nicer example I’ve put together akaRemote-Particle, a Particle system viz using one of Apples demos (“table particle.qtz“) as a base.
  8. The idea here is the same, except that your touch on the XY surface produces particle systems in the viewer (make sure you have the 1:1 thing set at the bottom left of the viewer screen).