Wednesday, February 27, 2013

Raspberry Pi: Remote Canon Camera Viewing and Control

This is my ongoing wireless video streaming project from my T2i using Raspberry Pi.

NOTE: I will organize this better when I have more time, I promise.

One possible option: http://www.auvidea.com/index.php/theme-styles/2013-06-22-21-23-34/raspberry-pi-hdmi-in
This would allow an HDMI in (non-encrypted) and also would provide a wifi system which could potentially encompass everything I wanted to do and do it better.

Another non-pi options: http://www.auvidea.com/index.php/theme-styles/2013-06-22-21-23-36/encoder-e100

Interesting Reading: https://plus.google.com/+RyanMatthews/posts/btdqMZqCzEA

In the Beginning:
I though it would be really nice to have some way to view my live video feed from my Canon T2i remotely from a mobile phone or tablet computer. It has crossed my mind that using this setup, I could potentially remote control the camera using PTP (Photo Transfer Protocol) and MTP (Media Transfer Protocol). I realize there will be a need to a computer to handle mediation between the camera and the remote viewing device. Only one other commercial device has been found which is for serious cameras and sells for $1,500. I'd like to do it for less than $100 using the Raspberry PI.

Some similar projects in regards to still photography acquisition and transfer:
http://islandinthenet.com/2012/08/23/hdr-photography-with-raspberry-pi-and-gphoto2/
http://davidhunt.ie/?p=2641

References:
http://blog.waynehartman.com/archive/2009/01/07/edsdk-to-remote-control-your-canon-camera.aspx
http://www.graphics.cornell.edu/~westin/canon/index.html (powershot)
http://gphoto.org/doc/manual/quickstart.html#using-gtkam
http://islandinthenet.com/2012/08/23/hdr-photography-with-raspberry-pi-and-gphoto2/

10/01/2012 (or so) 
I purchased the Raspberry PI model B (256MB) (512 wasn't available yet) online from Allied Electronics for $35 plus about $8 shipping.

10/05/2012 (or so) 
I determined the design and hardware requirements if this is to be obtainable. The Canon camera has an EDSDK which allows a developer to write programs that will interface with the canon Firmware and hardware (using APIs). The EDSDK however only appears to be supported on Mac and Windows (Linux not officially supported). One of these controls is the live view data. Using a USB connection between RPi (USB2.0) and the Canon (USB2.0) will allow the Pi to establish a connection with the camera to control it. A WiFi dongle will be used to enable the Pi to be a WiFi hotspot. Pi will use something similar to VLC or ffserver to stream image data over a wireless protocol to the listening device. The device will have a custom built solution to allow it to render out the media stream. Using a socket (or even RESTfully, commands could be sent back to the Pi to change focus, contrast, etc).

10/19/2012
Rasberry Pi shows up but I can't do anything really for it. Bought an SD card ( $8 / 16 GB Class 10 like this one) and an adapter from HDMI male to DVI female (about $5; who knows if this will work).

10/24/2012 
Adapter and SD card arrive, switched SD card with wife for 8 GB Class 4. Adapter still needs to be tested while OS formatted SD is in the Pi.

10/25/2012
Did some testing with the PI and research about Canon cameras and USB control. Applied and obtained the EDSDK from Canon. SDK appears to support control for the live view along with histogram data, etc. Pi boots up wheezy raspbian and is working great, still haven't tried it with the HDMI/DVI adapter. It's running quick though. Much better it seems then it did on the emulated version.

Preparing myself for adventures in C. Planning to try to remote access Camera data via Windows 7 desktop to ensure it will work and that I can get the data I want from the Canon. If all goes well, I will then move the programming onto the Pi and begin networking the hotspot and streaming services.

10/30/2012
Been thinking a lot about how to get the RPI setup to work better and did some more research into gphoto2. Found out Gphoto2 will allow the live view to be captured on some camera. Decided to setup a fedora box as a sandbox for testing this. VirtualBox fedora wouldn't recognize the USB devices (I might not have had it setup right, and I'd been looking for an excuse to finish setting up my old Dell with Linux).

11/01/2012
I began doing some testing now that I have a sandbox setup. Gphoto2 successfully recognizes my camera and can control it over USB. I am able to take the capture data and have it send to stdout. With ffmpeg I have figured out how to accept the stream and convert it for ffserver to send over the network.

My command is like this: 
ffserver -f /etc/ffserver.conf | gphoto2 --stdout --capture-movie | ffmpeg -f mjpeg -r 2 -i - http://localhost:81/canon.ffm

This appears to be working and ffserver is reachable over my local network. However, when I go to open the native stream in the browser ( which supports playing mjpeg streams) I get the error: This feed is already being received. This was a simple solution.


My ffserver.conf file looks like this:

Port 80
BindAddress 0.0.0.0
MaxClients 10
MaxBandwidth 50000
NoDaemon
<Feed canon.ffm>
file /tmp/canon.ffm
FileMaxSize 10M
</Feed>
<Stream canon.mjpeg>
Feed canon.ffm
Format mjpeg
VideoSize 640x480
VideoFrameRate 10
VideoBitRate 2000
</Stream>

As you can see, there is no conversion happening other than re-sizing the frame size. These setting provide a stream with the following attributes:

  • fps 13
  • bitrate 233.7kbitz/s


I assume changing my ffserver.conf will allow me to change the output format to be compressed, and allow me to up my frame rate.

11/03/12
Today I finally saw my camera stream over the network in real time. This attests that it could definitely be controlled and viewed a remote device on a wireless network. I'm about to buy a USB WiFi dongle so that I can begin porting this functionality to the Raspberry PI. I'm really excited to be seeing this working so well. I ended up buying a case for the PI for about $10, just to prevent damage from static electricity and dirt. Who knows when this will arrive though. The stream today has some lag, but I doubt it is from a wired network, so I'm going to look into how to optimize the image transfer. Maybe I will add some hooks to allow the client to control the output stream. Currently it has the following specs:

  • MJPEG uncompressed output
  • fps=15
  • q=0
  • bitrate 292.6 kbits/s


I'm going to optimize the output format to be compressed and see if I can get a 720 resolution with decent frame rate.

The stats page from ffserver look like this:


Screen capture of the video streaming

h264 links:
http://www.dexmac.com/index.php/how-to/74-streaming-with-ffserver
http://ffmpeg.org/sample.html
https://lists.libav.org/pipermail/ffserver-user/2011-January/000248.html


USB Dongles Options:
http://elinux.org/RPi_VerifiedPeripherals#Working_USB_Wifi_Adapters
http://www.edimax.co.uk/en/produce_detail.php?pd_id=328&pl1_id=1&pl2_id=44
http://dx.com/p/ultra-mini-nano-usb-2-0-802-11n-150mbps-wifi-wlan-wireless-network-adapter-48166?item=1&Utm_rid=24958662&Utm_source=affiliate
http://thepihut.com/products/usb-wifi-adapter-for-the-raspberry-pi
http://www.buyraspberrypi.com.au/raspberry-pi-802-11bgn-usb-wireless-dongle/ (w/ antenna)

11/16/2012
Been very busy and haven't been able to work on this much. Finally at 11 PM on Friday night, I am making some time (to take a break from AMS suffering).

I determined that my goals from before were somewhat mixed, I wanted to find a good output format and have low latency with very little processing power. To make this simpler I have divided the goals hoping I can eliminate type conversion all together.

My initial thought for achieving fairly low latency and still keeping resolution higher is to try and implement 3 things:
  1. Decent but not incredible frame rate, I think if I can get it to be under 18, that will help and still be decent
  2. Try to interlace the video frames to send less data overall
  3. Implement a loose packet protocol UDP (of course)

Lowering the frame rate actually hasn't added much because ffmpeg wont let me get to more than 17 or so. Also, I note that my bit-rate is incredibly high (3857 kbits/s). My resolution is 640x480.

11/17/2012
Tried to setup the same situation on the PI but ran into issues getting the same stream to get captures and transmitted. Going to install everything from source if I can to see if I am running into issues because of the packages available through the PI repos.

8 comments:

Kim said...

This is awesome work! I'd love to do this too, stream the live video feed over wifi using a Raspberry Pi and receive it e.g . in an android app complete with the possibility to control the camera and possibly a motorized mount.
A bit more detail on how to set things up (hardware and software) would be great...

Michael Corrigan said...

@Kim, right now it is just the Raspbien linux connected via USB to the Canon T2i, RPi has gphoto2 and ffmpeg to stream it outbound on my local network. VLC picks up the stream to display the image data but it's pretty slow. I'll update this more probably in April. Thanks for reading!

Brent Smith said...

@Michael, excellent work. Have you been able to get a stable/fast live stream from the canon to the raspberry pi? I haven't been able to get ffmpeg to stream it properly yet using a canon 5d mark III.

Michael Corrigan said...

@Brent, thanks for reading! I haven't had a lot of time to work on this project. But I haven't seen anything stream without some serious latency (maybe 2-3 seconds). Basically, I think I just need to fine tune the settings to compress the format as much as I can as fast as I can. Lately, I've been feeling like the 256 RAM might be more of a bottleneck factor than I had realized. I'll update this as I have some more free time and let you know what I figure out.

vzdr3v said...

Hi Michael, I'm able to use raspberry as a usbip server and so whit an usb wireless dongle share the 5d mark III to a remote pc (windows or mac) and then I can control the camera with eos utility as it was connect locally to pc. I'm looking to control it from an Android smartphone (Google nexus 4) but I don't know how to compile usbip(client side) in android. Have you any experience in it?

Michael Corrigan said...

@vzdr3v, that sounds awesome! I don't have any experience with the usb/ip project, but I would like to tinker around with it when I have some time. (since the project is written in C, I'd start with running a makefile for android: https://groups.google.com/forum/?fromgroups#!topic/android-building/iwWeai3cELw)

Good luck!

Rupin Chheda said...

I have been wanting to do this since some months now, and have been following your posts on the gphoto2 forum.

My choice of hardware was a wifi router tplink mr3020 which runs openwrt, and can run both ffmpeg and gphoto2.

This does have advantages over Using a Rpi of the wifi module being available internally.

I recognised my T2i, and I got the stream working. I did run into memory and performance issues (400Mhz processor, 32MB RAM). The router used to reset automatically at 100% CPU(may be an odd watchdog timer)

I finally realized that another way to do it was to use Node.js, which is better at event based computations. There is a gphoto2 library for Node.js available for Openwrt as well.

I am currently building images for the base system so that the node package manager can be installed.

I am definitely following your blog for updates you may have.

Thanks for the great work!

Rudi Zo said...

Hi Michael,
Did you spend time in usbip for Android? Or better successfully build :)
Plesse let me know. gsus24@gmx.de or if i could contribute to Android support also contact me. Thanks attending to