billmccord / OpenCV-Android

A project for porting and optimizing OpenCV for Google's Android OS
http://billmccord.github.com/OpenCV-Android/
444 stars 159 forks source link

= OpenCV-Android

Dedicated to providing an optimized port of OpenCV for the Google Android OS.

== Requirements

In order to use OpenCV-Android, you will need to download and install both the Android SDK 1.6 and NDK r5. It may or may not work with a higher / lower version. It has been confirmed that this doesn't work with older versions of Android SDK, so please use 1.6 or higher.

In addition to having the SDK or NDK you will also need to have one of the following:

For those of you running on the emulator, I built a very simple Socket-based Camera server that will send images over a socket connection. This uses the QuickTime libraries which are present by default on Mac OS X, but I haven't tested on other OSes. If it doesn't work for you, you can always try the original WebcamBroadcaster I derived mine from which uses the JMF (which doesn't work with Mac OS X, hence the QTWebcamBroadcaster): http://www.tomgibara.com/android/camera-source

== Build

Building is now even easier with NDK r5. I'm going to assume that you are familiar with Android if you are reading this and keep it rather short.

By default, this will build the opencv library and push it into: [OPENCV_ANDROID_ROOT]/tests/VideoEmulation/libs

You can change where the lib is delivered by modifying the APP_PROJECT_PATH in: [OPENCV_ANDROID_ROOT]/Application.mk

Once you have built the OpenCV library, you can now build and [OPENCV_ANDROID_ROOT]/tests/VideoEmulation.

NOTE: If you plan to use the Socket Camera, you will need to build and run the [OPENCV_ANDROID_ROOT]/tests/QTWebcamBroadcaster first. Also, if you use Eclipse to develop Android, there are already projects defined for both of these applications. You can simply import them into your workspace.

IMPORTANT NEW NOTE: In order for the QTWebcamBroadcaster to work, you must ensure that you run it in a 32-bit VM. Failure to run in 32-bit will result in an empty screen in your emulator (no image will appear.) You can easily force the VM that runs this application to 32-bit by adding the VM flag: -d32

== Setup

If you want to test face tracking, then you need to have a Haar Classifier Cascade XML. I have provided one for use and it is stored in: tests/haarcascade_frontalface_alt.xml

Before attempting to run the VideoEmulator application, you must first copy this XML file into the emulator in the following location: /data/data/org.siprop.opencv/files/haarcascade_frontalface_alt.xml

Currently, this is a hard-coded path that we look up. Hopefully, this can be remedied in a future version.

== Run

In order to use the VideoEmulator, you have to use the emulator (hence the name.) If you have a Dev Phone, you can play around with the old 'OpenCVSample' test or modify the VideoEmulator to support a real camera. This is something we will work on resolving in the future.

Using the emulator there are two slightly different 'flavors' of running. Both are socket based cameras, but one is written in C++ and emulates a real OpenCV Capture while the other (loosely) emulates a camera implementation in Java. The C++ version is the default as it is slightly faster and takes a little less memory. Also, the ultimate goal is to hook up with a real camera in C++ so that we don't have to pass huge amounts of data (images) back and forth through the JNI interface.

NOTE: For all of these examples you cannot use localhost or 127.0.0.1 as your address for the socket camera. The reason is because when the client is running on the Android emulator, both of these map to Android's localhost, not the machine you are running the emulator on. This means you have to be connected to a network in order to use the socket camera, a limitation.

=== C++

Since this is the default, we have made it pretty easy...

=== Java

To use Java, you have to make a small code change. Eventually we will make this a configurable option without having to make a code change. The reason is because when we send data over a socket from Java to Java it is faster to send serialized buffered images. However, when we send data to C++, we have to send a raw byte array.