chaquo / chaquopy

Chaquopy: the Python SDK for Android
https://chaquo.com/chaquopy/
MIT License
812 stars 131 forks source link

How to access camera with OpenCV #303

Closed mhsmith closed 4 years ago

mhsmith commented 4 years ago

Originally posted by @chaotianjiao in https://github.com/chaquo/chaquopy/issues/140#issuecomment-637399770

Hi, How do you pass the frames to Python? Can you give some examples? Acturally, I use opencv-android-sdk to get a camera frame,like this:

public void onCameraViewStarted(int width, int height) {
        mRgba = new Mat(height, width, CvType.CV_8UC4); }

public void onCameraViewStopped() {
        mRgba.release(); }

public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
        return inputFrame.rgba() ; }

And I have a test.py like this:

` def test_video(path):

cap = cv2.VideoCapture(video_path)

while cap.isOpened():

ret, frame = cap.read()

start = timer()

pro_frame = process(frame, size=4)

end = timer() `

If we can't use cv2.VideoCapture,cv2.imshow(), How can the java frame passing to test_video() ? If the previous step success, How can you show the result in an Android app(Using SurfaceView or sth else?)

I will be very appreciated if you can answer my questions above.

chaotianjiao commented 4 years ago

@mhsmith Thank you ! Do you have any ideas to solve this ?

mhsmith commented 4 years ago

Currently the easiest way to pass images between Java/Kotlin and Python is to pass them as JPG or PNG data. Here's an example of a project that does that:

Or if you want to pass an image from Python to Java/Kotlin and display it on the screen, see the chaquopy-matplotlib project.

chaotianjiao commented 4 years ago

@mhsmith First thanks for your ideas, but I'm new to Kotlin and don't want to use YUV. Is there any solustions to passing a Mat to cv2.imencode or cv2.imdecode or cv2.imread ? I have tried this: In my MainActivity.java: ` public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {

    Python py = Python.getInstance();

    PyObject image = py.getModule("new_test").callAttr("test_image", mRgba);

    return inputFrame.rgba(); }

`

In the new_test.py: ` def test_image(java_mat):

image = cv2.imdecode(java_mat, cv2.IMREAD_COLOR)

while image:

    start = timer()

    pro_frame = process(image, size=4)

`

And I got this error:

` com.chaquo.python.PyException: TypeError: Expected Ptr for argument '%s'

    at <python>.new_test.test_image(new_test.py:57)

` How can I solve this or maybe I use android's camera view to get the frame then coding as you said before?

chaotianjiao commented 4 years ago

I also tried this:

         MatOfByte matOfByte = new MatOfByte();
        // encoding to png, so that your image does not lose information like with jpeg.
        Imgcodecs.imencode(".png", mRgba, matOfByte);
        byte[] byteArray = matOfByte.toArray();
        Python py = Python.getInstance();
        PyObject image = py.getModule("new_test").callAttr("test_image", byteArray);

the image_test.py is :

def test_image(java_bytes):
    image = cv2.imdecode(".png", java_bytes)
    while image:
        start = timer()

got another error:

com.chaquo.python.PyException: TypeError: an integer is required (got type [B) at .new_test.test_image(new_test.py:57)

mhsmith commented 4 years ago

In Python, the arguments to imdecode are a Numpy byte array, and some flags. So try this:


import numpy as np
image = cv2.imdecode(np.asarray(java_bytes), cv2.IMREAD_COLOR)
chaotianjiao commented 4 years ago

I've tried convert Mat to Byte in JAVA like this:

 MatOfByte matOfByte = new MatOfByte(); 
 Imgcodecs.imencode(".png", mRgba, matOfByte);
byte[] byteArray = matOfByte.toArray();

and passing byteArray to python like this:

Python py = Python.getInstance()
 PyObject image = py.getModule("new_test").callAttr("test_image", byteArray);

in Python code like this:

def test_image(java_bytes): 
    image = cv2.imdecode(np.asarray(java_bytes), cv2.IMREAD_COLOR) 
   while image: 
       start = timer() 
 pro_frame = process(image, size=4)    
  end = timer()

I still got an error: com.chaquo.python.PyException: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all() at .new_test.test_image(new_test.py:58) Is there any way to convert java byte to python byte in order that cv2 can read? Actually , if it is difficult to do convert, I would like to use this.

mhsmith commented 4 years ago

This is a common Numpy error message: please search and you'll find plenty explanations of it.

I think you have successfully converted the image, but you should test it with image is not None, rather than simply image.

mhsmith commented 4 years ago

If this is still a problem, please reopen the issue and give details.

chaotianjiao commented 4 years ago

@mhsmith Hello,now I can pass java's matofbyte to python and got some data like this: image It means the frame is 4 channels? Usually When I use cv2.imread('demo.jpg'), we always get data like this: image So ,which method should I call to read the frame like [[[[1,2,3,4 ...]]]] in python? I use this to read the frame: image but got some error like this:

E/AndroidRuntime: FATAL EXCEPTION: Thread-2
    Process: com.byd.callpython2, PID: 9884
    com.chaquo.python.PyException: ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
        at <python>.new_test.test_image(new_test.py:80)
        at <python>.chaquopy_java.call(chaquopy_java.pyx:281)
        at <python>.chaquopy_java.Java_com_chaquo_python_PyObject_callAttrThrows(chaquopy_java.pyx:253)
        at com.chaquo.python.PyObject.callAttrThrows(Native Method)
        at com.chaquo.python.PyObject.callAttr(PyObject.java:209)
        at com.byd.callpython2.MainActivity.onCameraFrame(MainActivity.java:145)
        at org.opencv.android.CameraBridgeViewBase.deliverAndDrawFrame(CameraBridgeViewBase.java:408)
        at org.opencv.android.JavaCameraView$CameraWorker.run(JavaCameraView.java:373)
        at java.lang.Thread.run(Thread.java:761)
mhsmith commented 4 years ago

If the first channel is always 255, then it's probably an alpha channel. You can remove it like this:

frame = frame[:, :, 1:]
mhsmith commented 4 years ago

As for the exception, I don't think it can come from the code you posted. If you still can't work it out, please post the code around the indicated line number (new_test.py:80).

chaotianjiao commented 4 years ago

image This is my python code. I have changed into frame = frame[:, :, 1:],but I still got this error: image

mhsmith commented 4 years ago

Unlike most Python containers, NumPy arrays can't usually be converted to a boolean value. If you want to compare frame against None, you'll have to explicitly write frame is not None.

chaotianjiao commented 4 years ago

@mhsmith It works,Thank you!, now I have changed my code into: MainActivity.java:

public void onCameraViewStarted(int width, int height) {
        mRgba = new Mat(height, width, CV_8UC4); }

    public void onCameraViewStopped() {
        mRgba.release();
    }

    public Mat onCameraFrame(CameraBridgeViewBase.CvCameraViewFrame inputFrame) {
        mRgba = inputFrame.rgba();
        Mat mat = new Mat();
        Imgproc.cvtColor(mRgba,mat,Imgproc.COLOR_RGBA2BGR);

        MatOfByte matOfByte = new MatOfByte();
       // encoding to png, so that your image does not lose information like with jpeg.
        Imgcodecs.imencode(".png", mat, matOfByte);
        byte[] img_byte = matOfByte.toArray();
      // passing to python
        Python py = Python.getInstance();
        PyObject image = py.getModule("new_test").callAttr("test_image", img_byte);
        Log.d("sss","image=="+image);
        byte[] py_result = image.toJava(byte[].class);
        Mat result = new Mat();
        result.put(0,0,py_result);
        return result;
    }

and in new_test.py:

def test_image(java_bytes):
    frame = Image.open(io.BytesIO(bytes(java_bytes)))
    frame = np.array(frame,dtype=np.uint8)
    # Convert RGB to BGR
    # frame = frame[:, :, 1:].copy()
    if frame is not None:
        start = timer()
        pro_frame = process(frame, size=4)
        end = timer()
        fps = 1.0 / (end - start)
        cv2.putText(pro_frame, 'fps:' + str(fps), (10, 50), cv2.FONT_HERSHEY_SIMPLEX, 1, (87, 220, 217), 3)
        is_success, im_buf_arr = cv2.imencode(".png", pro_frame)
        byte_im = im_buf_arr.tobytes()
        return byte_im

The app is always working but I get a black screen. I found this in the log:

D/JavaCameraView: Preview Frame received. Frame size: 1382400
D/sss: image==b'\x89PNG\r\n\x1a\n\x00\x00\x00\rIHDR\x00\x00\x01@\x00\x00\x00\xb4\x08\x02\x00\x00\x00j\x86bF\x00\x00 \x00IDATx\x01|\xc1y\xccn\xebY\x1e\xf6\xeb\xba\xee\xe7Yk\xbd\xef7\xec\xbd\xcf>\xf396\xc76\x18B\x08\x90\x02\xa5qE\x1a\x05\xe2\x01l\xd3\x96\xd0\x98\x02\xa9\xa1a*Q[(\x81\x14)-%\x04\xa8B#D\xd3\xd2\x04\xa4\x90\xaaJ\xa5\x96\xb4\x15\xa8\x04\x05"20\x04p=\x1e\xdb\xd8\xe7\x1c\x9fy\xcf\xc3\xb7\xbf\xe9}\xdf\xb5\x9e\xe7\xbe\xfaA\xa5\xfeQ\xc5\xfd\xfd\xf8\x17\xde\xff=\xc5M\x99i\x00-\x9c\x03\xbaYBeP\x19\xeaP\x8aJ\xd4q\x1c\xf7\x1e:xp\xbc\xb9\xfa\xc8\xd5\xb6\xebc-\x97\x0e\xd7\x1d\xb9\xb7>\xbc\xf2\xf0\x13_\xf2o|\xd1\xd5\xcbWD\xe2\xb3\xbbv\xe7\xde\xbc\xddm\x979\x97y\xb7,\xad/s\x9b\xb3yq\xb6\xecD\xb4\x06\x18\x99=\x8de\xb3\x1b\xa6\tp\xb7a\x1b\xa6a\x18\x99\x80L\n\x99(\xa2\xff\xca_\xfbo\xff\xfb\x9f\xf8/N\x96{f@\x8c`\x19\xca\xd2\xb3/\x19Rwk\xf3\xb2\xb7?\x89"\xe1?D9\x9b\\@\x13\x1d\x14\x15&\x05Pp\x12\x84x\xc1\x80\xe1\xc2H@\x86i\x80\x14D\x02\xa0\x82\xa6\x00Q@\x98\xdd\x10a\x8b\xd54\x90D\x88I!\n\x98^\xe6\xa3\x93\xd3\x1b\xd7^\xbf~\xeb\xc6\xdd\xa3\x07\xdb\x93\xb6l\x97G\x0e\xd6m\xe6\xfd\xe3\x07_\xf6\xa5_\xf4\xe8SW"\xb1\x9c\x9e\x90rh\\\r\xf3\xd9v>=^\x96\xb6\x1a\xca\xb2,\x1c\xc6(\xa2\xe9n\x04\xd0\xcd*\'#\x90Pe\xb0\xa8g\x96(\xdd\x1d UB2\xe0\xec\x80\x9d\x89\x9e\x9b\xddy\x9f\x97\xde\x9a\x86!\x97\xb9\xb5\x8e\x9e\rVw\'\xec\x14\xd5\xbaEfs\x846\xcb\x9c-\tg&\xc8*\xa2\x96AQj\x01\xecn;\xe5\xde\xc5\x12Qj\tj\xb7\x9bW\xd3\xb8\xcc\xcb\xd2\x96B;\xcaj\x18\x1b\xb09;/\x11\xe3\xfe^\xb6\xbe9=E\xc48\xd5R\xeb\xbc\x9d\xb7\x9b\xdd\xde\xde\x18\xe3\xfa\xfc\xe4\xfc\xc3\xcf~\xfc\xd7\x7f\xfb\xf7\xa6q\xbfN\xab\xfd+\x0fO{\x97\x87\xe1\x12\xa3*\x86\x84I\x1a\xa1\x00M\xc3\n\xf7\xac .H\xcctO\n\x99\xe6\x10\xeev\xa6h\x03=\xe1\x9e,t\x124\xbb\x110\xe0\xec\x86\x92\x00T\x02H\xb2\xda\x9d\x04LCE\xa6\n\x04g2\xe5\xec\xadCC\x14\x13)Z\xaaAwF\xa1\x8b\x90\x81\xc2\x18k\xac\x8b\xcaX\xc6\x91W\x0e\xa6\xa7\x1f=8|\xf4\xea\x17~\xc9\x1f{z\xff!\xfc\xbf\xb6\xd7\xef\xde\xbc\xf9\xfa\xcb\xd7\xbd\x1an\xbcz\x0b\xf7\xeeb\xdb\x8e\xda|\xdb\xab\xe3\xf3\xb3\xf2\xfagzC\x9eo\xb6v\xd3vg:{f\xa2\xf7n3\xe1\x96vK\x9a\xdf\xf2\x17\xff\xc3%\xb9\x18\x93\xdd\x9d\x03\xfa\xa8\x9e\x8c"\x8dQ\x11\xa1\x12S\x19\x0e\xf6.\xd7\x1a\x0b\xdc\xdd\xaf\\\xbe\xb2\xd9\xceW\x1f\xba\xdc\x91\xcb\xf9\xf6\xdb\xbf\xe7;\xee\x9f\x9d\xef\xef\xedO\xc30T\xe2\xb3;:\xdb\x9d\x9e\x9c\xec\x96\xddf\xb7\x99\x97\xa5\xb5yi}i\xadgfG\xfa\x0fe\xe7\xb07\xb6\xd3\xb9\xb9\xf54It\x1bHB@\xda2\x00\x9a6\xa0\xb1\xfe\xb5\xef\xfc\x99\xbf\xf1\xb3\xffI\xcb\xae!@\xd1I\xa2\'\xda\xdc\x86iX\xba\xab"\xc2\x04\xc0L\x00\t\xd0i\x8a\x16\xc2L2H\x03$C\xceF\x07\x8b\xe0$\xc0\x94I\xd0\x94\xe1\xa0\xa84K1\x93\xe0\x1f\x92\x00\x91&M2\x88@\xce\xcb\xad\x9b\xb7_z\xe5\xc5\xd3\xa3\xd36we/\x0cw\x9f\xcd\xcd\xc4\xe18\xed\xad\x0f\xcaXn_\xbby|rr\xbe\xd9=\xfd\xcc[\xbe\xec+\xffX;?o\xdb\xf3\xbe=_\xad\xf7\r\xc5jx\xf5\xd3\x9f\xberx\xd0\xd31\xd4Z\x87a\x1cwm\xe9}a\x9a`(\x8cnC1@\x10\xffP\xdaH\xa6\x9b\xa2\x82r_\xb27\xc0\xf3\xbc\xebs\xa3\xdbn3/X\x84\x92N$3\xdd\x13p\xc2\xa0\x08\x11i\xa4\r\xb4\xde2\xb3\xb7^"\x0c\x13\x8c\xc2\xa1\x16\r\xe3P\xcb0\x0c\x12\xe6\xf3\xf3\xdd\xb2d\xc7j\x7f\x9c\x86\x89v_r\xe9M\xf0v\xbb\xa4\x17Q\xa5F\x8d\xba\xed9o7\xe3P\xc7qjs\xdb\xb6\xb9\x14\xd6iO\xf0\xd9\xc9\xb9\xc5i,\xf7O7\xff\xec\x9f\xfe\xe6\xc7>\xf5\xfc\xde\xfe\xa5\xb2\xda\x9f\xf6/\x8f{W\xc6qOe_Q\xd2\x0c1%\x01\xdd9*\xd2i\xb9\xbb\x88$\xdc\xcd\x00\xc8\x00\xb3\x81\x06Z\xeb\x03\x99\xd9;\xadd\x12A\xdb\t\x18\xb6\x93\x80%\xd9\x90\x081\x93p\x86"i&DS\x00"(\n\x89\x0b\x0e;\r\x89v\xd6Z\x0c\x03Y \x07\x8c\xa8A\x12c\xb0\x04V\xc3\xb0\xde+\xab1\x0f\xf7\xa6\xa7\x9f~\xe2\xa1G\xf6\xde\xf8\x86\xa7\x1e;\xbc2\xc8\xeag\xb7\xaf_\xbb\xf3\xfa\xedz\xe9\xe0\xd5\xeb\xb7\xfb\x9d\x9b:;\x7f\xb0\xc3-\r\xf7\xef\x1d\x0fw^\xd7\xb2=\xda\xcee\xd9%\xbcCo\xe9\xd9\r\x86\x1a\x1a\x1a\xba\x8cn3\x91\xfc\xc6o\xfe\x0e\xa19\xbb\x92\x8d\xdc\xe3\xaeP\x0c\x91\xb5\x08\x8aq*Q\xc7i=\x8d\xe7\xe7\xcb\xa3O>r\xef\xee\x83K\x8f\\\x05H\xc5\x18\xe5\xa9\xcb\x8f~\xd1\xd7|\xc9PV\x11\xc3jZ\x0f\xe3\xb0\x1eJ\x08\xff?\xae\xdd}\xb0\xdd\xed\xb6\xdb\xf3\xed\xb2mK\xee\xe6\xf3\xec\xbd\xa5\xb3\x1bF\xcb|\xfd\xc6\xcd\xc7\xae>f\x02\x06\x814l\x130\r\x83`\x02\xa4\x00\xff\xec\xdf\xfb\xf9o\xfd
E/cv::error(): OpenCV(4.3.0) Error: Assertion failed (src.dims == 2 && info.height == (uint32_t)src.rows && info.width == (uint32_t)src.cols) in Java_org_opencv_android_Utils_nMatToBitmap2, file /build/master_pack-android/opencv/modules/java/generator/src/cpp/utils.cpp, line 101
E/org.opencv.android.Utils: nMatToBitmap caught cv::Exception: OpenCV(4.3.0) /build/master_pack-android/opencv/modules/java/generator/src/cpp/utils.cpp:101: error: (-215:Assertion failed) src.dims == 2 && info.height == (uint32_t)src.rows && info.width == (uint32_t)src.cols in function 'Java_org_opencv_android_Utils_nMatToBitmap2'
E/CameraBridge: Mat type: Mat [ -1*-1*CV_8UC1, isCont=false, isSubmat=false, nativeObj=0x7f731f3240, dataAddr=0x0 ]
E/CameraBridge: Bitmap type: 1280*720
    Utils.matToBitmap() throws an exception: OpenCV(4.3.0) /build/master_pack-android/opencv/modules/java/generator/src/cpp/utils.cpp:101: error: (-215:Assertion failed) src.dims == 2 && info.height == (uint32_t)src.rows && info.width == (uint32_t)src.cols in function 'Java_org_opencv_android_Utils_nMatToBitmap2'
D/JavaCameraView: Preview Frame received. Frame size: 1382400

Is it not correct in my code to passing the frame or in java code decode the image data?

mhsmith commented 4 years ago

The image returned from Python is in PNG format, so you'll have to decode it using the OpenCV Java API. I'm sure you can find examples of this by searching.

chaotianjiao commented 4 years ago

Hi , I tried Camera2 api like you metioned in this: This is my code: image python: image and always got this error: image What should I do ? Thanks a lot~

mhsmith commented 4 years ago

It looks like the Camera2 API implementation in that app is incomplete, as it doesn't encode the image to JPEG before passing it to Python. You'll have to do that using something equivalent to the Camera API example.

chaotianjiao commented 4 years ago

@mhsmith W/python.stderr: Unlicensed copy of Chaquopy: app will now shut down.
I have send you a email, Could you contact me ?

mhsmith commented 4 years ago

You didn't mention your GitHub username, so I don't know which email was yours, but I've answered all the emails currently in my inbox.

chaotianjiao commented 4 years ago

@mhsmith Thanks, I've received your email. Our project have the last problem is that these three lines code:

             Python py = Python.getInstance();
             long start = System.currentTimeMillis();

             PyObject image = py.getModule("new_test").callAttr("test_image", yuv_byte);
             long end = System.currentTimeMillis();
             long using_time = (end - start);
             Log.e(TAG,  String.valueOf(using_time)+"ms" );

image

It's very slow to running these three lines. Is there any solutions to speed it up?

mhsmith commented 4 years ago

Your test_image function does a lot of things. If you want to find out which part is slow, add more logging to it.

chaotianjiao commented 4 years ago

@mhsmith Actually I have tried just do nothing except passing data to python then decode in java just like this:

def gray_image(java_bytes):
    frame = Image.open(io.BytesIO(bytes(java_bytes)))
    frame = np.array(frame)
    # Convert RGB to BGR
    frame = frame[:, :, ::-1].copy()

    if frame is not None:
        # start = timer()
        #
        # end = timer()
        # fps = 1.0 / (end - start)
        # cv2.putText(frame, 'fps:' + str(fps), (10, 50), cv2.FONT_HERSHEY_SIMPLEX, 1, (87, 220, 217), 3)
        _, im_buf_arr = cv2.imencode(".jpg", frame)
        return str(base64.b64encode(im_buf_arr))

and it's still 1075ms , Is there any solution?

mhsmith commented 4 years ago

I think the slow part is probably the call to b64encode, but you should add more logging to confirm that. Anything you print in Python will appear in the logcat.

The fastest way to return bytes from a Python function is to return a bytes object, and then convert it using .toJava(byte[].class). There's no need to go via base64.

chaotianjiao commented 4 years ago

@mhsmith Hi, I'm sorry to reply late. We have talked with email. And this is my boss's problem:

  1. If not use base64, using .toJava(byte[].class) ,we will get an error. 2.This is our python code:

    import io
    from top_layer import process
    import cv2
    from timeit import default_timer as timer
    from PIL import Image
    import numpy as np
    import base64
    def test_image(java_bytes):
    frame = Image.open(io.BytesIO(bytes(java_bytes)))
    frame = np.array(frame)
    # Convert RGB to BGR
    frame = frame[:, :, ::-1].copy()
    
    if frame is not None:
        # timer() 单位s
        start = timer()
        pro_frame = process(frame, size=4)
        end = timer()
        print("process using time:"+str((end - start)))
        fps = 1.0 / (end - start)
        cv2.putText(pro_frame, 'fps:' + str(fps), (10, 50), cv2.FONT_HERSHEY_SIMPLEX, 1, (87, 220, 217), 3)
        _, im_buf_arr = cv2.imencode(".jpg", pro_frame)
        start_01 = timer()
        result = str(base64.b64encode(im_buf_arr))
        end_01 = timer ()
        print("imencode using time:" + str((end_01 - start_01)))
        return result

    and this is the stdout in AndroidStudio, the time's unit is second:

    
    D/skia: onFlyCompress
    I/python.stdout: process using time:0.21592109400080517
    I/python.stdout:  
    I/python.stdout: imencode using time:0.0024337499926332384
    I/python.stdout:  
    D/skia: onFlyCompress
    I/python.stdout: process using time:0.19944093700905796

I/python.stdout: imencode using time:0.0018794269999489188

D/skia: onFlyCompress I/python.stdout: process using time:0.22260770801221952

I/python.stdout: imencode using time:0.002014166006119922

D/skia: onFlyCompress I/python.stdout: process using time:0.191485677001765 I/python.stdout:
I/python.stdout: imencode using time:0.00230302001000382 I/python.stdout:
D/skia: onFlyCompress I/python.stdout: process using time:0.20210078099626116 I/python.stdout:
I/python.stdout: imencode using time:0.002043593005510047

But when it display to the screen,It is really slow(Accoring to our eys, maybe only 0.1~1 fps). How can we speed up?
 and if we change

pro_frame = process(frame, size=8)

we can get this in AS:

D/skia: onFlyCompress I/python.stdout: process using time:0.055712603993015364

I/python.stdout: imencode using time:0.0007316139963222668

D/skia: onFlyCompress I/python.stdout: process using time:0.05203546799020842

I/python.stdout: imencode using time:0.000723749995813705

D/skia: onFlyCompress I/python.stdout: process using time:0.04847999999765307


the screen shows fps is 20,but our eyes see the fps is only 1~2.
mhsmith commented 4 years ago

If not use base64, using .toJava(byte[].class) ,we will get an error.

What error? If im_buf_arr is a Numpy array, you may need to convert it to a bytes object using the tobytes method before returning it.

the screen shows fps is 20,but our eyes see the fps is only 1~2.

Then the slow part must be somewhere other than the process function.

In your previous comment you found that test_image as a whole was taking about 1000 ms. Is that still true? If so, then add more logging within test_image to measure the time of each part, and narrow it down until you find the specific function call or calls which are slow. It looks like I was wrong about b64encode being the cause, so I'm not going to try making another guess.

chaotianjiao commented 4 years ago

@mhsmith oh,we don't convert it to a bytes, so we get an error,now we can use without base64. And now we found that slow is this:

private ImageReader.OnImageAvailableListener mOnImageAvailableListener = new ImageReader.OnImageAvailableListener() {

        @Override
        public void onImageAvailable(ImageReader reader) {
            // 在子线程执行,防止预览界面卡顿
            childHandler.post(() -> {
             Image img = reader.acquireLatestImage();
             // 拿到YUV的byte数据
             float start = System.currentTimeMillis();

             byte[] original_image_data = YuvProcess.getDataFromImage(img, YuvProcess.COLOR_FormatNV21);
             YuvImage yuv_image =  new YuvImage(original_image_data,ImageFormat.NV21, 1920, 1080,null);
             ByteArrayOutputStream byte_yuv_image = new ByteArrayOutputStream();
             yuv_image.compressToJpeg(new Rect(0,0,1920,1080),100,byte_yuv_image);
             byte[] yuv_byte = byte_yuv_image.toByteArray();

             // 传递给Python
             Python py = Python.getInstance();

             PyObject image = py.getModule("new_test").callAttr("test_image", yuv_byte);

             // python 端解码
//             String py_result = image.toJava(String.class);
            byte[] py_result = image.toJava(byte[].class);
             //Java端解码
//             byte[] decodedString = Base64.decode(py_result.substring(2, py_result.length() - 1), Base64.DEFAULT);
//             Bitmap final_bmp = BitmapFactory.decodeByteArray(decodedString, 0, decodedString.length).copy(Bitmap.Config.ARGB_8888, true);
            Bitmap final_bmp = BitmapFactory.decodeByteArray(py_result, 0, py_result.length).copy(Bitmap.Config.ARGB_8888, true);
            result.drawBitmap(final_bmp);
            float end = System.currentTimeMillis();
            Log.d("using time is:",String.valueOf(end-start));
            img.close();
            });
        }
    };

and the log is:

D/using time is:: 0.0
D/skia: onFlyCompress
D/using time is:: 0.0
D/skia: onFlyCompress
D/using time is:: 0.0
D/skia: onFlyCompress
D/using time is:: 0.0
D/skia: onFlyCompress
D/using time is:: 0.0
D/skia: onFlyCompress
D/using time is:: 0.0

but the screen is really slow. What is our problem?

mhsmith commented 4 years ago

float only has 24 bits of precision, which is probably why it's rounding to zero. You should go back to storing times as long, as you did in your previous comment. Then you'll be able to find the problem using more detailed logging, like I've said before.

Actually, you don't even need to use System.currentTimeMillis or timeit. Just add Log calls in Java, or print calls in Python, and look at the timestamps in the logcat to see how much time passes between them. If you've hidden the timestamps in Android Studio, you can re-enable them by clicking the settings button next to the logcat window.

chaotianjiao commented 4 years ago

@mhsmith Thanks !We use timestamps found that:

 byte[] original_image_data = YuvProcess.getDataFromImage(img, YuvProcess.COLOR_FormatNV21);
            YuvImage yuv_image =  new YuvImage(original_image_data,ImageFormat.NV21, 1920, 1080,null);
            ByteArrayOutputStream byte_yuv_image = new ByteArrayOutputStream();
            yuv_image.compressToJpeg(new Rect(0,0,1920,1080),100,byte_yuv_image);
            byte[] yuv_byte = byte_yuv_image.toByteArray();

This code using time is 150~170 ms.

  Log.e(TAG, "onImageAvailable: start" );
             PyObject image = py.getModule("new_test").callAttr("test_image", yuv_byte);
                Log.e(TAG, "onImageAvailable: end" );

This code usting time is 0.8~1.2s

We test our python code like this and finally found that:

def test_image(java_bytes):

    frame = Image.open(io.BytesIO(bytes(java_bytes)))
    print("python:start")
    frame = np.array(frame)
    print("python:end")

frame = np.array(frame) using too much time, almost about 600ms~800ms image

.Just as my previous comment said. How could we optimize our code?

mhsmith commented 4 years ago

The most obvious idea I can think of is to try using a resolution lower than 1920x1080. That would speed up all of your image processing code.

Having said that, 600-800 ms is surprisingly slow for loading and copying a JPEG image, even if it is such high resolution. If you find that this code runs significantly slower on Chaquopy than on other platforms, with identical image files, then please send me a test file along with its performance measurements on both platforms, and I'll look into it.

Otherwise, your problem is probably with Pillow or Numpy rather than Chaquopy, so if you need any further help on this, I think you should look elsewhere.

chaotianjiao commented 4 years ago

@mhsmith Thanks, our engineer will find the slow reason. When we can run it fluently, we will ask for a license. Hope we can keep talking. Best Wishes!

Oskar-H commented 2 years ago

@chaotianjiao Hello, have you found the solution? I'm facing the same problem of passing Mat to python now... Could you please possibly share your solution with me?

mhsmith commented 2 years ago

Unfortunately I don't know any easy way for the OpenCV Java and Python APIs to interact directly. The simplest solution is to pass data between them as a PNG file, as described above.