michaeljenkin / unityros

a unity ros tool
34 stars 41 forks source link

Publishing image to ROS #3

Open michaelprosario opened 8 years ago

michaelprosario commented 8 years ago

Hi. Thanks very much for putting together this awesome library to connect Unity 3D to ROS. I have enjoyed using it to connect SLAM algorithms to Unity 3D. I had a quick question regarding publishing images. I'm trying to publish an image to ROS bridge every 2 seconds.

Does this use case feel feasible? Any help you can offer would be appreciated.

ROS is currently mentioning that it's missing an op code.

` using UnityEngine; using System; using System.Collections; using ROSBridgeLib.sensor_msgs; using ROSBridgeLib.std_msgs; using ROSBridgeLib; using System.Text;

public class VideoTexture : MonoBehaviour {

// Use this for initialization
ROSBridgeWebSocketConnection ros;
WebCamTexture webcamTexture;
int count;
DateTime lastFrame;
DateTime camStart;
void Start () {

    webcamTexture = new WebCamTexture();
    Renderer renderer = GetComponent<Renderer>();
    renderer.material.mainTexture = webcamTexture;
    webcamTexture.Play();

    ros = new ROSBridgeWebSocketConnection ("ws://192.168.2.12", 9090);
    ros.AddPublisher(typeof(ImagePublisher));

    ros.Connect ();

    count = 0;
    lastFrame = DateTime.Now;
    camStart = DateTime.Now;
}

void OnApplicationQuit() {
    if(ros!=null)   
        ros.Disconnect ();

}

// Update is called once per frame
void Update () {

    //get jpeg of current frame ..
    Texture2D snap = new Texture2D(webcamTexture.width, webcamTexture.height);
    snap.SetPixels(webcamTexture.GetPixels());
    snap.Apply();

    byte[] data = snap.EncodeToJPG();
    string picString = Convert.ToBase64String(data);

    //picString = picString.Replace("data:image/jpeg;base64,", "");

    // set format
    string format = "jpeg";

    //How do you get a header message???
    // public TimeMsg(int secs, int nsecs)

    var now = DateTime.Now;
    var timeSpan = now - lastFrame;
    var timeSinceStart = now - camStart;

    if(timeSpan.Seconds > 2)
    {
        var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds);

        // public HeaderMsg(int seq, TimeMsg stamp, string frame_id) 
        var headerMessage = new HeaderMsg(count, timeMessage, "camera");

        //CompressedImageMsg(HeaderMsg header, string format, byte[] data)
        byte[] array = Encoding.ASCII.GetBytes(picString);
        var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

        ros.Publish(ImagePublisher.GetMessageTopic(), compressedImageMsg);

        lastFrame = now;
        count++;

    }

    ros.Render ();

}

}

`

michaeljenkin commented 8 years ago

Can you provide a screen dump of the error message from ROS? Also, please remember that as currently implemented the ROSBridge protocol is 8 bits clean so I am not sue that the compressedImageMsg (as a compressed image) is going to be that happy.

We normally do not send streamed images through the ROSBridge linkage. It is too easy to saturate the bandwidth the of the channel.

Michael

On Jul 25, 2016, at 7:42 AM, Michael Rosario notifications@github.com wrote:

Hi. Thanks very much for putting together this awesome library to connect Unity 3D to ROS. I have enjoyed using it to connect SLAM algorithms to Unity 3D. I had a quick question regarding publishing images. I'm trying to publish an image to ROS bridge every 2 seconds.

Does this use case feel feasible? Any help you can offer would be appreciated.

ROS is currently mentioning that it's missing an op code.

` using UnityEngine; using System; using System.Collections; using ROSBridgeLib.sensor_msgs; using ROSBridgeLib.std_msgs; using ROSBridgeLib; using System.Text;

public class VideoTexture : MonoBehaviour {

// Use this for initialization ROSBridgeWebSocketConnection ros; WebCamTexture webcamTexture; int count; DateTime lastFrame; DateTime camStart; void Start () {

webcamTexture = new WebCamTexture();
Renderer renderer = GetComponent<Renderer>();
renderer.material.mainTexture = webcamTexture;
webcamTexture.Play();

ros = new ROSBridgeWebSocketConnection ("ws://192.168.2.12", 9090);
ros.AddPublisher(typeof(ImagePublisher));

ros.Connect ();

count = 0;
lastFrame = DateTime.Now;
camStart = DateTime.Now;

}

void OnApplicationQuit() { if(ros!=null)
ros.Disconnect ();

}

// Update is called once per frame void Update () {

//get jpeg of current frame ..
Texture2D snap = new Texture2D(webcamTexture.width, webcamTexture.height);
snap.SetPixels(webcamTexture.GetPixels());
snap.Apply();

byte[] data = snap.EncodeToJPG();
string picString = Convert.ToBase64String(data);
picString = "stuff";
//picString = picString.Replace("data:image/jpeg;base64,", "");

// set format
string format = "jpeg";

//How do you get a header message???
// public TimeMsg(int secs, int nsecs)

var now = DateTime.Now;
var timeSpan = now - lastFrame;
var timeSinceStart = now - camStart;

if(timeSpan.Seconds > 2)
{
    var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds);

    // public HeaderMsg(int seq, TimeMsg stamp, string frame_id) 
    var headerMessage = new HeaderMsg(count, timeMessage, "camera");

    //CompressedImageMsg(HeaderMsg header, string format, byte[] data)
    byte[] array = Encoding.ASCII.GetBytes(picString);
    var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

    ros.Publish(ImagePublisher.GetMessageTopic(), compressedImageMsg);

    lastFrame = now;
    count++;

}

ros.Render ();

} }

`

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_kqzxNKox4F_69p-71mTYrUozzS8Yks5qZKE8gaJpZM4JUCcE.

michaelprosario commented 8 years ago

Is there a method that you would recommend for streaming images from Unity 3D back to ROS?

Please know that I appreciate the quick response! I really enjoyed getting to know your library. It's pretty fun!

michaeljenkin commented 8 years ago

I’ve never done it the Unity->ROS direction. In the other direction ROS->Unity, we have tried both services and also establishing a separate web channel to send them. The problem is that if you just send image sequences through the ROSBridge channel you can easily saturate the ROSBridge link, to the point where responsiveness suffers.

On the ROS side you can throttle the image (or any large data stream) in many ways. The idea of sampling images from the Unity side should work too, although I would be careful to only restart the clock once the image has been drained through the link.

Michael

PS If you send along the bug output, I can have a look to see where it is coming from. Glad to hear the library is useful.

On Jul 25, 2016, at 12:02 PM, Michael Rosario notifications@github.com wrote:

Is there a method that you would recommend for streaming images from Unity 3D back to ROS?

Please know that I appreciate the quick response! I really enjoyed getting to know your library. It's pretty fun!

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3#issuecomment-234998388, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_kgqtZYUFGFPXf1QevTMYW6xVRNKJks5qZN4EgaJpZM4JUCcE.

ghost commented 7 years ago

Hi, Did you solve the problem with the op code error? I am trying to send a Float32 to ROS and get the same op error.

[ERROR] [1506036825.816606]: [Client 1] Received a message without an op. All messages require 'op' field with value one of: ['service_response', 'unadvertise_service', 'call_service', 'publish', 'fragment', 'subscribe', 'advertise_service', 'unsubscribe', 'unadvertise', 'advertise']. Original message was: {"op": "publish", "topic": "/unityx", "msg": {"header" :{"seq": 0, "stamp": {"data" : {"secs" : 1, "nsecs" : 129}}, "frame_id": "xvariable"}"data" : 0}}

All the best Robert

michaeljenkin commented 7 years ago

Can you send a bit of the code that leads to this? But it looks as though a comma is missing before “data” which is invalid?

{"op": "publish”,

"topic": "/unityx”,

"msg": {"header” :

        {"seq": 0, 

         "stamp": {"data" : {"secs" : 1, "nsecs" : 129}}, 

         "frame_id": "xvariable”}. <- no comma

"data" : 0}}. <- no “data” associated with the message??? Strange

Michael

On Sep 21, 2017, at 7:59 PM, robetto notifications@github.com wrote:

Hi, Did you solve the problem with the op code error? I am trying to send a Float32 to ROS and get the same op error.

[ERROR] [1506036825.816606]: [Client 1] Received a message without an op. All messages require 'op' field with value one of: ['service_response', 'unadvertise_service', 'call_service', 'publish', 'fragment', 'subscribe', 'advertise_service', 'unsubscribe', 'unadvertise', 'advertise']. Original message was: {"op": "publish", "topic": "/unityx", "msg": {"header" :{"seq": 0, "stamp": {"data" : {"secs" : 1, "nsecs" : 129}}, "frame_id": "xvariable"}"data" : 0}}

All the best Robert

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3#issuecomment-331313443, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_kvojSP6uQ-FXlwTEKHrmrW-VwtgXks5skvhRgaJpZM4JUCcE.

ghost commented 6 years ago

` using System.Collections; using System.Collections.Generic; using UnityEngine; using System; using ROSBridgeLib.std_msgs; using ROSBridgeLib; using System.Text; using ROSBridgeLib.geometry_msgs;

public class pubtest : MonoBehaviour { private ROSBridgeWebSocketConnection ros = null; int count; DateTime lastFrame;

void Start() {

    ros = new ROSBridgeWebSocketConnection ("ws://192.168.1.217", 9090);
    ros.AddPublisher (typeof(pubbridge));
    //ros.AddServiceResponse (typeof(RealsenseServiceResponse));
    ros.Connect ();
    count = 0;
    lastFrame = DateTime.Now;
}

// Extremely important to disconnect from ROS. OTherwise packets continue to flow
void OnApplicationQuit() {
    if(ros!=null) {
        ros.Disconnect ();
    }
}

// Update is called once per frame in Unity
void Update () {
    var now = DateTime.Now;
    var timeSpan = now - lastFrame;
    var timeMessage = new TimeMsg(timeSpan.Seconds, timeSpan.Milliseconds);
    var headerMessage = new HeaderMsg(count, timeMessage, "xvariable");

    Float32Msg msg = new Float32Msg (headerMessage , transform.position.x);
    ros.Publish(pubbridge.GetMessageTopic(), msg);

    ros.Render ();
}

} `

And the code for the Float32 is

` using System.Collections; using System.Text; using SimpleJSON; using UnityEngine; using ROSBridgeLib.std_msgs;

namespace ROSBridgeLib { namespace std_msgs { public class Float32Msg : ROSBridgeMsg { private HeaderMsg _header; private float _data;

        public Float32Msg(JSONNode msg) {
            _header = new HeaderMsg (msg ["header"]);
            _data = float.Parse(msg);
        }

        public Float32Msg(HeaderMsg header, float data) {
            _header = header;
            _data = data;
        }

        public HeaderMsg Getheader()
        {
            return _header;
        }
        public static string GetMessageType() {
            return "beginner_tutorials/Float32_header";
        }

        public float GetData() {
            return _data;
        }

        public override string ToString() {
            return "Float32_header [header=" + _header.ToString() +
                "data=" + _data.ToString() + "]";
        }

        public override string ToYAMLString() {
            return "{\"header\" :" + _header.ToYAMLString() +
                "\"data\" : " + _data.ToString() + "}";
        }
    }
}

} `

ghost commented 6 years ago

You were right! There was a missing " ,"

public override string ToYAMLString() { return "{\"header\" :" + _header.ToYAMLString() + ", \"data\" : " + _data.ToString() + "}"; }

Thank You!

pushkalkatara commented 6 years ago

Is it possible to send Image from Unity -> ROS, the texture converted to .png or .jpeg?

michaeljenkin commented 6 years ago

Yes, but if will have to be converted to JSON and the receiving ROS message type will have to match. Michael

On Oct 2, 2017, at 10:37 PM, Pushkal Katara notifications@github.com wrote:

Is it possible to send Image from Unity -> ROS, the texture converted to .png or .jpeg?

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3#issuecomment-333722144, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_khVLsPKhCR_HQM4U1JYbtgsczr0Oks5soZ3OgaJpZM4JUCcE.

pushkalkatara commented 6 years ago

There are two message image formats under sensor_msgs in ROS - sensor_msgs/Image and sensor_msgs/CompressedImage. When I try to send as message type sensor_msgs/Image, ros gives out the error, step does not match. \"op\" : \"publish\", \"topic\" : \"/image_raw1\", \"msg\" : {\"width\" : "+width+" ,\"height\" : "+height+",\"encoding\" : \"rgb8\" , \"data\": \""+base64+"\"}} and When i try to send as message type sensor_msgs/CompressedImage, ros gives out the error "this message contains an uncompressed image" {\"op\" : \"publish\", \"topic\" : \"/image\", \"msg\" : {\"format\": \"png\", \"data\": \""+base64+"\"}}

http://docs.ros.org/api/sensor_msgs/html/msg/Image.html http://docs.ros.org/api/sensor_msgs/html/msg/CompressedImage.html

I think am missing the header info.

pushkalkatara commented 6 years ago

After correcting the JSON, ROS receives the JSON and gives the error - screenshot_from_2017_10_03_23_01_53

Also on rostopic echo /image/compressed screenshot_from_2017_10_03_23_02_26

hermonir commented 6 years ago

Any success on sending an image from Unity to ROS?

I'm using this code:

void SendJPEGImage(Texture2D _image)
{
    var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds);
    var headerMessage = new HeaderMsg(count, timeMessage, "camera");
    byte[] data = _image.EncodeToJPG();

    string picString = Convert.ToBase64String(data);
    Debug.Log ("data length: " + data.Length);

    // set format
    string format = "jpeg";

    //CompressedImageMsg(HeaderMsg header, string format, byte[] data)
        byte[] array = Encoding.ASCII.GetBytes(picString);
        var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

        ros.Publish(CompressedImagePublisher.GetMessageTopic(), compressedImageMsg);

}

ROS or rosbridge do not complain, but the image does not open in MatLab or in rqt. rqt says it doesn't have a compressedImage plugin: image

Matlab error is: Could not find a semicolon in format character vector jpeg. Modify the "format" property of the message object appropriately.

I tried putting in "jpeg; jpeg uncompressed", and after looking at a working image from another source, also "bgr8; jpeg uncompressed bgr8" to no avail.

Thanks :-)

michaeljenkin commented 6 years ago

It can be made got work, but is not recommended. The problem is that the base64 encoding of the image is going to result in a huge package. Such packages can easily saturate the rosbridge connection.

I think that the problem you are having is at the other end?

Michael

On Nov 6, 2017, at 2:43 AM, hermonir notifications@github.com wrote:

Any success on sending an image from Unity to ROS?

I'm using this code:

void SendJPEGImage(Texture2D _image) { var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds); var headerMessage = new HeaderMsg(count, timeMessage, "camera"); byte[] data = _image.EncodeToJPG();

string picString = Convert.ToBase64String(data); Debug.Log ("data length: " + data.Length);

// set format string format = "jpeg";

//CompressedImageMsg(HeaderMsg header, string format, byte[] data) byte[] array = Encoding.ASCII.GetBytes(picString); var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

    ros.Publish(CompressedImagePublisher.GetMessageTopic(), compressedImageMsg);

} ROS or rosbridge do not complain, but the image does not open in MatLab or in rqt. rqt says it doesn't have a compressedImage plugin: https://user-images.githubusercontent.com/14308559/32430005-e2b9ef9c-c2d5-11e7-9638-ab82d17c9688.png Matlab error is: Could not find a semicolon in format character vector jpeg. Modify the "format" property of the message object appropriately.

I tried putting in "jpeg; jpeg uncompressed", and after looking at a working image from another source, also "bgr8; jpeg uncompressed bgr8" to no avail.

Thanks :-)

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3#issuecomment-342069432, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_kqCAeFrQyXQ-hKmhNhuC7R3aHThjks5szriigaJpZM4JUCcE.

pushkalkatara commented 6 years ago

Any other way to convert Unity Texture2D Byte Stream to Ros topic? Maybe without using rosbridge, sockets or wwwforms in unity to send the bytes and requests in python to parse the bytes and display using opencv?

hermonir1 commented 6 years ago

@michaeljenkin I couldn't find any other way to send an image to ros. When I tried sending an uncompressed image, rosbridge kept complaining about the width and height and offset. Do you have a working sample code for publishing an uncompressed image?

Cheers

DapperFactory commented 6 years ago

You have to send the image in a compressed format (well thats how I have been able to get Unity->Ros working). Otherwise you can not get a byte[] array to ship out. You aslo have to use RGB24 when using ReadPixels() to convert RenderTexture to texture2D before encoding as png or jpeg. Then you can use rosrun image_view image_view image:=<your/topic/url> _image_transport:=compressed. From here you can use cv_bridge to modify the format and uncompress the image. Note that if you are also shipping out a depth map from unity you still need to do the same processes above. I have not found an easier way because the ReadPixel() only supports >3 channel assuming the texture is at least RGB. So you will end up with a 3 channel depth image where R, G, B are just repeated values. Again using cv_bridge::toCvShare(sensor_msgs::Image, "mono16") you can convert the 3 channel depth into a single channel depth.

Im now struggling getting the correct projection matrix conversion from Unity camera. Iv been using cam.projectionM * cam.worldToCameraM to get a 4x4 projection matrix, but my vision-odometry that im running has a hard time tracking points.

hermonir1 commented 6 years ago

Do you mean the render texture needs to be in RGB24? There is no such format...

image_formats

This is my pre-sending code:

RenderTexture.active = camTexture;

//get jpeg of current frame ..
Texture2D image = new Texture2D(camTexture.width, camTexture.height);
image.ReadPixels(new Rect(0, 0, camTexture.width, camTexture.height), 0, 0);
image.Apply();

SendJPEGImage (image);

Thanks

pushkalkatara commented 6 years ago

@hermonir @hermonir1 any luck?

hermonir1 commented 6 years ago

@pushkalkatara No, I'm actually waiting for your answer.. What did you mean about the RGB24 format? Where did you set this?

hermonir1 commented 6 years ago

Oh, RGB24 is used when creating the new texture2d. I'll try it tomorrow.

hermonir1 commented 6 years ago

@pushkalkatara @michaeljenkin Still no go :-\

Texture encoding code:

Texture2D image = new Texture2D(camTexture.width, camTexture.height, TextureFormat.RGB24, false);
image.ReadPixels(new Rect(0, 0, camTexture.width, camTexture.height), 0, 0);
image.Apply();

SendJPEGImage (image);

Actual publishing code:

void SendJPEGImage(Texture2D _image)
    {
        var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds);
        var headerMessage = new HeaderMsg(count, timeMessage, "camera");
        byte[] data = _image.EncodeToJPG();
        string picString = Convert.ToBase64String(data);
        Debug.Log ("data length: " + data.Length);

        // set format
        string format = "jpeg";

        //CompressedImageMsg(HeaderMsg header, string format, byte[] data)
        byte[] array = Encoding.ASCII.GetBytes(picString);
        var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

        ros.Publish(CompressedImagePublisher.GetMessageTopic(), compressedImageMsg);

    }

Can someone please send a working code sample?

michaeljenkin commented 6 years ago

It is not clear where the error is, at the Unity side or at the Matlab error. All the Unity side does is fire out a json string that encodes the image. Its a 7 bit clean stream so you should be able to put it in a file, give it to your favourite json decoder, and see if you are actually getting a jpeg. Btw, the json structure will be really simple, so except for the really large Base64 encoded image it should be quite easy to view and inspect visually.

But to repeat, firing images (repeatedly) through the rosbridge link is not recommended. Open up some other stream and pump the jpeg through that. http is usually really easy.

Michael

On Nov 13, 2017, at 3:51 AM, hermonir1 notifications@github.com wrote:

@pushkalkatara https://github.com/pushkalkatara @michaeljenkin https://github.com/michaeljenkin Still no go :-\

Texture encoding code:

Texture2D image = new Texture2D(camTexture.width, camTexture.height, TextureFormat.RGB24, false); image.ReadPixels(new Rect(0, 0, camTexture.width, camTexture.height), 0, 0); image.Apply();

SendJPEGImage (image); Actual publishing code:

void SendJPEGImage(Texture2D _image) { var timeMessage = new TimeMsg(timeSinceStart.Seconds, timeSinceStart.Milliseconds); var headerMessage = new HeaderMsg(count, timeMessage, "camera"); byte[] data = _image.EncodeToJPG(); string picString = Convert.ToBase64String(data); Debug.Log ("data length: " + data.Length);

  // set format
  string format = "jpeg";

  //CompressedImageMsg(HeaderMsg header, string format, byte[] data)
    byte[] array = Encoding.ASCII.GetBytes(picString);
    var compressedImageMsg = new CompressedImageMsg(headerMessage, format, array);

    ros.Publish(CompressedImagePublisher.GetMessageTopic(), compressedImageMsg);

} Can anyone please send a working code sample?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/michaeljenkin/unityros/issues/3#issuecomment-343851717, or mute the thread https://github.com/notifications/unsubscribe-auth/AJy_kuIopvCh51j4fb2P4_0n_ZBXDIhmks5s2AMPgaJpZM4JUCcE.

hermonir1 commented 6 years ago

Thanks!

Just solved the problem. It actually was on the Matlab side - it didn't convert from base64 string back to byte[].

Will look at sending the images with http. At least now we have a working solution.

pushkalkatara commented 6 years ago

Thanks! I was trying to display the image using image_transport but it was not working, a short opencv workaround helped. A subscriber with a callback -

        image_np = cv2.imdecode(np_arr, cv2.IMREAD_COLOR)
        cv2.imshow('cv_img', image_np)
        cv2.waitKey(2)
arunavanag591 commented 6 years ago

I am aware this issue is for image type; i am unable to send even string from unity to ros, however I can hear back from ros to unity.

public class ROSBridgeComm : MonoBehaviour {
 private ROSBridgeLib.ROSBridgeWebSocketConnection ros = null;

   void Start()
    {

        ros = new ROSBridgeLib.ROSBridgeWebSocketConnection("ws://192.168.137.175", 9090);
        ros.AddPublisher(typeof(PubSliderValue));
        ros.Connect();

    }

    void OnApplicationQuit()
    {
        if (ros != null)
        {
            ros.Disconnect();
        }
    }

    // Update is called once per frame
    void Update(){

            ROSBridgeLib.std_msgs.StringMsg msg = new ROSBridgeLib.std_msgs.StringMsg("_my_message ");
            ros.Publish("joint_values", msg);
            Debug.Log(msg);
            ros.Render();

    }
}

@pushkalkatara