Closed pthomasj closed 1 year ago
Hey @pthomasj
GStreamer, by default, assumes a 4-byte alignment in the width. This means that if a width is not multiple of 4, it assumes the frame contains a stride such that a the alignment is met. That’s why the acA720-290gm does not expose the problem. Try specifying the stride to be the same as the width in the video meta:
Instead of:
gst_buffer_add_video_meta( buffer, GST_VIDEO_FRAME_FLAG_NONE, GST_VIDEO_FORMAT_GRAY8, pylon_image.GetWidth(), pylon_image.GetHeight() );
use
const guint n_planes=1;
gsize offset[n_planes]=0;
gint stride[n_planes]=pylon_image.GetWidth();
gst_buffer_add_video_meta_full (buffer,
GST_VIDEO_FRAME_FLAG_NONE,
GST_VIDEO_FORMAT_GRAY8,
pylon_image.GetWidth(),
pylon_image.GetHeight(),
n_planes,
offset,
stride);
Note that the stride is specified in bytes, while the width in pixels. The only reason they match in this example is because your format is GRAY8 and there’s a single byte per pixel. You’ll need to adjust for other formats.
As a side question, if you don’t mind me asking: Why did you end up using Pylon + appsrc? Did the plug-in not fulfill your needs? What limitation did you encounter that you find this easier?
On 15 Mar 2023, at 12:01, pthomasj @.***> wrote:
This is not an issue with this project, but I am hoping that I can ask for help from your expertise in combining Pylon and GStreamer.
I am trying to find the cause of an issue that I am seeing in my attempt to use Pylon as an appsrc in GStreamer. This occurs when using a Basler camera model acA640-90gm, but not an acA720-290gm. I suspect that it is happening in the conversion of the Pylon grabbed images to GStreamer buffers, since if I save one of the grabbed images to a png file before it is converted, then it looks fine. The issue is that the video from GStreamer has a diagonal line that the image seems to be split at. I have attached a screenshot and a copy of the code. The hardcoded image width and height are changed to match that of the camera being tested. For the acA640-90gm it is 659 x 494. For the acA720-290gm it is 720 x 540.
include
include
include
extern "C" {
include <gst/gst.h>
include <gst/video/video-frame.h>
}
include <pylon/PylonIncludes.h>
std::mutex image_mutex; Pylon::CPylonImage pylon_image;
struct gstreamer_data { guint source_id;
GstElement appsrc; GstElement capsfilter; GstElement queue; GstElement videoconvert; GstElement* autovideosink;
GstElement* pipeline;
GMainLoop* main_loop; };
static gboolean push_data(gstreamer_data data) { if (image_mutex.try_lock()) { if (pylon_image.IsValid()) { GstBuffer buffer = gst_buffer_new_wrapped_full( (GstMemoryFlags) GST_MEMORY_FLAG_PHYSICALLY_CONTIGUOUS, (gpointer) pylon_image.GetBuffer(), pylon_image.GetImageSize(), 0, pylon_image.GetImageSize(), NULL, NULL );
gst_buffer_add_video_meta( buffer, GST_VIDEO_FRAME_FLAG_NONE, GST_VIDEO_FORMAT_GRAY8, pylon_image.GetWidth(), pylon_image.GetHeight() ); GstFlowReturn ret; g_signal_emit_by_name(data->appsrc, "push-buffer", buffer, &ret); if (ret != GST_FLOW_OK) { g_printerr("Error\n"); gst_buffer_unref(buffer); pylon_image.Release(); image_mutex.unlock(); return false; } gst_buffer_unref(buffer); pylon_image.Release(); } image_mutex.unlock();
}
usleep(1);
return true; }
static void start_feed(GstElement source, guint size, gstreamer_data data) { (void) source; (void) size;
if (data->source_id == 0) { //g_print("Start feeding\n"); data->source_id = g_idle_add((GSourceFunc) push_data, data); } }
static void stop_feed(GstElement source, gstreamer_data data) { (void) source;
if (data->source_id != 0) { //g_print("Stop feeding\n"); g_source_remove(data->source_id); data->source_id = 0; } }
gboolean bus_callback(GstBus bus, GstMessage message, gpointer user_data) { (void) bus; GMainLoop main_loop = ((gstreamer_data) user_data)->main_loop;
switch (GST_MESSAGE_TYPE(message)) { case GST_MESSAGE_EOS: { g_print("End of stream\n"); g_main_loop_quit(main_loop); break; } case GST_MESSAGE_ERROR: { GError error; gchar debug; gst_message_parse_error(message, &error, &debug); g_printerr("Error received from element %s: %s\n", GST_OBJECT_NAME(message->src), error->message); g_error_free(error); g_printerr("Debugging information: %s\n", debug ? debug : "none"); g_free(debug); g_main_loop_quit(main_loop); break; } case GST_MESSAGE_WARNING: { GError error; gchar debug; gst_message_parse_warning(message, &error, &debug); g_printerr("Warning received from element %s: %s\n", GST_OBJECT_NAME(message->src), error->message); g_error_free(error); g_printerr("Debugging information: %s\n", debug ? debug : "none"); g_free(debug); break; } default: //g_print("%s %s\n", GST_MESSAGE_SRC_NAME(message), GST_MESSAGE_TYPE_NAME(message)); break; }
return TRUE; }
static GstFlowReturn overrun_callback(GstElement* queue, gpointer udata) { g_printerr("%s overrun\n", queue->object.name);
return GST_FLOW_OK; }
void process_pylon() { try { Pylon::CInstantCamera camera(Pylon::CTlFactory::GetInstance().CreateFirstDevice());
GenApi::INodeMap& node_map = camera.GetNodeMap(); camera.Open(); GenApi::CIntegerPtr width = node_map.GetNode("Width"); std::cout << "Width: " << width->GetValue() << std::endl; GenApi::CIntegerPtr height = node_map.GetNode("Height"); std::cout << "Height: " << height->GetValue() << std::endl; camera.StartGrabbing(Pylon::GrabStrategy_LatestImageOnly); Pylon::CGrabResultPtr ptrGrabResult; int loop = 0; while (camera.IsGrabbing()) { loop++; camera.RetrieveResult(5000, ptrGrabResult, Pylon::TimeoutHandling_ThrowException); if (ptrGrabResult->GrabSucceeded()) { if (image_mutex.try_lock()) { pylon_image.AttachGrabResultBuffer(ptrGrabResult); if (loop == 20) { pylon_image.Save(Pylon::EImageFileFormat::ImageFileFormat_Png, "test.png"); } image_mutex.unlock(); } } else { std::cout << "Error: " << std::hex << ptrGrabResult->GetErrorCode() << std::dec << " " << ptrGrabResult->GetErrorDescription() << std::endl; } }
} catch (const Pylon::GenericException& e) { std::cerr << "An exception occurred." << std::endl << e.GetDescription() << std::endl; exit(-1); } }
int main(int argc, char *argv[]) { gst_init(NULL, NULL);
Pylon::PylonAutoInitTerm autoInitTerm;
gstreamer_data data;
data.appsrc = gst_element_factory_make("appsrc", "appsrc"); if (!data.appsrc) { g_printerr("Failed to create element 'appsrc'\n"); return -1; }
data.capsfilter = gst_element_factory_make("capsfilter", "capsfilter"); if (!data.capsfilter) { g_printerr("Failed to create element 'capsfilter'\n"); return -1; }
data.queue = gst_element_factory_make("queue", "queue"); if (!data.queue) { g_printerr("Failed to create element 'appsink_queue'\n"); return -1; }
data.videoconvert = gst_element_factory_make("videoconvert", "videoconvert"); if (!data.videoconvert) { g_printerr("Failed to create element 'videoconvert'\n"); return -1; }
data.autovideosink = gst_element_factory_make("autovideosink", "autovideosink"); if (!data.autovideosink) { g_printerr("Failed to create element 'autovideosink'\n"); return -1; }
// This had to be explicitly set in order to get the 'start_feed' and 'stop_feed' functions to be called more than once. data.source_id = 0;
g_object_set(G_OBJECT(data.appsrc), "stream-type", 0, // GST_APP_STREAM_TYPE_STREAM, "format", GST_FORMAT_TIME, "is-live", TRUE, "do-timestamp", TRUE, // needed for the pipeline to run NULL );
g_signal_connect(data.appsrc, "need-data", G_CALLBACK(start_feed), &data); g_signal_connect(data.appsrc, "enough-data", G_CALLBACK(stop_feed), &data);
g_object_set(G_OBJECT(data.capsfilter), "caps", gst_caps_new_simple("video/x-raw", "width", G_TYPE_INT, 659, "height", G_TYPE_INT, 494, "framerate", GST_TYPE_FRACTION, 0, 1, "format", G_TYPE_STRING, "GRAY8", NULL ), NULL );
g_signal_connect(data.queue, "overrun", G_CALLBACK(overrun_callback), NULL);
data.pipeline = gst_pipeline_new("pipeline"); if (!data.pipeline) { g_printerr("Failed to create pipeline\n"); return -1; }
gst_bin_add_many( GST_BIN(data.pipeline), data.appsrc, data.capsfilter, data.queue, data.videoconvert, data.autovideosink, NULL );
if ( gst_element_link_many( data.appsrc, data.capsfilter, data.queue, data.videoconvert, data.autovideosink, NULL ) != TRUE ) { g_printerr("Elements could not be linked.\n"); gst_object_unref(data.pipeline); return -1; }
data.main_loop = g_main_loop_new(NULL, FALSE);
GstBus* bus = gst_element_get_bus(data.pipeline); guint bus_watch_id = gst_bus_add_watch(bus, (GstBusFunc) bus_callback, &data); gst_object_unref(bus);
GstStateChangeReturn ret = gst_element_set_state(data.pipeline, GST_STATE_PLAYING); if (ret == GST_STATE_CHANGE_FAILURE) { g_printerr("Unable to set the pipeline to the playing state.\n"); gst_object_unref(data.pipeline); g_source_remove(bus_watch_id); g_main_loop_unref(data.main_loop); return -1; }
std::thread pylon_thread(process_pylon);
g_print("Starting main loop.\n");
g_main_loop_run(data.main_loop);
pylon_thread.join();
gst_element_set_state(data.pipeline, GST_STATE_NULL); gst_object_unref(data.pipeline); g_source_remove(bus_watch_id); g_main_loop_unref(data.main_loop);
return 0; } — Reply to this email directly, view it on GitHub https://github.com/basler/gst-plugin-pylon/issues/34, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFPIOBXJ3B2PSVVDCUNNK3W4H7WDANCNFSM6AAAAAAV4F6UZE. You are receiving this because you are subscribed to this thread.
Yes, that fixed it. Thank you! I had been struggling with this for some time.
It has been a while since I made the decision, but I think one of the features that I had been missing was the ability to handle disconnects without restarting the program (https://github.com/basler/gst-plugin-pylon/issues/7). (The code I posted does not do this because it is a minimum working example.)
This is not an issue with this project, but I am hoping that I can ask for help from your expertise in combining Pylon and GStreamer.
I am trying to find the cause of an issue that I am seeing in my attempt to use Pylon as an appsrc in GStreamer. This occurs when using a Basler camera model acA640-90gm, but not an acA720-290gm. I suspect that it is happening in the conversion of the Pylon grabbed images to GStreamer buffers, since if I save one of the grabbed images to a png file before it is converted, then it looks fine. The issue is that the video from GStreamer has a diagonal line that the image seems to be split at. I have attached a screenshot and a copy of the code. The hardcoded image width and height are changed to match that of the camera being tested. For the acA640-90gm it is 659 x 494. For the acA720-290gm it is 720 x 540.