micro-manager / futureMMCore

Place to discuss the future design of a universal microscope hardware interface
Creative Commons Zero v1.0 Universal
15 stars 1 forks source link

What do people use LabView for? #22

Open henrypinkard opened 3 years ago

henrypinkard commented 3 years ago

I know that LabView is often the go-to software for building customized systems from scratch that require complex synchronization. Having never used it myself, my big picture understanding is: 1) It provides some essential functionality 2) In spite of this, people seem to perpetually complain about having to use it

It would be really helpful if someone could explain it. What do you use it for? Why is it so essential? What are its most important features? What (if any) alternatives are there? Why is it so frustrating to use?

Rory-Power commented 3 years ago

Have been a heavy LabVIEW user for the last 10 years or so for control of fairly complicated microscopes and optical systems generally. A few points: A lot of the complaining about LabVIEW is either because people are using it for a quick and dirty solution and expect it to be extraordinarily stable and powerful without proper optimisation, or by those able to write appropriate code for instrument control, who find that it sacrifices a lot of performance in pursuit of general usability. Done properly, LabVIEW can be very powerful e.g. with use of OOP (which does exist in LabVIEW!) but will of course sacrifice something relative to syntax based programming languages. Others will instinctually dislike having to pay for it. What LabVIEW provides is a shallow learning curve, quick (and often dirty!) results that do not require any knowledge of programming languages. There is very little need to consider management of memory, multi-tasking/processing, UI design etc. The device support from manufacturers is generally very good as well. To answer your Qs specifically.

What is it used for: Rapid prototyping and quick/dirty microscope control (I have used it with cameras/DAQ devices/stages/lasers/SLMs/DMs/filter wheels/shutters etc.). It can also be used for control at a very professional and powerful level but generally the former dominates use cases.

Why is it so essential: Since it offers a solution for instrument developers without a specific programming background

What are its most important features: Device support from manufacturers, ecosystem of modules available (e.g. Vision/DAQ/FPGA packages), as above. The debugging features are limited but generally helpful.

What alternatives: The closest thing would be micro-manager. On one hand this makes things simpler even than LabVIEW but only if the desired device is supported and assuming that one does not need to do anything particularly complicated. I have used MM2 quite a bit and feel that it does not provide the same introduction to instrument control logic that one naturally acquires while learning LabVIEW. LabVIEW is really a very good learning tool for this.

Why is it so frustrating to use: Honestly, I have never found it to be particularly frustrating. Or at least its benefits far outweigh its issues for me. I can think of only one issue that was unable to be resolved by the National Instruments engineers (who I have always found to be helpful and happy to look for solutions). As instruments become more and more complex, perhaps at some point requiring some feedback control or real-time analysis it is probably not the best solution. For example, having the Python numpy/scipy/image analysis/machine learning/napari etc. toolbox available would be very helpful for this. With the aforementioned quick/dirty use, software will probably crash, fail to recover from errors and so on.

In summary: on the basis of a small barrier to entry (other than cost perhaps), speed of finding a workable solution, manufacturer device support and capabilities it remains a very powerful tool.

Hope that helps!

jondaniels commented 3 years ago

I was a heavy LabVIEW user 8-20 years ago but not in the specific context of microscopy. I largely agree with Rory.

Because LabVIEW code is visual instead of in text it is easy for novices to grasp, but can be frustrating and slow to program in e.g. instead of typing a variable name you have to create a wire between nodes, and to add two numbers you use the "+" icon and two wires in with a wire out. Change tracking is a nightmare in LabVIEW. Writing modular code is possible in LabVIEW, but it's really easy to write spaghetti code (which literally looks like spaghetti).

Reasons for LabVIEW appeal:

These days I would hesitate to use LabVIEW for a new project. I often suggest to people building custom microscope setups to try out Micro-Manager, but admittedly LabVIEW still has the above advantages if you are making something new that you and your labmates will use that doesn't need to be scalable or particularly sharable. To really compete Micro-Manager needs to lower the barrier to creating plugins, and part of that is overcoming Java's weaknesses.

Rory-Power commented 3 years ago

Agree with Jon, although some of the points noted such as spaghetti wiring are a result of lazy use, i.e. trailing wires to connect things up rather than defining/using local variables etc and a failure to use formula nodes for calculations rather than a series of operators. Absolutely tracking changes is a nightmare! Modularity requires thought and care - LabVIEW is not magic and despite its ease of entry, the same effort should be expended as with any programming language. Jon's point about scalability/shareability is generally correct, although there have been some great efforts that are specified to a type of microscope e.g. https://github.com/Gurdon-Super-Res-Lab/Microscope-Control. LabVIEW's issue as much as anything is that since it is so easy to get started with, users have no knowledge of good practices 'baked in', hence the unrealistic expectations of what it can do at times.

Rory-Power commented 3 years ago

Also code readability is generally horrible but can be OK if good practices are followed

FilipDominec commented 3 years ago

I have used LabView to control temperature cell (Linkam) and acquire photodiode waveform (Tektronics) in a liquid crystal laboratory. As the project went more complex and the procedure started containing several nested conditional loops, as well as on-the-fly curve fitting by means of Scilab code, it was becoming very hard to modify or debug.

I eventually abandoned LabView's graphical programming and rewrote the measurement application in Python+Visa+GTK+Matplotlib. Now I believe I should have never started with LabView.

SK-UK commented 3 years ago

Agree with Jon, although some of the points noted such as spaghetti wiring are a result of lazy use, i.e. trailing wires to connect things up rather than defining/using local variables etc and a failure to use formula nodes for calculations rather than a series of operators. Absolutely tracking changes is a nightmare! Modularity requires thought and care - LabVIEW is not magic and despite its ease of entry, the same effort should be expended as with any programming language. Jon's point about scalability/shareability is generally correct, although there have been some great efforts that are specified to a type of microscope e.g. https://github.com/Gurdon-Super-Res-Lab/Microscope-Control. LabVIEW's issue as much as anything is that since it is so easy to get started with, users have no knowledge of good practices 'baked in', hence the unrealistic expectations of what it can do at times.

This is pretty much exactly what I'd say - I'd perhaps add that there's decent straightforward support for simple communication between multiple computers through e.g. network-published shared variables in the rare cases that's needed.

rwb27 commented 3 years ago

I used LabVIEW a lot during my PhD and still fire it up occasionally to use code that I wrote back then. Others have done a great job of explaining its appeal in terms of ease of use, and the tendency to do things that are quick and dirty. A couple of points I'd add:

I've never seen a more effortless way to handle threading, which in my experience is the steepest part of the learning curve for lab scientists controlling equipment, particularly when you want a GUI. I aspire to be able to do this with a non-LabVIEW toolkit!

LabVIEW is great at hacking together a GUI. It's then a vast amount of work to make said GUI "nice" and I'd recommend almost any other language. But if you need to write a GUI, use it for a week, then forget, LabVIEW makes that super easy.

If you are using National Instruments hardware, LabVIEW is suggested as the go-to way to control it and all the manuals make it very easy to set up. I don't know how well NI devices are already supported by MMCore; this may be a moot point.

I'm pretty sure a nice set of MMCore bindings for LabVIEW would get used, though I'm not sure they'd get used enough to justify the maintenance effort. Calling the relevant functions from LabVIEW isn't hard with a "call library function" node, but it would take someone (with a LabVIEW license) a non-trivial amount of effort to make and maintain a nice set of VI wrappers. Perhaps there's a semi-automatic way to generate a set of VIs to wrap the relevant functions, which is a higher up-front investment of time but more sustainable.

In the other direction, writing a device adapter using LabVIEW is an intriguing possibility - there is a lot of existing support for many different bits of hardware (especially NI), and tapping into that is attractive. However, compiling LabVIEW into a device adapter sounds to me like it would be far from straightforward, and would result in some moderate-to-horrible binary badness that becomes very much not portable between systems because of the very large amount of binary runtime files required.

henrypinkard commented 2 years ago

A belated thank you to all of you for all this helpful feedback!

We're finally moving forward on a new features that address many of these limitations. Please feel free to chime in and get involved at: https://github.com/micro-manager/mmCoreAndDevices/issues