CDSoft / pp

PP - Generic preprocessor (with pandoc in mind) - macros, literate programming, diagrams, scripts...
http://cdelord.fr/pp
GNU General Public License v3.0
252 stars 21 forks source link

Q: Handling OS Awareness in Macros? #15

Closed tajmone closed 7 years ago

tajmone commented 7 years ago

Some of the macros in the library I'm building rely on OS-specific macros like !cmd.

I would like to make the macros library cross-platform, and create a custom macro like !clicommand that would use the appropriate command invocation macro for the host OS (!bash or !cmd).

How can I check the OS using pp macros? Are there some env-vars I could evaluate that would allow to establish with 100% certainity which OS is the macro being run under?

Or should I set an environment variable from the calling batch/shell script (eg: GuestOS) than can then be queried from the macros? (ie: I know that the macros library will be invoked by different scripts in Linux and Windows).

Which solution would be best for both worlds (*nix & Win)?

When dealing with external command line tools that are available on both Linux and Windows (eg: Highlight), it would be convenient to have an agnostic command line invocation macro capable of behaving as either !bash or !cmd, according to host OS.

CDSoft commented 7 years ago

I would say that if you don't find some windows specific environment variable you can assume that you are running on a Linux/Unix OS (%LOCALAPPDATA%?). I don't think that an agnostic command line invocation is a good idea, languages are too different.

IMO, using Cygwin, MSYS or even busybox on Windows and relative path is a way better alternative.

tajmone commented 7 years ago

If the test should be carried on a Windows specific env var, these are two good candidates:

WINDIR: Lists the location of the OS directory. Every Windows version must have this var set, and it's likely that it will be kept in future editions of Windows too. Checking its actual value doesn't make sense, but its presence should be a very strong indicator that we are under Win OS.

OS: Lists the name of the operating system. Will return Windows_NT from XP upward (Win 10 included). Older versions (Win 95/98) are set to different values. Future versions of Windows might change its value. So, if one is not worried about pre-XP Windows version, this one allows testing its actual value.

tajmone commented 7 years ago

You are right: I've tried running the followin Bash macro under Git Bash:

\bash
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
echo "Hi, I'm $SHELL $BASH_VERSION"
RANDOM=42 # seed
echo "Here are a few random numbers: $RANDOM, $RANDOM, $RANDOM"
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

... and it worked!

Chances are that many Windows users of PP will also have Git installed, so they can benefit from using Bash scripts instead of Cmd (not 100%, because Git Bash is not a full blown Bash, but a good number of features and commands are supported).

Maybe is worth mentioning this in the documentation?

CDSoft commented 7 years ago

There is no mention of the target OS in the documentation for scripts. PP just calls the interpretor associated to the script language. The doc just says that for a bash scripts, pp calls bash or bash.exe. On Windows bash.exe can be a native port of bash or MSYS, Cygwin or any derivatives (git bash is a derivative of MSYS and its bash is "100%" bash, ported to Windows thanks to mingw and msys). When I had to use pp on Windows I used MSYS or Cygwin. So if the user provides a bash.exe or runhaskell.exe he/she can run bash or haskel scripts.

I'll add a note mention this.

CDSoft commented 7 years ago

I can add a macro that returns the OS (it's known at compile time):

\ifeq(\os)(linux)
~~~~~~~~~~~~~~~~~
this is Linux
~~~~~~~~~~~~~~~~~
\ifeq(\os)(windows)
~~~~~~~~~~~~~~~~~
this is Windows
~~~~~~~~~~~~~~~~~

But is there a use case for that?

The user can also define some macros:

\def(win)(\ifne(\env(WINDIR))()(\1))

\win(this text appears in windows only)

You can for instance add something like this in your library:

\quiet
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
\ifeq(\env(WINDIR))()
`````````````````````
\def(win)()
\def(linux)(\1)
`````````````````````
\ifne(\env(WINDIR))()
`````````````````````
\def(win)(\1)
\def(linux)()
`````````````````````
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CDSoft commented 7 years ago

added to the documentation. will be released later.

tajmone commented 7 years ago

Yes, I agree that testing for the OS by seeing if WINDIR is defined should work almost 100% of the time (it might fail on a Windows machine running andLinux, because even from Bash the WINDIR might be present, for example).

An macro for OS awareness (like GuestOS) would be great (that is: if it doesn't entangle things or becomes bloatware). The usecases I envisage are related to a scenario of a macros collection where modules are contributed by different people (and for different dialects and formats). Ideally, a task like checking the guest OS should go through some hard-coded check which is bullet-proof.

Of course, knowing the guest OS doesn't imply the actual script being run on the OS default shell — the guest OS might be Windows, but the actual script be run under Bash for Windows. But knowing the OS is a good start — further checks could be done differently according to OS.

CDSoft commented 7 years ago

All I can do is a macro that returns Linux or Windows. It's pretty simple: the Linux binary returns Linux and the Windows binary returns Windows. Any other corner case can not be handled (Cygwin, MSYS, ...) But once you have the OS, you still need to write scripts for that OS.

My motto: use standards, use POSIX ;-)

BTW, andLinux is actually Linux. The kernel is just recompiled to work as a Windows process but it runs Linux binaries (elf) not Windows binaries. It communicates with Windows through the network (NFS, X11). You may not have Windows variables.

Maybe what's missing is a means to start a command (not a script). And this requires no language. It's just a matter of starting an executable with some arguments. At the very beginning I had a macro \exec that runs sh on Linux and cmd on Windows to run a single command (not a script). This macro has changed several times and is now deprecated. It was maybe a bad idea.

So to sum up:

For any other specific needs, it will be necessary to use specific user defined macros.

tajmone commented 7 years ago

Thanks for the andLinux clarification (I though that Windows env-vars would be visible to scripts running therein).

!exec — Yes, this would be great to have (back). If I remember correctly, it didn't work with Windows (see #7). Most definitely, there are times when one wishes to execute a bare command in no particular scripting environment: just call what's available on the system command line. So, theoretically, this would also include installed RubyGems and global Node.js/Python apps/commands, ecc? (eg: pygments, sass, asciidoctor, ecc.)

!os — my only question is about the name choice. Are there any chances that this var name might be already in use by a given OS? Windows already uses it, and SS64 lists it at as a volatile (read-only) Standard (built-in) Environment Variable indicating the "Operating system on the user's workstation":

On my Win 10, echo %OS gives me: Windows_NT.

I'm not sure about the "read only" stuff, because I can do SET OS=something and the value is preserved, but there might be a distinction between user vars and system vars at play here.

So ... if you are setting its value to Linux, MacOS or Windows from pp, it should not change a native env-var (in Windows env vars being case insensitive it would be trying to overwritie a sys var). This was why I suggested GuestOS, a spelling which would rather unlikely be used by an operating system, yet still intuitive and easy to remember.

CDSoft commented 7 years ago

By \os (or whatever the name) I mean a built-in pp macro, not an environment variable (i.e. not \env(OS). It will not be related with $OS or %OS% in scripts. It will be just Linux or Windows, no more information about the version.

tajmone commented 7 years ago

Right. I apologyse for the mistake, I got entangled with the previous discussion on WINDIR (due to lack of sleep lately :sleeping:).

Yes, it seems a great solution indeed.

CDSoft commented 7 years ago

release 1.7: \exec is \sh on Linux and \cmd on Windows \os and \arch return the OS name and the architecture

tajmone commented 7 years ago

Great! And thanks a lot ...

I just pushed and made a PR for a PP-pandoc examples folder to be merged into Higlight's repo and distribution:

https://github.com/tajmone/highlight/tree/pandoc/examples/pandoc

Live HTML preview:

http://htmlpreview.github.io/?https://github.com/tajmone/highlight/blob/pandoc/examples/pandoc/README.html

After lunch I'll get to work to make the Highlight macro OS aware and cross-platform, so I can test it and update both the macros library and the Highlight example!

tajmone commented 7 years ago

Today I've managed to:

  1. Install Ubuntu 15 (x64) on another PC
  2. Setup Git, Haskell and various required tools
  3. Clone and build PP

So I've manged finally to cross-platform test the macros library: they work!

(bare in mind that I'm still not very confident with the Linux environment, and rather clumbsy)

I would appreciate strongly your opinion on how I went about the task. Specifically, I've created on a dev branch two short bash scripts to do the same tasks as their pre-existing batch equivalents:

The first one sets the PP_MACROS_PATH env var to the path of the macros folder. This env var is used by some macros to locate resources like CSS snippets which the macros then injects into the processed document. Here I'm trying to replicate on Linux the way I've been using PP on Windows, and so far it seemed to me the best way to interact with external files.

What do you think about this approach, is it the correct way to head for in building a cross-platform macros library? Or are there better ways?

The second script just iterates through all markdown test files and processes them. Again: I'm trying to replicate the way the batch script approaches the task, so it might be «Win-biased».

Do you think these two scripts are good for merging into master? or am I missing out something?

Thanks.

CDSoft commented 7 years ago

Hello, I'm rather busy those days... I think your approach is good.

There are several ways to get the path of the running script. BASH_SOURCE is specific to bash and may not work with other shells (dash, ksh, ...). $0 may be more portable. I also like the realpath command but it's not installed everywhere.