JacquesCarette / Drasil

Generate all the things (focusing on research software)
https://jacquescarette.github.io/Drasil
BSD 2-Clause "Simplified" License
143 stars 26 forks source link

Investigate simplifying/rewriting `Makefile` #3557

Open balacij opened 1 year ago

balacij commented 1 year ago

The Makefile has some issues, for example:

  1. the Stack version check is superfluous since Stack already does this,
  2. it constantly rebuilds artifacts for various build targets (e.g., see #3548),
  3. the target names are confusing (e.g., analysis, convertAnalyzed, graphs, tracegraphs, deploy_code_path), some can be folded into each other via environment variables (e.g., hlint vs hot_hlint, debug vs *, deploy vs deploy_lite) to simplify the scripts, and some just aren't helpful (e.g., install),
  4. the file is difficult to read and maintain,
  5. the targets with X parameters aren't extensible,
  6. it has an incomplete system integrity check script (i.e., checking system dependencies), and
  7. the targets are very finicky at times
Current output of `make help` ``` λ ~/Programming/Drasil/code/ logNone make help HLint: hlint Run HLint through the drasil packages. Uses a local HLint installation. hot_hlint Run HLint through the drasil packages. Uses the latest HLint version, downloading the binary each time. Documentation: docs Create Haddock documentation of all Drasil modules. Set FULL=1 if you want to additionally test Haddock generation for website deployment. GOOL: gool Generate code from examples and test each one. doxygen Generate doxygen documentation for all examples. deploy_code_path Find all code file paths for all examples. Checks: check_stack Check that we are using the right stack version. check_dot Download the most recent version of graphviz. graphmod Check that we can generate dot graphs. Help: help Show this help. help_examples Lists all examples. help_packages Lists all packages. Deploy: website First builds all Drasil packages, and then executes the drasil-website. deploy_lite Deploy the Drasil website without regenerating artifacts. For local development. deploy Generates all artifacts and the Drasil website locally. Analysis: graphs Generate all module dependency graphs. analysis Generate a table and some graphs to analyze Drasil's class, datatype, and instance structures. convertAnalyzed Convert analyzed dot graphs into SVGs. General: pr_ready Check if your current work is ready to for a PR via `all` and `hot_hlint`. Examples: all Run examples and test against stable. install Install all example project binaries into your local binary path (see $(stack path) for local-bin-path). debug Run test target with better debugging tools. code Build all Drasil packages. examples Run all examples (no traceability graphs). tracegraphs Run examples with traceability graphs. test Run all examples and compare against the stable folder examples. stabilize Overwrites the stable folder with up-to-date artifacts. tex Generate all example pdfs. Needs Graphviz to work. Cleaning: clean_artifacts Remove generated artifacts & folders. (alt.: cleanArtifacts) clean Fully clean all generated builds, artifacts, & folders. Dependencies: deps Build only dependencies. Build-specific targets where X is a drasil package name. Run "make help_packages" to see a list of possible packages: X_build Builds a given "drasil-" package. X_doc Currently unused. X_graph Creates a package dependency graph. Example-specific targets where X is the example. Run "make help_examples" to see a list of possible examples: X_gen Generate individual example in HTML and LaTeX format. X_diff Generate individual example and create a comparison against stable in the log folder. X_install Install individual example executable(s) into your local binary path. X_stabilize Generate individual example and overwrite the contents of stable with the new version. X_tex Generate individual example as a PDF (from LaTeX files). X_trace_graph Generate individual example in HTML and LaTeX format with traceability graphs filled in. X_build_clean Remove specific example from the build folder. X_gool Generate individual example code. X_deploy_code_path Get all generated code file paths in the example. To get started with Drasil, try running "make". ```

EDIT: Along the way, we're going to need to toy around with and fix the CI, so:

  1. we should add a weekly scheduled CI run to keep the cache alive (@JacquesCarette found this, and it seems generally helpful for us too),
  2. allow for slim build interactions with the CI (i.e., not constantly rebuilding things!)
B-rando1 commented 6 months ago

I'm not sure if this is the right place to mention it (I can open a separate issue if that's better), but I noticed that make help does not list codegenTest_gen as an option. This is what's used to generate code from the GOOL programs in drasil-code/test, as was discussed in #3414.

balacij commented 6 months ago

I would say it's the right place. We can use this ticket to track general, unimportant issues too.

B-rando1 commented 4 months ago

I was thinking about how to better test generated GOOL code, prompted by the discussion in #3820. An idea that came to me was that we could add assert statements to GOOL.

For anyone who doesn't know, in Python assert statements take in a boolean value and throw an error if the value resolves to False. This is meant to be used for testing/debugging.

This should be easy to add to GOOL, and it would allow us to generate code that tests itself. For example, right now we are able to generate Python code that looks like this:

a = len(myOtherList)
print("Size of myOtherList.\n\tExpected: 2\n\tActual  : ", end="")
print(a)

But we could replace the print statements with asserts, to make:

a = len(myOtherList)
print("Checking size of myOtherList.")
assert a == 2

Then we could easily set up a script to run this code, and as long as it doesn't throw any errors, we know that all the properties we tested hold.

@samm82 what are your thoughts? Does this seem like a good way to add testing to GOOL?

samm82 commented 4 months ago

Building the infrastructure to capture the "expected" values more consistently (I'm not really sure how this is done now), as well as the ability to generate assert statements (and their equivalents in other languages) would definitely be a step in the right direction (although should definitely be its own issue/set of issues)! Once that's done, then we could incrementally build this up by:

Then, the results of my research would lend itself nicely to informing how to derive these test cases, both for GOOL and for our generated code, at which point we can start designing how to derive important test cases automatically and integrate this into our examples!

JacquesCarette commented 4 months ago

It was indeed amusing to see @B-rando1 describe all the infrastructure needed to enable what @samm82 's research would rely on!