SciML / NeuralPDE.jl

Physics-Informed Neural Networks (PINN) Solvers of (Partial) Differential Equations for Scientific Machine Learning (SciML) accelerated simulation
https://docs.sciml.ai/NeuralPDE/stable/
Other
961 stars 195 forks source link

Deep learning for symbolic mathematics #44

Closed ChrisRackauckas closed 3 years ago

ChrisRackauckas commented 4 years ago

https://arxiv.org/pdf/1912.01412.pdf

Could make use of ModelingToolkit for the representation and build a solver!

chinglamchoi commented 4 years ago

The code and data generation methods for this paper have finally been released (few hours ago)! https://github.com/facebookresearch/SymbolicMathematics

zzj0402 commented 4 years ago

Discord Instruction

I have 3 students dividing up that repo right now taking most of the low hanging fruit: general physics-informed neural network generator, improved physics-informed neural network training, and deep SDE based methods [9:38 AM] so to not overlap with what's going on at MIT, I'd probably stick to the symbolic stuff [9:38 AM] https://github.com/SciML/NeuralNetDiffEq.jl/issues/44 GitHub Deep learning for symbolic mathematics · Issue #44 · SciML/NeuralNe... https://arxiv.org/pdf/1912.01412.pdf Could make use of ModelingToolkit for the representation and build a solver!

2 [9:40 AM] so https://arxiv.org/pdf/1912.01412.pdf might be an interesting paper to work from [9:40 AM] you can use the https://github.com/SciML/ModelingToolkit.jl symbolic system as the base GitHub SciML/ModelingToolkit.jl A modeling framework for automatically parallelized scientific computing in Julia - SciML/ModelingToolkit.jl

[9:41 AM] Start by building a system to generate a bunch of random symbolic mathematical expressions and take derivatives of them (ModelingToolkit has a derivative expansion mechanism) [9:41 AM] The use those pairs as training data to train a classifier like in the paper that can do symbolic integration. [9:42 AM] you'll have to dig through how they represented everything in a trainable fashion. [9:42 AM] The neural networks they used were graph neural networks, so you can make use of(edited) [9:42 AM] https://github.com/yuehhua/GeometricFlux.jl

To-Do

Working on reading the paper and replicating their result in the codebase.

WIP log

Replication of training on fwd

(torch) zz199@quatern:~/Project/SymbolicMathematics$ python main.py --exp_name first_train --fp16 true --amp 2 --tasks "prim_fwd" --reload_data "prim_fwd,prim_fwd.train,prim_fwd.valid,prim_fwd.test" --reload_size 40000000 --emb_dim 1024 --n_enc_layers 6 --n_dec_layers 6 --n_heads 8 --optimizer "adam,lr=0.0001" --batch_size 32 --epoch_size 300000 --validation_metrics valid_prim_fwd_acc
SLURM job: False
0 - Number of nodes: 1
0 - Node ID        : 0
0 - Local rank     : 0
0 - Global rank    : 0
0 - World size     : 1
0 - GPUs per node  : 1
0 - Master         : True
0 - Multi-node     : False
0 - Multi-GPU      : False
0 - Hostname       : quatern
INFO - 06/20/20 11:16:26 - 0:00:00 - ============ Initialized logger ============
INFO - 06/20/20 11:16:26 - 0:00:00 - accumulate_gradients: 1
                                     amp: 2
                                     attention_dropout: 0
                                     balanced: False
                                     batch_size: 32
                                     beam_early_stopping: True
                                     beam_eval: False
                                     beam_length_penalty: 1
                                     beam_size: 1
                                     clean_prefix_expr: True
                                     clip_grad_norm: 5
                                     command: python main.py --exp_name first_train --fp16 true --amp 2 --tasks prim_fwd --reload_data 'prim_fwd,prim_fwd.train,prim_fwd.valid,prim_fwd.test' --reload_size 40000000 --emb_dim 1024 --n_enc_layers 6 --n_dec_layers 6 --n_heads 8 --optimizer 'adam,lr=0.0001' --batch_size 32 --epoch_size 300000 --validation_metrics valid_prim_fwd_acc --exp_id "igwas1cka6"
                                     cpu: False
                                     debug: False
                                     debug_slurm: False
                                     dropout: 0
                                     dump_path: ./dumped/first_train/igwas1cka6
                                     emb_dim: 1024
                                     env_base_seed: 0
                                     env_name: char_sp
                                     epoch_size: 300000
                                     eval_only: False
                                     eval_verbose: 0
                                     eval_verbose_print: False
                                     exp_id: igwas1cka6
                                     exp_name: first_train
                                     export_data: False
                                     fp16: True
                                     global_rank: 0
                                     int_base: 10
                                     is_master: True
                                     is_slurm_job: False
                                     leaf_probs: 0.75,0,0.25,0
                                     local_rank: 0
                                     master_port: -1
                                     max_epoch: 100000
                                     max_int: 10000
                                     max_len: 512
                                     max_ops: 10
                                     max_ops_G: 4
                                     multi_gpu: False
                                     multi_node: False
                                     n_coefficients: 0
                                     n_dec_layers: 6
                                     n_enc_layers: 6
                                     n_gpu_per_node: 1
                                     n_heads: 8
                                     n_nodes: 1
                                     n_variables: 1
                                     node_id: 0
                                     num_workers: 10
                                     operators: add:2,sub:1
                                     optimizer: adam,lr=0.0001
                                     positive: False
                                     precision: 10
                                     reload_checkpoint: 
                                     reload_data: prim_fwd,prim_fwd.train,prim_fwd.valid,prim_fwd.test
                                     reload_model: 
                                     reload_size: 40000000
                                     rewrite_functions: 
                                     same_nb_ops_per_batch: False
                                     save_periodic: 0
                                     share_inout_emb: True
                                     sinusoidal_embeddings: False
                                     stopping_criterion: 
                                     tasks: prim_fwd
                                     validation_metrics: valid_prim_fwd_acc
                                     world_size: 1
INFO - 06/20/20 11:16:26 - 0:00:00 - The experiment will be stored in ./dumped/first_train/igwas1cka6

INFO - 06/20/20 11:16:26 - 0:00:00 - Running command: python main.py --exp_name first_train --fp16 true --amp 2 --tasks prim_fwd --reload_data 'prim_fwd,prim_fwd.train,prim_fwd.valid,prim_fwd.test' --reload_size 40000000 --emb_dim 1024 --n_enc_layers 6 --n_dec_layers 6 --n_heads 8 --optimizer 'adam,lr=0.0001' --batch_size 32 --epoch_size 300000 --validation_metrics valid_prim_fwd_acc

WARNING - 06/20/20 11:16:26 - 0:00:00 - Signal handler installed.
INFO - 06/20/20 11:16:26 - 0:00:00 - Unary operators: []
INFO - 06/20/20 11:16:26 - 0:00:00 - Binary operators: ['add', 'sub']
INFO - 06/20/20 11:16:26 - 0:00:00 - words: {'<s>': 0, '</s>': 1, '<pad>': 2, '(': 3, ')': 4, '<SPECIAL_5>': 5, '<SPECIAL_6>': 6, '<SPECIAL_7>': 7, '<SPECIAL_8>': 8, '<SPECIAL_9>': 9, 'pi': 10, 'E': 11, 'x': 12, 'y': 13, 'z': 14, 't': 15, 'a0': 16, 'a1': 17, 'a2': 18, 'a3': 19, 'a4': 20, 'a5': 21, 'a6': 22, 'a7': 23, 'a8': 24, 'a9': 25, 'abs': 26, 'acos': 27, 'acosh': 28, 'acot': 29, 'acoth': 30, 'acsc': 31, 'acsch': 32, 'add': 33, 'asec': 34, 'asech': 35, 'asin': 36, 'asinh': 37, 'atan': 38, 'atanh': 39, 'cos': 40, 'cosh': 41, 'cot': 42, 'coth': 43, 'csc': 44, 'csch': 45, 'derivative': 46, 'div': 47, 'exp': 48, 'f': 49, 'g': 50, 'h': 51, 'inv': 52, 'ln': 53, 'mul': 54, 'pow': 55, 'pow2': 56, 'pow3': 57, 'pow4': 58, 'pow5': 59, 'rac': 60, 'sec': 61, 'sech': 62, 'sign': 63, 'sin': 64, 'sinh': 65, 'sqrt': 66, 'sub': 67, 'tan': 68, 'tanh': 69, 'I': 70, 'INT+': 71, 'INT-': 72, 'INT': 73, 'FLOAT': 74, '-': 75, '.': 76, '10^': 77, 'Y': 78, "Y'": 79, "Y''": 80, '0': 81, '1': 82, '2': 83, '3': 84, '4': 85, '5': 86, '6': 87, '7': 88, '8': 89, '9': 90}
INFO - 06/20/20 11:16:26 - 0:00:00 - 20001 possible leaves.
INFO - 06/20/20 11:16:26 - 0:00:00 - Checking expressions in [0.01, 0.1, 0.3, 0.5, 0.7, 0.9, 1.1, 2.1, 3.1, -0.01, -0.1, -0.3, -0.5, -0.7, -0.9, -1.1, -2.1, -3.1]
INFO - 06/20/20 11:16:26 - 0:00:00 - Training tasks: prim_fwd
INFO - 06/20/20 11:16:26 - 0:00:01 - Number of parameters (encoder): 79866880
INFO - 06/20/20 11:16:26 - 0:00:01 - Number of parameters (decoder): 105069659
INFO - 06/20/20 11:16:31 - 0:00:05 - Found 261 parameters in model.
INFO - 06/20/20 11:16:31 - 0:00:05 - Optimizers: model
Selected optimization level O2:  FP16 training with FP32 batchnorm and FP32 master weights.

Defaults for this optimization level are:
enabled                : True
opt_level              : O2
cast_model_type        : torch.float16
patch_torch_functions  : False
keep_batchnorm_fp32    : True
master_weights         : True
loss_scale             : dynamic
Processing user overrides (additional kwargs that are not None)...
After processing overrides, optimization options are:
enabled                : True
opt_level              : O2
cast_model_type        : torch.float16
patch_torch_functions  : False
keep_batchnorm_fp32    : True
master_weights         : True
loss_scale             : dynamic
INFO - 06/20/20 11:16:31 - 0:00:05 - Creating train iterator for prim_fwd ...
INFO - 06/20/20 11:16:31 - 0:00:05 - Loading data from prim_fwd.train ...
INFO - 06/20/20 11:25:49 - 0:09:23 - Loaded 39999979 equations from the disk.
INFO - 06/20/20 11:25:55 - 0:09:30 - ============ Starting epoch 0 ... ============
INFO - 06/20/20 11:25:55 - 0:09:30 - Initialized random generator for worker 0, with seed [0, 0, 0] (base seed=0).
/home/zz199/anaconda3/envs/torch/lib/python3.8/site-packages/apex/amp/_initialize.py:25: UserWarning: An input tensor was not cuda.
  warnings.warn("An input tensor was not cuda.")
Gradient overflow.  Skipping step, loss scaler 0 reducing loss scale to 32768.0
INFO - 06/20/20 11:26:14 - 0:09:49 -      20 -    1.10 equations/s -    87.28 words/s - PRIM-FWD:  4.5701 - model LR: 1.0000e-04
INFO - 06/20/20 11:26:26 - 0:10:00 -      40 -   54.68 equations/s -  4443.13 words/s - PRIM-FWD:  2.8008 - model LR: 1.0000e-04
INFO - 06/20/20 11:26:38 - 0:10:13 -      60 -   52.76 equations/s -  4384.13 words/s - PRIM-FWD:  2.8392 - model LR: 1.0000e-04
INFO - 06/20/20 11:26:51 - 0:10:25 -      80 -   51.42 equations/s -  4298.36 words/s - PRIM-FWD:  2.8075 - model LR: 1.0000e-04
INFO - 06/20/20 11:27:02 - 0:10:37 -     100 -   55.36 equations/s -  4438.87 words/s - PRIM-FWD:  2.7966 - model LR: 1.0000e-04
INFO - 06/20/20 11:27:14 - 0:10:49 -     120 -   51.66 equations/s -  4079.64 words/s - PRIM-FWD:  2.3692 - model LR: 1.0000e-04
INFO - 06/20/20 11:27:26 - 0:11:01 -     140 -   53.91 equations/s -  4398.30 words/s - PRIM-FWD:  1.9535 - model LR: 1.0000e-04
INFO - 06/20/20 11:27:39 - 0:11:13 -     160 -   52.04 equations/s -  4252.21 words/s - PRIM-FWD:  1.8256 - model LR: 1.0000e-04
INFO - 06/20/20 11:27:50 - 0:11:24 -     180 -   56.92 equations/s -  4501.49 words/s - PRIM-FWD:  1.7446 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:02 - 0:11:37 -     200 -   51.59 equations/s -  4094.86 words/s - PRIM-FWD:  1.7159 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:13 - 0:11:47 -     220 -   62.50 equations/s -  5064.16 words/s - PRIM-FWD:  1.6465 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:22 - 0:11:57 -     240 -   66.72 equations/s -  5378.79 words/s - PRIM-FWD:  1.6264 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:34 - 0:12:09 -     260 -   53.14 equations/s -  4451.99 words/s - PRIM-FWD:  1.6008 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:46 - 0:12:20 -     280 -   54.17 equations/s -  4507.31 words/s - PRIM-FWD:  1.5709 - model LR: 1.0000e-04
INFO - 06/20/20 11:28:58 - 0:12:32 -     300 -   54.70 equations/s -  4278.03 words/s - PRIM-FWD:  1.4920 - model LR: 1.0000e-04
INFO - 06/20/20 11:29:10 - 0:12:44 -     320 -   53.61 equations/s -  4295.23 words/s - PRIM-FWD:  1.4913 - model LR: 1.0000e-04
INFO - 06/20/20 11:29:21 - 0:12:56 -     340 -   55.41 equations/s -  4366.27 words/s - PRIM-FWD:  1.4300 - model LR: 1.0000e-04
INFO - 06/20/20 11:29:33 - 0:13:07 -     360 -   55.26 equations/s -  4189.66 words/s - PRIM-FWD:  1.4104 - model LR: 1.0000e-04
INFO - 06/20/20 11:29:44 - 0:13:18 -     380 -   58.53 equations/s -  4476.86 words/s - PRIM-FWD:  1.4049 - model LR: 1.0000e-04
INFO - 06/20/20 11:29:57 - 0:13:31 -     400 -   49.37 equations/s -  4020.01 words/s - PRIM-FWD:  1.4269 - model LR: 1.0000e-04
INFO - 06/20/20 11:30:08 - 0:13:43 -     420 -   55.73 equations/s -  4533.87 words/s - PRIM-FWD:  1.3905 - model LR: 1.0000e-04
INFO - 06/20/20 11:30:20 - 0:13:54 -     440 -   54.59 equations/s -  4383.42 words/s - PRIM-FWD:  1.3772 - model LR: 1.0000e-04
INFO - 06/20/20 11:30:31 - 0:14:05 -     460 -   58.46 equations/s -  4490.91 words/s - PRIM-FWD:  1.3546 - model LR: 1.0000e-04
INFO - 06/20/20 11:30:42 - 0:14:17 -     480 -   55.79 equations/s -  4524.04 words/s - PRIM-FWD:  1.3286 - model LR: 1.0000e-04
INFO - 06/20/20 11:30:54 - 0:14:28 -     500 -   55.60 equations/s -  4384.97 words/s - PRIM-FWD:  1.2861 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:05 - 0:14:40 -     520 -   55.59 equations/s -  4474.77 words/s - PRIM-FWD:  1.3405 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:16 - 0:14:51 -     540 -   57.49 equations/s -  4432.98 words/s - PRIM-FWD:  1.2843 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:29 - 0:15:03 -     560 -   51.95 equations/s -  4309.73 words/s - PRIM-FWD:  1.2901 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:40 - 0:15:14 -     580 -   57.18 equations/s -  4472.85 words/s - PRIM-FWD:  1.2819 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:48 - 0:15:22 -     600 -   79.82 equations/s -  6497.11 words/s - PRIM-FWD:  1.2780 - model LR: 1.0000e-04
INFO - 06/20/20 11:31:59 - 0:15:34 -     620 -   56.68 equations/s -  4412.18 words/s - PRIM-FWD:  1.2430 - model LR: 1.0000e-04
INFO - 06/20/20 11:32:11 - 0:15:45 -     640 -   56.33 equations/s -  4484.59 words/s - PRIM-FWD:  1.2386 - model LR: 1.0000e-04
INFO - 06/20/20 11:32:23 - 0:15:57 -     660 -   53.71 equations/s -  4427.40 words/s - PRIM-FWD:  1.2544 - model LR: 1.0000e-04
INFO - 06/20/20 11:32:35 - 0:16:09 -     680 -   52.93 equations/s -  4260.39 words/s - PRIM-FWD:  1.2537 - model LR: 1.0000e-04
INFO - 06/20/20 11:32:47 - 0:16:22 -     700 -   50.66 equations/s -  4088.17 words/s - PRIM-FWD:  1.2419 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:00 - 0:16:34 -     720 -   52.07 equations/s -  4174.89 words/s - PRIM-FWD:  1.2456 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:12 - 0:16:47 -     740 -   49.73 equations/s -  4076.14 words/s - PRIM-FWD:  1.2089 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:24 - 0:16:58 -     760 -   55.88 equations/s -  4467.64 words/s - PRIM-FWD:  1.2074 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:36 - 0:17:11 -     780 -   51.60 equations/s -  4114.93 words/s - PRIM-FWD:  1.1783 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:47 - 0:17:22 -     800 -   58.98 equations/s -  4600.33 words/s - PRIM-FWD:  1.1655 - model LR: 1.0000e-04
INFO - 06/20/20 11:33:58 - 0:17:33 -     820 -   56.72 equations/s -  4462.36 words/s - PRIM-FWD:  1.1864 - model LR: 1.0000e-04
INFO - 06/20/20 11:34:11 - 0:17:45 -     840 -   52.34 equations/s -  4370.07 words/s - PRIM-FWD:  1.1730 - model LR: 1.0000e-04
INFO - 06/20/20 11:34:23 - 0:17:57 -     860 -   52.13 equations/s -  4359.20 words/s - PRIM-FWD:  1.1608 - model LR: 1.0000e-04
INFO - 06/20/20 11:34:35 - 0:18:09 -     880 -   53.57 equations/s -  4216.33 words/s - PRIM-FWD:  1.1555 - model LR: 1.0000e-04
INFO - 06/20/20 11:34:46 - 0:18:20 -     900 -   59.98 equations/s -  4686.62 words/s - PRIM-FWD:  1.1047 - model LR: 1.0000e-04
INFO - 06/20/20 11:34:57 - 0:18:32 -     920 -   54.85 equations/s -  4360.40 words/s - PRIM-FWD:  1.1157 - model LR: 1.0000e-04
INFO - 06/20/20 11:35:09 - 0:18:43 -     940 -   55.56 equations/s -  4735.11 words/s - PRIM-FWD:  1.1610 - model LR: 1.0000e-04
INFO - 06/20/20 11:35:18 - 0:18:52 -     960 -   69.29 equations/s -  5503.90 words/s - PRIM-FWD:  1.1267 - model LR: 1.0000e-04
INFO - 06/20/20 11:35:30 - 0:19:05 -     980 -   51.36 equations/s -  4119.40 words/s - PRIM-FWD:  1.1064 - model LR: 1.0000e-04
INFO - 06/20/20 11:35:42 - 0:19:17 -    1000 -   53.61 equations/s -  4385.64 words/s - PRIM-FWD:  1.0643 - model LR: 1.0000e-04
INFO - 06/20/20 11:35:54 - 0:19:28 -    1020 -   56.78 equations/s -  4460.24 words/s - PRIM-FWD:  1.0605 - model LR: 1.0000e-04
INFO - 06/20/20 11:36:05 - 0:19:40 -    1040 -   55.88 equations/s -  4477.55 words/s - PRIM-FWD:  1.0733 - model LR: 1.0000e-04
INFO - 06/20/20 11:36:18 - 0:19:52 -    1060 -   50.46 equations/s -  4206.24 words/s - PRIM-FWD:  1.0759 - model LR: 1.0000e-04
INFO - 06/20/20 11:36:29 - 0:20:04 -    1080 -   56.53 equations/s -  4536.44 words/s - PRIM-FWD:  1.0291 - model LR: 1.0000e-04
INFO - 06/20/20 11:36:41 - 0:20:15 -    1100 -   55.41 equations/s -  4279.33 words/s - PRIM-FWD:  1.0145 - model LR: 1.0000e-04
INFO - 06/20/20 11:36:52 - 0:20:27 -    1120 -   55.33 equations/s -  4521.44 words/s - PRIM-FWD:  1.0304 - model LR: 1.0000e-04
INFO - 06/20/20 11:37:04 - 0:20:39 -    1140 -   53.31 equations/s -  4183.06 words/s - PRIM-FWD:  1.0435 - model LR: 1.0000e-04
INFO - 06/20/20 11:37:16 - 0:20:50 -    1160 -   54.15 equations/s -  4355.94 words/s - PRIM-FWD:  1.0270 - model LR: 1.0000e-04
INFO - 06/20/20 11:37:28 - 0:21:03 -    1180 -   51.46 equations/s -  4113.32 words/s - PRIM-FWD:  1.0338 - model LR: 1.0000e-04
INFO - 06/20/20 11:37:40 - 0:21:15 -    1200 -   53.98 equations/s -  4204.74 words/s - PRIM-FWD:  1.0104 - model LR: 1.0000e-04
INFO - 06/20/20 11:37:52 - 0:21:27 -    1220 -   53.36 equations/s -  4320.03 words/s - PRIM-FWD:  0.9798 - model LR: 1.0000e-04
INFO - 06/20/20 11:38:04 - 0:21:38 -    1240 -   55.14 equations/s -  4524.91 words/s - PRIM-FWD:  0.9904 - model LR: 1.0000e-04
INFO - 06/20/20 11:38:17 - 0:21:51 -    1260 -   49.25 equations/s -  4170.14 words/s - PRIM-FWD:  1.0186 - model LR: 1.0000e-04
INFO - 06/20/20 11:38:29 - 0:22:04 -    1280 -   51.74 equations/s -  4232.69 words/s - PRIM-FWD:  0.9779 - model LR: 1.0000e-04
INFO - 06/20/20 11:38:37 - 0:22:12 -    1300 -   78.08 equations/s -  6219.90 words/s - PRIM-FWD:  1.0016 - model LR: 1.0000e-04
INFO - 06/20/20 11:38:49 - 0:22:23 -    1320 -   55.58 equations/s -  4616.37 words/s - PRIM-FWD:  0.9794 - model LR: 1.0000e-04
INFO - 06/20/20 11:39:01 - 0:22:35 -    1340 -   54.98 equations/s -  4320.82 words/s - PRIM-FWD:  0.9407 - model LR: 1.0000e-04
INFO - 06/20/20 11:39:12 - 0:22:47 -    1360 -   54.68 equations/s -  4239.37 words/s - PRIM-FWD:  0.9584 - model LR: 1.0000e-04
INFO - 06/20/20 11:39:26 - 0:23:00 -    1380 -   47.96 equations/s -  3956.55 words/s - PRIM-FWD:  0.9835 - model LR: 1.0000e-04
INFO - 06/20/20 11:39:38 - 0:23:12 -    1400 -   52.94 equations/s -  4353.62 words/s - PRIM-FWD:  0.9750 - model LR: 1.0000e-04
INFO - 06/20/20 11:39:50 - 0:23:25 -    1420 -   51.48 equations/s -  4255.92 words/s - PRIM-FWD:  0.9817 - model LR: 1.0000e-04

Training sample

641556|add mul INT- 2 Y mul x Y'        mul a8 pow x INT+ 2
406394|add INT- 2 Y'    add a8 mul INT+ 2 x
265543|add mul INT- 3 Y mul x Y'        mul a8 pow x INT+ 3
238156|add INT+ 1 Y'    add a8 mul INT- 1 x
201768|add pow x INT+ 2 add mul INT- 1 mul x Y' Y       mul x add a8 x
161143|add x add mul INT- 2 Y mul x Y'  add x mul a8 pow x INT+ 2
160067|add INT- 3 Y'    add a8 mul INT+ 3 x
132622|add INT- 1 add mul INT- 2 x Y'   add a8 add x pow x INT+ 2
122995|add mul INT- 4 Y mul x Y'        mul a8 pow x INT+ 4
115829|add mul INT+ 2 Y mul x Y'        mul a8 pow x INT- 2
100162|add mul INT- 2 x add mul x Y' Y  add x mul a8 pow x INT- 1
98197|add mul INT+ 2 pow x INT+ 2 add mul INT- 1 mul x Y' Y     mul x add a8 mul INT+ 2 x
93535|add pow x INT+ 3 add mul INT+ 2 Y mul INT- 1 mul x Y'     mul pow x INT+ 2 add a8 x
82497|add mul x Y' mul INT- 1 mul Y ln Y        exp mul a8 x
72339|add pow Y INT+ 2 add mul INT- 1 Y mul x Y'        mul x pow add a8 x INT- 1
70919|add INT- 4 Y'     add a8 mul INT+ 4 x
70747|add INT- 1 mul x Y'       ln mul a8 x
70486|add mul INT- 1 atan Y add mul x Y' mul INT- 1 mul pow Y INT+ 2 atan Y     tan mul a8 x
70196|add mul INT- 1 Y Y'       mul a8 exp x
69567|add mul INT- 2 Y mul x Y' mul x add x mul a8 x
66853|add mul Y' sin x mul INT- 1 mul cos x Y   mul a8 sin x
66847|add mul Y' cos x mul Y sin x      mul a8 cos x
61712|add mul INT- 1 exp x Y'   add a8 exp x
61466|add Y' sin x      add a8 cos x
61220|add INT- 1 mul x Y'       add a8 ln x
61149|add mul INT- 1 cos x Y'   add a8 sin x
60665|add mul INT- 3 pow x INT+ 2 Y'    add a8 pow x INT+ 3
60579|add INT- 1 mul Y' exp Y   ln add a8 x
57739|add INT- 1 mul INT+ 2 mul Y' Y    sqrt add a8 x
57232|add mul INT- 1 Y Y'       exp add a8 x
57152|add INT- 1 add mul INT- 1 pow Y INT+ 2 Y' tan add a8 x
53385|add mul INT- 1 Y mul x Y' add mul INT- 1 x mul a8 x
53368|add pow x INT+ 2 add mul INT- 1 mul x Y' Y        add x mul x add a8 x
51688|add pow x INT+ 2 add mul INT- 1 mul x Y' Y        add pow x INT+ 2 mul a8 x
49537|add mul INT- 1 Y add mul x Y' mul INT- 1 mul x Y  mul a8 mul x exp x
zzj0402 commented 4 years ago

https://julialang.slack.com/archives/C6A044SQH/p1593504459281700

zzj0402 commented 4 years ago

Prerequisites

Todo

zzj0402 commented 4 years ago

https://github.com/chengchingwen/Transformers.jl/issues/11

ChrisRackauckas commented 3 years ago

Moved to Symbolics.jl