Open MilesCranmer opened 2 years ago
For this to work, I think the frozen nodes would have to be at the root of the tree. I don't think you could have non-frozen parent nodes with frozen children.
If you assume that all the frozen nodes are at the root, and frozen, then it becomes reasonable to implement. You would basically only do crossovers between non-frozen nodes, which would guarantee that none of the frozen nodes are copied during crossover.
During mutation, one would make it so the prepend mutation could not operate if any nodes are frozen, and the insertion mutation only insert after the frozen nodes.
For example:
graph TD;
A["+ (frozen)"]
B["× (frozen)"]
C["×"]
D[3.2]
E["a"]
F["b"]
G["c (frozen)"]
A-->B
A-->C
C-->D
C-->E
B-->F
B-->G
This would represent the expression $+(\times(b, c), \times(3.2, a))$, or, $b\cdot c + 3.2 a$. Since the $+$, $\times$, and $c$ are frozen, you could only mutate and apply crossovers to the nodes below them. This would essentially guarantee that those nodes are always kept in-place and with the same connectivity.
For example, the following series of mutations could happen:
graph TD;
A["+ (frozen)"]
B["× (frozen)"]
C["a"]
F["cos"]
G["c (frozen)"]
H["d"]
A-->B
A-->C
B-->F
B-->G
F-->H
which still has the same few nodes frozen in-place. This is equal to $\cos(d) \cdot c + a$
@ChrisRackauckas - I think this might be a direction to get SINDy-like models within SymbolicRegression.jl? Much slower since it has to traverse a tree rather than do a dot product, but the flexibility might be interesting to exploit somehow.
Dear MilesCarnmer,
We in the ATOMS Lab are also interested in freezing nodes! We’ve been looking into this for a while. Our approach involves dividing the system into a "frozen" part + a "flexible" part handled by symbolic regression (PySR). Our aim is to integrate both parts in our expression for better predictive results.
our frozen part concept:
Here's our plan:
We want to add a frozen tree option in the Options.jl. So, user can turn it or turn it off.
We want to input the frozen_tree in to a new frozen loss function in LossFunctions.jl, which uses the frozen tree expression and the flexible part expression to get the loss.
The frozen loss will be used to calculate the loss for expressions in the hall of fame.jl.
Optionally, constants in the frozen part can be optimized using ConstantOptimization.jl, but the tree structure won’t change.
Currently, we're implementing these changes by forking the GIT SymbolicRegresssion.jl repository. Furthermore, we aim to avoid making high-level changes such as modifying the population, mutation, and crossover Julia files to prevent disruptions to the symbolic regression pipeline. We’d love to have a discussion with you regarding the work-plan, and ask you some technical questions about our current approach.
Hi @charishma13,
Thanks for reaching out, I'd be very happy to discuss this! If you email me I would be happy to join a zoom call.
For implementation I think the most general approach would be to fork DynamicExpressions.jl and add an extra field to the Node
type for storing metadata, which would allow for storing an extra boolean parameter declaring a node as frozen or not.
For example, you could add an extra
field as follows:
-mutable struct Node{T} <: AbstractExpressionNode{T}
+mutable struct Node{T,E} <: AbstractExpressionNode{T}
degree::UInt8 # 0 for constant/variable, 1 for cos/sin, 2 for +/* etc.
constant::Bool # false if variable
val::T # If is a constant, this stores the actual value
# ------------------- (possibly undefined below)
feature::UInt16 # If is a variable (e.g., x in cos(x)), this stores the feature index.
op::UInt8 # If operator, this is the index of the operator in operators.binops, or operators.unaops
+ extra::E # NEW
l::Node{T} # Left child node. Only defined for degree=1 or degree=2.
r::Node{T} # Right child node. Only defined for degree=2.
#################
## Constructors:
#################
- Node{_T}() where {_T} = new{_T}()
+ Node{_T}() where {_T} = new{_T,Nothing}()
+ Node{_T,_E}() where {_T,_E} = new{_T,_E}()
end
So the default extra field would just be nothing
. Then to create a frozen node it be something like
tree = Node{Float64,@NamedTuple{frozen::Bool}}()
tree.extra.frozen = true # or false
# Other properties here:
tree.degree = 0
tree.constant = true
tree.val = 1.0
...
Then the various mutation functions could have their random samplers be modified to filter out any nodes with a frozen attribute.
Cheers, Miles
I was thinking how to impose a specific functional form on expressions. One way to do this is to write a custom constraint, so that any expression breaking it will receive an infinite loss. But this seems very inefficient.
One potentially elegant way is to add a
frozen
attribute to theNode
struct. If a specific node is "frozen," then mutations and crossovers should know not to mutate that particular node, and skip over it. This way, you could have SymbolicRegression.jl flexibly construct expressions based on existing functional forms.