Replies: 1 comment 2 replies
-
Hi @GFODK, Unfortunately you cannot have a dynamic loss function, because it is cached at various points in the search rather than re-computing each time. So if your loss function changes, it will break some assumptions in the code. What you can do though is adjust the loss function and call model = SRRegressor(
niterations=1,
binary_operators=[+, -, *, /],
unary_operators=[sin, cos, exp, log],
population_size=50,
loss_function=loss_functions[1]
)
mach = machine(model, x, y)
fit!(mach)
for i in 2:100
mach.model.loss_function = loss_functions[i]
mach.model.niterations += 1
fit!(mach)
end or something like this. You might want to build some callable struct that has parameters you can declare, like struct LossContext <: Function
w_a::Float64
w_b::Float64
w_c::Float64
end
function (ctx::LossContext)(tree::Node, dataset::Dataset{T,L}, options::Options) where {T,L}
...
return L_a * ctx.w_a + L_b * ctx.w_b + L_c * ctx.w_c
end and then you can just define the loss like SRRegressor(loss_function=LossContext(0.1, 0.4, 0.4)) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I'm training a symbolic regression model in Julia (not using PySR) and defining a custom loss function. My loss function includes two additional conditions, and I manually assign different weights to each term, like this:
I’d like to adjust the weights of my loss terms dynamically based on the current iteration.
Is there a built-in variable that keeps track of the iteration count inside the custom_loss function? Or is there another way to access the current iteration during training?
Any help would be much appreciated! Thanks! 😊
Beta Was this translation helpful? Give feedback.
All reactions