Custom loss function #1069
Unanswered
DorofeevKirillDev
asked this question in
Q&A
Replies: 1 comment 2 replies
-
|
Can you try with a template expression maybe? See https://ai.damtp.cam.ac.uk/pysr/dev/examples#_13-vector-valued-expressions for a more complex example. So you could do, e.g., spec = TemplateExpressionSpec(
expressions=["f"],
variable_names=["x", "y_target"],
combine=f"""
# Compute model outputs:
f1 = f(x)
f2 = f(x + 0.01)
# Plateau penalty: only apply when x < 0
plateau_penalty = (x < 0) * abs(f1 - f2)
# Standard residual:
residual = abs2(f1 - y_target)
# Combine residual and plateau regularization:
residual + 0.1 * plateau_penalty
"""
)
model = PySRRegressor(
expression_spec=spec,
binary_operators=["+", "-", "*", "/", "^"],
unary_operators=["sin", "cos", "exp"],
maxsize=20,
niterations=100,
)
# Dummy target stacked for template compatibility
X_with_y = np.column_stack([X, y])
model.fit(X_with_y, np.zeros_like(y)) |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I need to enforce a plateau in my model's output for negative inputs. My idea is to add a custom loss term, specifically w * |f(x) - f(x + ε)|, where x is negative. I've seen examples using predefined data points, but how can I calculate the model's current prediction at specific x values inside a custom loss function?
Beta Was this translation helpful? Give feedback.
All reactions