Replies: 2 comments 1 reply
-
There's an example here: https://astroautomata.com/PySR/examples/#9-custom-objectives Also see:
Other potentially related: |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi Miles, thank you for your swift reply and thorough answers, I read the
later discussions, especially #557 and applied it in the following function
y = f(x4,x5) * g(x3,x4) + h(x1,x2)
where f = 3 * np.sin(X[:, 3] + X[:, 4]), g = np.sin(X[:, 2]*2.0)+X[:, 3] **
2, h = 2.0*X[:, 0]*X[:, 1] ** 2 + X[:, 1]
But it fail to search for the right function
Any further recommendations would be very appreciated!
import numpy as np
from pysr import PySRRegressor
# Create a synthetic dataset
np.random.seed(0)
X = np.random.rand(100, 5) * 5.0 # 100 samples, 6 features
y = (3 * np.sin(X[:, 3] + X[:, 4])) * (np.sin(X[:, 2]*2.0)+X[:, 3] ** 2) +
(2.0*X[:, 0]*X[:, 1] ** 2 + X[:, 1] )+ np.random.randn(100) * 0.01 #
Target variable
# y = f(x4,x5) * g(x3,x4) + h(x1,x2)
# where f = 3 * np.sin(X[:, 3] + X[:, 4]), g = np.sin(X[:, 2]*2.0)+X[:, 3]
** 2, h = 2.0*X[:, 0]*X[:, 1] ** 2 + X[:, 1]
# Define the custom objective function
objective = """
function contains(t, features)
if t.degree == 0
return !t.constant && t.feature in features
elseif t.degree == 1
return contains(t.l, features)
else
return contains(t.l, features) || contains(t.r, features)
end
end
function my_custom_objective(tree, dataset::Dataset{T,L}, options) where
{T,L}
tree.degree != 2 && return L(Inf)
left = tree.l
right = tree.r
left.degree != 2 && return L(Inf)
bot_left = left.l
bot_right = left.r
bot_left_pred, flag = eval_tree_array(bot_left, dataset.X, options)
!flag && return L(Inf)
bot_right_pred, flag = eval_tree_array(bot_right, dataset.X, options)
!flag && return L(Inf)
right_pred, flag = eval_tree_array(right, dataset.X, options)
!flag && return L(Inf)
prediction = bot_left_pred .* bot_right_pred .+ right_pred
right_violating = Int(contains(right, (5,3,4))) + Int(!contains(right,
(1,2))) # h(x1,x2) and apply penalty if contains x3,x4 or x5
bot_left_violating = Int(contains(bot_left, (1,2))) +
Int(!contains(bot_left, (4,5))) # f(x4,x5) and apply penalty if contains
others say x1 or x2
bot_right_violating = Int(contains(bot_right, (1,2,5))) +
Int(!contains(bot_right, (3,4))) # g(x3,x4) and apply penalty if contains
others say x1 or x2 or x5
regularization = L(100) * (right_violating .+ bot_left_violating .+
bot_right_violating)
diffs = (prediction .- dataset.y) .^ 2
pp = (sum(diffs) / length(dataset.y))^ 0.5
return pp + regularization
end
my_custom_objective
"""
# Initialize and train the PySRRegressor
model = PySRRegressor(
procs=16,
maxsize=30,
maxdepth=7,
populations=48,
population_size=65,
niterations=500,
ncyclesperiteration=500,
binary_operators=["+", "*", "-", "/", "^"],
unary_operators=["sin"],
parsimony=0.05,
adaptive_parsimony_scaling=1000,
precision=64,
loss_function=objective
)
# Train the model
model.fit(X, y)
# Predict on the test set
y_pred = model.predict(X)
print(model.equations_)
…On Mon, Jun 3, 2024 at 1:48 AM Miles Cranmer ***@***.***> wrote:
See
- #291 <#291>
- #465 <#465>
- #528 <#528>
- #557 <#557>
Other potentially related:
- #401 <#401>
—
Reply to this email directly, view it on GitHub
<#637 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ATB6Q46DVJLED5F3RWCLGB3ZFOONNAVCNFSM6AAAAABIVLQKOKVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4TMNBRGI3DC>
.
You are receiving this because you authored the thread.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello everyone,
I want to fit my equation to be as a sum of two equations, each equation is function of particular variables, for example
my function = f(x,y) + g(z,u)
My function is sum of two functions, the first function is function in x, y variables, and the second one is function in z,u variables
also other shape like 1/f(x,y) + 1/g(z,u)
Beta Was this translation helpful? Give feedback.
All reactions