from sklearn.linear_model import LinearRegression, Ridge, Lasso, ElasticNet
from sklearn.metrics import mean_absolute_error, root_mean_squared_error
models = {
'linear': LinearRegression(),
'ridge': Ridge(alpha=1.0),
'lasso': Lasso(alpha=0.01),
'elastic_net': ElasticNet(alpha=0.01, l1_ratio=0.5),
}
for name, model in models.items():
model.fit(X_train, y_train)
predictions = model.predict(X_valid)
mae = mean_absolute_error(y_valid, predictions)
rmse = root_mean_squared_error(y_valid, predictions)
print(name, round(mae, 3), round(rmse, 3))
For numeric targets I usually start simple and make regularization earn its keep. Ridge is stable, Lasso helps with sparsity, and ElasticNet is a practical compromise when correlated features exist. The main goal is not just minimizing RMSE but understanding which variables carry usable signal.