1 분 소요

  • [목적]
    1. ElasticNet
    • Regularized Linear Model을 활용하여 Overfitting을 방지함
    • Hyperparameter lamba를 튜닝할 때 for loop 뿐만 아니라 GridsearchCV를 통해 돌출해봄
      1. Regularized Linear Models의 경우 X’s Scaling을 필수적으로 진행해야함

[ElasticNet Regression]

  • Hyperparameter Tuning using for Loop
  • Hyperparameter Tuning using GridSearchCV

[ElasticNet Regression Parameters]

  • Package : https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.ElasticNet.html
  • alpha : L2-norm Penalty Term
    • alpha : 0 일 때, Just Linear Regression
  • l1_ratio : L1-norm Penalty Term
    • 0 <= l1_ratio <= 1
    • l1_ratio : 1 일 때, Just Ridge Regression
  • fit_intercept : Centering to zero
    • 베타0를 0로 보내는 것 (베타0는 상수이기 때문에)
  • max_iter : Maximum number of interation
    • Loss Function의 LASSO Penalty Term은 절대 값이기 때문에 Gradient Descent와 같은 최적화가 필요함
  • Penalty Term
    • 1 / (2 * n_samples) *   y - Xw   ^2_2 + alpha * l1_ratio *   w   -1 + 0.5 * alpha * (1 - l1_ratio) *   w   ^2_2
    • alpha가 0 ->그냥 linear regression
    • l1_ration가 1 -> 그냥 ridge regression
# alphas, alpha = 0 is equivalent to an ordinary least square, solved by the LinearRegression object.
alphas = [0.000001, 0.000005, 0.00001, 0.00005, 0.0001, 0.001, 0.005, 0.01, 0.05]
# betas range (0 ~ 1), l1_ratio is often to put more values close to 1 (i.e. Lasso) and less close to 0 (i.e. Ridge)
l1_ratio = [0.9, 0.7, 0.5, 0.3, 0.1]

# ElasticNet Regression
# select alpha and beta by checking R2, MSE, RMSE
for a in alphas:
    for b in l1_ratio:
        model = ElasticNet(alpha=a, l1_ratio=b).fit(X_scal.iloc[train_idx], Y.iloc[train_idx])
        score = model.score(X_scal.iloc[valid_idx], Y.iloc[valid_idx])
        pred_y = model.predict(X_scal.iloc[valid_idx])
        mse = mean_squared_error(Y.iloc[valid_idx], pred_y)
        print("Alpha:{0:.7f}, l1_ratio: {1:.7f}, R2:{2:.7f}, MSE:{3:.7f}, RMSE:{4:.7f}".format(a, b, score, mse, np.sqrt(mse)))
# Cross Validation for ElasticNet
grid = dict() # 파라미터 2개라 dict사용
grid['alpha'] = alphas
grid['l1_ratio'] = l1_ratio
grid
# define model
model = ElasticNet()
# define search
search = GridSearchCV(model, grid, scoring='neg_root_mean_squared_error', cv=5, n_jobs=-1)
# perform the search
results = search.fit(X_scal.iloc[valid_idx], Y.iloc[valid_idx])
# summarize
print('RMSE: {:.4f}'.format(-results.best_score_))
print('Config: {}'.format(results.best_params_))
model_best = ElasticNet(alpha=results.best_params_['alpha'],
                        l1_ratio=results.best_params_['l1_ratio']).fit(X_scal.iloc[train_idx], Y.iloc[train_idx])
score = model_best.score(X_scal.iloc[valid_idx], Y.iloc[valid_idx])
pred_y = model_best.predict(X_scal.iloc[valid_idx])
mse = mean_squared_error(Y.iloc[valid_idx], pred_y)
print("Alpha:{0:.7f}, l1_ratio: {1:.7f}, R2:{2:.7f}, MSE:{3:.7f}, RMSE:{4:.7f}".format(results.best_params_['alpha'],
                                                                                   results.best_params_['l1_ratio'],
                                                                                   score, mse, np.sqrt(mse)))
summary(model_best, X_scal.iloc[valid_idx], Y.iloc[valid_idx], xlabels=X.columns)

카테고리:

업데이트:

댓글남기기