ML之PLiR之Glmnet算法:利用Glmnet算法求解ElasticNet回归类型问题(实数值评分预测)
目录
输出结果
1、Glmnet算法
实现代码
输出结果
0 21 22 23 34 35 36 37 38 39 210 211 212 213 314 315 216 217 218 219 220 221 222 223 224 225 226 227 228 229 330 331 332 233 334 235 236 237 238 239 240 241 242 143 144 245 246 247 148 249 150 151 152 153 154 155 1……95 196 197 198 199 1['"alcohol"', '"volatile acidity"', '"sulphates"', '"total sulfur dioxide"', '"chlorides"', '"fixed acidity"', '"pH"', '"free sulfur dioxide"', '"residual sugar"', '"citric acid"', '"density"']
1、Glmnet算法
实现代码
#calculate starting value for lambdalam = maxXY/alpha#this value of lambda corresponds to beta = list of 0's#initialize a vector of coefficients betabeta = [0.0] * ncols#initialize matrix of betas at each stepbetaMat = []betaMat.append(list(beta))#begin iterationnSteps = 100lamMult = 0.93 #100 steps gives reduction by factor of 1000 in# lambda (recommended by authors)nzList = []for iStep in range(nSteps):#make lambda smaller so that some coefficient becomes non-zerolam = lam * lamMultdeltaBeta = 100.0eps = 0.01iterStep = 0betaInner = list(beta)while deltaBeta > eps:iterStep += 1if iterStep > 100: break#cycle through attributes and update one-at-a-time#record starting value for comparisonbetaStart = list(betaInner)for iCol in range(ncols):xyj = 0.0for i in range(nrows):#calculate residual with current value of betalabelHat = sum([xNormalized[i][k]*betaInner[k]for k in range(ncols)])residual = labelNormalized[i] - labelHatxyj += xNormalized[i][iCol] * residualuncBeta = xyj/nrows + betaInner[iCol]betaInner[iCol] = S(uncBeta, lam * alpha) / (1 +lam * (1 - alpha))sumDiff = sum([abs(betaInner[n] - betaStart[n])for n in range(ncols)])sumBeta = sum([abs(betaInner[n]) for n in range(ncols)])deltaBeta = sumDiff/sumBetaprint(iStep, iterStep)beta = betaInner#add newly determined beta to listbetaMat.append(beta)#keep track of the order in which the betas become non-zeronzBeta = [index for index in range(ncols) if beta[index] != 0.0]for q in nzBeta:if (q in nzList) == False:nzList.append(q)