国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

Python編程實現線性回歸和批量梯度下降法代碼實例

2020-02-16 11:26:50
字體:
來源:轉載
供稿:網友

通過學習斯坦福公開課的線性規劃和梯度下降,參考他人代碼自己做了測試,寫了個類以后有時間再去擴展,代碼注釋以后再加,作業好多:

import numpy as npimport matplotlib.pyplot as pltimport randomclass dataMinning:  datasets = []  labelsets = []    addressD = '' #Data folder  addressL = '' #Label folder    npDatasets = np.zeros(1)  npLabelsets = np.zeros(1)    cost = []  numIterations = 0  alpha = 0  theta = np.ones(2)  #pCols = 0  #dRows = 0  def __init__(self,addressD,addressL,theta,numIterations,alpha,datasets=None):    if datasets is None:      self.datasets = []    else:      self.datasets = datasets    self.addressD = addressD    self.addressL = addressL    self.theta = theta    self.numIterations = numIterations    self.alpha = alpha      def readFrom(self):    fd = open(self.addressD,'r')    for line in fd:      tmp = line[:-1].split()      self.datasets.append([int(i) for i in tmp])    fd.close()    self.npDatasets = np.array(self.datasets)    fl = open(self.addressL,'r')    for line in fl:      tmp = line[:-1].split()      self.labelsets.append([int(i) for i in tmp])    fl.close()        tm = []    for item in self.labelsets:      tm = tm + item    self.npLabelsets = np.array(tm)  def genData(self,numPoints,bias,variance):    self.genx = np.zeros(shape = (numPoints,2))    self.geny = np.zeros(shape = numPoints)    for i in range(0,numPoints):      self.genx[i][0] = 1      self.genx[i][1] = i      self.geny[i] = (i + bias) + random.uniform(0,1) * variance  def gradientDescent(self):    xTrans = self.genx.transpose() #    i = 0    while i < self.numIterations:      hypothesis = np.dot(self.genx,self.theta)      loss = hypothesis - self.geny      #record the cost      self.cost.append(np.sum(loss ** 2))      #calculate the gradient      gradient = np.dot(xTrans,loss)      #updata, gradientDescent      self.theta = self.theta - self.alpha * gradient      i = i + 1          def show(self):    print 'yes'    if __name__ == "__main__":  c = dataMinning('c://city.txt','c://st.txt',np.ones(2),100000,0.000005)  c.genData(100,25,10)  c.gradientDescent()  cx = range(len(c.cost))  plt.figure(1)  plt.plot(cx,c.cost)  plt.ylim(0,25000)  plt.figure(2)  plt.plot(c.genx[:,1],c.geny,'b.')  x = np.arange(0,100,0.1)  y = x * c.theta[1] + c.theta[0]  plt.plot(x,y)  plt.margins(0.2)  plt.show()

圖1. 迭代過程中的誤差cost

圖2. 數據散點圖和解直線

總結

以上就是本文關于Python編程實現線性回歸和批量梯度下降法代碼實例的全部內容,希望對大家有所幫助。感興趣的朋友可以繼續參閱本站:

發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 郁南县| 松原市| 淮滨县| 浮山县| 岐山县| 安陆市| 双城市| 堆龙德庆县| 丽水市| 台中县| 泽普县| 太仓市| 绩溪县| 江安县| 沙坪坝区| 泰州市| 定日县| 台中县| 勃利县| 新巴尔虎左旗| 五台县| 汽车| 南投市| 秦皇岛市| 永宁县| 荣成市| 利津县| 永登县| 常熟市| 淮滨县| 青海省| 图木舒克市| 甘孜县| 祁连县| 波密县| 遂平县| 遵化市| 邵阳县| 合阳县| 庆元县| 泰安市|