728x90
๋ฐ์ํ
์ฌ๋ฌ ๊ฐ์ ๋ ๋ฆฝ ๋ณ์x
๊ณต๋ถํ์๊ฐ:x
๊ณผ์ธ ์์ ํ์: xโ
์ฑ์ :y
y = aโxโ + aโxโ + b
๊ธฐ์ธ๊ธฐ aโ,aโ๋ ๊ฒฝ์ฌํ๊ฐ๋ฒ์ ํตํด ๊ตฌํจ
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import mpl_tookits import mplot3d #3d๊ทธ๋ํ ๊ทธ๋ฆฌ๋ ๋ผ์ด๋ธ๋ฌ๋ฆฌ
#x,y๋ฐ์ดํฐ๊ฐ
data=[[2,0,81],[4,4,93],[6,2,91],[8,3,97]]
x1= [i[0] for i in data]
x2= [i[1] for i in data]
y = [i[2] for i in data]
ax = plt.axes(projection='3d')
ax.set_xlabel('study_H')
ax.set_ylabel('private_class')
ax.set_zlabel('score')
ax.scatter(x1,x2,y)
plt.show()
#๋ฆฌ์คํธ๋ฅผ numpy๋ฐฐ์ด๋ก ๋ฐ๊พธ๊ธฐ(์ธ๋ฑ์ค๋ฅผ ์ฃผ์ด ํ๋์ฉ ๋ถ๋ฌ์ ๊ณ์ฐํ๊ธฐ ์ํด)
xdata=np.array(x1)
x2data = np.array(x1)
ydata=np.array(y)
#์ด๊ธฐํ
a=0
a2=0
b=0
#ํ์ต๋ฅ ์ ํ๊ธฐ
lr=0.05
#๋ช๋ฒ ๋ฐ๋ณต๋ ์ง(0๋ถํฐ ์ธ๋ฏ๋ก +1ํด์ฃผ๊ธฐ)
epochs =2001
#๊ฒฝ์ฌํ๊ฐ๋ฒ
for i in range(epochs):
y_pred = a* xdata+ a2 *x2data + b
error =ydata-y_pred #์ค์ฐจ
a_diff = -(1/len(xdata))* sum(xdata *(error)) #์ค์ฐจํจ์ a๋ก ๋ฏธ๋ถํ ๊ฐ
a2diff = -(1/len(x2data))* sum(x2data *(error)) #์ค์ฐจํจ์ a2๋ก๋ฏธ๋ถํ ๊ฐ
b_diff = -(1/len(xdata))* sum(ydata - y_pred) #์ค์ฐจํจ์ b๋ก ๋ฏธ๋ถํ ๊ฐ
a = a-lr *a_diff #ํ์ต๋ฅ ๊ณฑํด ๊ธฐ์กด๊ฐ ์
๋ฐ์ดํธ
a2= a2-lr*a2diff
b = b-lr *b_diff
if i%100 ==0:
print("epoch=%.f, ๊ธฐ์ธ๊ธฐ=%.04f, ๊ธฐ์ธ๊ธฐ2=%.04f,์ ํธ=%.04f" % (i,a, a2,b)) #100๋ฒ ๋ฐ๋ณต๋ ๋๋ง๋ค ํ์ฌ a,b์ถ๋ ฅ
์ฐจ์์ด ํ๋ ๋ ๋์ด๋ ๋ชจ์ต
์กฐ๊ธ ๋ ์ ๋ฐํ ์์ธก ๊ฐ๋ฅํด์ง
๋ฐ์ํ
'๋ฅ๋ฌ๋ > Today I learned :' ์นดํ ๊ณ ๋ฆฌ์ ๋ค๋ฅธ ๊ธ
[๋ฅ๋ฌ๋] ํผ์ ํธ๋ก (perceptron) - ๋จ์ธต ํผ์ ํธ๋ก (0) | 2021.03.18 |
---|---|
[๋ฅ๋ฌ๋] ๋ก์ง์คํฑ ํ๊ท(logistic regression) (0) | 2021.03.17 |
[๋ฅ๋ฌ๋] ๊ฒฝ์ฌํ๊ฐ๋ฒ(Gradient Descent) (0) | 2021.03.15 |
[๋ฅ๋ฌ๋] ์ค์ฐจ ํ๊ฐ๋ฒ : ํ๊ท ์ ๊ณฑ์ค์ฐจ Mean Squared Error (0) | 2021.03.15 |
[๋ฅ๋ฌ๋] ์ ํ ํ๊ท (0) | 2021.03.12 |