반응형
설명
- 퀴즈1, 퀴즈2, 중간고사 성적으로 기말고사 성적 예측하기
예제 코드
import tensorflow as tf
x1 = [73., 93., 89., 96., 73.]
x2 = [80., 88., 91., 98., 66.]
x3 = [75., 93., 90., 100., 70.]
Y = [152., 185., 180., 196., 142.]
w1 = tf.Variable(10.)
w2 = tf.Variable(10.)
w3 = tf.Variable(10.)
b = tf.Variable(10.)
learning_rate = 0.000001
for i in range(1000 + 1):
with tf.GradientTape() as tape:
hypothesis = w1 * x1 + w2 * x2 + w3 * x3 + b
cost = tf.reduce_mean(tf.square(hypothesis - Y))
w1_grad, w2_grad, w3_grad, b_grad = tape.gradient(cost, [w1, w2, w3, b])
w1.assign_sub(learning_rate * w1_grad)
w2.assign_sub(learning_rate * w2_grad)
w3.assign_sub(learning_rate * w3_grad)
b.assign_sub(learning_rate * b_grad)
if i % 50 == 0:
print("{:5} | {:12.4f}".format(i, cost.numpy()))
print("{:.4f} | {:.4f} | {:.4f} | {:.4f}".format(w1.numpy(), w2.numpy(), w3.numpy(), b.numpy()))
0 | 5793889.5000
50 | 64291.1484
100 | 715.2902
150 | 9.8462
200 | 2.0152
250 | 1.9252
300 | 1.9210
350 | 1.9177
400 | 1.9145
450 | 1.9114
500 | 1.9081
550 | 1.9050
600 | 1.9018
650 | 1.8986
700 | 1.8955
750 | 1.8923
800 | 1.8892
850 | 1.8861
900 | 1.8829
950 | 1.8798
1000 | 1.8767
0.6694 | 0.6683 | 0.5595 | 9.8914
Matrix를 사용한 예제 코드
import numpy as np
import tensorflow as tf
data = np.array([
[73., 80., 75., 152.],
[93., 88., 93., 185.],
[89., 91., 90., 180.],
[96., 98., 100., 196.],
[73., 66., 70., 142.],
], dtype=np.float32)
X = data[:, :-1]
y = data[:, [-1]]
W = tf.Variable(tf.compat.v1.random_normal([3, 1])) # 입력 파라미터가 3개
b = tf.Variable(tf.compat.v1.random_normal([1])) # 출력 결과가 1개
learning_rate = 0.000001
for i in range(2000 + 1):
with tf.GradientTape() as tape:
hypothesis = tf.matmul(X, W) + b
cost = tf.reduce_mean(tf.square(hypothesis - y))
W_grad, b_grad = tape.gradient(cost, [W, b])
W.assign_sub(learning_rate * W_grad)
b.assign_sub(learning_rate * b_grad)
if i % 100 == 0:
print("{:5} | {:10.4f}".format(i, cost.numpy()))
print(W.numpy())
print(b.numpy())
0 | 37.5109
100 | 19.2034
200 | 19.1517
300 | 19.1025
400 | 19.0534
500 | 19.0046
600 | 18.9560
700 | 18.9077
800 | 18.8596
900 | 18.8118
1000 | 18.7643
1100 | 18.7170
1200 | 18.6700
1300 | 18.6232
1400 | 18.5767
1500 | 18.5304
1600 | 18.4843
1700 | 18.4384
1800 | 18.3926
1900 | 18.3472
2000 | 18.3019
[[ 0.02031493]
[-0.79357684]
[ 2.7809958 ]]
[-1.9658151]
참고
반응형
'Development > Deep Learning' 카테고리의 다른 글
[Deep Learning] LSTM (0) | 2020.12.28 |
---|---|
[Deep Learning] Keras 기초 (0) | 2020.12.28 |
[Deep Learning] Linear Regression (0) | 2020.12.28 |
[Deep Learning] 기본 (0) | 2020.12.28 |