모두를 위한 딥러닝 강의 시즌1을 듣고 복습한 내용입니다.
numpy 활용 + slicing
import numpy as np
b = np.array([[1, 2, 3, 4],
[5, 6, 7, 8],
[9, 10, 11, 12]])
b[:, 1]
# array([2, 6, 10])
b[-1]
# array([9, 10, 11, 12])
b[-1, :]
# array([9, 10, 11, 12])
b[-1, ...]
# array([9, 10, 11, 12])
b[0:2, :]
# array([[1, 2, 3, 4],
# [5, 6, 7, 8]])
# 'data.csv'
# EXAM1, EXAM2, EXAM3, FINAL
# 73, 80, 75, 152
# 93, 88, 93, 185
# 89, 91, 90, 180
xy = np.loadtxt('data.csv', delimier=',', dtype=np.float32)
x_data = xy[:, 0:-1]
y_data = xy[:, [-1]]
텐서플로우 코드
import tensorflow as tf
import numpy as np
xy = np.loadtxt('../data-01-test-score.csv', delimiter=',', dtype=np.float32)
x_data = xy[:, 0:-1]
y_data = xy[:, [-1]]
# Make sure the shape and data are OK
print(x_data, "\nx_data shape:", x_data.shape)
print(y_data, "\ny_data shape:", y_data.shape)
# data output
'''
[[ 73. 80. 75.]
[ 93. 88. 93.]
...
[ 76. 83. 71.]
[ 96. 93. 95.]]
x_data shape: (25, 3)
[[152.]
[185.]
...
[149.]
[192.]]
y_data shape: (25, 1)
'''
tf.model = tf.keras.Sequential()
# activation function doesn't have to be added as a separate layer. Add it as an argument of Dense() layer
tf.model.add(tf.keras.layers.Dense(units=1, input_dim=3, activation='linear'))
# tf.model.add(tf.keras.layers.Activation('linear'))
tf.model.summary()
tf.model.compile(loss='mse', optimizer=tf.keras.optimizers.SGD(lr=1e-5))
history = tf.model.fit(x_data, y_data, epochs=2000)
# Ask my score
print("Your score will be ", tf.model.predict([[100, 70, 101]]))
print("Other scores will be ", tf.model.predict([[60, 70, 110], [90, 100, 80]]))
'딥러닝, 머신러닝 > 모두를 위한 딥러닝 강의 복습' 카테고리의 다른 글
Softmax Regression (0) | 2022.01.06 |
---|---|
Logistic Classification, Logistic Regression의 cost 함수 (0) | 2022.01.05 |
Multi-variable linear regression (0) | 2022.01.04 |
Linear Regression의 cost 최소화 알고리즘의 원리 (0) | 2022.01.04 |
Linear Regression (0) | 2022.01.04 |