SlideShare ist ein Scribd-Unternehmen logo
1 von 52
Downloaden Sie, um offline zu lesen
Linear Regression
Logistic Regression
Outlook
โ€ข Part 1: ํŒŒ์ด์ฌ๊ณผ ํ…์„œํ”Œ๋กœ์šฐ ์†Œ๊ฐœ
โ€ข Part 2: ํšŒ๊ท€ ๋ถ„์„๊ณผ ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€
โ€ข Part 3: ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ ์•Œ๊ณ ๋ฆฌ์ฆ˜
โ€ข Part 4: ์ฝ˜๋ณผ๋ฃจ์…˜ ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ
2
์ง€๋‚œ ์‹œ๊ฐ„์—...
3
๋ฆฌ์ŠคํŠธ
โ€ข ๋Œ€๊ด„ํ˜ธ๋กœ ์ดˆ๊ธฐํ™”
a = [0, 1]
โ€ข ๋ฆฌ์ŠคํŠธ์— ์›์†Œ ์ถ”๊ฐ€
a.append(2)
โ€ข ๋ฆฌ์ŠคํŠธ์˜ ๊ธธ์ด
len(a)
โ€ข ๋ฆฌ์ŠคํŠธ ์Šฌ๋ผ์ด์Šค
a[0:2]
๋”•์…”๋„ˆ๋ฆฌ
โ€ข ์ˆœ์„œ๊ฐ€ ์—†๋Š” ์ธ๋ฑ์Šค๋ฅผ ๊ฐ€์ง(๋ฌธ์ž์—ด
๊ฐ€๋Šฅ)
โ€ข ์ค‘๊ด„ํ˜ธ๋กœ ์ดˆ๊ธฐํ™”
b = {โ€˜sunโ€™: 0}
โ€ข ํ‚ค๋ฅผ ์ง€์ •ํ•ด ์›์†Œ ์ถ”๊ฐ€
b[โ€˜monโ€™] = 1
โ€ข ๋”•์…”๋„ˆ๋ฆฌ ์ฐธ์กฐ๋Š” ๋ฆฌ์ŠคํŠธ์™€ ๋™์ผ
b[โ€™monโ€™]
Python-data type
4
if i == 10:
print(10)
elif i < 10:
print(0)
else:
print(100)
for a in lst:
print(a)
for k in dct:
print(dct[k])
for k, v in dct.items():
print(k, v)
Python-if, for
5
TensorFlow Graph ์™€ Session
โ€ข ํ…์„œํ”Œ๋กœ์šฐ์˜ ์—ฐ์‚ฐ์ž๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๊ณ„์‚ฐ ๊ตฌ์กฐ๋ฅผ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค.
a = tf.constant(2)
b = tf.constant(3)
x = tf.add(a, b)
โ€ข ์„ธ์…˜ ๊ฐ์ฒด๋ฅผ ๋งŒ๋“ค์–ด ๋งŒ๋“ค์–ด์ง„ ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค.
x
<tf.Tensor 'Add:0' shape=() dtype=int32>
tf.Session().run(x)
5
6
zeros(), ones()
โ€ข 0์œผ๋กœ ์ฑ„์›Œ์ง„ ํ…์„œ๋ฅผ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
e = tf.zeros([2, 3])
tf.Session().run(e)
array([[ 0., 0., 0.], [ 0., 0., 0.]], dtype=float32)
โ€ข 1๋กœ ์ฑ„์›Œ์ง„ ํ…์„œ๋ฅผ ๋งŒ๋“ญ๋‹ˆ๋‹ค.
f = tf.ones([2, 3], dtype=tf.int32)
tf.Session().run(f)
array([[1, 1, 1], [1, 1, 1]], dtype=int32)
7
tf.Variable()
โ€ข ์ƒ์ˆ˜๊ฐ€ ์•„๋‹ˆ๋ผ ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„์—์„œ ๋ณ€ํ•˜๋Š” ๊ฐ’์„ ๋‹ด๋Š” ๋„๊ตฌ์ž…๋‹ˆ๋‹ค.
โ€ข ๋ณ€์ˆ˜๋Š” ์ดˆ๊นƒ๊ฐ’์ด ์ฃผ์–ด์ ธ์•ผ ํ•˜๊ณ  ์‚ฌ์šฉํ•˜๊ธฐ ์ „์— ์ดˆ๊ธฐํ™” ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค.
a = tf.Variable(tf.constant(2))
a
<tensorflow.python.ops.variables.Variable at ...>
init = tf.global_variables_initializer()
sess = tf.Session()
sess.run(init)
sess.run(a)
2
8
ํ–‰๋ ฌ(matrix)
โ€ข 2ร—3 ํ–‰๋ ฌ
1 โˆ’2 2
3 โˆ’1 1
a = tf.Variable([[1, -2, 2], [3, -1, 1]])
sess = tf.Session()
sess.run(tf.global_variables_initializer())
sess.run(a)
[[1 -2 2], [3 -1 1]]
9
ํ–‰
(row)
์—ด(column)
ํ–‰๋ ฌ ๋‚ด์ 
โ€ข ํ–‰๋ ฌ์˜ ๋ง์…ˆ
โ€ข 2ร—3 + 2ร—3 = [2ร—3]
1 โˆ’2 2
3 โˆ’1 1
+
โˆ’1 3 2
2 4 1
=
0 1 4
5 3 2
โ€ข ํ–‰๋ ฌ์˜ ๊ณฑ์…ˆ:	๋‚ด์ , ์ ๊ณฑ(dot	product)
โ€ข 2ร—3 โ‹… 3ร—2 = [2ร—2]
1 โˆ’2 2
3 โˆ’1 1
2 โˆ’1
4 3
1 2
=
โˆ’4 โˆ’3
3 โˆ’4
10
ํ–‰
(row)
์—ด(column)
tf.matmul()
โ€ข ๋‘๊ฐœ์˜ ํ…์„œ๋ฅผ ์ž…๋ ฅ ๋ฐ›์•„ ํ–‰๋ ฌ ๋‚ด์ ์„ ๊ณ„์‚ฐํ•ฉ๋‹ˆ๋‹ค.
a = tf.Variable([[1, -2, 2], [3, -1, 1]])
b = tf.Variable([[2, -1], [4, 3], [1, 2]])
dot = tf.matmul(a, b)
sess = tf.Session()
sess.run(tf.global_variables_initializer())
sess.run(dot)
array([[-4, -3],
[ 3, -4]], dtype=int32)
11
1 โˆ’2 2
3 โˆ’1 1
2 โˆ’1
4 3
1 2
โˆ’4 โˆ’3
3 โˆ’4
์„ ํ˜• ํšŒ๊ท€ ๋ถ„์„
12
ํšŒ๊ท€ ๋ถ„์„
โ€ข ์ˆซ์ž ๊ฒฐ๊ณผ๋ฅผ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
โ€ข ์ถœ๋ ฅ ๊ฐ’์€ ์—ฐ์†์ ์ธ ์†์„ฑ์„ ๊ฐ€์ง‘๋‹ˆ๋‹ค.
โ€ข Regression Analysis
ex)
โ€ข ํ™˜์ž์˜ ๋‹น๋‡จ๋ณ‘ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ 1๋…„๋’ค ์•…ํ™” ์ •๋„๋ฅผ ์ธก์ •
โ€ข ๊ณผ๊ฑฐ ์ฃผ์‹์‹œ์žฅ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ ๋‚ด์ผ ์ฃผ๊ฐ€๋ฅผ ์˜ˆ์ธก
โ€ข ์ง€์—ญ, ๋ฐฉ ๊ฐœ์ˆ˜, ํ‰์ˆ˜ ๋“ฑ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ ์ฃผํƒ ๊ฐ€๊ฒฉ ์˜ˆ์ธก
13
1์ฐจ ์„ ํ˜• ํ•จ์ˆ˜
๐‘ฆ; = ๐‘ค	ร—	๐‘ฅ + ๐‘
๊ฐ€์ค‘์น˜ ํŽธํ–ฅ
14
Hyperplane
TV
Radio
Sales ๐‘†๐‘Ž๐‘™๐‘’๐‘  = ๐‘ŽD	ร—	๐‘…๐‘Ž๐‘‘๐‘–๐‘œ + ๐‘ŽIร—๐‘‡๐‘‰ + ๐‘
โ€ข ๊ธฐ๋ณธ ๋ฒ ์ด์Šค ๋ชจ๋ธ
โ€ข ๋Œ€๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ์…‹
โ€ข ํŠน์„ฑ์ด ๋น„๊ต์  ๋งŽ์„ ๋•Œ
15
์ผ๋ฐ˜ํ™”
โ€ข n ๊ฐœ์˜ ํŠน์„ฑ์ด ์žˆ์„ ๋•Œ ์„ ํ˜• ํšŒ๊ท€์˜ ์ผ๋ฐ˜ ๋ฐฉ์ •์‹
๐‘ฆ; = ๐›ฝD ๐‘ฅD + ๐›ฝI ๐‘ฅI + โ‹ฏ + ๐›ฝN ๐‘ฅN + ๐›ฝO
โ€ข ๐‘ฅO = 1	์ธ	ํ•ญ์„	์ถ”๊ฐ€
๐‘ฆ;D = ๐›ฝD ๐‘ฅD + ๐›ฝI ๐‘ฅI + โ‹ฏ + ๐›ฝN ๐‘ฅN + ๐›ฝO ๐‘ฅO
โ‹ฎ
๐‘ฆ;Q = ๐›ฝD ๐‘ฅQD + ๐›ฝI ๐‘ฅQI + โ‹ฏ + ๐›ฝN ๐‘ฅQN + ๐›ฝO ๐‘ฅQO
๐‘ฆ; =
๐‘ฅDD โ‹ฏ ๐‘ฅDO
โ‹ฎ โ‹ฑ โ‹ฎ
๐‘ฅQD โ‹ฏ ๐‘ฅQO
๐›ฝD
โ‹ฎ
๐›ฝO
, ๐‘š = ๋ฐ์ดํ„ฐ๊ฐœ์ˆ˜	 โ†’		 ๐’šV = ๐‘ฟ๐œทY
16
์†”๋ฃจ์…˜
โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Ordinary Least Squares)๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํ‰๊ท ์ œ๊ณฑ์˜ค์ฐจ(Mean
Squared Error)๋ฅผ ์ตœ์†Œํ™”ํ•˜๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ฐพ์Œ.
โ€ข ํ‰๊ท ์ œ๊ณฑ์˜ค์ฐจ
1
๐‘š
Z ๐‘ฆ โˆ’ ๐‘ฆ; I								, ๐‘ฆ; = ๐‘‹๐›ฝ
Q
]D
โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ•
๐›ฝ^ = ๐‘‹_ ๐‘‹ `D ๐‘‹_ ๐‘ฆ
์˜ค์ฐจ์˜ ์ œ๊ณฑ
๋ชจ๋“  ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์˜
์˜ค์ฐจ ์ œ๊ณฑ์„ ๋”ํ•จ
ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ
๊ฐฏ์ˆ˜๋กœ ๋‚˜๋ˆ”
โ€ข ๋ฐ์ดํ„ฐ๊ฐ€ ์•„์ฃผ ๋งŽ์€ ๊ฒฝ์šฐ ๋ฌธ์ œ
โ€ข ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•  ์ˆ˜ ์—†๋Š” ๊ฒฝ์šฐ ๋ฌธ์ œ
17
๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•(Gradient Descent)
โ€ข ์˜ค์ฐจํ•จ์ˆ˜์˜ ๋‚ฎ์€ ์ง€์ ์„ ์ฐพ์•„๊ฐ€๋Š” ์ตœ์ ํ™” ๋ฐฉ๋ฒ•
โ€ข ๋‚ฎ์€ ์ชฝ์˜ ๋ฐฉํ–ฅ์„ ์ฐพ๊ธฐ ์œ„ํ•ด ์˜ค์ฐจํ•จ์ˆ˜๋ฅผ ํ˜„์žฌ ์œ„์น˜์—์„œ ๋ฏธ๋ถ„ํ•จ
๐ฝ =
1
2๐‘š
Z ๐‘ฆ โˆ’ ๐‘ฆ; I								, 	 โˆ‡๐ฝ =
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
Q
]D
18
๋ณต์žกํ•œ ์˜ค์ฐจํ•จ์ˆ˜(์‹ ๊ฒฝ๋ง)
๐‘ฑ
19
๋‰ด๋Ÿฐ
20
๋‰ด๋Ÿฐ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ
Neuron
๐‘ฆ;
๐‘ค
๐‘ฆ; = ๐‘ค	ร—	๐‘ฅ + ๐‘
๐‘ฅ
๐‘
ร—
+
๐’š
21
๋‚ฎ์€ ๊ณณ์œผ๋กœ
Neuron
๐‘ฆ;
๐‘ค
๐‘ฅ
๐‘
ร—
+
๐’š
๐œ•๐ฝ
๐œ•๐‘ค
=
1
๐‘š
๐‘ฆ โˆ’ ๐‘ฆ;
๐œ•๐‘ฆ;
๐œ•๐‘ค
=
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ
๐œ•๐ฝ
๐œ•๐‘
=
1
๐‘š
๐‘ฆ โˆ’ ๐‘ฆ;
๐œ•๐‘ฆ;
๐œ•๐‘
=
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
๐ฝ =
1
2๐‘š
Z ๐‘ฆ โˆ’ ๐‘ฆ; I
Q
]D
22
ํŒŒ๋ผ๋ฏธํ„ฐ ์—…๋ฐ์ดํŠธ
Neuron
๐‘ฆ;
๐‘ค = ๐‘ค + โˆ†๐‘ค = ๐‘ค +
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ
๐‘ฅ ร—
+
๐’š
๐‘ = ๐‘ + โˆ†๐‘ = ๐‘ +
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
(๐‘ฆ โˆ’ ๐‘ฆ;)
23
์ ๋‹นํ•œ ์†๋„
โ€ข ํŒŒ๋ผ๋ฏธํ„ฐ w, b ์˜ ์—…๋ฐ์ดํŠธ๊ฐ€ ํด ๊ฒฝ์šฐ ์ตœ์ €์ (local minima)์„ ์ง€๋‚˜์น  ์ˆ˜
์žˆ์Šต๋‹ˆ๋‹ค.
โ€ข ํ•™์Šต ์†๋„(learning rate)๋กœ ๊ทธ๋ž˜๋””์–ธํŠธ ์—…๋ฐ์ดํŠธ๋ฅผ ์กฐ์ ˆํ•ฉ๋‹ˆ๋‹ค.
๐‘ค = ๐‘ค + ๐œถ
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐‘ = ๐‘ + ๐œถ
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
24
ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ
โ€ข ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ(Hyperparameter)๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฐ์ดํ„ฐ๋กœ๋ถ€ํ„ฐ ํ•™์Šตํ•  ์ˆ˜
์—†๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค.
โ€ข ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฐ์ดํ„ฐ๋กœ ๋ถ€ํ„ฐ ํ•™์Šตํ•˜๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค. ์˜ˆ
๋ฅผ ๋“ค๋ฉด, w, b ์ž…๋‹ˆ๋‹ค.
โ€ข ํ•™์Šต์†๋„(learning rate)์€ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค.
โ€ข ์ด ์™ธ์™ธ์—๋„ ์‹ ๊ฒฝ๋ง์˜ ๋ ˆ์ด์–ด์ˆ˜๋‚˜ ์œ ๋‹›์ˆ˜, k-NN ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ k ๊ฐ’ ๋“ฑ ์•Œ๊ณ 
๋ฆฌ์ฆ˜๋งˆ๋‹ค ์—ฌ๋Ÿฌ๊ฐ€์ง€์˜ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค.
โ€ข ์ตœ์ ์˜ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ฐพ๊ธฐ์œ„ํ•ด์„œ ๋ฐ˜๋ณต์ ์ธ ํ•™์Šต, ๊ฒ€์ฆ ๊ณผ์ •์„ ๊ฑฐ์ณ์•ผ
ํ•ฉ๋‹ˆ๋‹ค.
25
ํ…์„œํ”Œ๋กœ์šฐ
์„ ํ˜• ํšŒ๊ท€ ๊ตฌํ˜„
26
๋ฐ์ดํ„ฐ ์ƒ์„ฑ
์„ธ์…˜ ๊ฐ์ฒด ์ƒ์„ฑ
ํ‰๊ท  0, ํ‘œ์ค€ํŽธ์ฐจ 0.55 ์ธ x ์ƒ˜ํ”Œ 1000๊ฐœ ์ƒ์„ฑ
0.1*x + 0.3 ๋ฐฉ์ •์‹์„ ๋งŒ์กฑํ•˜๋Š” y ๋ฐ์ดํ„ฐ๋ฅผ ์ƒ์„ฑํ•˜๋˜,
ํ‰๊ท  0, ํ‘œ์ค€ํŽธ์ฐจ 0.03์„ ๊ฐ€์ง€๋„๋ก ํ•จ.
27
์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ ์‹œ๊ฐํ™”
28
๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„ ์ƒ์„ฑ
๊ฐ€์ค‘์น˜ W, b ๋ณ€์ˆ˜๋ฅผ 0์œผ๋กœ ์ดˆ๊ธฐํ™”
y_hat ๊ณ„์‚ฐ
์†์‹ค ํ•จ์ˆ˜์ธ MSE ๊ณ„์‚ฐ
๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• ๊ฐ์ฒด ์ƒ์„ฑ์†์‹คํ•จ์ˆ˜ ๋…ธ๋“œ๋ฅผ ์ตœ์ ํ™”
ํ•˜๋Š” ํ•™์Šต๋…ธ๋“œ ์ƒ์„ฑ
๐ฝ =
1
2๐‘š
Z ๐‘ฆ โˆ’ ๐‘ฆ; I
Q
]D
train
loss
y_hat
W x b
29
๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„ ์‹คํ–‰
๋ณ€์ˆ˜ ์ดˆ๊ธฐํ™”
ํ•™์Šต ๋…ธ๋“œ ์‹คํ–‰
ํ•™์Šต๋œ ํŒŒ๋ผ๋ฏธํ„ฐ์™€ ์†์‹ค
ํ•จ์ˆ˜ ๊ฐ’ ์ถœ๋ ฅ
30
๊ฒฐ๊ณผ ๊ทธ๋ž˜ํ”„
w = 0.099, b = 0.298 31
์„ ํ˜• ํšŒ๊ท€ ์ •๋ฆฌ
โ€ข ์„ ํ˜• ํšŒ๊ท€ ๋ถ„์„์€ ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์—ฐ์†์ ์ธ ๊ฒฐ๊ณผ๋ฅผ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
โ€ข ์„ ํ˜• ํšŒ๊ท€์˜ ๋Œ€ํ‘œ์ ์ธ ๋น„์šฉํ•จ์ˆ˜๋Š” MSE(mean square error) ํ•จ์ˆ˜์ž…๋‹ˆ๋‹ค.
โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ• ๋Œ€์‹  ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•์„ ์‚ฌ์šฉํ•˜์—ฌ ์ ์ง„์ ์œผ๋กœ ์ตœ์ ์˜ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ
์ฐพ์•˜์Šต๋‹ˆ๋‹ค.
โ€ข ํŠน์„ฑ์ด ๋งŽ์„ ๊ฒฝ์šฐ ๋†’์€ ์„ฑ๋Šฅ์„ ๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋Ÿด ๊ฒฝ์šฐ ์˜คํžˆ๋ ค ์„ฑ๋Šฅ์„
์ œํ•œํ•ด์•ผ ํ•  ๋•Œ๊ฐ€ ๋งŽ์Šต๋‹ˆ๋‹ค.
โ€ข ๋น„๊ต์  ๋Œ€๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ์…‹์—์„œ๋„ ์ž˜ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค.
โ€ข ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ํ•  ๋•Œ ์ฒ˜์Œ ์‹œ๋„ํ•  ๋ชจ๋ธ๋กœ์„œ ์ข‹์Šต๋‹ˆ๋‹ค.
32
๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€
33
๋ถ„๋ฅ˜(Classification)
โ€ข ํด๋ž˜์Šค ๋ ˆ์ด๋ธ”์„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
โ€ข ์ถœ๋ ฅ ๊ฒฐ๊ณผ๋Š” ์ด์‚ฐ์ ์ž…๋‹ˆ๋‹ค.
โ€ข Binary Classification(์ด์ง„ ๋ถ„๋ฅ˜), Multiclass Classification(๋‹ค์ค‘ ๋ถ„๋ฅ˜)
ex)
โ€ข ์ŠคํŒธ ๋ถ„๋ฅ˜
โ€ข ์•” ์ง„๋‹จ
โ€ข ๋ถ“๊ฝƒ์˜ ํ’ˆ์ข… ํŒ๋ณ„
โ€ข ์†๊ธ€์”จ ์ˆซ์ž ๋ถ„๋ฅ˜
34
๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€ (์ด์ง„ ๋ถ„๋ฅ˜)
โ€ข ์ด์ง„ ๋ถ„๋ฅ˜๋Š” ์ƒ˜ํ”Œ์„ True(1), ๋˜๋Š” False(0)์œผ๋กœ ๋ถ„๋ฅ˜ํ•ฉ๋‹ˆ๋‹ค.
โ€ข ํšŒ๊ท€์˜ ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ๊ทธ๋Œ€๋กœ ์ด์šฉํ•ฉ๋‹ˆ๋‹ค.
โ€ข ์„ ํ˜• ํ•จ์ˆ˜์˜ ๊ฒฐ๊ณผ๋ฅผ 0~1 ์‚ฌ์ด์˜ ํ™•๋ฅ ๋กœ ๋ณ€ํ™˜ํ•ฉ๋‹ˆ๋‹ค.
โ€ข 0.5 ์ด์ƒ์ผ ๊ฒฝ์šฐ True, ์•„๋‹ˆ๋ฉด False ๋กœ ๋ถ„๋ฅ˜ํ•ฉ๋‹ˆ๋‹ค.
๐‘ฆ; = ๐‘ค	ร—	๐‘ฅ + ๐‘
35
๋กœ์ง€์Šคํ‹ฑ ํ•จ์ˆ˜
โ€ข ๋กœ์ง€์Šคํ‹ฑ(logistic) ๋˜๋Š” ์‹œ๊ทธ๋ชจ์ด๋“œ(sigmoid) ํ•จ์ˆ˜๋Š” -โˆž~+โˆž์ž…๋ ฅ์— ๋Œ€ํ•ด
0~1 ์‚ฌ์ด์˜ ๊ฐ’์„ ์ถœ๋ ฅํ•ฉ๋‹ˆ๋‹ค.
๐‘ฆ; =	
1
1 +	 ๐‘’`(g	ร—	h	i	j)
=	
1
1 +	 ๐‘’`k	
๐‘ง = ๐‘ค	ร—	๐‘ฅ + ๐‘
36
๋‰ด๋Ÿฐ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ
Neuron Sigmoid
0 ~ 1
๐‘ค
๐‘ฆ; = ๐œŽ(๐‘ง)
๐‘ฅ
๐‘
ร—
+
๐’š
๐‘ง = ๐‘ค	ร—	๐‘ฅ + ๐‘
-โˆž~+โˆž
๐œŽ(๐‘ง) =	
1
1 +	 ๐‘’`k
37
๋ถ„๋ฅ˜์—์„œ์˜ ์†์‹ค ํ•จ์ˆ˜๋Š”
โ€ข ๋ถ„๋ฅ˜๋Š” ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ(cross-entropy) ์†์‹ค ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค.
โ€ข ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ์†์‹คํ•จ์ˆ˜๋ฅผ ๋ฏธ๋ถ„ํ•˜๋ฉด
โ€ข ์„ ํ˜•ํšŒ๊ท€์˜ MSE ์†์‹คํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„ ๊ฒฐ๊ณผ์™€ ๋™์ผํ•ฉ๋‹ˆ๋‹ค.
๐ฝ = โˆ’	
1
๐‘š
Z ๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ;
Q
]D
= โˆ’	
1
๐‘š
Z[๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ; + 1 โˆ’ ๐‘ฆ log	(1 โˆ’ ๐‘ฆ;)]
Q
]D
๐œ•๐ฝ
๐œ•๐‘ค
= โˆ’
1
๐‘š
	Z ๐‘ฆ โˆ’ ๐‘ฆ; ๐‘ฅ
Q
]D
๐œ•๐ฝ
๐œ•๐‘
= โˆ’
1
๐‘š
	Z ๐‘ฆ โˆ’ ๐‘ฆ;
Q
]D
38
๋‚ฎ์€ ๊ณณ์œผ๋กœ
Neuron Sigmoid
๐‘ค
๐‘ฅ
๐‘
ร—
+
๐’š
๐‘ฆ; = ๐œŽ(๐‘ง) =	
1
1 +	 ๐‘’`k
๐œ•๐ฝ
๐œ•๐‘ค
=
1
๐‘š
๐‘ฆ โˆ’ ๐‘ฆ;
๐œ•๐‘ฆ;
๐œ•๐‘ค
=
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ
๐œ•๐ฝ
๐œ•๐‘
=
1
๐‘š
๐‘ฆ โˆ’ ๐‘ฆ;
๐œ•๐‘ฆ;
๐œ•๐‘
=
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
๐ฝ = โˆ’	
1
๐‘š
Z[๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ; + 1 โˆ’ ๐‘ฆ log	(1 โˆ’ ๐‘ฆ;)]
Q
]D
๐‘ฆ;๐‘ง
39
๊ทธ๋ž˜๋””์–ธํŠธ ์—…๋ฐ์ดํŠธ
Neuron Sigmoid
๐‘ฅ ร—
+
๐’š๐‘ฆ;
๐‘ค = ๐‘ค + โˆ†๐‘ค = ๐‘ค +
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ
๐‘ = ๐‘ + โˆ†๐‘ = ๐‘ +
1
๐‘š
(๐‘ฆ โˆ’ ๐‘ฆ;)
(๐‘ฆ โˆ’ ๐‘ฆ;)
40
๋กœ์ง€์Šคํ‹ฑ ์ •๋ฆฌ
โ€ข ๋ถ„๋ฅ˜์— ์‚ฌ์šฉํ•˜๋Š” ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค.
โ€ข ์„ ํ˜• ํ•จ์ˆ˜ ๊ฒฐ๊ณผ๋ฅผ ์‹œ๊ทธ๋ชจ์ด๋“œ ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ 0~1 ์‚ฌ์ด๋กœ ์••์ถ•ํ•ฉ๋‹ˆ๋‹ค.
โ€ข ์ด์ง„ ๋ถ„๋ฅ˜๋Š” 0.5 ๋ณด๋‹ค ๋†’์„ ๋•Œ๋Š” True ๋กœํ•˜๊ณ  ๊ทธ ์ดํ•˜๋Š” False ๋กœ ํ•˜์—ฌ ๋ชจ๋ธ
์„ ํ•™์Šต์‹œํ‚ต๋‹ˆ๋‹ค.
โ€ข ์‹œ๊ทธ๋ชจ์ด๋“œ ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ๋น„์šฉํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„ ๊ฒฐ๊ณผ๋Š” ์„ ํ˜•
ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ MSE ๋น„์šฉํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„๊ณผ ๋™์ผํ•ฉ๋‹ˆ๋‹ค.
โ€ข ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€๋Š” ๋‹ค์ค‘ ๋ถ„๋ฅ˜๋„ ์ง€์›ํ•ฉ๋‹ˆ๋‹ค.
41
ํ…์„œํ”Œ๋กœ์šฐ
๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€ ๊ตฌํ˜„
42
์œ„์Šค์ฝ˜์‹  ์œ ๋ฐฉ์•” ๋ฐ์ดํ„ฐ
์‚ฌ์ดํ‚ท๋Ÿฐ์˜ ๋ฐ์ดํ„ฐ์…‹ ์ด์šฉ
๋„˜ํŒŒ์ด
์œ„์Šค์ฝ˜์‹  ์œ ๋ฐฉ์•” ๋ฐ์ดํ„ฐ ๋กœ๋“œ
์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋Š”
scikit-learn์˜ Bunch ์˜ค๋ธŒ์ ํŠธ
43
๋„˜ํŒŒ์ด(NumPy)
โ€ข ๋ฐ์ดํ„ฐ ๊ณผํ•™์„ ์œ„ํ•œ ๋‹ค์ฐจ์› ๋ฐฐ์—ด ํŒจํ‚ค์ง€๋กœ ๋งŽ์€ ๋ฐฐ์—ด ์—ฐ์‚ฐ์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.
โ€ข ๋„˜ํŒŒ์ด๋Š” ํŒŒ์ด์ฌ ๋ฆฌ์ŠคํŠธ์™€๋Š” ๋‹ฌ๋ฆฌ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๋ฐ์ดํ„ฐ ํƒ€์ž…์„ ๋‹ด์„ ์ˆ˜ ์—†
์Šต๋‹ˆ๋‹ค.
โ€ข scikit-learn, tensorflow ๋“ฑ ๋งŽ์€ ๋จธ์‹  ๋Ÿฌ๋‹ ํŒจํ‚ค์ง€๋“ค์ด ์ž…๋ ฅ ๊ฐ’์œผ๋กœ ๋„˜ํŒŒ์ด
๋ฐฐ์—ด์„ ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
44
cancer ํŠน์„ฑ
30๊ฐœ์˜ ํŠน์„ฑ
ํŠน์„ฑ์˜ ์ด๋ฆ„
๐‘ฆ; = ๐‘คD ๐‘ฅD + ๐‘คI ๐‘ฅI + โ‹ฏ + ๐‘คqO ๐‘ฅqO + ๐‘
45
cancer ๋ฐ์ดํ„ฐ
y
X
569๊ฐœ์˜
๋ฐ์ดํ„ฐ
ํ–‰๋ฒกํ„ฐ
์—ด๋ฒกํ„ฐ
ํŠน๋ณ„ํ•œ ๊ฒฝ์šฐ๊ฐ€ ์•„๋‹ˆ
๋ฉด float32ํ˜• ๊ถŒ์žฅ
46
์„ ํ˜• ํ•จ์ˆ˜ ๊ณ„์‚ฐ
0.2
0.6
โ‹ฏ
0.1
0.2
โ‹ฎ โ‹ฑ โ‹ฎ
0.5 โ‹ฏ 0.4
โ‹…
0.1
โ‹ฎ
0.3
=
1.5
5.9
โ‹ฎ
0.7
										+ 0.1 =
1.6
6.0
โ‹ฎ
0.8
30๊ฐœ ๊ฐ€์ค‘์น˜:
569๊ฐœ ์ƒ˜ํ”Œ์—
๋ชจ๋‘ ์ ์šฉ
569๊ฐœ ์ƒ˜ํ”Œ
๐‘ฅ							ร—		๐‘Š																				 + ๐‘ = ๐‘ฆ;
[569, 30] x [30, 1] = [569, 1] + [1] = [569, 1]
1๊ฐœ ํŽธํ–ฅ(bias):
569๊ฐœ ์ƒ˜ํ”Œ์—
๋ชจ๋‘ ์ ์šฉ
(๋ธŒ๋กœ๋“œ์บ์ŠคํŒ…)
30๊ฐœ ํŠน์„ฑ
569๊ฐœ ๊ฒฐ๊ณผ
(logits)
47
์†์‹ค ํ•จ์ˆ˜์™€ ์ตœ์ ํ™”
๋กœ์ง€์Šคํ‹ฑ(์‹œ๊ทธ๋ชจ์ด๋“œ) ํฌ
๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ์†์‹คํ•จ์ˆ˜
ํ•™์Šต์†๋„ ๋งค์šฐ ๋‚ฎ๊ฒŒ
๋ณ€์ˆ˜ ์ดˆ๊ธฐํ™”
48
ํ•™์Šต
โ€ข ์—ฌ๊ธฐ์„œ๋Š” ์˜ˆ๋ฅผ ๊ฐ„๋‹จํžˆ ํ•˜๊ธฐ์œ„ํ•ด ํ•™์Šต ๋ฐ์ดํ„ฐ๋กœ ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์„
ํ‰๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค๋งŒ ์‹ค์ „์—์„œ๋Š” ์ด๋ ‡๊ฒŒ ํ•ด์„œ๋Š” ์•ˆ๋ฉ๋‹ˆ๋‹ค
prediction ์˜ ๋ชจ๋“  ์›
์†Œ์— ์ ์šฉ, 0.5๋ณด๋‹ค ํฌ
๋ฉด True, ์ž‘์œผ๋ฉด False[569, 1] ํฌ๊ธฐ
5000๋ฒˆ ํ•™์Šตํ•˜๋ฉด์„œ
์†์‹คํ•จ์ˆ˜ ๊ฐ’ ๊ธฐ๋ก
92% ์ •ํ™•๋„
49
์ •๋ฆฌ
โ€ข ์„ ํ˜• ๋ชจ๋ธ์„ ์ด์šฉํ•ด ํšŒ๊ท€์™€ ๋ถ„๋ฅ˜ ํ•™์Šต์„ ํ–ˆ์Šต๋‹ˆ๋‹ค.
โ€ข ๋ถ„๋ฅ˜๋Š” ๋กœ์ง€์Šคํ‹ฑ ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•ด ํ™•๋ฅ ๋กœ ๋ณ€ํ™˜ํ•˜์—ฌ ๋ ˆ์ด๋ธ”์„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค.
โ€ข ํšŒ๊ท€์—์„œ๋Š” ์ž„์˜์˜ ์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ 1000๊ฐœ, ํŠน์„ฑ 1๊ฐœ๋ฅผ ์‚ฌ์šฉํ–ˆ๊ณ  ๋ถ„๋ฅ˜์—์„œ๋Š”
์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ 569๊ฐœ, ํŠน์„ฑ 30๊ฐœ๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.
โ€ข ํšŒ๊ท€์—์„œ ํ•™์Šตํ•œ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๊ฐ€์ค‘์น˜ w 1๊ฐœ, ํŽธํ–ฅ b 1๊ฐœ ์ž…๋‹ˆ๋‹ค.
โ€ข ๋ถ„๋ฅ˜์—์„œ ํ•™์Šตํ•œ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๊ฐ€์ค‘์น˜ w 30๊ฐœ, ํŽธํ–ฅ b 1๊ฐœ ์ž…๋‹ˆ๋‹ค.
โ€ข ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๋ฅผ ๋งŒ๋“ค๊ณ  ํ…์„œํ”Œ๋กœ์šฐ์—์„œ ์ œ๊ณตํ•˜๋Š” ์†
์‹คํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค.
โ€ข ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• ์ตœ์ ํ™” ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ ์šฉํ•˜์—ฌ ์ตœ์ ๊ฐ’์„ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค.
50
Materials
โ€ข Github :
https://github.com/rickiepark/tfk-notebooks/tree/master/tensorflow_for_beginners
โ€ข Slideshare :
https://www.slideshare.net/RickyPark3/
51
๊ฐ์‚ฌํ•ฉ๋‹ˆ๋‹ค.
52

Weitere รคhnliche Inhalte

Was ist angesagt?

ๅŒๅฏพๆ€ง
ๅŒๅฏพๆ€งๅŒๅฏพๆ€ง
ๅŒๅฏพๆ€งYoichi Iwata
ย 
Solving linear homogeneous recurrence relations
Solving linear homogeneous recurrence relationsSolving linear homogeneous recurrence relations
Solving linear homogeneous recurrence relationsDr. Maamoun Ahmed
ย 
Demonstrate interpolation search
Demonstrate interpolation searchDemonstrate interpolation search
Demonstrate interpolation searchmanojmanoj218596
ย 
AtCoder Regular Contest 038 ่งฃ่ชฌ
AtCoder Regular Contest 038 ่งฃ่ชฌAtCoder Regular Contest 038 ่งฃ่ชฌ
AtCoder Regular Contest 038 ่งฃ่ชฌAtCoder Inc.
ย 
AtCoder Regular Contest 030 ่งฃ่ชฌ
AtCoder Regular Contest 030 ่งฃ่ชฌAtCoder Regular Contest 030 ่งฃ่ชฌ
AtCoder Regular Contest 030 ่งฃ่ชฌAtCoder Inc.
ย 
Insertion Sorting
Insertion SortingInsertion Sorting
Insertion SortingFarihaHabib123
ย 
Expression trees
Expression treesExpression trees
Expression treesSalman Vadsarya
ย 
Lecture filling algorithms
Lecture  filling algorithmsLecture  filling algorithms
Lecture filling algorithmsavelraj
ย 
Insertion sort
Insertion sort Insertion sort
Insertion sort Monalisa Patel
ย 
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloomSho Nakazono
ย 
Iteration, induction, and recursion
Iteration, induction, and recursionIteration, induction, and recursion
Iteration, induction, and recursionMohammed Hussein
ย 
Mergesort
MergesortMergesort
Mergesortluzenith_g
ย 
Selection sort
Selection sortSelection sort
Selection sortstella D
ย 
Binary Search - Design & Analysis of Algorithms
Binary Search - Design & Analysis of AlgorithmsBinary Search - Design & Analysis of Algorithms
Binary Search - Design & Analysis of AlgorithmsDrishti Bhalla
ย 
Lesson02 Vectors And Matrices Slides
Lesson02   Vectors And Matrices SlidesLesson02   Vectors And Matrices Slides
Lesson02 Vectors And Matrices SlidesMatthew Leingang
ย 
Insertion and merge sort
Insertion and merge sortInsertion and merge sort
Insertion and merge sortPreetham Devisetty
ย 
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–น
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–นไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–น
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–นKensuke Otsuki
ย 

Was ist angesagt? (20)

ๅŒๅฏพๆ€ง
ๅŒๅฏพๆ€งๅŒๅฏพๆ€ง
ๅŒๅฏพๆ€ง
ย 
Solving linear homogeneous recurrence relations
Solving linear homogeneous recurrence relationsSolving linear homogeneous recurrence relations
Solving linear homogeneous recurrence relations
ย 
Demonstrate interpolation search
Demonstrate interpolation searchDemonstrate interpolation search
Demonstrate interpolation search
ย 
AtCoder Regular Contest 038 ่งฃ่ชฌ
AtCoder Regular Contest 038 ่งฃ่ชฌAtCoder Regular Contest 038 ่งฃ่ชฌ
AtCoder Regular Contest 038 ่งฃ่ชฌ
ย 
AtCoder Regular Contest 030 ่งฃ่ชฌ
AtCoder Regular Contest 030 ่งฃ่ชฌAtCoder Regular Contest 030 ่งฃ่ชฌ
AtCoder Regular Contest 030 ่งฃ่ชฌ
ย 
Insertion Sorting
Insertion SortingInsertion Sorting
Insertion Sorting
ย 
Expression trees
Expression treesExpression trees
Expression trees
ย 
Lecture filling algorithms
Lecture  filling algorithmsLecture  filling algorithms
Lecture filling algorithms
ย 
Insertion sort
Insertion sort Insertion sort
Insertion sort
ย 
Binary trees
Binary treesBinary trees
Binary trees
ย 
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom
่ซ–ๆ–‡็ดนไป‹: Cuckoo filter: practically better than bloom
ย 
Iteration, induction, and recursion
Iteration, induction, and recursionIteration, induction, and recursion
Iteration, induction, and recursion
ย 
Mergesort
MergesortMergesort
Mergesort
ย 
Selection sort
Selection sortSelection sort
Selection sort
ย 
Binary Search - Design & Analysis of Algorithms
Binary Search - Design & Analysis of AlgorithmsBinary Search - Design & Analysis of Algorithms
Binary Search - Design & Analysis of Algorithms
ย 
Data structure by Digvijay
Data structure by DigvijayData structure by Digvijay
Data structure by Digvijay
ย 
Divide and Conquer
Divide and ConquerDivide and Conquer
Divide and Conquer
ย 
Lesson02 Vectors And Matrices Slides
Lesson02   Vectors And Matrices SlidesLesson02   Vectors And Matrices Slides
Lesson02 Vectors And Matrices Slides
ย 
Insertion and merge sort
Insertion and merge sortInsertion and merge sort
Insertion and merge sort
ย 
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–น
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–นไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–น
ไบŒ้ƒจใ‚ฐใƒฉใƒ•ใฎๆœ€ๅฐ็‚น่ขซ่ฆ†ใจๆœ€ๅคงๅฎ‰ๅฎš้›†ๅˆใจๆœ€ๅฐ่พบ่ขซ่ฆ†ใฎๆฑ‚ใ‚ๆ–น
ย 

Andere mochten auch

2.supervised learning
2.supervised learning2.supervised learning
2.supervised learningHaesun Park
ย 
1.introduction
1.introduction1.introduction
1.introductionHaesun Park
ย 
1.Introduction to Python and TensorFlow
1.Introduction to Python and TensorFlow1.Introduction to Python and TensorFlow
1.Introduction to Python and TensorFlowHaesun Park
ย 
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐโ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ๋ณต์—ฐ ์ด
ย 
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ SEUNGWOO LEE
ย 
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€beom kyun choi
ย 
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐ
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐRe: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐ
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐMario Cho
ย 
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)beom kyun choi
ย 
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ Yong Joon Moon
ย 
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 12162016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216Nasmedia
ย 
1.introduction(epoch#2)
1.introduction(epoch#2)1.introduction(epoch#2)
1.introduction(epoch#2)Haesun Park
ย 
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐTerry Cho
ย 
2.supervised learning(epoch#2)-1
2.supervised learning(epoch#2)-12.supervised learning(epoch#2)-1
2.supervised learning(epoch#2)-1Haesun Park
ย 
2.supervised learning(epoch#2)-2
2.supervised learning(epoch#2)-22.supervised learning(epoch#2)-2
2.supervised learning(epoch#2)-2Haesun Park
ย 
3.unsupervised learing
3.unsupervised learing3.unsupervised learing
3.unsupervised learingHaesun Park
ย 

Andere mochten auch (15)

2.supervised learning
2.supervised learning2.supervised learning
2.supervised learning
ย 
1.introduction
1.introduction1.introduction
1.introduction
ย 
1.Introduction to Python and TensorFlow
1.Introduction to Python and TensorFlow1.Introduction to Python and TensorFlow
1.Introduction to Python and TensorFlow
ย 
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐโ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ
โ€‹ใ€Ž๊ณจ๋นˆํ•ด์ปค์˜ 3๋ถ„ ๋”ฅ๋Ÿฌ๋‹ใ€ ๋ง›๋ณด๊ธฐ
ย 
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ
Tensorflow๋กœ ํ”Œ๋กœ์ด๋“œ ํด๋ผ์šฐ๋“œ์—์„œ ํ•ฉ์„ฑ๊ณฑ ์‹ ๊ฒฝ๋ง (CNN) ๊ตฌํ˜„ํ•ด๋ณด๊ธฐ
ย 
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€
Tensorflow regression ํ…์„œํ”Œ๋กœ์šฐ ํšŒ๊ท€
ย 
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐ
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐRe: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐ
Re: ์ œ๋กœ๋ถ€ํ„ฐ์‹œ์ž‘ํ•˜๋Š”ํ…์„œํ”Œ๋กœ์šฐ
ย 
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)
์ฐจ์›์ถ•์†Œ ํ›‘์–ด๋ณด๊ธฐ (PCA, SVD, NMF)
ย 
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ
ํ…์„œํ”Œ๋กœ์šฐ ๊ธฐ์ดˆ ์ดํ•ดํ•˜๊ธฐ
ย 
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 12162016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216
2016๊ฒฐ์‚ฐ๋ณด๊ณ ์„œ ๋ฏธ๋””์–ด์ด์Šˆ 1216
ย 
1.introduction(epoch#2)
1.introduction(epoch#2)1.introduction(epoch#2)
1.introduction(epoch#2)
ย 
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ
๋จธ์‹ ๋Ÿฌ๋‹์œผ๋กœ ์–ผ๊ตด ์ธ์‹ ๋ชจ๋ธ ๊ฐœ๋ฐœ ์‚ฝ์งˆ๊ธฐ
ย 
2.supervised learning(epoch#2)-1
2.supervised learning(epoch#2)-12.supervised learning(epoch#2)-1
2.supervised learning(epoch#2)-1
ย 
2.supervised learning(epoch#2)-2
2.supervised learning(epoch#2)-22.supervised learning(epoch#2)-2
2.supervised learning(epoch#2)-2
ย 
3.unsupervised learing
3.unsupervised learing3.unsupervised learing
3.unsupervised learing
ย 

ร„hnlich wie 2.linear regression and logistic regression

3.neural networks
3.neural networks3.neural networks
3.neural networksHaesun Park
ย 
Lecture 4: Neural Networks I
Lecture 4: Neural Networks ILecture 4: Neural Networks I
Lecture 4: Neural Networks ISang Jun Lee
ย 
Rdatamining
Rdatamining Rdatamining
Rdatamining Kangwook Lee
ย 
4.convolutional neural networks
4.convolutional neural networks4.convolutional neural networks
4.convolutional neural networksHaesun Park
ย 
Neural network (perceptron)
Neural network (perceptron)Neural network (perceptron)
Neural network (perceptron)Jeonghun Yoon
ย 
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptx
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptxโ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptx
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptxDonghwan Lee
ย 
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธdeepseaswjh
ย 
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ์˜๊ธฐ ๊น€
ย 
Deep Learning from scratch 4์žฅ : neural network learning
Deep Learning from scratch 4์žฅ : neural network learningDeep Learning from scratch 4์žฅ : neural network learning
Deep Learning from scratch 4์žฅ : neural network learningJinSooKim80
ย 
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹Jinwon Lee
ย 
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธS.O.P.T - Shout Our Passion Together
ย 
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic์šฉ์ง„ ์กฐ
ย 
2.supervised learning(epoch#2)-3
2.supervised learning(epoch#2)-32.supervised learning(epoch#2)-3
2.supervised learning(epoch#2)-3Haesun Park
ย 
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…ํ™๋ฐฐ ๊น€
ย 
Python programming for Bioinformatics
Python programming for BioinformaticsPython programming for Bioinformatics
Python programming for BioinformaticsHyungyong Kim
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€Haesun Park
ย 
Naive ML Overview
Naive ML OverviewNaive ML Overview
Naive ML OverviewChul Ju Hong
ย 

ร„hnlich wie 2.linear regression and logistic regression (20)

3.neural networks
3.neural networks3.neural networks
3.neural networks
ย 
Lecture 4: Neural Networks I
Lecture 4: Neural Networks ILecture 4: Neural Networks I
Lecture 4: Neural Networks I
ย 
Rdatamining
Rdatamining Rdatamining
Rdatamining
ย 
R_datamining
R_dataminingR_datamining
R_datamining
ย 
4.convolutional neural networks
4.convolutional neural networks4.convolutional neural networks
4.convolutional neural networks
ย 
Neural network (perceptron)
Neural network (perceptron)Neural network (perceptron)
Neural network (perceptron)
ย 
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptx
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptxโ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptx
โ˜…แ„€แ…กแ†ผแ„‹แ…ดแ„€แ…ญแ„Œแ…ข_แ„ƒแ…ฆแ„‹แ…ตแ„แ…ฅ แ„‡แ…ฎแ†ซแ„‰แ…ฅแ†จแ„‹แ…ณแ†ฏ แ„‹แ…ฑแ„’แ…กแ†ซ แ„แ…ฉแ†ผแ„€แ…จแ„‹แ…ช แ„’แ…ชแ†จแ„…แ…ฒแ†ฏ_v2.pptx
ย 
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
๋”ฅ๋Ÿฌ๋‹์˜ ๊ธฐ๋ณธ
ย 
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ
์•Œ๊ณ ๋ฆฌ์ฆ˜๊ณผ ์ž๋ฃŒ๊ตฌ์กฐ
ย 
Deep Learning from scratch 4์žฅ : neural network learning
Deep Learning from scratch 4์žฅ : neural network learningDeep Learning from scratch 4์žฅ : neural network learning
Deep Learning from scratch 4์žฅ : neural network learning
ย 
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹
์ธ๊ณต์ง€๋Šฅ, ๊ธฐ๊ณ„ํ•™์Šต ๊ทธ๋ฆฌ๊ณ  ๋”ฅ๋Ÿฌ๋‹
ย 
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ
[SOPT] ๋ฐ์ดํ„ฐ ๊ตฌ์กฐ ๋ฐ ์•Œ๊ณ ๋ฆฌ์ฆ˜ ์Šคํ„ฐ๋”” - #01 : ๊ฐœ์š”, ์ ๊ทผ์  ๋ณต์žก๋„, ๋ฐฐ์—ด, ์—ฐ๊ฒฐ๋ฆฌ์ŠคํŠธ
ย 
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic
์บ๋นˆ๋จธํ”ผ ๋จธ์‹ ๋Ÿฌ๋‹ Kevin Murphy Machine Learning Statistic
ย 
Amugona study 1ํšŒ jjw
Amugona study 1ํšŒ jjwAmugona study 1ํšŒ jjw
Amugona study 1ํšŒ jjw
ย 
Amugona study 1ํšŒ jjw
Amugona study 1ํšŒ jjwAmugona study 1ํšŒ jjw
Amugona study 1ํšŒ jjw
ย 
2.supervised learning(epoch#2)-3
2.supervised learning(epoch#2)-32.supervised learning(epoch#2)-3
2.supervised learning(epoch#2)-3
ย 
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…
Recurrent Neural Net์˜ ์ด๋ก ๊ณผ ์„ค๋ช…
ย 
Python programming for Bioinformatics
Python programming for BioinformaticsPython programming for Bioinformatics
Python programming for Bioinformatics
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 2์žฅ. ๋จธ์‹ ๋Ÿฌ๋‹ ํ”„๋กœ์ ํŠธ ์ฒ˜์Œ๋ถ€ํ„ฐ ๋๊นŒ์ง€
ย 
Naive ML Overview
Naive ML OverviewNaive ML Overview
Naive ML Overview
ย 

Mehr von Haesun Park

์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””
์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””
์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””Haesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐHaesun Park
ย 
(Handson ml)ch.8-dimensionality reduction
(Handson ml)ch.8-dimensionality reduction(Handson ml)ch.8-dimensionality reduction
(Handson ml)ch.8-dimensionality reductionHaesun Park
ย 
(Handson ml)ch.7-ensemble learning and random forest
(Handson ml)ch.7-ensemble learning and random forest(Handson ml)ch.7-ensemble learning and random forest
(Handson ml)ch.7-ensemble learning and random forestHaesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌHaesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ [ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ Haesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จHaesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜Haesun Park
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹Haesun Park
ย 
7.woring with text data(epoch#2)
7.woring with text data(epoch#2)7.woring with text data(epoch#2)
7.woring with text data(epoch#2)Haesun Park
ย 
6.algorithm chains and piplines(epoch#2)
6.algorithm chains and piplines(epoch#2)6.algorithm chains and piplines(epoch#2)
6.algorithm chains and piplines(epoch#2)Haesun Park
ย 
5.model evaluation and improvement(epoch#2) 2
5.model evaluation and improvement(epoch#2) 25.model evaluation and improvement(epoch#2) 2
5.model evaluation and improvement(epoch#2) 2Haesun Park
ย 
5.model evaluation and improvement(epoch#2) 1
5.model evaluation and improvement(epoch#2) 15.model evaluation and improvement(epoch#2) 1
5.model evaluation and improvement(epoch#2) 1Haesun Park
ย 
4.representing data and engineering features(epoch#2)
4.representing data and engineering features(epoch#2)4.representing data and engineering features(epoch#2)
4.representing data and engineering features(epoch#2)Haesun Park
ย 
3.unsupervised learing(epoch#2)
3.unsupervised learing(epoch#2)3.unsupervised learing(epoch#2)
3.unsupervised learing(epoch#2)Haesun Park
ย 
7.woring with text data
7.woring with text data7.woring with text data
7.woring with text dataHaesun Park
ย 
6.algorithm chains and piplines
6.algorithm chains and piplines6.algorithm chains and piplines
6.algorithm chains and piplinesHaesun Park
ย 
5.model evaluation and improvement
5.model evaluation and improvement5.model evaluation and improvement
5.model evaluation and improvementHaesun Park
ย 
4.representing data and engineering features
4.representing data and engineering features4.representing data and engineering features
4.representing data and engineering featuresHaesun Park
ย 
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?Haesun Park
ย 

Mehr von Haesun Park (20)

์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””
์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””
์‚ฌ์ดํ‚ท๋Ÿฐ ์ตœ์‹  ๋ณ€๊ฒฝ ์‚ฌํ•ญ ์Šคํ„ฐ๋””
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 9์žฅ ํ…์„œํ”Œ๋กœ ์‹œ์ž‘ํ•˜๊ธฐ
ย 
(Handson ml)ch.8-dimensionality reduction
(Handson ml)ch.8-dimensionality reduction(Handson ml)ch.8-dimensionality reduction
(Handson ml)ch.8-dimensionality reduction
ย 
(Handson ml)ch.7-ensemble learning and random forest
(Handson ml)ch.7-ensemble learning and random forest(Handson ml)ch.7-ensemble learning and random forest
(Handson ml)ch.7-ensemble learning and random forest
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 6์žฅ ๊ฒฐ์ • ํŠธ๋ฆฌ
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ [ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 5์žฅ. ์„œํฌํŠธ ๋ฒกํ„ฐ ๋จธ์‹ 
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 4์žฅ. ๋ชจ๋ธ ํ›ˆ๋ จ
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 3์žฅ. ๋ถ„๋ฅ˜
ย 
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹
[ํ™๋Œ€ ๋จธ์‹ ๋Ÿฌ๋‹ ์Šคํ„ฐ๋”” - ํ•ธ์ฆˆ์˜จ ๋จธ์‹ ๋Ÿฌ๋‹] 1์žฅ. ํ•œ๋ˆˆ์— ๋ณด๋Š” ๋จธ์‹ ๋Ÿฌ๋‹
ย 
7.woring with text data(epoch#2)
7.woring with text data(epoch#2)7.woring with text data(epoch#2)
7.woring with text data(epoch#2)
ย 
6.algorithm chains and piplines(epoch#2)
6.algorithm chains and piplines(epoch#2)6.algorithm chains and piplines(epoch#2)
6.algorithm chains and piplines(epoch#2)
ย 
5.model evaluation and improvement(epoch#2) 2
5.model evaluation and improvement(epoch#2) 25.model evaluation and improvement(epoch#2) 2
5.model evaluation and improvement(epoch#2) 2
ย 
5.model evaluation and improvement(epoch#2) 1
5.model evaluation and improvement(epoch#2) 15.model evaluation and improvement(epoch#2) 1
5.model evaluation and improvement(epoch#2) 1
ย 
4.representing data and engineering features(epoch#2)
4.representing data and engineering features(epoch#2)4.representing data and engineering features(epoch#2)
4.representing data and engineering features(epoch#2)
ย 
3.unsupervised learing(epoch#2)
3.unsupervised learing(epoch#2)3.unsupervised learing(epoch#2)
3.unsupervised learing(epoch#2)
ย 
7.woring with text data
7.woring with text data7.woring with text data
7.woring with text data
ย 
6.algorithm chains and piplines
6.algorithm chains and piplines6.algorithm chains and piplines
6.algorithm chains and piplines
ย 
5.model evaluation and improvement
5.model evaluation and improvement5.model evaluation and improvement
5.model evaluation and improvement
ย 
4.representing data and engineering features
4.representing data and engineering features4.representing data and engineering features
4.representing data and engineering features
ย 
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?
๊ธฐ๊ณ„๋„ ํ•™๊ต์— ๊ฐ€๋‚˜์š”?
ย 

Kรผrzlich hochgeladen

Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)
Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)
Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)Wonjun Hwang
ย 
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...Kim Daeun
ย 
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)Wonjun Hwang
ย 
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค
ย 
MOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution DetectionMOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution DetectionKim Daeun
ย 
A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)Tae Young Lee
ย 

Kรผrzlich hochgeladen (6)

Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)
Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)
Merge (Kitworks Team Study ์ด์„ฑ์ˆ˜ ๋ฐœํ‘œ์ž๋ฃŒ 240426)
ย 
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
Continual Active Learning for Efficient Adaptation of Machine LearningModels ...
ย 
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)
Console API (Kitworks Team Study ๋ฐฑํ˜œ์ธ ๋ฐœํ‘œ์ž๋ฃŒ)
ย 
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
์บ๋“œ์•ค๊ทธ๋ž˜ํ”ฝ์Šค 2024๋…„ 5์›”ํ˜ธ ๋ชฉ์ฐจ
ย 
MOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution DetectionMOODv2 : Masked Image Modeling for Out-of-Distribution Detection
MOODv2 : Masked Image Modeling for Out-of-Distribution Detection
ย 
A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)A future that integrates LLMs and LAMs (Symposium)
A future that integrates LLMs and LAMs (Symposium)
ย 

2.linear regression and logistic regression

  • 2. Outlook โ€ข Part 1: ํŒŒ์ด์ฌ๊ณผ ํ…์„œํ”Œ๋กœ์šฐ ์†Œ๊ฐœ โ€ข Part 2: ํšŒ๊ท€ ๋ถ„์„๊ณผ ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€ โ€ข Part 3: ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ ์•Œ๊ณ ๋ฆฌ์ฆ˜ โ€ข Part 4: ์ฝ˜๋ณผ๋ฃจ์…˜ ๋‰ด๋Ÿด ๋„คํŠธ์›Œํฌ 2
  • 4. ๋ฆฌ์ŠคํŠธ โ€ข ๋Œ€๊ด„ํ˜ธ๋กœ ์ดˆ๊ธฐํ™” a = [0, 1] โ€ข ๋ฆฌ์ŠคํŠธ์— ์›์†Œ ์ถ”๊ฐ€ a.append(2) โ€ข ๋ฆฌ์ŠคํŠธ์˜ ๊ธธ์ด len(a) โ€ข ๋ฆฌ์ŠคํŠธ ์Šฌ๋ผ์ด์Šค a[0:2] ๋”•์…”๋„ˆ๋ฆฌ โ€ข ์ˆœ์„œ๊ฐ€ ์—†๋Š” ์ธ๋ฑ์Šค๋ฅผ ๊ฐ€์ง(๋ฌธ์ž์—ด ๊ฐ€๋Šฅ) โ€ข ์ค‘๊ด„ํ˜ธ๋กœ ์ดˆ๊ธฐํ™” b = {โ€˜sunโ€™: 0} โ€ข ํ‚ค๋ฅผ ์ง€์ •ํ•ด ์›์†Œ ์ถ”๊ฐ€ b[โ€˜monโ€™] = 1 โ€ข ๋”•์…”๋„ˆ๋ฆฌ ์ฐธ์กฐ๋Š” ๋ฆฌ์ŠคํŠธ์™€ ๋™์ผ b[โ€™monโ€™] Python-data type 4
  • 5. if i == 10: print(10) elif i < 10: print(0) else: print(100) for a in lst: print(a) for k in dct: print(dct[k]) for k, v in dct.items(): print(k, v) Python-if, for 5
  • 6. TensorFlow Graph ์™€ Session โ€ข ํ…์„œํ”Œ๋กœ์šฐ์˜ ์—ฐ์‚ฐ์ž๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๊ณ„์‚ฐ ๊ตฌ์กฐ๋ฅผ ์ •์˜ํ•ฉ๋‹ˆ๋‹ค. a = tf.constant(2) b = tf.constant(3) x = tf.add(a, b) โ€ข ์„ธ์…˜ ๊ฐ์ฒด๋ฅผ ๋งŒ๋“ค์–ด ๋งŒ๋“ค์–ด์ง„ ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๋ฅผ ์‹คํ–‰ํ•ฉ๋‹ˆ๋‹ค. x <tf.Tensor 'Add:0' shape=() dtype=int32> tf.Session().run(x) 5 6
  • 7. zeros(), ones() โ€ข 0์œผ๋กœ ์ฑ„์›Œ์ง„ ํ…์„œ๋ฅผ ๋งŒ๋“ญ๋‹ˆ๋‹ค. e = tf.zeros([2, 3]) tf.Session().run(e) array([[ 0., 0., 0.], [ 0., 0., 0.]], dtype=float32) โ€ข 1๋กœ ์ฑ„์›Œ์ง„ ํ…์„œ๋ฅผ ๋งŒ๋“ญ๋‹ˆ๋‹ค. f = tf.ones([2, 3], dtype=tf.int32) tf.Session().run(f) array([[1, 1, 1], [1, 1, 1]], dtype=int32) 7
  • 8. tf.Variable() โ€ข ์ƒ์ˆ˜๊ฐ€ ์•„๋‹ˆ๋ผ ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„์—์„œ ๋ณ€ํ•˜๋Š” ๊ฐ’์„ ๋‹ด๋Š” ๋„๊ตฌ์ž…๋‹ˆ๋‹ค. โ€ข ๋ณ€์ˆ˜๋Š” ์ดˆ๊นƒ๊ฐ’์ด ์ฃผ์–ด์ ธ์•ผ ํ•˜๊ณ  ์‚ฌ์šฉํ•˜๊ธฐ ์ „์— ์ดˆ๊ธฐํ™” ๋˜์–ด์•ผ ํ•ฉ๋‹ˆ๋‹ค. a = tf.Variable(tf.constant(2)) a <tensorflow.python.ops.variables.Variable at ...> init = tf.global_variables_initializer() sess = tf.Session() sess.run(init) sess.run(a) 2 8
  • 9. ํ–‰๋ ฌ(matrix) โ€ข 2ร—3 ํ–‰๋ ฌ 1 โˆ’2 2 3 โˆ’1 1 a = tf.Variable([[1, -2, 2], [3, -1, 1]]) sess = tf.Session() sess.run(tf.global_variables_initializer()) sess.run(a) [[1 -2 2], [3 -1 1]] 9 ํ–‰ (row) ์—ด(column)
  • 10. ํ–‰๋ ฌ ๋‚ด์  โ€ข ํ–‰๋ ฌ์˜ ๋ง์…ˆ โ€ข 2ร—3 + 2ร—3 = [2ร—3] 1 โˆ’2 2 3 โˆ’1 1 + โˆ’1 3 2 2 4 1 = 0 1 4 5 3 2 โ€ข ํ–‰๋ ฌ์˜ ๊ณฑ์…ˆ: ๋‚ด์ , ์ ๊ณฑ(dot product) โ€ข 2ร—3 โ‹… 3ร—2 = [2ร—2] 1 โˆ’2 2 3 โˆ’1 1 2 โˆ’1 4 3 1 2 = โˆ’4 โˆ’3 3 โˆ’4 10 ํ–‰ (row) ์—ด(column)
  • 11. tf.matmul() โ€ข ๋‘๊ฐœ์˜ ํ…์„œ๋ฅผ ์ž…๋ ฅ ๋ฐ›์•„ ํ–‰๋ ฌ ๋‚ด์ ์„ ๊ณ„์‚ฐํ•ฉ๋‹ˆ๋‹ค. a = tf.Variable([[1, -2, 2], [3, -1, 1]]) b = tf.Variable([[2, -1], [4, 3], [1, 2]]) dot = tf.matmul(a, b) sess = tf.Session() sess.run(tf.global_variables_initializer()) sess.run(dot) array([[-4, -3], [ 3, -4]], dtype=int32) 11 1 โˆ’2 2 3 โˆ’1 1 2 โˆ’1 4 3 1 2 โˆ’4 โˆ’3 3 โˆ’4
  • 13. ํšŒ๊ท€ ๋ถ„์„ โ€ข ์ˆซ์ž ๊ฒฐ๊ณผ๋ฅผ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค. โ€ข ์ถœ๋ ฅ ๊ฐ’์€ ์—ฐ์†์ ์ธ ์†์„ฑ์„ ๊ฐ€์ง‘๋‹ˆ๋‹ค. โ€ข Regression Analysis ex) โ€ข ํ™˜์ž์˜ ๋‹น๋‡จ๋ณ‘ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ 1๋…„๋’ค ์•…ํ™” ์ •๋„๋ฅผ ์ธก์ • โ€ข ๊ณผ๊ฑฐ ์ฃผ์‹์‹œ์žฅ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ ๋‚ด์ผ ์ฃผ๊ฐ€๋ฅผ ์˜ˆ์ธก โ€ข ์ง€์—ญ, ๋ฐฉ ๊ฐœ์ˆ˜, ํ‰์ˆ˜ ๋“ฑ์˜ ๋ฐ์ดํ„ฐ๋ฅผ ์ด์šฉํ•˜์—ฌ ์ฃผํƒ ๊ฐ€๊ฒฉ ์˜ˆ์ธก 13
  • 14. 1์ฐจ ์„ ํ˜• ํ•จ์ˆ˜ ๐‘ฆ; = ๐‘ค ร— ๐‘ฅ + ๐‘ ๊ฐ€์ค‘์น˜ ํŽธํ–ฅ 14
  • 15. Hyperplane TV Radio Sales ๐‘†๐‘Ž๐‘™๐‘’๐‘  = ๐‘ŽD ร— ๐‘…๐‘Ž๐‘‘๐‘–๐‘œ + ๐‘ŽIร—๐‘‡๐‘‰ + ๐‘ โ€ข ๊ธฐ๋ณธ ๋ฒ ์ด์Šค ๋ชจ๋ธ โ€ข ๋Œ€๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ์…‹ โ€ข ํŠน์„ฑ์ด ๋น„๊ต์  ๋งŽ์„ ๋•Œ 15
  • 16. ์ผ๋ฐ˜ํ™” โ€ข n ๊ฐœ์˜ ํŠน์„ฑ์ด ์žˆ์„ ๋•Œ ์„ ํ˜• ํšŒ๊ท€์˜ ์ผ๋ฐ˜ ๋ฐฉ์ •์‹ ๐‘ฆ; = ๐›ฝD ๐‘ฅD + ๐›ฝI ๐‘ฅI + โ‹ฏ + ๐›ฝN ๐‘ฅN + ๐›ฝO โ€ข ๐‘ฅO = 1 ์ธ ํ•ญ์„ ์ถ”๊ฐ€ ๐‘ฆ;D = ๐›ฝD ๐‘ฅD + ๐›ฝI ๐‘ฅI + โ‹ฏ + ๐›ฝN ๐‘ฅN + ๐›ฝO ๐‘ฅO โ‹ฎ ๐‘ฆ;Q = ๐›ฝD ๐‘ฅQD + ๐›ฝI ๐‘ฅQI + โ‹ฏ + ๐›ฝN ๐‘ฅQN + ๐›ฝO ๐‘ฅQO ๐‘ฆ; = ๐‘ฅDD โ‹ฏ ๐‘ฅDO โ‹ฎ โ‹ฑ โ‹ฎ ๐‘ฅQD โ‹ฏ ๐‘ฅQO ๐›ฝD โ‹ฎ ๐›ฝO , ๐‘š = ๋ฐ์ดํ„ฐ๊ฐœ์ˆ˜ โ†’ ๐’šV = ๐‘ฟ๐œทY 16
  • 17. ์†”๋ฃจ์…˜ โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ•(Ordinary Least Squares)๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ํ‰๊ท ์ œ๊ณฑ์˜ค์ฐจ(Mean Squared Error)๋ฅผ ์ตœ์†Œํ™”ํ•˜๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ฐพ์Œ. โ€ข ํ‰๊ท ์ œ๊ณฑ์˜ค์ฐจ 1 ๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; I , ๐‘ฆ; = ๐‘‹๐›ฝ Q ]D โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ• ๐›ฝ^ = ๐‘‹_ ๐‘‹ `D ๐‘‹_ ๐‘ฆ ์˜ค์ฐจ์˜ ์ œ๊ณฑ ๋ชจ๋“  ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ์˜ ์˜ค์ฐจ ์ œ๊ณฑ์„ ๋”ํ•จ ํ›ˆ๋ จ ๋ฐ์ดํ„ฐ ๊ฐฏ์ˆ˜๋กœ ๋‚˜๋ˆ” โ€ข ๋ฐ์ดํ„ฐ๊ฐ€ ์•„์ฃผ ๋งŽ์€ ๊ฒฝ์šฐ ๋ฌธ์ œ โ€ข ์—ญํ–‰๋ ฌ์„ ๊ตฌํ•  ์ˆ˜ ์—†๋Š” ๊ฒฝ์šฐ ๋ฌธ์ œ 17
  • 18. ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•(Gradient Descent) โ€ข ์˜ค์ฐจํ•จ์ˆ˜์˜ ๋‚ฎ์€ ์ง€์ ์„ ์ฐพ์•„๊ฐ€๋Š” ์ตœ์ ํ™” ๋ฐฉ๋ฒ• โ€ข ๋‚ฎ์€ ์ชฝ์˜ ๋ฐฉํ–ฅ์„ ์ฐพ๊ธฐ ์œ„ํ•ด ์˜ค์ฐจํ•จ์ˆ˜๋ฅผ ํ˜„์žฌ ์œ„์น˜์—์„œ ๋ฏธ๋ถ„ํ•จ ๐ฝ = 1 2๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; I , โˆ‡๐ฝ = 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) Q ]D 18
  • 21. ๋‰ด๋Ÿฐ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ Neuron ๐‘ฆ; ๐‘ค ๐‘ฆ; = ๐‘ค ร— ๐‘ฅ + ๐‘ ๐‘ฅ ๐‘ ร— + ๐’š 21
  • 22. ๋‚ฎ์€ ๊ณณ์œผ๋กœ Neuron ๐‘ฆ; ๐‘ค ๐‘ฅ ๐‘ ร— + ๐’š ๐œ•๐ฝ ๐œ•๐‘ค = 1 ๐‘š ๐‘ฆ โˆ’ ๐‘ฆ; ๐œ•๐‘ฆ; ๐œ•๐‘ค = 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐œ•๐ฝ ๐œ•๐‘ = 1 ๐‘š ๐‘ฆ โˆ’ ๐‘ฆ; ๐œ•๐‘ฆ; ๐œ•๐‘ = 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) ๐ฝ = 1 2๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; I Q ]D 22
  • 23. ํŒŒ๋ผ๋ฏธํ„ฐ ์—…๋ฐ์ดํŠธ Neuron ๐‘ฆ; ๐‘ค = ๐‘ค + โˆ†๐‘ค = ๐‘ค + 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐‘ฅ ร— + ๐’š ๐‘ = ๐‘ + โˆ†๐‘ = ๐‘ + 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) (๐‘ฆ โˆ’ ๐‘ฆ;) 23
  • 24. ์ ๋‹นํ•œ ์†๋„ โ€ข ํŒŒ๋ผ๋ฏธํ„ฐ w, b ์˜ ์—…๋ฐ์ดํŠธ๊ฐ€ ํด ๊ฒฝ์šฐ ์ตœ์ €์ (local minima)์„ ์ง€๋‚˜์น  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. โ€ข ํ•™์Šต ์†๋„(learning rate)๋กœ ๊ทธ๋ž˜๋””์–ธํŠธ ์—…๋ฐ์ดํŠธ๋ฅผ ์กฐ์ ˆํ•ฉ๋‹ˆ๋‹ค. ๐‘ค = ๐‘ค + ๐œถ 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐‘ = ๐‘ + ๐œถ 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) 24
  • 25. ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ โ€ข ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ(Hyperparameter)๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฐ์ดํ„ฐ๋กœ๋ถ€ํ„ฐ ํ•™์Šตํ•  ์ˆ˜ ์—†๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค. โ€ข ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฐ์ดํ„ฐ๋กœ ๋ถ€ํ„ฐ ํ•™์Šตํ•˜๋Š” ํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค. ์˜ˆ ๋ฅผ ๋“ค๋ฉด, w, b ์ž…๋‹ˆ๋‹ค. โ€ข ํ•™์Šต์†๋„(learning rate)์€ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ์ž…๋‹ˆ๋‹ค. โ€ข ์ด ์™ธ์™ธ์—๋„ ์‹ ๊ฒฝ๋ง์˜ ๋ ˆ์ด์–ด์ˆ˜๋‚˜ ์œ ๋‹›์ˆ˜, k-NN ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ k ๊ฐ’ ๋“ฑ ์•Œ๊ณ  ๋ฆฌ์ฆ˜๋งˆ๋‹ค ์—ฌ๋Ÿฌ๊ฐ€์ง€์˜ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. โ€ข ์ตœ์ ์˜ ํ•˜์ดํผํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ฐพ๊ธฐ์œ„ํ•ด์„œ ๋ฐ˜๋ณต์ ์ธ ํ•™์Šต, ๊ฒ€์ฆ ๊ณผ์ •์„ ๊ฑฐ์ณ์•ผ ํ•ฉ๋‹ˆ๋‹ค. 25
  • 27. ๋ฐ์ดํ„ฐ ์ƒ์„ฑ ์„ธ์…˜ ๊ฐ์ฒด ์ƒ์„ฑ ํ‰๊ท  0, ํ‘œ์ค€ํŽธ์ฐจ 0.55 ์ธ x ์ƒ˜ํ”Œ 1000๊ฐœ ์ƒ์„ฑ 0.1*x + 0.3 ๋ฐฉ์ •์‹์„ ๋งŒ์กฑํ•˜๋Š” y ๋ฐ์ดํ„ฐ๋ฅผ ์ƒ์„ฑํ•˜๋˜, ํ‰๊ท  0, ํ‘œ์ค€ํŽธ์ฐจ 0.03์„ ๊ฐ€์ง€๋„๋ก ํ•จ. 27
  • 29. ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„ ์ƒ์„ฑ ๊ฐ€์ค‘์น˜ W, b ๋ณ€์ˆ˜๋ฅผ 0์œผ๋กœ ์ดˆ๊ธฐํ™” y_hat ๊ณ„์‚ฐ ์†์‹ค ํ•จ์ˆ˜์ธ MSE ๊ณ„์‚ฐ ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• ๊ฐ์ฒด ์ƒ์„ฑ์†์‹คํ•จ์ˆ˜ ๋…ธ๋“œ๋ฅผ ์ตœ์ ํ™” ํ•˜๋Š” ํ•™์Šต๋…ธ๋“œ ์ƒ์„ฑ ๐ฝ = 1 2๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; I Q ]D train loss y_hat W x b 29
  • 30. ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„ ์‹คํ–‰ ๋ณ€์ˆ˜ ์ดˆ๊ธฐํ™” ํ•™์Šต ๋…ธ๋“œ ์‹คํ–‰ ํ•™์Šต๋œ ํŒŒ๋ผ๋ฏธํ„ฐ์™€ ์†์‹ค ํ•จ์ˆ˜ ๊ฐ’ ์ถœ๋ ฅ 30
  • 32. ์„ ํ˜• ํšŒ๊ท€ ์ •๋ฆฌ โ€ข ์„ ํ˜• ํšŒ๊ท€ ๋ถ„์„์€ ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ์—ฐ์†์ ์ธ ๊ฒฐ๊ณผ๋ฅผ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค. โ€ข ์„ ํ˜• ํšŒ๊ท€์˜ ๋Œ€ํ‘œ์ ์ธ ๋น„์šฉํ•จ์ˆ˜๋Š” MSE(mean square error) ํ•จ์ˆ˜์ž…๋‹ˆ๋‹ค. โ€ข ์ตœ์†Œ์ œ๊ณฑ๋ฒ• ๋Œ€์‹  ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ•์„ ์‚ฌ์šฉํ•˜์—ฌ ์ ์ง„์ ์œผ๋กœ ์ตœ์ ์˜ ํŒŒ๋ผ๋ฏธํ„ฐ๋ฅผ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค. โ€ข ํŠน์„ฑ์ด ๋งŽ์„ ๊ฒฝ์šฐ ๋†’์€ ์„ฑ๋Šฅ์„ ๋‚ผ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด๋Ÿด ๊ฒฝ์šฐ ์˜คํžˆ๋ ค ์„ฑ๋Šฅ์„ ์ œํ•œํ•ด์•ผ ํ•  ๋•Œ๊ฐ€ ๋งŽ์Šต๋‹ˆ๋‹ค. โ€ข ๋น„๊ต์  ๋Œ€๋Ÿ‰์˜ ๋ฐ์ดํ„ฐ์…‹์—์„œ๋„ ์ž˜ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค. โ€ข ๋ฐ์ดํ„ฐ ๋ถ„์„์„ ํ•  ๋•Œ ์ฒ˜์Œ ์‹œ๋„ํ•  ๋ชจ๋ธ๋กœ์„œ ์ข‹์Šต๋‹ˆ๋‹ค. 32
  • 34. ๋ถ„๋ฅ˜(Classification) โ€ข ํด๋ž˜์Šค ๋ ˆ์ด๋ธ”์„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค. โ€ข ์ถœ๋ ฅ ๊ฒฐ๊ณผ๋Š” ์ด์‚ฐ์ ์ž…๋‹ˆ๋‹ค. โ€ข Binary Classification(์ด์ง„ ๋ถ„๋ฅ˜), Multiclass Classification(๋‹ค์ค‘ ๋ถ„๋ฅ˜) ex) โ€ข ์ŠคํŒธ ๋ถ„๋ฅ˜ โ€ข ์•” ์ง„๋‹จ โ€ข ๋ถ“๊ฝƒ์˜ ํ’ˆ์ข… ํŒ๋ณ„ โ€ข ์†๊ธ€์”จ ์ˆซ์ž ๋ถ„๋ฅ˜ 34
  • 35. ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€ (์ด์ง„ ๋ถ„๋ฅ˜) โ€ข ์ด์ง„ ๋ถ„๋ฅ˜๋Š” ์ƒ˜ํ”Œ์„ True(1), ๋˜๋Š” False(0)์œผ๋กœ ๋ถ„๋ฅ˜ํ•ฉ๋‹ˆ๋‹ค. โ€ข ํšŒ๊ท€์˜ ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ๊ทธ๋Œ€๋กœ ์ด์šฉํ•ฉ๋‹ˆ๋‹ค. โ€ข ์„ ํ˜• ํ•จ์ˆ˜์˜ ๊ฒฐ๊ณผ๋ฅผ 0~1 ์‚ฌ์ด์˜ ํ™•๋ฅ ๋กœ ๋ณ€ํ™˜ํ•ฉ๋‹ˆ๋‹ค. โ€ข 0.5 ์ด์ƒ์ผ ๊ฒฝ์šฐ True, ์•„๋‹ˆ๋ฉด False ๋กœ ๋ถ„๋ฅ˜ํ•ฉ๋‹ˆ๋‹ค. ๐‘ฆ; = ๐‘ค ร— ๐‘ฅ + ๐‘ 35
  • 36. ๋กœ์ง€์Šคํ‹ฑ ํ•จ์ˆ˜ โ€ข ๋กœ์ง€์Šคํ‹ฑ(logistic) ๋˜๋Š” ์‹œ๊ทธ๋ชจ์ด๋“œ(sigmoid) ํ•จ์ˆ˜๋Š” -โˆž~+โˆž์ž…๋ ฅ์— ๋Œ€ํ•ด 0~1 ์‚ฌ์ด์˜ ๊ฐ’์„ ์ถœ๋ ฅํ•ฉ๋‹ˆ๋‹ค. ๐‘ฆ; = 1 1 + ๐‘’`(g ร— h i j) = 1 1 + ๐‘’`k ๐‘ง = ๐‘ค ร— ๐‘ฅ + ๐‘ 36
  • 37. ๋‰ด๋Ÿฐ์ฒ˜๋Ÿผ ๋ณด์ด๊ฒŒ Neuron Sigmoid 0 ~ 1 ๐‘ค ๐‘ฆ; = ๐œŽ(๐‘ง) ๐‘ฅ ๐‘ ร— + ๐’š ๐‘ง = ๐‘ค ร— ๐‘ฅ + ๐‘ -โˆž~+โˆž ๐œŽ(๐‘ง) = 1 1 + ๐‘’`k 37
  • 38. ๋ถ„๋ฅ˜์—์„œ์˜ ์†์‹ค ํ•จ์ˆ˜๋Š” โ€ข ๋ถ„๋ฅ˜๋Š” ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ(cross-entropy) ์†์‹ค ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. โ€ข ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ์†์‹คํ•จ์ˆ˜๋ฅผ ๋ฏธ๋ถ„ํ•˜๋ฉด โ€ข ์„ ํ˜•ํšŒ๊ท€์˜ MSE ์†์‹คํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„ ๊ฒฐ๊ณผ์™€ ๋™์ผํ•ฉ๋‹ˆ๋‹ค. ๐ฝ = โˆ’ 1 ๐‘š Z ๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ; Q ]D = โˆ’ 1 ๐‘š Z[๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ; + 1 โˆ’ ๐‘ฆ log (1 โˆ’ ๐‘ฆ;)] Q ]D ๐œ•๐ฝ ๐œ•๐‘ค = โˆ’ 1 ๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; ๐‘ฅ Q ]D ๐œ•๐ฝ ๐œ•๐‘ = โˆ’ 1 ๐‘š Z ๐‘ฆ โˆ’ ๐‘ฆ; Q ]D 38
  • 39. ๋‚ฎ์€ ๊ณณ์œผ๋กœ Neuron Sigmoid ๐‘ค ๐‘ฅ ๐‘ ร— + ๐’š ๐‘ฆ; = ๐œŽ(๐‘ง) = 1 1 + ๐‘’`k ๐œ•๐ฝ ๐œ•๐‘ค = 1 ๐‘š ๐‘ฆ โˆ’ ๐‘ฆ; ๐œ•๐‘ฆ; ๐œ•๐‘ค = 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐œ•๐ฝ ๐œ•๐‘ = 1 ๐‘š ๐‘ฆ โˆ’ ๐‘ฆ; ๐œ•๐‘ฆ; ๐œ•๐‘ = 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) ๐ฝ = โˆ’ 1 ๐‘š Z[๐‘ฆ๐‘™๐‘œ๐‘” ๐‘ฆ; + 1 โˆ’ ๐‘ฆ log (1 โˆ’ ๐‘ฆ;)] Q ]D ๐‘ฆ;๐‘ง 39
  • 40. ๊ทธ๋ž˜๋””์–ธํŠธ ์—…๋ฐ์ดํŠธ Neuron Sigmoid ๐‘ฅ ร— + ๐’š๐‘ฆ; ๐‘ค = ๐‘ค + โˆ†๐‘ค = ๐‘ค + 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;)๐‘ฅ ๐‘ = ๐‘ + โˆ†๐‘ = ๐‘ + 1 ๐‘š (๐‘ฆ โˆ’ ๐‘ฆ;) (๐‘ฆ โˆ’ ๐‘ฆ;) 40
  • 41. ๋กœ์ง€์Šคํ‹ฑ ์ •๋ฆฌ โ€ข ๋ถ„๋ฅ˜์— ์‚ฌ์šฉํ•˜๋Š” ๋ชจ๋ธ์ž…๋‹ˆ๋‹ค. โ€ข ์„ ํ˜• ํ•จ์ˆ˜ ๊ฒฐ๊ณผ๋ฅผ ์‹œ๊ทธ๋ชจ์ด๋“œ ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ 0~1 ์‚ฌ์ด๋กœ ์••์ถ•ํ•ฉ๋‹ˆ๋‹ค. โ€ข ์ด์ง„ ๋ถ„๋ฅ˜๋Š” 0.5 ๋ณด๋‹ค ๋†’์„ ๋•Œ๋Š” True ๋กœํ•˜๊ณ  ๊ทธ ์ดํ•˜๋Š” False ๋กœ ํ•˜์—ฌ ๋ชจ๋ธ ์„ ํ•™์Šต์‹œํ‚ต๋‹ˆ๋‹ค. โ€ข ์‹œ๊ทธ๋ชจ์ด๋“œ ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ ํฌ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ๋น„์šฉํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„ ๊ฒฐ๊ณผ๋Š” ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ•œ MSE ๋น„์šฉํ•จ์ˆ˜์˜ ๋ฏธ๋ถ„๊ณผ ๋™์ผํ•ฉ๋‹ˆ๋‹ค. โ€ข ๋กœ์ง€์Šคํ‹ฑ ํšŒ๊ท€๋Š” ๋‹ค์ค‘ ๋ถ„๋ฅ˜๋„ ์ง€์›ํ•ฉ๋‹ˆ๋‹ค. 41
  • 43. ์œ„์Šค์ฝ˜์‹  ์œ ๋ฐฉ์•” ๋ฐ์ดํ„ฐ ์‚ฌ์ดํ‚ท๋Ÿฐ์˜ ๋ฐ์ดํ„ฐ์…‹ ์ด์šฉ ๋„˜ํŒŒ์ด ์œ„์Šค์ฝ˜์‹  ์œ ๋ฐฉ์•” ๋ฐ์ดํ„ฐ ๋กœ๋“œ ์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋Š” scikit-learn์˜ Bunch ์˜ค๋ธŒ์ ํŠธ 43
  • 44. ๋„˜ํŒŒ์ด(NumPy) โ€ข ๋ฐ์ดํ„ฐ ๊ณผํ•™์„ ์œ„ํ•œ ๋‹ค์ฐจ์› ๋ฐฐ์—ด ํŒจํ‚ค์ง€๋กœ ๋งŽ์€ ๋ฐฐ์—ด ์—ฐ์‚ฐ์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. โ€ข ๋„˜ํŒŒ์ด๋Š” ํŒŒ์ด์ฌ ๋ฆฌ์ŠคํŠธ์™€๋Š” ๋‹ฌ๋ฆฌ ๋‹ค๋ฅธ ์ข…๋ฅ˜์˜ ๋ฐ์ดํ„ฐ ํƒ€์ž…์„ ๋‹ด์„ ์ˆ˜ ์—† ์Šต๋‹ˆ๋‹ค. โ€ข scikit-learn, tensorflow ๋“ฑ ๋งŽ์€ ๋จธ์‹  ๋Ÿฌ๋‹ ํŒจํ‚ค์ง€๋“ค์ด ์ž…๋ ฅ ๊ฐ’์œผ๋กœ ๋„˜ํŒŒ์ด ๋ฐฐ์—ด์„ ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. 44
  • 45. cancer ํŠน์„ฑ 30๊ฐœ์˜ ํŠน์„ฑ ํŠน์„ฑ์˜ ์ด๋ฆ„ ๐‘ฆ; = ๐‘คD ๐‘ฅD + ๐‘คI ๐‘ฅI + โ‹ฏ + ๐‘คqO ๐‘ฅqO + ๐‘ 45
  • 47. ์„ ํ˜• ํ•จ์ˆ˜ ๊ณ„์‚ฐ 0.2 0.6 โ‹ฏ 0.1 0.2 โ‹ฎ โ‹ฑ โ‹ฎ 0.5 โ‹ฏ 0.4 โ‹… 0.1 โ‹ฎ 0.3 = 1.5 5.9 โ‹ฎ 0.7 + 0.1 = 1.6 6.0 โ‹ฎ 0.8 30๊ฐœ ๊ฐ€์ค‘์น˜: 569๊ฐœ ์ƒ˜ํ”Œ์— ๋ชจ๋‘ ์ ์šฉ 569๊ฐœ ์ƒ˜ํ”Œ ๐‘ฅ ร— ๐‘Š + ๐‘ = ๐‘ฆ; [569, 30] x [30, 1] = [569, 1] + [1] = [569, 1] 1๊ฐœ ํŽธํ–ฅ(bias): 569๊ฐœ ์ƒ˜ํ”Œ์— ๋ชจ๋‘ ์ ์šฉ (๋ธŒ๋กœ๋“œ์บ์ŠคํŒ…) 30๊ฐœ ํŠน์„ฑ 569๊ฐœ ๊ฒฐ๊ณผ (logits) 47
  • 48. ์†์‹ค ํ•จ์ˆ˜์™€ ์ตœ์ ํ™” ๋กœ์ง€์Šคํ‹ฑ(์‹œ๊ทธ๋ชจ์ด๋“œ) ํฌ ๋กœ์Šค ์—”ํŠธ๋กœํ”ผ ์†์‹คํ•จ์ˆ˜ ํ•™์Šต์†๋„ ๋งค์šฐ ๋‚ฎ๊ฒŒ ๋ณ€์ˆ˜ ์ดˆ๊ธฐํ™” 48
  • 49. ํ•™์Šต โ€ข ์—ฌ๊ธฐ์„œ๋Š” ์˜ˆ๋ฅผ ๊ฐ„๋‹จํžˆ ํ•˜๊ธฐ์œ„ํ•ด ํ•™์Šต ๋ฐ์ดํ„ฐ๋กœ ๋ชจ๋ธ์˜ ์„ฑ๋Šฅ์„ ํ‰๊ฐ€ํ–ˆ์Šต๋‹ˆ๋‹ค๋งŒ ์‹ค์ „์—์„œ๋Š” ์ด๋ ‡๊ฒŒ ํ•ด์„œ๋Š” ์•ˆ๋ฉ๋‹ˆ๋‹ค prediction ์˜ ๋ชจ๋“  ์› ์†Œ์— ์ ์šฉ, 0.5๋ณด๋‹ค ํฌ ๋ฉด True, ์ž‘์œผ๋ฉด False[569, 1] ํฌ๊ธฐ 5000๋ฒˆ ํ•™์Šตํ•˜๋ฉด์„œ ์†์‹คํ•จ์ˆ˜ ๊ฐ’ ๊ธฐ๋ก 92% ์ •ํ™•๋„ 49
  • 50. ์ •๋ฆฌ โ€ข ์„ ํ˜• ๋ชจ๋ธ์„ ์ด์šฉํ•ด ํšŒ๊ท€์™€ ๋ถ„๋ฅ˜ ํ•™์Šต์„ ํ–ˆ์Šต๋‹ˆ๋‹ค. โ€ข ๋ถ„๋ฅ˜๋Š” ๋กœ์ง€์Šคํ‹ฑ ํ•จ์ˆ˜๋ฅผ ์ด์šฉํ•ด ํ™•๋ฅ ๋กœ ๋ณ€ํ™˜ํ•˜์—ฌ ๋ ˆ์ด๋ธ”์„ ์˜ˆ์ธกํ•ฉ๋‹ˆ๋‹ค. โ€ข ํšŒ๊ท€์—์„œ๋Š” ์ž„์˜์˜ ์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ 1000๊ฐœ, ํŠน์„ฑ 1๊ฐœ๋ฅผ ์‚ฌ์šฉํ–ˆ๊ณ  ๋ถ„๋ฅ˜์—์„œ๋Š” ์ƒ˜ํ”Œ ๋ฐ์ดํ„ฐ 569๊ฐœ, ํŠน์„ฑ 30๊ฐœ๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค. โ€ข ํšŒ๊ท€์—์„œ ํ•™์Šตํ•œ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๊ฐ€์ค‘์น˜ w 1๊ฐœ, ํŽธํ–ฅ b 1๊ฐœ ์ž…๋‹ˆ๋‹ค. โ€ข ๋ถ„๋ฅ˜์—์„œ ํ•™์Šตํ•œ ๋ชจ๋ธ ํŒŒ๋ผ๋ฏธํ„ฐ๋Š” ๊ฐ€์ค‘์น˜ w 30๊ฐœ, ํŽธํ–ฅ b 1๊ฐœ ์ž…๋‹ˆ๋‹ค. โ€ข ์„ ํ˜• ํ•จ์ˆ˜๋ฅผ ํ‘œํ˜„ํ•˜๋Š” ๊ณ„์‚ฐ ๊ทธ๋ž˜ํ”„๋ฅผ ๋งŒ๋“ค๊ณ  ํ…์„œํ”Œ๋กœ์šฐ์—์„œ ์ œ๊ณตํ•˜๋Š” ์† ์‹คํ•จ์ˆ˜๋ฅผ ์‚ฌ์šฉํ–ˆ์Šต๋‹ˆ๋‹ค. โ€ข ๊ฒฝ์‚ฌํ•˜๊ฐ•๋ฒ• ์ตœ์ ํ™” ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์ ์šฉํ•˜์—ฌ ์ตœ์ ๊ฐ’์„ ์ฐพ์•˜์Šต๋‹ˆ๋‹ค. 50