Coursera lecture summary
Cost function
We can measure the accuracy of our hypothesis fufnction by using a cost functon. This takes an acerage difference (actually a fancier version of an average) of all the results of the hypothesis with inputs from x's and the actual output y's
To break it apart, it is 1/2x where x is the mean of the squares of h@(Xi)-yi, or the difference between the predicted value andthe actual value.
This fuction is otherwise called the "Squared error function", or "Mean squared error".
ther mean is hlved (1/2) as a convenience for the computation of the gradient descent, as the derivative tem of the square function will cancel out the 1/2 term. The following image summmarizes what the cost function does:
반응형
'Machine Learning' 카테고리의 다른 글
데이터처리 파이프라인 프레임워크 (0) | 2022.11.11 |
---|---|
[ing]Linear regression with one variable - cost function intuition1 (0) | 2022.05.04 |
Linear regression with one variable - Model Representation (0) | 2022.05.03 |
Supervised Learning & Unsupervised Learning (0) | 2022.04.26 |
what is Machine Learning? (0) | 2022.04.26 |