9/29/2008

Perceptron Learning Rule

來記幾個本來不懂的地方。

1. For "linear splitable” data

2. why expand matrix ?
since the target function is:  g^t (X) = w^t (x) - θ^t 
to simplify the computing,  extend X = {-1, x_1, x_2, ... , X_d} and W = {θ, w_1, w_2, ..., w_d}, so X‧W = Σ_{i=1}^d {w_i * x_i }  - θ = w‧x - θ

3. update function?
the origin function is 
if X ε P , and W_{t-1} ‧X <= 0 , W_t = W_{t-1} + X
else if X ε N, and W_{t-1}‧X >0, W_t = W_{t-1} - X
else W_t = W_{t-1}

we can simplify the first two function by 
if y= 1 , and W_{t-1} ‧X <= 0 , W_t = W_{t-1} + X = W_{t-1} + y‧X
else if y = -1, and W_{t-1}‧X >0, W_t = W_{t-1} - X = W_{t-1} + y‧X
→ if (sign(W‧X) != y )  W_t = W_{t-1} + y‧X


0 comments: