编写感知器算法
个人理解:
还是之前那个高中数学题,一般给定的数据是无法让每个点都满足题目要求的,这时我们就需要对这条线稍作修改,就是利用那些划分到错误地带的点尽可能靠近三八线,一点点的挤进自己的地域。而方法就是让那些已知自己分类的点(标签)去和划分的地界去比较(分类/预测),
不过效果可想而知嘛,毕竟只是一条直直的三八线,谁能分呢么清楚呢,所以啊后面我们要学的还很多呢
该编写代码了!在此练习中,你将实现感知器算法以分类下面的数据(位于文件 data.csv 中)。
感知器步骤如下所示。对于坐标轴为(p,q) 的点,标签 y,以及等式{y} = step(w_1x_1 + w_2x_2 + b)y^=step(w1x1+w2x2+b) 给出的预测
- 如果点分类正确,则什么也不做。
- 如果点分类为正,但是标签为负,则分别减去 αp,αq, 和 α 至 w_1, w_2,w1,w2, 和 b
- 如果点分类为负,但是标签为正,则分别将 p, q,αp,αq, 和 α 加到 >w_1, w_2,w1,w2, 和 b 上。
然后点击运行
绘出感知器算法给出的解决方案。它实际上会画出一组虚线,显示算法如何接近最佳解决方案(用黑色实现表示)。
请随意改动算法的参数(epoch 数量、学习速率,甚至随机化初始参数),看看初始条件对解决方案有何影响!
import numpy as np
# Setting the random seed, feel free to change it and see different solutions.
np.random.seed(42)
def stepFunction(t):
if t >= 0:
return 1
return 0
def prediction(X, W, b):
return stepFunction((np.matmul(X,W)+b)[0])
# TODO: Fill in the code below to implement the perceptron trick.
# The function should receive as inputs the data X, the labels y,
# the weights W (as an array), and the bias b,
# update the weights and bias W, b, according to the perceptron algorithm,
# and return W and b.
def perceptronStep(X, y, W, b, learn_rate = 0.01):
# Fill in code
for i in range(len(X)):
y_hat = prediction(X[i],W,b) # predict one by one
if y[i] - y_hat == 1 : # lable = 1 but predicion = 0 shoule plus
W[0] += X[i][0]*learn_rate
W[1] += X[i][1]*learn_rate
b += learn_rate
elif y[i] - y_hat == -1 : # lable = 0 but predicion = 1 shoule minus
W[0] -= X[i][0]*learn_rate
W[1] -= X[i][1]*learn_rate
b -= learn_rate
return W, b
# This function runs the perceptron algorithm repeatedly on the dataset,
# and returns a few of the boundary lines obtained in the iterations,
# for plotting purposes.
# Feel free to play with the learning rate and the num_epochs,
# and see your results plotted below.
def trainPerceptronAlgorithm(X, y, learn_rate = 0.01, num_epochs = 25):
x_min, x_max = min(X.T[0]), max(X.T[0])
y_min, y_max = min(X.T[1]), max(X.T[1])
W = np.array(np.random.rand(2,1))
b = np.random.rand(1)[0] + x_max
# These are the solution lines that get plotted below.
boundary_lines = []
for i in range(num_epochs):
# In each epoch, we apply the perceptron step.
W, b = perceptronStep(X, y, W, b, learn_rate)
boundary_lines.append((-W[0]/W[1], -b/W[1]))
return boundary_lines