讲解:CMPT310-HW4、Neural Networks、Python、PythonR|Matlab

Detecting Handwritten Prime Digitswith Neural NetworksNovember 20, 2019100 points1 IntroductionPrime numbers are fantastic. In this assignment we will use Multi Layer Perceptronsto detect handwritten prime digits. But before doing such a difficulttask I suggest to try and solve an easier problem with MLP. If you succeed,which I know you will, you can proceed with tackling the challenging problemof detecting handwritten prime digits.2 Regression - Toy DataThe first task is to learn the function that generated the following data, usinga simple neural network.The function that produced this data is actually y = x2+�, where � ∼ N(0, σ)is a random noise from a normal distribution with a small variance. We are goingto use an MLP with one hidden layer, which each has 5 neurons, to learn1an approximation of this function using the data that we have. This assignmentcomes with a starting code, which is incomplete and you are supposed tocomplete it.2.1 Technical Details2.1.1 CodeThe code that comes with this assignment has multiple files including:assignmenttoy example regressor.pylayers.pyneural network.pyutils.py• toy_example_regressor.py contains most of the codes that are relatedto the training procedure, including loading data and iteratively feedingmini-batches of data to the neural network, plotting the approximatedfunction and the data, etc. Please read this file and understand it but youdon’t need to modify this.• layers.py contains definition of the layers that we use in this assignment,including DenseLayer, SigmoidLayer, and L2LossLayer. Your main responsibilityis to implement forward and backward functions for theselayers.• neural_network.py contains the definition of a neural network (NeuralNetclass), which is an abstract class. This class basically takes care of runningforward pass and propagating the gradients, backwards, from loss to thefirst layer.• utils.py contains some useful functions.2.1.2 DataThe training data for this problem, which consists of input data and labels, canbe generated by the function get_data(), which you can find in the main file,toy_example_regressor.py.2.1.3 Network StructureFor the regression problem (i.e. the first task) we defined a new class, SimpleNet, which is inherited from NeuralNet. SimpleNet contains two DenseLayers, which one of them has hidden neurons with Sigmoid activation functions.Network definition can be found in toy_example_regressor.py.22.2 Your Task (total: 80 points)2.2.1 Implementing compute_activations, compute_gradients, and update_weightsfunctionsThere are three type of layers completely implemented in the layers.py file:DenseLayer, Sigmoid, and L2LossLayer. However, implementation of DenseLayeris incomplete. You are supposed to implement the following functions• DenseLayer: This is a simple dense (or linear or fully connected) layerthat has two types of parameters: weights w and biases b.– compute_activations (15 points): The value of every output neuronis oi = x.wi + bi. The number of input and output neurons arespecified in the __init__ function.– compute_gradients (20 points): Assume that gradient of the losswith respect to the output neurons, self._output_error_gradient, are computed by the next layer already. You need to compute thegradients of the loss with respect to all the parameters of this layer(i.e. b and w) and store them in self.dw and self.db so that youcan use them the update the parameters later. Needless to say thatshape of dw and w should be equal, and same goes for db and b. Inaddition, you should compute the gradient of the loss with respectto the input, which is the output of the previous layer, and store itin self._input_error_gradient. This value will be passed on tothe previous layer in the network, which will be used to compute thegradients recursively (Back Propagation).– update_weights (10 points): You should perform Stochastic GradientDescent and update the weights using the current weights, gradients,and the given learning rate ( newweights = currentweights −learningrate ∗ gradient )You can refer to the implementations of Sigmoid and L2LossLayer. Note: It’sup to you how to implement these functions. However, it would becomputationally less expensive if you use numpy matrix operations tocompute the value of the neurons or gradients.Your next task is to implement same functions for the NeuralNet class.• compute_activations (10 points): Iterate over all layers starting from thefirst one and compute the activations for each layer. At the end returnthe output of the last layer along with the value of loss.• compute_gradients (10 points): You are supposed to perform back propagati代写CMPT310-HW4、代做Neural Networkon.In other words, first compute the gradient of the lass layer andpass it to the last layer. Then starting from the last layer iterate overall layers backwards, first compute its gradient and then pass it to theprevious layer.3• update_weights (5 points): You should update the parameters of all thelayers (i.e. those who have parameters). You can use the update_weightfunction of each layer2.2.2 Training the model (10 points)Once you are done with implementing and testing the correctness of the implementedfunctions, you are ready to build a multi layer Perceptron and trainit.There is already an existing starter code for you at toy_example_regressor.py. It’s a script that contains definition of a simple two layer MLP with scriptsthat loads the data and trains the MLP. At the end of the training the codeplots data and the approximated function. Also the network weights will besaved to a file with this name simple_net_weights-{timestamp}.pkl. Youshould change its name to simple_net_weights.pkl and include it inyour submission.. The timestamp is added to make sure you don’t overwritesome previously well trained model.In addition, you should check the loss and the saved image. Check if thepredicted function is similar to f(x) = x2 and matches the validation data.You should include the saved image, data_function.png, both in yourreport and in your submission.Note that this is a regression task, so the last layer of the MLP only has oneneuron without any activation functions.3 Detecting Prime DigitsNow we can use the same layers to distinguish one digit prime numbers (i.e.2,3,5, and 7) from one digit composite numbers (i.e. 1, 4, 6, 8, and 9). This isa binary classification problem. So the MLP that we are going to use will haveonly one neuron in the last layer with a sigmoid activation function.3.1 DataThe dataset that you will be using for this task is the MNIST dataset, whichcontains gray scale images of hand written digits. Sizes of images are 28 by28 pixels. We have already preprocessed it for you. We have set the labels forprime digits to 1, and 0 otherwise. Also we have normalized the values of pixels(i.e. pixel values are in the interval of [0, 1]).3.2 Network StructureThe input of the network is a vector with 784 neurons (28 ∗ 28) The networkhas one hidden layer with 20 neurons with sigmoid activation functions. Theoutput of the network is one neuron with sigmoid activation function.43.3 Your Task (20 points)For this task, all you need to do is to read and understand the prime_classifier.py code and then run it. Over the course of training, loss values and accuracieson the validation set is printed. At the end of the training, network parameterswill be saved in a file with the following format prime_net_weights-{timestamp}.pkl. You should change its name to prime_net_weights.pkland include it in your submission.. The timestamp is added to make sureyou don’t overwrite some previously well trained model.4 Testing Your CodeTo help you with testing your code, a number of tester files have been included.You can use them to test your implementations. For example if you run:$ python test_layers.pyYou can see if the three functions (DenseLayer.compute_activations(),DenseLayer.compute_gradients(), and DenseLayer.update_weights()) thatyou implemented in layers.py file work properly or not. We recommend touse all the four test files that are included:testerstest layers.pytest neural network.pytest toy example regressor.pytest prime classifier.pyAlso, you can estimate the total points that you might get for this assignmentby running the following code:$ python evaluate_assignment.pyNote: We will use stronger test cases to test your code and gradeyour assignment. So passing these tests does not guarantee anything.These tests are only meant to help you with this assignment.5 What to submitYou should include all the following files in a tar.gz or zip file with yourstudent id (either YOUR_STUDENT_ID.tar.gz or YOUR_STUDENT_ID.zip).1. Your code. Do not change the signature of the functions that you weresupposed to implement. Do NOT include the dataset (assignment4-mnist.pkl)2. You should include the following files that are automatically saved in yoursubmission:• prime_net_weights.pkl• simple_net_weights.pkl5• data_function.png3. A report.pdf file concisely explaining what you did in this assignment.Also in your report include your model’s loss (for both problems) andaccuracy (only for prime digit detection).Note: your code will be evaluated with an automated script. So if youdon’t follow the above steps, you will lose all or a portion of your points for thisassignment.6转自:http://www.6daixie.com/contents/3/4476.html

©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 216,324评论 6 498
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 92,356评论 3 392
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 162,328评论 0 353
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 58,147评论 1 292
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 67,160评论 6 388
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 51,115评论 1 296
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 40,025评论 3 417
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,867评论 0 274
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 45,307评论 1 310
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,528评论 2 332
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,688评论 1 348
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 35,409评论 5 343
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 41,001评论 3 325
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,657评论 0 22
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,811评论 1 268
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 47,685评论 2 368
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,573评论 2 353

推荐阅读更多精彩内容