Detecting Handwritten Prime Digitswith Neural NetworksNovember 20, 2019100 points1 IntroductionPrime numbers are fantastic. In this assignment we will use Multi Layer Perceptronsto detect handwritten prime digits. But before doing such a difficulttask I suggest to try and solve an easier problem with MLP. If you succeed,which I know you will, you can proceed with tackling the challenging problemof detecting handwritten prime digits.2 Regression - Toy DataThe first task is to learn the function that generated the following data, usinga simple neural network.The function that produced this data is actually y = x2+�, where � ∼ N(0, σ)is a random noise from a normal distribution with a small variance. We are goingto use an MLP with one hidden layer, which each has 5 neurons, to learn1an approximation of this function using the data that we have. This assignmentcomes with a starting code, which is incomplete and you are supposed tocomplete it.2.1 Technical Details2.1.1 CodeThe code that comes with this assignment has multiple files including:assignmenttoy example regressor.pylayers.pyneural network.pyutils.py• toy_example_regressor.py contains most of the codes that are relatedto the training procedure, including loading data and iteratively feedingmini-batches of data to the neural network, plotting the approximatedfunction and the data, etc. Please read this file and understand it but youdon’t need to modify this.• layers.py contains definition of the layers that we use in this assignment,including DenseLayer, SigmoidLayer, and L2LossLayer. Your main responsibilityis to implement forward and backward functions for theselayers.• neural_network.py contains the definition of a neural network (NeuralNetclass), which is an abstract class. This class basically takes care of runningforward pass and propagating the gradients, backwards, from loss to thefirst layer.• utils.py contains some useful functions.2.1.2 DataThe training data for this problem, which consists of input data and labels, canbe generated by the function get_data(), which you can find in the main file,toy_example_regressor.py.2.1.3 Network StructureFor the regression problem (i.e. the first task) we defined a new class, SimpleNet, which is inherited from NeuralNet. SimpleNet contains two DenseLayers, which one of them has hidden neurons with Sigmoid activation functions.Network definition can be found in toy_example_regressor.py.22.2 Your Task (total: 80 points)2.2.1 Implementing compute_activations, compute_gradients, and update_weightsfunctionsThere are three type of layers completely implemented in the layers.py file:DenseLayer, Sigmoid, and L2LossLayer. However, implementation of DenseLayeris incomplete. You are supposed to implement the following functions• DenseLayer: This is a simple dense (or linear or fully connected) layerthat has two types of parameters: weights w and biases b.– compute_activations (15 points): The value of every output neuronis oi = x.wi + bi. The number of input and output neurons arespecified in the __init__ function.– compute_gradients (20 points): Assume that gradient of the losswith respect to the output neurons, self._output_error_gradient, are computed by the next layer already. You need to compute thegradients of the loss with respect to all the parameters of this layer(i.e. b and w) and store them in self.dw and self.db so that youcan use them the update the parameters later. Needless to say thatshape of dw and w should be equal, and same goes for db and b. Inaddition, you should compute the gradient of the loss with respectto the input, which is the output of the previous layer, and store itin self._input_error_gradient. This value will be passed on tothe previous layer in the network, which will be used to compute thegradients recursively (Back Propagation).– update_weights (10 points): You should perform Stochastic GradientDescent and update the weights using the current weights, gradients,and the given learning rate ( newweights = currentweights −learningrate ∗ gradient )You can refer to the implementations of Sigmoid and L2LossLayer. Note: It’sup to you how to implement these functions. However, it would becomputationally less expensive if you use numpy matrix operations tocompute the value of the neurons or gradients.Your next task is to implement same functions for the NeuralNet class.• compute_activations (10 points): Iterate over all layers starting from thefirst one and compute the activations for each layer. At the end returnthe output of the last layer along with the value of loss.• compute_gradients (10 points): You are supposed to perform back propagati代写CMPT310-HW4、代做Neural Networkon.In other words, first compute the gradient of the lass layer andpass it to the last layer. Then starting from the last layer iterate overall layers backwards, first compute its gradient and then pass it to theprevious layer.3• update_weights (5 points): You should update the parameters of all thelayers (i.e. those who have parameters). You can use the update_weightfunction of each layer2.2.2 Training the model (10 points)Once you are done with implementing and testing the correctness of the implementedfunctions, you are ready to build a multi layer Perceptron and trainit.There is already an existing starter code for you at toy_example_regressor.py. It’s a script that contains definition of a simple two layer MLP with scriptsthat loads the data and trains the MLP. At the end of the training the codeplots data and the approximated function. Also the network weights will besaved to a file with this name simple_net_weights-{timestamp}.pkl. Youshould change its name to simple_net_weights.pkl and include it inyour submission.. The timestamp is added to make sure you don’t overwritesome previously well trained model.In addition, you should check the loss and the saved image. Check if thepredicted function is similar to f(x) = x2 and matches the validation data.You should include the saved image, data_function.png, both in yourreport and in your submission.Note that this is a regression task, so the last layer of the MLP only has oneneuron without any activation functions.3 Detecting Prime DigitsNow we can use the same layers to distinguish one digit prime numbers (i.e.2,3,5, and 7) from one digit composite numbers (i.e. 1, 4, 6, 8, and 9). This isa binary classification problem. So the MLP that we are going to use will haveonly one neuron in the last layer with a sigmoid activation function.3.1 DataThe dataset that you will be using for this task is the MNIST dataset, whichcontains gray scale images of hand written digits. Sizes of images are 28 by28 pixels. We have already preprocessed it for you. We have set the labels forprime digits to 1, and 0 otherwise. Also we have normalized the values of pixels(i.e. pixel values are in the interval of [0, 1]).3.2 Network StructureThe input of the network is a vector with 784 neurons (28 ∗ 28) The networkhas one hidden layer with 20 neurons with sigmoid activation functions. Theoutput of the network is one neuron with sigmoid activation function.43.3 Your Task (20 points)For this task, all you need to do is to read and understand the prime_classifier.py code and then run it. Over the course of training, loss values and accuracieson the validation set is printed. At the end of the training, network parameterswill be saved in a file with the following format prime_net_weights-{timestamp}.pkl. You should change its name to prime_net_weights.pkland include it in your submission.. The timestamp is added to make sureyou don’t overwrite some previously well trained model.4 Testing Your CodeTo help you with testing your code, a number of tester files have been included.You can use them to test your implementations. For example if you run:$ python test_layers.pyYou can see if the three functions (DenseLayer.compute_activations(),DenseLayer.compute_gradients(), and DenseLayer.update_weights()) thatyou implemented in layers.py file work properly or not. We recommend touse all the four test files that are included:testerstest layers.pytest neural network.pytest toy example regressor.pytest prime classifier.pyAlso, you can estimate the total points that you might get for this assignmentby running the following code:$ python evaluate_assignment.pyNote: We will use stronger test cases to test your code and gradeyour assignment. So passing these tests does not guarantee anything.These tests are only meant to help you with this assignment.5 What to submitYou should include all the following files in a tar.gz or zip file with yourstudent id (either YOUR_STUDENT_ID.tar.gz or YOUR_STUDENT_ID.zip).1. Your code. Do not change the signature of the functions that you weresupposed to implement. Do NOT include the dataset (assignment4-mnist.pkl)2. You should include the following files that are automatically saved in yoursubmission:• prime_net_weights.pkl• simple_net_weights.pkl5• data_function.png3. A report.pdf file concisely explaining what you did in this assignment.Also in your report include your model’s loss (for both problems) andaccuracy (only for prime digit detection).Note: your code will be evaluated with an automated script. So if youdon’t follow the above steps, you will lose all or a portion of your points for thisassignment.6转自:http://www.6daixie.com/contents/3/4476.html
讲解:CMPT310-HW4、Neural Networks、Python、PythonR|Matlab
©著作权归作者所有,转载或内容合作请联系作者
- 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
- 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
- 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
推荐阅读更多精彩内容
- The Inner Game of Tennis W Timothy Gallwey Jonathan Cape ...
- 前言 自从google的服务器搬离中国大陆后,大陆地区用户用 google 服务时会自动跳转到香港的 http:/...