讲解:CSCI 4155、Python、date、PythonSQL|Java

CSCI 4155/6505 – Winter 2020 A2Assignment 2 Full [pre-release summary of certain relevant parts only!]Submission Note (as explained in class): In order to help you prepare for the midterm, this assignment wasreleased in two parts:• Part 1 should be submitted by Saturday 8 Feb 2020 at 9pm. As long as you submit a functioning“Option 1” for Part 1(a) and Part 1(c) by this date, you will get at least 50% on the entire Part 1 (seebelow to read exactly what the different options include). Note that as mentioned in class, more aspectswill be added to Part 1 as well, but they are not relevant to the Saturday submission opportunity.• The rest of the assignment is now due on Tues 18 Feb 2020, 9pm Sun 23 Feb 2020, 9pm. At thattime, you entire assignment will be marked (so even if you did not complete Part 1 previously, youcould still get 100%).• New: Due to the delays in releasing the final version, there will not be a “Part 2” to this assignmentinvolving PyTorch.IntroductionIn Part 1 of this assignment you will implement a basic neural net in numpy. You are not to use any librariessuch as sklearn or keras, although you are always welcome to check if there is a certain library that youthink would make sense for you to use![20pts Total] Part 1: A Feedforward Neural NetworkAs discussed in lab, we will follow some of the examples shown in PlayGround.a) [3pts] Create the data set. Choose from the following two options (for now) three options:[1pt] Option 1 : Create and visualize a two-class data set consisting of two Gaussians, similarly scaled asthe ones in the bottom left data set in playground.[2pts] Option 2 : Create and visualize both: (i) a two-class data set as in Option 1, as well as (ii) a two classdata set of concentric circles, as in the upper left data set in playground.[3pts] Option 3 [new] : Create and visualize both Gaussian clusters and concentric circles, but this time New.allow the user to specify how many different clusters/circles the distribution includes. For example, if(numClasses==3) then that will generate three Gaussians, or three concentric circles, corresponding tothree distinct classes. The sample function below has been modified accordingly, with a default value, sothat if you are only able to handle two classes, it will still be called the same way.Note that your data sets must be randomly generated, so that running the code multiple times with differentseeds will give different (but similar looking) results each time. Also note the args have couple smalldifferences from before. Your data generation should be run with the function:X = generateData( numExamples, distributionShape, numClasses = 2, numOutputs=1 )# X is just the name of the matrix containing the generated dataset# numExamples is the number of examples in the dataset, with each row corresponding to oneexample# distributionShape can either be ’g’ (gaussian) or ’c’ (circles)# if numClasses==2 then numOutputs can be 1, assuming a sigmoid unit, or 2, corresponding tosoftmax.# Otherwise, numOutputs must be the same as numClasses.b) [1pt] Add noise [new] : Add an option that allows you to add label noise with a given probability.1CSCI 4155/6505 – Winter 2020 A2c) [7pts] Train a small feedforward neural network [new] :Starting with the example done in lab, you will continue to implement a small feedforward network in numpy.At this stage, you will write two functions:train( X, numInput, numHiddenUnits, numOutput, activationType, numIter) \\// X is the data matrix, one example per row// numInput is the number of columns of input in X// numOutput is the number of columns of output in X// activationType is either ’linear’ or ’sigmoid’ or ’reLU’;// it specifies only the activation function of the single hidden layer// numIter is the number of iterations for which your training algorithm should run.//// Return:// The function should return (W1, W2), a 2-tuple of weight matrices for the// input-to-hidden and hidden-to-output layers, similarly to the sample code// done in lab.}If numOutput == 1, then the output unit is a sigmoid (no matter what activation function the hidden unitshave). If numOutput > 1, then the output units should be a softmax layer.Your second function should take as input a data matrix (which might contain test data), as well as thetrained weights, and the same set of architecture parameters (i.e. numInput, numHiddenUnits, numOutput,activationType) and also a data matrix , and a verbosity parameter, and it should output the results of thetrained weights applied to the given data.predict( X, W1, W2, numInput, numHiddenUnits, numOutput, activationType, verbosity )// X, W1, W2, numInput, numHiddenUnits, numOutput, activationType are all as before.//// Return:// This function returns a matrix containing with the same number of rows// as X, and with a total of (numOutput+1) columns:// the first numOutput columns contain the predicted values for the given input X. // Thelast column contains the cross-entropy error for that prediction, given the// correct answer as defined in X.//This part should work for an arbitrary number of input units, an arbitrary number of hidden units (keepingjust a single layer), and for the three activation functions (linear, sigmoid, reLU). Later parts of this questio代做CSCI 4155作业、代做Python课程作业、代写date留学生作业、Python程序语言作业调试 代做数据库Snwill ask you to write this in an object-oriented format and add additional features. For Saturday, you aresimply asked to write this for an arbitrary number of input units, an arbitrary number of hidden units(keeping just a single layer), and for three activation functions (linear, sigmoid, reLU). You should try toallow for multiple output units, but they do not need to function well: there are some numerical precisionissues that you might be unable to solve on your own at this point.d) [5pts] Refactor [new] : Refactor your existing code into a cleaner object-oriented form. For now, youcan assume the default values as shown in the listing below. That is, you need create a class for a modelthat has a single sigmoid output, with two hidden layers (in addition to the input), with each hidden layerhaving two ReLU units.class Model:def __init__(self, numInputs=2, numOutputs=1, layerSize=2, numHiddenLayers=2, activationType=’R’):// numInputs: number of inputs to the net// numOuputs: number of output units in the output// layerSize: the number of units in each hidden layer// activationType: either ’L’ (linear), ’S’ (sigmoid), ’R’(reLU)// This is the activation type of all hidden layers.2CSCI 4155/6505 – Winter 2020 A2// Note that the output should automatically be a softmax layer if// numOutputs is greater than 1, and a sigmoid otherwise.// (So the default output is a single sigmoid unit).self.numInputs = numInputsself.numOuputs = numOuputsself.layerSize = layerSizeself.numHiddenLayers = numHiddenLayersself.activationType = activationTypeCreate a function setWeights( value ) that allows us to set all the weights to a constant value (fordebugging). You should also have a function initWeights(mean,stdev) that initializes all the weightsto be randomly chosen with the provided mean and standard deviation.We should be able to call your model as follows:X = generateData( 100, ’g’, 2, 1 ) # use 2 gaussians to generate# 100 data points, with a single target value (0 or 1)np.random.shuffle(X) # make sure the examples are shuffledX_train = X[:90] # create a 90/10 train/test splitX_test = X[90:]net = Model()net.setInput(X_train)net.setTest(X_test)net.initWeights(0.0,0.01) # initialize weights with mean 0 and standard deviation 0.01trainError = net.train(100, 0.1) # train for 100 iterations with a step size of 0.1, thisshould return 100x2 array containing the training and test error at each of the 100iterationstestError = net.test() # return the error on the test setY = net.predict(X1) # assuming X1 is a valid input matrix, return the array of predictions for X1For this part, you only need to have it work for the given default values.e) [1pt] [new] : Allow a variable number of hidden units. E.g.net = Model(2,1,5,2,’R’) #same as above but with 5 hidden units per layerf) [1pt] [new] : Allow the various possible activation types. E.g.net = Model(2,1,2,2,’S’) #same as above but with sigmoid hidden unitsg) [1pt] [new] : Allow multiple output units, using softmax and a cross-entropy error.net = Model(2,3,2,2,’S’) #3 softmax output unitsh) [1pt] [new] : Allow multiple hidden layers.net = Model(2,1,2,5,’R’) # 5 hidden layersIn this assignment, your code will primarily be marked by running it. It is possible that the marker willlook at the code itself, but not guaranteed. The marking will be done based on whether the test scriptsrun properly and give correct answers. We may provide some additional informal suggestions over the nextcouple of days, but this document should provide sufficient specification in its current form for completingthe assignment.3CSCI 4155/6505 – Winter 2020 A2General Marking Notes and Tips• In some assignments (such as this one), you will be marked by having the markers run test scripts asspecified, and they will examine the output of those test scripts. That means that your code mustfollow the specified format exactly (e.g. any function templates), or the test scripts may not run.• You will be marked for accuracy, correctness and also clarity of presentation where relevant.• Be selective in the experimental results you present. You don’t need to present every single experimentyou carried out. It is best to summarize some of your work and present only the interesting results,e.g. where the behaviour of the ML model behaves differently.• In your answers, you want to demonstrate skill in using any required libraries to experiment with MLmodels, and good understanding / interpretation of the results. Make it easy for the marker to seeyour skill and your understanding.• Justify your meta-parameter choices where appropriate.• Except where you are explicitly asked to implement a particular algorithm/step yourself, if there is aprocedure in sklearn that does the task, you are free to find it, read its documentation and use it. Ifyou are not sure, it is best to on the shared notes or in class!• You should submit one python notebook with your answers.• Liberally add markdown cells to explain your code and experimental results. Make good use of theformatting capabilities of markdown to make your notebook highly readable.• Add links to all online resources you use (markdown syntax is: [URL](anchor text) ). Avoid explicitlong URLs in your markdown cells.4转自:http://www.6daixie.com/contents/3/4923.html

©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 214,588评论 6 496
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 91,456评论 3 389
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 160,146评论 0 350
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 57,387评论 1 288
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 66,481评论 6 386
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 50,510评论 1 293
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,522评论 3 414
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,296评论 0 270
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 44,745评论 1 307
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 37,039评论 2 330
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 39,202评论 1 343
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 34,901评论 5 338
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,538评论 3 322
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 31,165评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,415评论 1 268
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 47,081评论 2 365
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 44,085评论 2 352

推荐阅读更多精彩内容