fx 函数

Deep Learning Toolbox — 函数

图像深度学习

| trainingOptions | Options for training deep learning neural network |
| trainNetwork | Train neural network for deep learning |
| analyzeNetwork | Analyze deep learning network architecture |
| alexnet | Pretrained AlexNet convolutional neural network |
| vgg16 | Pretrained VGG-16 convolutional neural network |
| vgg19 | Pretrained VGG-19 convolutional neural network |
| squeezenet | Pretrained SqueezeNet convolutional neural network |
| googlenet | Pretrained GoogLeNet convolutional neural network |
| inceptionv3 | Pretrained Inception-v3 convolutional neural network |
| resnet18 | Pretrained ResNet-18 convolutional neural network |
| resnet50 | Pretrained ResNet-50 convolutional neural network |
| resnet101 | Pretrained ResNet-101 convolutional neural network |
| inceptionresnetv2 | Pretrained Inception-ResNet-v2 convolutional neural network |
| imageInputLayer | Image input layer |
| convolution2dLayer | 2-D convolutional layer |
| fullyConnectedLayer | Fully connected layer |
| reluLayer | Rectified Linear Unit (ReLU) layer |
| leakyReluLayer | Leaky Rectified Linear Unit (ReLU) layer |
| clippedReluLayer | Clipped Rectified Linear Unit (ReLU) layer |
| batchNormalizationLayer | Batch normalization layer |
| CrossChannelNormalizationLayer | Channel-wise local response normalization layer |
| dropoutLayer | Dropout layer |
| averagePooling2dLayer | Average pooling layer |
| maxPooling2dLayer | Max pooling layer |
| maxUnpooling2dLayer | Max unpooling layer |
| additionLayer | Addition layer |
| depthConcatenationLayer | Depth concatenation layer |
| softmaxLayer | Softmax layer |
| transposedConv2dLayer | Transposed 2-D convolution layer |
| classificationLayer | Classification output layer |
| regressionLayer | Create a regression output layer |
| augmentedImageDatastore | Transform batches to augment image data |
| imageDataAugmenter | Configure image data augmentation |
| augment | Apply identical random transformations to multiple images |
| layerGraph | Graph of network layers for deep learning |
| plot | Plot neural network layer graph |
| addLayers | Add layers to layer graph |
| removeLayers | Remove layers from layer graph |
| replaceLayer | Replace layer in layer graph |
| connectLayers | Connect layers in layer graph |
| disconnectLayers | Disconnect layers in layer graph |
| DAGNetwork | Directed acyclic graph (DAG) network for deep learning |
| classify | Classify data using a trained deep learning neural network |
| activations | Compute convolutional neural network layer activations |
| predict | Predict responses using a trained deep learning neural network |
| confusionchart | Create confusion matrix chart for classification problem |
| sortClasses | Sort classes of confusion matrix chart |

时序、序列和文本深度学习

| trainingOptions | Options for training deep learning neural network |
| trainNetwork | Train neural network for deep learning |
| analyzeNetwork | Analyze deep learning network architecture |
| sequenceInputLayer | Sequence input layer |
| lstmLayer | Long short-term memory (LSTM) layer |
| bilstmLayer | Bidirectional long short-term memory (BiLSTM) layer |
| fullyConnectedLayer | Fully connected layer |
| reluLayer | Rectified Linear Unit (ReLU) layer |
| leakyReluLayer | Leaky Rectified Linear Unit (ReLU) layer |
| clippedReluLayer | Clipped Rectified Linear Unit (ReLU) layer |
| dropoutLayer | Dropout layer |
| softmaxLayer | Softmax layer |
| classificationLayer | Classification output layer |
| regressionLayer | Create a regression output layer |
| predict | Predict responses using a trained deep learning neural network |
| classify | Classify data using a trained deep learning neural network |
| predictAndUpdateState | Predict responses using a trained recurrent neural network and update the network state |
| classifyAndUpdateState | Classify data using a trained recurrent neural network and update the network state |
| resetState | Reset the state of a recurrent neural network |
| confusionchart | Create confusion matrix chart for classification problem |
| sortClasses | Sort classes of confusion matrix chart |

深度学习调整和可视化

| analyzeNetwork | Analyze deep learning network architecture |
| plot | Plot neural network layer graph |
| trainingOptions | Options for training deep learning neural network |
| trainNetwork | Train neural network for deep learning |
| activations | Compute convolutional neural network layer activations |
| predict | Predict responses using a trained deep learning neural network |
| classify | Classify data using a trained deep learning neural network |
| predictAndUpdateState | Predict responses using a trained recurrent neural network and update the network state |
| classifyAndUpdateState | Classify data using a trained recurrent neural network and update the network state |
| resetState | Reset the state of a recurrent neural network |
| deepDreamImage | Visualize network features using deep dream |
| confusionchart | Create confusion matrix chart for classification problem |
| sortClasses | Sort classes of confusion matrix chart |

深度学习导入、导出和自定义

| importKerasNetwork | Import a pretrained Keras network and weights |
| importKerasLayers | Import layers from Keras network |
| importCaffeNetwork | Import pretrained convolutional neural network models from Caffe |
| importCaffeLayers | Import convolutional neural network layers from Caffe |
| importONNXNetwork | Import pretrained ONNX network |
| importONNXLayers | Import layers from ONNX network |
| exportONNXNetwork | Export network to ONNX model format |
| findPlaceholderLayers | Find placeholder layers in network architecture imported from Keras or ONNX |
| replaceLayer | Replace layer in layer graph |
| assembleNetwork | Assemble deep learning network from pretrained layers |
| PlaceholderLayer | Layer replacing an unsupported Keras or ONNX layer |
| setLearnRateFactor | Set learn rate factor of layer learnable parameter |
| setL2Factor | Set L2 regularization factor of layer learnable parameter |
| getLearnRateFactor | Get learn rate factor of layer learnable parameter |
| getL2Factor | Get L2 regularization factor of layer learnable parameter |
| checkLayer | Check validity of custom layer |
| MiniBatchable | Add mini-batch support to datastore |
| BackgroundDispatchable | Add prefetch reading support to datastore |
| PartitionableByIndex | Add parallelization support to datastore |
| Shuffleable | Add shuffling support to datastore |

函数逼近和聚类

函数逼近和非线性回归

| nnstart | Neural network getting started GUI |
| view | View neural network |
| fitnet | Function fitting neural network |
| feedforwardnet | Feedforward neural network |
| cascadeforwardnet | Cascade-forward neural network |
| train | Train shallow neural network |
| trainlm | Levenberg-Marquardt backpropagation |
| trainbr | Bayesian regularization backpropagation |
| trainscg | Scaled conjugate gradient backpropagation |
| trainrp | Resilient backpropagation |
| mse | Mean squared normalized error performance function |
| regression | Linear regression |
| ploterrhist | Plot error histogram |
| plotfit | Plot function fit |
| plotperform | Plot network performance |
| plotregression | Plot linear regression |
| plottrainstate | Plot training state values |
| genFunction | Generate MATLAB function for simulating neural network |

模式识别

| 自编码器 | Autoencoder class |
| nnstart | Neural network getting started GUI |
| view | View neural network |
| trainAutoencoder | Train an autoencoder |
| trainSoftmaxLayer | Train a softmax layer for classification |
| decode | Decode encoded data |
| encode | Encode input data |
| predict | Reconstruct the inputs using trained autoencoder |
| stack | Stack encoders from several autoencoders together |
| network | Convert Autoencoder object into network object |
| patternnet | Pattern recognition network |
| lvqnet | Learning vector quantization neural network |
| train | Train shallow neural network |
| trainlm | Levenberg-Marquardt backpropagation |
| trainbr | Bayesian regularization backpropagation |
| trainscg | Scaled conjugate gradient backpropagation |
| trainrp | Resilient backpropagation |
| mse | Mean squared normalized error performance function |
| regression | Linear regression |
| roc | Receiver operating characteristic |
| plotconfusion | Plot classification confusion matrix |
| ploterrhist | Plot error histogram |
| plotperform | Plot network performance |
| plotregression | Plot linear regression |
| plotroc | Plot receiver operating characteristic |
| plottrainstate | Plot training state values |
| crossentropy | Neural network performance |
| genFunction | Generate MATLAB function for simulating neural network |

聚类

自组织映射

| nnstart | Neural network getting started GUI |
| view | View neural network |
| selforgmap | Self-organizing map |
| train | Train shallow neural network |
| plotsomhits | Plot self-organizing map sample hits |
| plotsomnc | Plot self-organizing map neighbor connections |
| plotsomnd | Plot self-organizing map neighbor distances |
| plotsomplanes | Plot self-organizing map weight planes |
| plotsompos | Plot self-organizing map weight positions |
| plotsomtop | Plot self-organizing map topology |
| genFunction | Generate MATLAB function for simulating neural network |

竞争层

| competlayer | Competitive layer |
| view | View neural network |
| train | Train shallow neural network |
| trainru | Unsupervised random order weight/bias training |
| learnk | Kohonen weight learning function |
| learncon | Conscience bias learning function |
| genFunction | Generate MATLAB function for simulating neural network |

自编码器

| 自编码器 | Autoencoder class |
| trainAutoencoder | Train an autoencoder |
| trainSoftmaxLayer | Train a softmax layer for classification |
| decode | Decode encoded data |
| encode | Encode input data |
| generateFunction | Generate a MATLAB function to run the autoencoder |
| generateSimulink | Generate a Simulink model for the autoencoder |
| network | Convert Autoencoder object into network object |
| plotWeights | Plot a visualization of the weights for the encoder of an autoencoder |
| predict | Reconstruct the inputs using trained autoencoder |
| stack | Stack encoders from several autoencoders together |
| view | View autoencoder |

定义浅层神经网络架构

| network | Create custom neural network |

时序和控制系统

时序和动态系统

使用 NARX 网络和时延网络进行建模和预测

| nnstart | Neural network getting started GUI |
| view | View neural network |
| timedelaynet | Time delay neural network |
| narxnet | Nonlinear autoregressive neural network with external input |
| narnet | Nonlinear autoregressive neural network |
| layrecnet | Layer recurrent neural network |
| distdelaynet | Distributed delay network |
| train | Train shallow neural network |
| gensim | Generate Simulink block for neural network simulation |
| adddelay | Add delay to neural network response |
| removedelay | Remove delay to neural network’s response |
| closeloop | Convert neural network open-loop feedback to closed loop |
| openloop | Convert neural network closed-loop feedback to open loop |
| ploterrhist | Plot error histogram |
| plotinerrcorr | Plot input to error time-series cross-correlation |
| plotregression | Plot linear regression |
| plotresponse | Plot dynamic network time series response |
| ploterrcorr | Plot autocorrelation of error time series |
| genFunction | Generate MATLAB function for simulating neural network |

创建 Simulink 模型

| gensim | Generate Simulink block for neural network simulation |
| setsiminit | Set neural network Simulink block initial conditions |
| getsiminit | Get Simulink neural network block initial input and layer delays states |
| sim2nndata | Convert Simulink time series to neural network data |
| nndata2sim | Convert neural network data to Simulink time series |

<iframe id="doc_survey" src="https://ww2.mathworks.cn/programs/bounce_hub_help.html?s_cid=Help_Topic_Survey&surveyParams=https://ww2.mathworks.cn/help/deeplearning/referencelist.html?type=function&s_cid=doc_ftr-G11N-zh_CN" style="box-sizing: border-box; transition: none 0s ease 0s !important; width: 310px; height: 167px; margin-top: 20px; border: 1px solid rgb(230, 230, 230); font-size: 14px !important;"></iframe>

©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 204,530评论 6 478
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 86,403评论 2 381
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 151,120评论 0 337
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 54,770评论 1 277
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 63,758评论 5 367
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,649评论 1 281
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 38,021评论 3 398
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,675评论 0 258
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 40,931评论 1 299
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,659评论 2 321
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,751评论 1 330
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,410评论 4 321
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 39,004评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,969评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,203评论 1 260
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 45,042评论 2 350
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,493评论 2 343

推荐阅读更多精彩内容