神经网络实现

工神经网络(ANN)是用于人工智能的算法来模拟人类的思维。该网络的工作方式类似于人的大脑,它是由神经元彼此沟通,并提供有价值的产出。虽然只是一个模型 – 甚至还没有接近人类的思维 – 人工神经网络已经被用于预测,分类和决策支持系统,以及光学字符识别和许多其他应用程序中。

人工神经网络的发展主要集中在高层次的编程语言,如C或C + +,但你也可以在PHP 中实现神经网络,这也许是在Web应用程序中使用人工智能最方便的方式。在这篇文章中,我将解释如何建立一个最常见的神经网络拓扑结构,多层次的感知,并通过使用一个PHP神经网络类在PHP中创建第一个神经网络。

类似人类思维过程,神经网络:

接收一些(数据)输入分析,并处理它

提供输出值(即,计算的结果)

这就是为什么在这个例子中拓扑结构有三层(多层感知):

  • 输入层

  • 隐藏层

  • 输出层

根据您的需要,每一层都有一个特定的神经元数量。每一个神经元将被连接到下一层中所有神经元。神经元通过调整输出(即它们之间的权重系数)处理给定的任务。当然,它们可以应用到实际使用的情况下之前,神经网络往往需要去学习任务。但是,之前的一切,你要准备好你的数据网络。

在PHP中输入神经网络 – 准备数据

由于神经网络是复杂的数学模型,你可以不发送任何数据类型到输入神经元。之前的网络必须将数据标准化才可以使用它。这意味着,数据应被缩放到-1到1的范围内。不幸的是,在PHP中没有正常化函数,所以你必须自己做,但是可以我给你的公式:

I = Imin + (Imax-Imin)*(D-Dmin)/(Dmax-Dmin)

Imin 和 Imax代表的是神经网络的范围内(-1至1),Dmin和Dmax是数据的最小值和最大值。

正常处理数据后,你必须选择输入神经元的数量。例如,如果你有RGB颜色,你要确定红色或蓝色是主色,你将有4个输入神经元(红,绿,蓝值是神经元,第四个是偏见 – 通常等于1 )。下面是这样计算的PHP代码:

<?php

require_once``(``"class_neuralnetwork.php"``);

$n = ``new``NeuralNetwork(4, 4, 1); ``// the number of neurons in each layer of the network -- 4 input, 4 hidden and 1 output neurons

$n``->setVerbose(false); ``// do not display error messages

//test data

// First array is input data, and the second is the expected output value (1 means blue and 0 means red)

$n``->addTestData( ``array (0, 0, 255, 1), ``array (1));

$n``->addTestData( ``array (0, 0, 192, 1), ``array (1));

$n``->addTestData( ``array (208, 0, 49, 1), ``array (0));

$n``->addTestData( ``array ( 228, 105, 116, 1), ``array (0));

$n``->addTestData( ``array (128, 80, 255, 1), ``array (1));

$n``->addTestData( ``array ( 248, 80, 68, 1), ``array (0));

?>

只有一个输出神经元,因为你只有两种可能性结果。对于更复杂的问题,你可以使用一个以上的神经元网络输出,因此有许多的0和1作为可能的输出组合。

在PHP中训练神经网络

在能够解决问题之前,人工神经网络需要学习如何解决它,想像这个网络为一个公式。您已添加测试数据和预期的输出,并且该网络具有通过寻找输入和输出之间的连接来求解方程。这个过程被称为训练。在神经网络中,这些连接是神经元的权重。这几个算法可用于网络的训练,但反向传播算法是最常用的。BP算法实际上是指向后传播的错误。

初始化随机网络中的权重后,接下来的步骤是:

通过测试数据循环

1.计算实际输出

2.计算误差(所需的输出 – 当前网络输出)

3.向后计算增量的权重

4.更新权重

该过程继续进行,直到所有的测试数据已被正确地分类或算法已经达到停止标准。一般程序员试图使用最多三次的网络,而训练弹(时期)的最大数目为1000。另外,每个学习算法需要的激活功能。反向传播算法,激活函数是双曲正切(双曲正切)

让我们来看看如何在PHP中训练一个神经网络:

<?php

$max = 3;

// train the network in max 1000 epochs, with a max squared error of 0.01

while (!(``$success``=``$n``->train(1000, 0.01)) && ``$max``-->0) {

// training failed -- re-initialize the network weights

$n``->initWeights();

}

//training successful

if (``$success``) {

$epochs = ``$n``->getEpoch(); ``// get the number of epochs needed for training

}

?>

平均平方误差(mse)的错误,也被称为标准偏差的平方的平均值。默认的平均平方误差值通常为0.01,如果小于0.01,这意味着平均平方误差训练成功。

在来看下在PHP中工作的人工神经网络例子,这其实是一种很好的做法,可以节省您的神经网络文件或SQL数据库。如果你不保存它,你将不得不每次培训人来执行你的应用。简单的任务是很容易学的,但训练是一个更复杂的问题,需要更长的时间,你希望你的用户尽量少等待, 幸运的是,在这个PHP类例子中有保存和加载功能:

<?php

$n``->save(``'my_network.ini'``);

?>

请注意,文件扩展名必须是 .ini.

关于我们神经网络的PHP代码

让我们来看看在正在运行的应用程序中接收红,绿,蓝的值,并计算是否是蓝色或红色占主导地位的的PHP代码:

<?php

require_once``(``"class_neuralnetwork.php"``);

$r = ``$_POST``[``'red'``];

$g = ``$_POST``[``'green'``];

$b = ``$_POST``[``'blue'``];

$n = ``new NeuralNetwork(4, 4, 1); ``//initialize the neural network

$n``->setVerbose(false);

$n``->load(``'my_network.ini'``); ``// load the saved weights into the initialized neural network. This way you won't need to train the network each time the application has been executed

$input``= ``array``(normalize(``$r``),normalize(``$g``),normalize(``$b``)); ``//note that you will have to write a normalize function, depending on your needs

$result = ``$n``->calculate(``$input``);

If(``$result``>0.5) {

// the dominant color is blue

}

else {

// the dominant color is red

}

?>

神经网络的限制

神经网络的主要限制是,它们只能解决线性可分的问题,很多问题不是线性可分。因此,非线性可分问题需要另一种人工智能算法。然而,神经网络可以解决很多的问题,需要计算机智能赚取的人工智能算法中占有重要地位。

附class_neuralnetwork.php :

<?php
/**
  * <b>Multi-layer Neural Network in PHP</b>
  * 
  * Loosely based on source code by {@link [http://www.philbrierley.com](http://www.philbrierley.com) Phil Brierley},
  * that was translated into PHP by ‘dspink’ in sep 2005
  * 
  * Algorithm was obtained from the excellent introductory book 
  * “{@link [http://www.amazon.com/link/dp/0321204662](http://www.amazon.com/link/dp/0321204662) Artificial Intelligence – a guide to intelligent systems}”
  * by Michael Negnevitsky (ISBN 0-201-71159-1)
  *
  * <b>Example: learning the ‘XOR’-function</b>
  * <code>
  * require_once(“class_neuralnetwork.php”);
  * 
  * // Create a new neural network with 3 input neurons, 
  * // 4 hidden neurons, and 1 output neuron
  * $n = new NeuralNetwork(3, 4, 1);
  * $n->setVerbose(false);
  * 
  * // Add test-data to the network. In this case, 
  * // we want the network to learn the ‘XOR’-function.
  * // The third input-parameter is the ‘bias’.
  * $n->addTestData( array (-1, -1, 1), array (-1));
  * $n->addTestData( array (-1,  1, 1), array ( 1));
  * $n->addTestData( array ( 1, -1, 1), array ( 1));
  * $n->addTestData( array ( 1,  1, 1), array (-1));
  * 
  * // we try training the network for at most $max times
  * $max = 3;
  * 
  * // train the network in max 1000 epochs, with a max squared error of 0.01
  * while (!($success=$n->train(1000, 0.01)) && $max–>0) {
  *         // training failed:
  *         // 1\. re-initialize the weights in the network
  *         $n->initWeights();
  *        
  *         // 2\. display message
  *         echo “Nothing found…<hr />”; 
  * }
  * 
  * // print a message if the network was succesfully trained
  * if ($success) {
  *         $epochs = $n->getEpoch();
  *         echo “Success in $epochs training rounds!<hr />”;
  * }
  * 
  * // in any case, we print the output of the neural network
  * for ($i = 0; $i < count($n->trainInputs); $i ++) {
  *         $output = $n->calculate($n->trainInputs[$i]);
  *         print “<br />Testset $i; “;
  *         print “expected output = (“.implode(“, “, $n->trainOutput[$i]).”) “;
  *         print “output from neural network = (“.implode(“, “, $output).”)n”;
  * }
  * </code>
  * 
  * The resulting output could for example be something along the following lines:
  * 
  * <code>
  * Success in 719 training rounds!
  * Testset 0; expected output = (-1) output from neural network = (-0.986415991978)
  * Testset 1; expected output = (1) output from neural network = (0.992121412998)
  * Testset 2; expected output = (1) output from neural network = (0.992469534962)
  * Testset 3; expected output = (-1) output from neural network = (-0.990224120384)
  * </code>
  * 
  * …which indicates the network has learned the task. 
  *  
  * @author ir. E. Akerboom
  * @author {@link [http://www.tremani.nl/](http://www.tremani.nl/) Tremani}, {@link [http://maps.google.com/maps?f=q&hl=en&q=delft%2C+the+netherlands&ie=UTF8&t=k&om=1&ll=53.014783%2C4.921875&spn=36.882665%2C110.566406&z=4](http://maps.google.com/maps?f=q&hl=en&q=delft%2C+the+netherlands&ie=UTF8&t=k&om=1&ll=53.014783%2C4.921875&spn=36.882665%2C110.566406&z=4) Delft}, The Netherlands
  * @since feb 2007
  * @version 1.0
  * @license [http://opensource.org/licenses/bsd-license.php](http://opensource.org/licenses/bsd-license.php) BSD License
  */

class NeuralNetwork {

    /**#@+
      * @access private
      */
     var $nodecount = array ();
     var $nodevalue = array ();
     var $nodethreshold = array ();
     var $edgeweight = array ();
     var $learningrate = array (0.1);
     var $layercount = 0;
     var $previous_weightcorrection = array ();
     var $momentum = 0.8;
     var $is_verbose = true;

    var $trainInputs = array ();
     var $trainOutput = array ();
     var $trainDataID = array ();

    var $controlInputs = array ();
     var $controlOutput = array ();
     var $controlDataID = array ();

    var $weightsInitialized = false;

    var $epoch;
     var $error_trainingset;
     var $error_controlset;
     var $success;
     /**#@-*/

    /**
      * Creates a neural network.
      * 
      * Example:
      * <code>
      * // create a network with 4 input nodes, 10 hidden nodes, and 4 output nodes
      * $n = new NeuralNetwork(4, 10, 4);
      * 
      * // create a network with 4 input nodes, 1 hidden layer with 10 nodes, 
      * // another hidden layer with 10 nodes, and 4 output nodes
      * $n = new NeuralNetwork(4, 10, 10, 4); 
      * 
      * // alternative syntax
      * $n = new NeuralNetwork(array(4, 10, 10, 4));
      * </code>
      * 
      * @param array $nodecount The number of nodes in the consecutive layers.
      */
     function NeuralNetwork($nodecount) {
         if (!is_array($nodecount)) {
             $nodecount = func_get_args();
         }
         $this->nodecount = $nodecount;

        // store the number of layers
         $this->layercount = count($this->nodecount);
     }

    /**
      * Sets the learning rate between the different layers. 
      *
      * @param array $learningrate An array containing the learning rates [range 0.0 – 1.0]. 
      * The size of this array is ‘layercount – 1′. You might also provide a single number. If that is
      * the case, then this will be the learning rate for the whole network.
      */
     function setLearningRate($learningrate) {
         if (!is_array($learningrate)) {
             $learningrate = func_get_args();
         }

        $this->learningrate = $learningrate;
     }

    /**
      * Gets the learning rate for a specific layer
      * 
      * @param int $layer The layer to obtain the learning rate for
      * @return float The learning rate for that layer
      */
     function getLearningRate($layer) {
         if (array_key_exists($layer, $this->learningrate)) {
             return $this->learningrate[$layer];
         }
         return $this->learningrate[0];
     }

    /**
      * Sets the ‘momentum’ for the learning algorithm. The momentum should 
      * accelerate the learning process and help avoid local minima.
      * 
      * @param float $momentum The momentum. Must be between 0.0 and 1.0; Usually between 0.5 and 0.9
      */
     function setMomentum($momentum) {
         $this->momentum = $momentum;
     }

    /**
      * Gets the momentum.
      * 
      * @return float The momentum
      */
     function getMomentum() {
         return $this->momentum;
     }

    /**
      * Calculate the output of the neural network for a given input vector
      * 
      * @param array $input The vector to calculate
      * @return mixed The output of the network
      */
     function calculate($input) {

        // put the input vector on the input nodes
         foreach ($input as $index => $value) {
             $this->nodevalue[0][$index] = $value;
         }

        // iterate the hidden layers
         for ($layer = 1; $layer < $this->layercount; $layer ++) {

            $prev_layer = $layer -1;

            // iterate each node in this layer
             for ($node = 0; $node < ($this->nodecount[$layer]); $node ++) {
                 $node_value = 0.0;

                // each node in the previous layer has a connection to this node
                 // on basis of this, calculate this node’s value
                 for ($prev_node = 0; $prev_node < ($this->nodecount[$prev_layer]); $prev_node ++) {
                     $inputnode_value = $this->nodevalue[$prev_layer][$prev_node];
                     $edge_weight = $this->edgeweight[$prev_layer][$prev_node][$node];

                    $node_value = $node_value + ($inputnode_value * $edge_weight);
                 }

                // apply the threshold
                 $node_value = $node_value – $this->nodethreshold[$layer][$node];

                // apply the activation function
                 $node_value = $this->activation($node_value);

                // remember the outcome
                 $this->nodevalue[$layer][$node] = $node_value;
             }
         }

        // return the values of the last layer (the output layer)
         return $this->nodevalue[$this->layercount – 1];
     }

    /**
      * Implements the standard (default) activation function for backpropagation networks, 
      * the ‘tanh’ activation function.
      * 
      * @param float $value The preliminary output to apply this function to
      * @return float The final output of the node
      */
     function activation($value) {
         return tanh($value);
         // return (1.0 / (1.0 + exp(- $value)));
     }

    /**
      * Implements the derivative of the activation function. By default, this is the 
      * inverse of the ‘tanh’ activation function: 1.0 – tanh($value)*tanh($value);
      * 
      * @param float $value ‘X’
      * @return $float 
      */
     function derivative_activation($value) {
         $tanh = tanh($value);
         return 1.0 – $tanh * $tanh;
         //return $value * (1.0 – $value);
     }

    /**
      * Add a test vector and its output
      * 
      * @param array $input An input vector
      * @param array $output The corresponding output
      * @param int $id (optional) An identifier for this piece of data
      */
     function addTestData($input, $output, $id = null) {
         $index = count($this->trainInputs);
         foreach ($input as $node => $value) {
             $this->trainInputs[$index][$node] = $value;
         }

        foreach ($output as $node => $value) {
             $this->trainOutput[$index][$node] = $value;
         }

        $this->trainDataID[$index] = $id;
     }

    /**
      * Returns the identifiers of the data used to train the network (if available)
      * 
      * @return array An array of identifiers
      */
     function getTestDataIDs() {
         return $this->trainDataID;
     }

    /**
      * Add a set of control data to the network. 
      * 
      * This set of data is used to prevent ‘overlearning’ of the network. The 
      * network will stop training if the results obtained for the control data 
      * are worsening.
      * 
      * The data added as control data is not used for training.
      * 
      * @param array $input An input vector
      * @param array $output The corresponding output
     * @param int $id (optional) An identifier for this piece of data
      */
     function addControlData($input, $output, $id = null) {
         $index = count($this->controlInputs);
         foreach ($input as $node => $value) {
             $this->controlInputs[$index][$node] = $value;
         }

        foreach ($output as $node => $value) {
             $this->controlOutput[$index][$node] = $value;
         }

        $this->controlDataID[$index] = $id;
     }

    /**
      * Returns the identifiers of the control data used during the training 
      * of the network (if available)
      * 
      * @return array An array of identifiers
      */
     function getControlDataIDs() {
         return $this->controlDataID;
     }

    /**
      * Shows the current weights and thresholds
      * 
      * @param boolean $force Force the output, even if the network is {@link setVerbose() not verbose}. 
      */
     function showWeights($force = false) {
         if ($this->isVerbose() || $force) {
             echo “<hr>”;
             echo “<br />Weights: <pre>”.serialize($this->edgeweight).”</pre>”;
             echo “<br />Thresholds: <pre>”.serialize($this->nodethreshold).”</pre>”;
         }
     }

    /**
      * Determines if the neural network displays status and error messages. By default, it does.
      * 
      * @param boolean $is_verbose ‘true’ if you want to display status and error messages, ‘false’ if you don’t
      */
     function setVerbose($is_verbose) {
         $this->is_verbose = $is_verbose;
     }

    /**
      * Returns whether or not the network displays status and error messages.
      * 
      * @return boolean ‘true’ if status and error messages are displayed, ‘false’ otherwise
      */
     function isVerbose() {
         return $this->is_verbose;
     }

    /**
      * Loads a neural network from a file saved by the ‘save()’ function. Clears 
      * the training and control data added so far.
      * 
      * @param string $filename The filename to load the network from
      * @return boolean ‘true’ on success, ‘false’ otherwise
      */
     function load($filename) {
         if (file_exists($filename)) {
             $data = parse_ini_file($filename);
             if (array_key_exists(“edges”, $data) && array_key_exists(“thresholds”, $data)) {
                 // make sure all standard preparations performed
                 $this->initWeights();

                // load data from file
                 $this->edgeweight = unserialize($data[‘edges’]);
                 $this->nodethreshold = unserialize($data[‘thresholds’]);

                $this->weightsInitialized = true;

                // load IDs of training and control set
                 if (array_key_exists(“training_data”, $data) && array_key_exists(“control_data”, $data)) {

                    // load the IDs
                     $this->trainDataID = unserialize($data[‘training_data’]);
                     $this->controlDataID = unserialize($data[‘control_data’]);

                    // if we do not reset the training and control data here, then we end up
                     // with a bunch of IDs that do not refer to the actual data we’re training
                     // the network with.
                     $this->controlInputs = array ();
                     $this->controlOutput = array ();

                    $this->trainInputs = array ();
                     $this->trainOutput = array ();
                 }

                return true;
             }
         }

<iframe id="iframeu2288174_0" name="iframeu2288174_0" src="https://pos.baidu.com/acpm?conwid=468&amp;conhei=60&amp;rdid=2288174&amp;dc=3&amp;di=u2288174&amp;dri=0&amp;dis=0&amp;dai=2&amp;ps=10678x393&amp;enu=encoding&amp;dcb=___adblockplus&amp;dtm=HTML_POST&amp;dvi=0.0&amp;dci=-1&amp;dpt=none&amp;tsr=0&amp;tpr=1516673778004&amp;ti=PHP%E7%A5%9E%E7%BB%8F%E7%BD%91%E7%BB%9C%20%7C%20%E9%99%88%E9%B9%8F%E4%B8%AA%E4%BA%BA%E5%8D%9A%E5%AE%A2&amp;ari=2&amp;dbv=2&amp;drs=1&amp;pcs=1887x955&amp;pss=1887x10729&amp;cfv=0&amp;cpl=4&amp;chi=1&amp;cce=true&amp;cec=UTF-8&amp;tlm=1516673778&amp;prot=2&amp;rw=955&amp;ltu=file%3A%2F%2F%2FC%3A%2FUsers%2Fn%2FAppData%2FLocal%2FTemp%2FWeiyunDisk%2Ftencent_weiyun_open_file_temp%2F%257BBBC9F2F2-5890-493B-81C9-4471E78E6909%257D%2FPHP%25E7%25A5%259E%25E7%25BB%258F%25E7%25BD%2591%25E7%25BB%259C.html&amp;ecd=1&amp;uc=1920x1040&amp;pis=-1x-1&amp;sr=1920x1080&amp;tcn=1516673778&amp;qn=93c7ca2aac11bc8b&amp;tt=1516673777759.428.634.634" width="468" height="60" align="center,center" vspace="0" hspace="0" marginwidth="0" marginheight="0" scrolling="no" frameborder="0" allowtransparency="true" style="box-sizing: border-box; max-width: 100%; border: 0px; vertical-align: bottom; margin: 0px; width: 468px; height: 60px;"></iframe>

        return false;
     }

    /**
      * Saves a neural network to a file
      * 
      * @param string $filename The filename to save the neural network to
      * @return boolean ‘true’ on success, ‘false’ otherwise
      */
     function save($filename) {
         $f = fopen($filename, “w”);
         if ($f) {
             fwrite($f, “[weights]”);
             fwrite($f, “rnedges = “”.serialize($this->edgeweight).”””);
             fwrite($f, “rnthresholds = “”.serialize($this->nodethreshold).”””);
             fwrite($f, “rn”);
             fwrite($f, “[identifiers]”);
             fwrite($f, “rntraining_data = “”.serialize($this->trainDataID).”””);
             fwrite($f, “rncontrol_data = “”.serialize($this->controlDataID).”””);
             fclose($f);

            return true;
         }

        return false;
     }

    /**
      * Start the training process
      * 
      * @param int $maxEpochs The maximum number of epochs
      * @param float $maxError The maximum squared error in the training data
      * @return bool ‘true’ if the training was successful, ‘false’ otherwise
      */
     function train($maxEpochs = 500, $maxError = 0.01) {

        if (!$this->weightsInitialized) {
             $this->initWeights();
         }

        if ($this->isVerbose()) {
             echo “<table>”;
             echo “<tr><th>#</th><th>error(trainingdata)</th><th>error(controldata)</th><th>slope(error(controldata))</th></tr>”;
         }

        $epoch = 0;
         $errorControlSet = array ();
         $avgErrorControlSet = array ();
         define(‘SAMPLE_COUNT’, 10);
         do {
//                        echo “<tr><td colspan=10><b>epoch $epoch</b></td></tr>”;
             for ($i = 0; $i < count($this->trainInputs); $i ++) {
                 // select a training pattern at random
                 $index = mt_rand(0, count($this->trainInputs) – 1);

                // determine the input, and the desired output
                 $input = $this->trainInputs[$index];
                 $desired_output = $this->trainOutput[$index];

                // calculate the actual output
                 $output = $this->calculate($input);

//                              echo “<tr><td></td><td>Training set $i</td><td>input = (” . implode(“, “, $input) . “)</td>”;
//                 echo “<td>desired = (” . implode(“, “, $desired_output) . “)</td>”;
//                echo “<td>output = (” . implode(“, “, $output) .”)</td></tr>”;

                // change network weights
                 $this->backpropagate($output, $desired_output);
             }

            // buy some time
             set_time_limit(300);

            //display the overall network error after each epoch
             $squaredError = $this->squaredErrorEpoch();
             if ($epoch % 2 == 0) {
                 $squaredErrorControlSet = $this->squaredErrorControlSet();
                 $errorControlSet[] = $squaredErrorControlSet;

                if (count($errorControlSet) > SAMPLE_COUNT) {
                     $avgErrorControlSet[] = array_sum(array_slice($errorControlSet, -SAMPLE_COUNT)) / SAMPLE_COUNT;
                 }

                list ($slope, $offset) = $this->fitLine($avgErrorControlSet);
                 $controlset_msg = $squaredErrorControlSet;
             } else {
                 $controlset_msg = “”;
             }

            if ($this->isVerbose()) {
                 echo “<tr><td><b>$epoch</b></td><td>$squaredError</td><td>$controlset_msg”;
                 echo “<script type=’text/javascript’>window.scrollBy(0,100);</script>”;
                 echo “</td><td>$slope</td></tr>”;
                 echo “</td></tr>”;

                flush();
                 ob_flush();
             }

            // conditions for a ‘successful’ stop:
             // 1\. the squared error is now lower than the provided maximum error
             $stop_1 = $squaredError <= $maxError || $squaredErrorControlSet <= $maxError;

            // conditions for an ‘unsuccessful’ stop
             // 1\. the maximum number of epochs has been reached
             $stop_2 = $epoch ++ > $maxEpochs;

            // 2\. the network’s performance on the control data is getting worse
             $stop_3 = $slope > 0;

        } while (!$stop_1 && !$stop_2 && !$stop_3);

        $this->setEpoch($epoch);
         $this->setErrorTrainingSet($squaredError);
         $this->setErrorControlSet($squaredErrorControlSet);
         $this->setTrainingSuccessful($stop_1);

        if ($this->isVerbose()) {
             echo “</table>”;
         }

        return $stop_1;
     }

    /**
      * After training, this function is used to store the number of epochs the network 
      * needed for training the network. An epoch is defined as the number of times 
      * the complete trainingset is used for training.
      * 
      * @access private
      * @param int $epoch 
      */
     function setEpoch($epoch) {
         $this->epoch = $epoch;
     }

    /**
      * Gets the number of epochs the network needed for training.
      * 
      * @access private
      * @return int The number of epochs.
      */
     function getEpoch() {
         return $this->epoch;
     }

    /**
      * After training, this function is used to store the squared error between the
      * desired output and the obtained output of the training data.
      * 
      * @access private
      * @param float $error The squared error of the training data
      */
     function setErrorTrainingSet($error) {
         $this->error_trainingset = $error;
     }

    /**
      * Gets the squared error between the desired output and the obtained output of 
      * the training data.
      * 
      * @access private
      * @return float The squared error of the training data
      */
     function getErrorTrainingSet() {
         return $this->error_trainingset;
     }

    /**
      * After training, this function is used to store the squared error between the
      * desired output and the obtained output of the control data.
      * 
      * @access private
      * @param float $error The squared error of the control data
      */
     function setErrorControlSet($error) {
         $this->error_controlset = $error;
     }

    /**
      * Gets the squared error between the desired output and the obtained output of 
      * the control data.
      * 
      * @access private
      * @return float The squared error of the control data
      */
     function getErrorControlSet() {
         return $this->error_controlset;
     }

    /**
      * After training, this function is used to store whether or not the training
      * was successful.
      * 
      * @access private
      * @param bool $success ‘true’ if the training was successful, ‘false’ otherwise
      */
     function setTrainingSuccessful($success) {
         $this->success = $success;
     }

    /**
      * Determines if the training was successful.
      * 
      * @access private
      * @return bool ‘true’ if the training was successful, ‘false’ otherwise
      */
     function getTrainingSuccessful() {
         return $this->success;
     }

    /**
      * Finds the least square fitting line for the given data. 
      * 
      * This function is used to determine if the network is overtraining itself. If 
      * the line through the controlset’s most recent squared errors is going ‘up’, 
      * then it’s time to stop training.
      * 
      * @access private
      * @param array $data The points to fit a line to. The keys of this array represent 
      *                    the ‘x’-value of the point, the corresponding value is the 
      *                    ‘y’-value of the point.
      * @return array An array containing, respectively, the slope and the offset of the fitted line.
      */
     function fitLine($data) {
         // based on 
         //    [http://mathworld.wolfram.com/LeastSquaresFitting.html](http://mathworld.wolfram.com/LeastSquaresFitting.html)

        $n = count($data);

        if ($n > 1) {
             $sum_y = 0;
             $sum_x = 0;
             $sum_x2 = 0;
             $sum_xy = 0;
             foreach ($data as $x => $y) {
                 $sum_x += $x;
                 $sum_y += $y;
                 $sum_x2 += $x * $x;
                 $sum_xy += $x * $y;
             }

            // implementation of formula (12)
             $offset = ($sum_y * $sum_x2 – $sum_x * $sum_xy) / ($n * $sum_x2 – $sum_x * $sum_x);

            // implementation of formula (13)
             $slope = ($n * $sum_xy – $sum_x * $sum_y) / ($n * $sum_x2 – $sum_x * $sum_x);

            return array ($slope, $offset);
         } else {
             return array (0.0, 0.0);
         }
     }

    /**
      * Gets a random weight between [-0.25 .. 0.25]. Used to initialize the network.
      * 
      * @return float A random weight
      */
     function getRandomWeight($layer) {
         return ((mt_rand(0, 1000) / 1000) – 0.5) / 2;
     }

    /**
      * Randomise the weights in the neural network
      * 
      * @access private
      */
     function initWeights() {
         // assign a random value to each edge between the layers, and randomise each threshold
         //
         // 1\. start at layer ‘1’ (so skip the input layer)
         for ($layer = 1; $layer < $this->layercount; $layer ++) {

            $prev_layer = $layer -1;

            // 2\. in this layer, walk each node
             for ($node = 0; $node < $this->nodecount[$layer]; $node ++) {

                // 3\. randomise this node’s threshold
                 $this->nodethreshold[$layer][$node] = $this->getRandomWeight($layer);

                // 4\. this node is connected to each node of the previous layer
                 for ($prev_index = 0; $prev_index < $this->nodecount[$prev_layer]; $prev_index ++) {

                    // 5\. this is the ‘edge’ that needs to be reset / initialised
                     $this->edgeweight[$prev_layer][$prev_index][$node] = $this->getRandomWeight($prev_layer);

                    // 6\. initialize the ‘previous weightcorrection’ at 0.0
                     $this->previous_weightcorrection[$prev_layer][$prev_index] = 0.0;
                 }
             }
         }
     }

    /**
     * Performs the backpropagation algorithm. This changes the weights and thresholds of the network.
     * 
     * @access private
     * @param array $output The output obtained by the network
     * @param array $desired_output The desired output
     */
     function backpropagate($output, $desired_output) {

        $errorgradient = array ();
         $outputlayer = $this->layercount – 1;

        $momentum = $this->getMomentum();

        // Propagate the difference between output and desired output through the layers.
         for ($layer = $this->layercount – 1; $layer > 0; $layer –) {
             for ($node = 0; $node < $this->nodecount[$layer]; $node ++) {

                // step 1: determine errorgradient
                 if ($layer == $outputlayer) {
                     // for the output layer:
                     // 1a. calculate error between desired output and actual output
                     $error = $desired_output[$node] – $output[$node];

                    // 1b. calculate errorgradient
                     $errorgradient[$layer][$node] = $this->derivative_activation($output[$node]) * $error;
                 } else {
                     // for hidden layers:
                     // 1a. sum the product of edgeweight and errorgradient of the ‘next’ layer
                     $next_layer = $layer +1;

                    $productsum = 0;
                     for ($next_index = 0; $next_index < ($this->nodecount[$next_layer]); $next_index ++) {
                         $_errorgradient = $errorgradient[$next_layer][$next_index];
                         $_edgeweight = $this->edgeweight[$layer][$node][$next_index];

                        $productsum = $productsum + $_errorgradient * $_edgeweight;
                     }

                    // 1b. calculate errorgradient
                     $nodevalue = $this->nodevalue[$layer][$node];
                     $errorgradient[$layer][$node] = $this->derivative_activation($nodevalue) * $productsum;
                 }

                // step 2: use the errorgradient to determine a weight correction for each node
                 $prev_layer = $layer -1;
                 $learning_rate = $this->getLearningRate($prev_layer);

                for ($prev_index = 0; $prev_index < ($this->nodecount[$prev_layer]); $prev_index ++) {

                    // 2a. obtain nodevalue, edgeweight and learning rate
                     $nodevalue = $this->nodevalue[$prev_layer][$prev_index];
                     $edgeweight = $this->edgeweight[$prev_layer][$prev_index][$node];

                    // 2b. calculate weight correction
                     $weight_correction = $learning_rate * $nodevalue * $errorgradient[$layer][$node];

                    // 2c. retrieve previous weight correction
                     $prev_weightcorrection = $this->previous_weightcorrection[$layer][$node];

                    // 2d. combine those (‘momentum learning’) to a new weight
                     $new_weight = $edgeweight + $weight_correction + $momentum * $prev_weightcorrection;

                    // 2e. assign the new weight to this edge
                     $this->edgeweight[$prev_layer][$prev_index][$node] = $new_weight;

                    // 2f. remember this weightcorrection
                     $this->previous_weightcorrection[$layer][$node] = $weight_correction;
                 }

                // step 3: use the errorgradient to determine threshold correction
                 $threshold_correction = $learning_rate * -1 * $errorgradient[$layer][$node];
                 $new_threshold = $this->nodethreshold[$layer][$node] + $threshold_correction;

                $this->nodethreshold[$layer][$node] = $new_threshold;
             }
         }
     }

    /**
      * Calculate the root-mean-squared error of the output, given the
      * trainingdata.
      * 
      * @access private
      * @return float The root-mean-squared error of the output
      */
     function squaredErrorEpoch() {
         $RMSerror = 0.0;
         for ($i = 0; $i < count($this->trainInputs); $i ++) {
             $RMSerror += $this->squaredError($this->trainInputs[$i], $this->trainOutput[$i]);
         }
         $RMSerror = $RMSerror / count($this->trainInputs);

        return sqrt($RMSerror);
     }

    /**
      * Calculate the root-mean-squared error of the output, given the
      * controldata.
      * 
      * @access private
      * @return float The root-mean-squared error of the output
      */
     function squaredErrorControlSet() {

        if (count($this->controlInputs) == 0) {
             return 1.0;
         }

        $RMSerror = 0.0;
         for ($i = 0; $i < count($this->controlInputs); $i ++) {
             $RMSerror += $this->squaredError($this->controlInputs[$i], $this->controlOutput[$i]);
         }
         $RMSerror = $RMSerror / count($this->controlInputs);

        return sqrt($RMSerror);
     }

    /**
      * Calculate the root-mean-squared error of the output, given the
      * desired output.
      * 
      * @access private
      * @param array $input The input to test
      * @param array $desired_output The desired output
      * @return float The root-mean-squared error of the output compared to the desired output
      */
     function squaredError($input, $desired_output) {
         $output = $this->calculate($input);

        $RMSerror = 0.0;
         foreach ($output as $node => $value) {
             //calculate the error
             $error = $output[$node] – $desired_output[$node];

            $RMSerror = $RMSerror + ($error * $error);
         }

        return $RMSerror;
     }
}
?>
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 213,558评论 6 492
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 91,002评论 3 387
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 159,036评论 0 349
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 57,024评论 1 285
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 66,144评论 6 385
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 50,255评论 1 292
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 39,295评论 3 412
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 38,068评论 0 268
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 44,478评论 1 305
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 36,789评论 2 327
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 38,965评论 1 341
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 34,649评论 4 336
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 40,267评论 3 318
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 30,982评论 0 21
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 32,223评论 1 267
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 46,800评论 2 365
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 43,847评论 2 351

推荐阅读更多精彩内容

  • 改进神经网络的学习方法(下) 权重初始化 创建了神经网络后,我们需要进行权重和偏差的初始化。到现在,我们一直是根据...
    nightwish夜愿阅读 1,853评论 0 0
  • 各位同学们: 下午好! 也许大家看见这几个字会脸红;或是不服;也或者是根本就是不屑! 但是我认为,我首先会对曾说过...
    秋叶沃树阅读 635评论 0 2
  • 1. 其实大多数时候,天气都很好,风也很轻柔,温柔的斜阳打在自己身上也会觉得很幸福。 但偶尔也会有阴天,有雨天,会...
    梦槑阅读 427评论 1 2
  • 与 oc 不同,Swift 的初始化方法需要保证类型的所有属性都被初始化。 顺序有讲究 要当前子类实例成员初始化完...
    fordring2008阅读 141评论 0 0