Skip to content Skip to sidebar Skip to footer

Backpropagation In An Tensorflow.js Neural Network

When I have been attempting to implement this function tf.train.stg(learningRate).minimize(loss)into my code in order to conduct back-propagation. I have been getting multiple err

Solution 1:

The error says it all:

The f passed in variableGrads(f) must be a function

optimizer.minimize is expecting a function as parameter and not a tensor. Since the code is trying to minimize the meanSquaredError, the argument of minimize can be a function that computes the meanSquaredError between the predicted value and the expected one.

constloss = (pred, label) => pred.sub(label).square().mean();

for (let f = 0; f < 10; f++) {
            optimizer.minimize(() => tf.losses.meanSquaredError(Y, model))
}

Does it solve the issue, not completely yet ? The error will change for something like:

variableGrads() expects at least one of the input variables to be trainable

What does it mean ? When the optimizer is used, it expects the function passed as argument to contains variables whose values will be updated to minimize the function output.

Here is the changes to be made:

var Y = tf.tensor([[0,0,0],[0,0,0], [1,1,1]]).variable() // a variable instead// var loss = tf.losses.meanSquaredError(Y, model)// computed below in the minimize functionconst learningRate = 0.01;
var optimizer = tf.train.sgd(learningRate);
var model = RNN_FowardProp(X, a0, parameters);

constloss = (pred, label) => pred.sub(label).square().mean();
for (let f = 0; f < 10; f++) {
    optimizer.minimize(() => tf.losses.meanSquaredError(Y, model))
}

Post a Comment for "Backpropagation In An Tensorflow.js Neural Network"