Equivalent of Neural ODE for discrete time state space models

14 visualizzazioni (ultimi 30 giorni)
Here is the example on how to train neural ODE to identify dynamical system :
https://mathworks.com/help/deeplearning/ug/dynamical-system-modeling-using-neural-ode.html
This example talks about continuous time models and I was wondering if there was any equivalent tutorial related to discrete time models ?

Risposta accettata

Arkadiy Turevskiy
Arkadiy Turevskiy il 31 Gen 2023
We added idNeuralStateSpace object that support both continuous and discrete time model. Maybe this could be useful. It was created to simplify code you have to write, so it would not allow you to write your own training loop though.
  1 Commento
Ben
Ben il 2 Feb 2023
Hi M.
I'm not sure if this is possible with the shallow network functions but it can be done with the dlnetwork and custom training loops since these we allow you to write your own model function that re-uses the same network on 2 different inputs. Here's some example code with dummy data - in practice you may need to tweak the training and network hyperparameters to get optimal performance.
% share a neural net across multiple calls
% create some fake data
% predict x(t+1) = F(x(1,t),u(1,t)) + F(x(2,t),u(2,t)) for some unknown F
numSteps = 100;
t = linspace(0,2*pi,numSteps);
F = @(x,u) sqrt(x+u+1);
x = [0;1];
u = [cos(t);sin(t)];
for i = 2:numSteps
x(:,i) = F(x(1,i-1),u(1,i-1)) + F(x(2,i-1),u(2,i-1));
end
% create a network to model F
% it needs to have two inputs, for x and u.
hiddenSize = 5000;
inputSize = 1;
outputSize = 2;
layers = [
featureInputLayer(inputSize,Name="x")
concatenationLayer(1,2,Name="concat");
fullyConnectedLayer(hiddenSize)
reluLayer
fullyConnectedLayer(outputSize)];
net = dlnetwork(layers,Initialize=false);
net = addLayers(net,featureInputLayer(1,Name="u"));
net = connectLayers(net,"u","concat/in2");
net = initialize(net);
% train with custom training loop
numEpochs = 1000;
vel = [];
x = dlarray(x,"CB");
u = dlarray(u,"CB");
learnRate = 0.1;
for epoch = 1:numEpochs
[loss,gradient] = dlfeval(@modelLoss,x,u,net);
lossValue = extractdata(loss);
fprintf("Epoch: %d, Loss %.4f\n", epoch, lossValue);
[net,vel] = sgdmupdate(net,gradient,vel,learnRate);
end
function [loss,gradient] = modelLoss(x,u,net)
% predict x(:,2:end) from x(:,1:end-1) and u(:,1:end-1)
xtarget = x(:,2:end);
xpred = model(x(:,1:end-1),u(:,1:end-1),net);
loss = mse(xtarget,xpred);
gradient = dlgradient(loss,net.Learnables);
end
function xpred = model(x,u,net)
% model xpred = x(t+1) = f(x(1,t),u(1,t)) + f(x(2,t),u(2,t)) where f is a neural net.
xpred = forward(net,x(1,:),u(1,:)) + forward(net,x(2,:),u(2,:));
end
Hope that helps.

Accedi per commentare.

Più risposte (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by