Noise parameters in Reinforcement learning DDPG
54 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
Surya teja Tunuguntla
il 14 Giu 2019
Commentato: Atikah Surriani
il 8 Mag 2023
What should be the values of Noise parameters (for agent) if my action range is between -0.5 to -5 in DDPG reinforcement learning I want to explore whole action range for each sample time? Also is there anyway to make the noise options (for agent) independent of sample time?
0 Commenti
Risposta accettata
Drew Davis
il 19 Giu 2019
Modificato: Drew Davis
il 19 Giu 2019
Hi Surya
It is fairly common to have Variance*sqrt(SampleTime) somewhere between 1 and 10% of your action range for Ornstein Uhlenbeck (OU) action noise. So in your case, the variance can be set between 4.5*0.01/sqrt(SampleTime) and 4.5*0.10/sqrt(SampleTime). The other important factor is the VarianceDecayRate, which will dictate how fast the variance will decay. You can calculate how many samples it will take for your variance to be halved by this simple formula:
halflife = log(0.5)/log(1-VarianceDecayRate)
It is critically important for your agent to explore while learning so keeping the VarianceDecayRate small (or even zero) is a good idea. The other noise parameters can usually be left as default.
The sample time of the noise options will be inherited by the agent, so it is not necessary to configure. By default, the noise model will be queried at the same rate as the agent.
Hope this helps
Drew
5 Commenti
Drew Davis
il 9 Dic 2019
You can derive this formula pretty easily:
decayfactor = 0.5 = (1 - decayrate)^(#steps)
Più risposte (1)
Atikah Surriani
il 30 Apr 2023
can i change noise model of ddpg using matlab? for example, the original ddpg using OU noise, while my study tends to change it using gaussian?
3 Commenti
Atikah Surriani
il 8 Mag 2023
thank you for the answer, so we can change the noise option on DDPG using matlab?
for example:
rl.option.OrnsteinUhlenbeckActionNoise
we change as " rl.option.gaussianActionNoise or rl.option.anythingActionNoise "
or else
thankyou
Vedere anche
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!