Environment for Reinforcement Learning Project
6 visualizzazioni (ultimi 30 giorni)
Mostra commenti meno recenti
GCats
il 21 Lug 2020
Commentato: Alberto Tellaeche
il 20 Feb 2023
Hi everyone!
I'm currently looking to work on a small Reinforcement Learning project. Friends have reccomended the OpenAI Gym (https://gym.openai.com/envs/#classic_control) where they provide many classical/non-classical control environments where one can apply reiforcement learning rules. However, these are based on python. Being a MatLab user myself, I was wondering wheter anyone knew something like OpenAI where I can download an environment (I'm interested in the Lunar Lander env, but it's not a strong preference) where I can apply RL rules easily.
I'd appreciate any tips!
0 Commenti
Risposta accettata
Emmanouil Tzorakoleftherakis
il 21 Lug 2020
Modificato: Emmanouil Tzorakoleftherakis
il 21 Lug 2020
Hello,
We are working on providing an interface between OpenAI Gym and Reinforcement Learning Toolbox but this will take some more time. In the meantime, you could use community posts like this one to get an idea of how this could be accomplished. I have not personally tried the code in the link above, but seems like it is along the lines of what you were looking for.
Hope that helps.
2 Commenti
John Adams
il 29 Nov 2021
Hi Emmanouil,
When will this interface be ready?
I am currently trying to interface using the link you posted above and it works fine for discrete action problems as in the example in the link using "this.open_env.step(int16(Action));" for the discrete cart pole problem. However for the continuous cart problem I get the following error when calling the step function [this.open_env.step(double(Action));] :--
Python Error: TypeError: 'float' object is not subscriptable
How can this problem be avoided?
Thx!
Alberto Tellaeche
il 20 Feb 2023
The same problem here....when actions are continuous, the "object is not subscriptable problem appears, no matter you use a 'float' or cast the data to 'single', the error remains the same.
Thank you,
Più risposte (0)
Vedere anche
Categorie
Scopri di più su Introduction to Installation and Licensing in Help Center e File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!