MATLAB Answers


How to check irreducible Markov chain?

Asked by Clarisha Nijman on 23 Oct 2018
Latest activity Commented on by Clarisha Nijman on 23 Oct 2018
Now I want to check in matlab if a Markov Chain is irreducible or not. I found some instructions in mathworks saying:
tf1 = isreducible(mc1) %returns true if the discrete-time Markov chain mc is reducible and false otherwise.
but it seems not to be enough. Then came accros a part saying that the object should be defined first as a Markov chain. dtmc mc1
But it still gives errors. Where can I find simple information/instructions about this topic? I am relatively new in matlab, so I do not know the right keywords. Does somebody have some advice for me?
Thank you in advance


That is the same document in Mathwords I referred to. This document wants me to define mc as a dtmc object, the link on the page says a transition matrix is needed to do that. But is this the only way? Is it possible to give a series a input, and then to check if the series is irreducible or a proper Markov Chain?
on 23 Oct 2018
A Markov chain is defined by its transition matrix. If you know how you must transform your "series" (I don't exactly know what this series actually is) into the transition matrix, you are done.
Best wishes

Sign in to comment.

1 Answer

Answer by Torsten
on 23 Oct 2018
 Accepted Answer

  1 Comment

Thanks a lot Torsten,
so with the series (sequence of numbers or states the Markov chain visited after n transitions), the transition probability matrix is composed and then it can be checked if the Markov chain is irreducible or not. Thanks a lot!

Sign in to comment.