# Transition probabilities in Continuous Time Markov Chain following Poisson Processes

4 visualizzazioni (ultimi 30 giorni)
JESUS il 18 Mar 2024
Risposto: Sai Pavan il 25 Mar 2024
Let's imagine we have the following states corresponding to the number of users in a system in a continuous markov chain:
---- [10] ------ [11] ------ [12] ........... [17]
The arrival of a new users in the system follows a Poisson distribution at rate \lambda, and service time (users leaves the system) at rate \mu.
I want to compute the probability that staying at state [10] jumps into state [17]. That is, the probability that having 10 users in the system, 7 more users arrive in the system.
It is possible to compute?
Thanks!
##### 0 CommentiMostra -2 commenti meno recentiNascondi -2 commenti meno recenti

Accedi per commentare.

### Risposte (1)

Sai Pavan il 25 Mar 2024
Hello,
I assume you want to calculate the transition probablity of direct transition between state [10] to [17] given that arrivals follow a Poisson distribution with rate (\lambda) and departures (service completions) follow a rate (\mu) and the system's transitions are indeed modeled by these rates.
For a Continuous Time Markov Chain (CTMC) process like the one described, transitions typically occur one at a time, either an arrival or a departure, because the Poisson processes for arrivals and departures are memoryless and the probability of two or more simultaneous events (arrivals or departures) in an infinitesimally small time interval is essentially zero. It is highly not feasible for a direct transition from 10 users to 17 users without visiting the intermediate states ([11] to [16]) in the framework of Poisson processes and CTMCs because it would require 7 users to arrive simultaneously, and the Poisson process, by nature, describes events occurring independently and scattered over time and it contradicts the way these processes are modeled.
Hope it helps!
##### 0 CommentiMostra -2 commenti meno recentiNascondi -2 commenti meno recenti

Accedi per commentare.

### Categorie

Scopri di più su Markov Chain Models in Help Center e File Exchange

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by