● What is a discrete Markov chain?: Discrete Markov chain is a probabilistic finite automat that can have one discrete state in the moment of time.
● What is its key property?: Only the current state is important for the transition to the next state.
● Draw an example of a Markov chain (e.g., flea hopping):
● Draw the transition matrix for your example:
● Why do the rows of the transition matrix need to sum to 1?: The transition matrix show the possibilities for each transition from each state to another. The transition from the current state to the other state is mandatory, and all the possibilities are defined. If we have one event and know all discrete possibilities, they must summarize to one, thats how statistics works. Therefore the rows of the transition matrix need to sum to 1.
● What does the π vector denote?: Pi vector show us the chanses of the start in different states. It contains the probabilities of the values of X_0.
● What is the equilibrium distribution?: Equilibrium distribution is a distribution $\pi$ with the property $\pi^\top P = \pi^\top$ . That means, that the distribution has no change over time, and all $X_t$ will have the same distribution. Once a Markov chain reaches the equilibrium distribution, it will stay there.
What is continuous markov chains? → Tranistions/switching between states at random times and not at clock ticks like in a CPU, for example!
● How do we compute a transition matrix for a continuous time Markov Chain P(t)?: Probability matrix is a function of time, P(t) (function, that returns the matrix). We can compute the derivative of this function. P(t) = exp(Qt), where Q is a jump rate matrix. Q can be obtained from the limit calculation