Markov Chain

A Markov chain, named after Andrey Markov is a mathematical system that experiences transitions from one state to another, between a finite or countable number of possible states. Markov chain is a random process usually characterized as memoryless: the next state depends only on the current state and not on …

Read Complete Post

Stochastic Processes Introduction (2012)

Before starting the introduction of Stochastic Processes, let us start with some important definitions related to statistics and stochastic processes. Experiment: Any activity or situation having an uncertain outcome. Sample Space:  The set of all possible outcomes is called sample space and every element $\omega$ of $\Omega$ is called sample …

Read Complete Post