Continuity in probability (original) (raw)
In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge.
In probability theory, a stochastic process is said to be continuous in probability or stochastically continuous if its distributions converge whenever the values in the index set converge.