What is a Markov process called if it has a finite number of states?
A) Continuous Markov process
B) Discrete Markov process
C) Infinite-state Markov chain
D) Limited transition chain
This community is for professionals and enthusiasts of our products and services.
Share and discuss the best content and new marketing ideas, build your professional profile and become a better marketer together.