Welcome!

This community is for professionals and enthusiasts of our products and services.
Share and discuss the best content and new marketing ideas, build your professional profile and become a better marketer together.

You need to be registered to interact with the community.
This question has been flagged
1 Reply
124 Views

What is a Markov process called if it has a finite number of states?

A) Continuous Markov process

B) Discrete Markov process

C) Infinite-state Markov chain

D) Limited transition chain

Avatar
Discard
Best Answer

B) Discrete Markov process

Avatar
Discard