Math Is Fun Forum

  Discussion about math, puzzles, games and fun.   Useful symbols: ÷ × ½ √ ∞ ≠ ≤ ≥ ≈ ⇒ ± ∈ Δ θ ∴ ∑ ∫ • π ƒ -¹ ² ³ °

You are not logged in.

#1 Re: Help Me ! » Markov chain versus Markov Process » 2010-06-28 03:50:33

Thanks a lot for the info..I will grab the Finite Mathematics textbook and go from there...

#2 Help Me ! » Markov chain versus Markov Process » 2010-06-21 03:28:39

Ramze
Replies: 3

Dear Friends,

I was wondering how can I differ between a markov chain and markov process with some examples if possible. I was working in building a formulation for rain occurence probability (what is the probability of raining today if it rained yesterday, or what are the other option such as day 1: rain, day 2: no rain, day 1: no rain, day 2: rain...etc)...As I read this can be solved using 1st order markov chain...what about other orders (2nd or higher)? How this can be soved in this case?
I know it could be a lot of questions, but I believe this may clarify a lot for me and others who are confused regarding this process...Thank you in advance for your help.

Ramze

Board footer

Powered by FluxBB