Login

Welcome, Guest. Please login or register.

January 07, 2026, 08:36:44 am

Author Topic: Markov Chains - help! and also a suprise for you all ;)  (Read 600 times)  Share 

0 Members and 1 Guest are viewing this topic.

menashiiii

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 354
  • mega fresh
  • Respect: +1
Markov Chains - help! and also a suprise for you all ;)
« on: November 07, 2010, 04:57:31 pm »
0
seriously screwed if there are some!
can someone please run through the basics of it!
please!

and also
guys, check this out ;)

the year 12 video me and my mate made :D

http://www.youtube.com/watch?v=KbECymuAEXM
2011: BSc at UniMelb

Elnino_Gerrard

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 353
  • Respect: +6
Re: Markov Chains - help! and also a suprise for you all ;)
« Reply #1 on: November 07, 2010, 05:10:43 pm »
0
Haha Could defs have done without the last bit tho :P
2010 VCE ATAR : 98.35

lachymm

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 379
  • Respect: +1
Re: Markov Chains - help! and also a suprise for you all ;)
« Reply #2 on: November 07, 2010, 05:18:11 pm »
0
you were having phaggy time?

EDIT: i spose i should tell you that markov chains are simple, from what i understand they are always 2 x 2 matrix

Lets say someone goes to the beach 40% of the time if they went to the pool last week and go to pool 55% of the time if they went to the beach last week...(Made up)

Then Top left is percentage of going to the pool given they had just went to the pool while bottom right is the probability of going to the beach given they had just went to the beach. Then fill in the columns as they must add upto 1

Then add in the initial state Eg whether the person went to the beach or the pool initially and then use an appropriate "to the power sign" above the matrix we figured out before and your right to go :D
« Last Edit: November 07, 2010, 05:26:43 pm by lachymm »
2009 Further Mathematics [41]

Enter 95+

menashiiii

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 354
  • mega fresh
  • Respect: +1
Re: Markov Chains - help! and also a suprise for you all ;)
« Reply #3 on: November 07, 2010, 07:20:01 pm »
0
hahaha
thats awesome but!
ohh boy they were ;)

thanks dude!

but what if it says for a long period of time/long term?
how do i work that out?
2011: BSc at UniMelb

JinXi

  • Victorian
  • Forum Leader
  • ****
  • Posts: 818
  • Respect: +90
  • School: Camberwell High School
  • School Grad Year: 2010
Re: Markov Chains - help! and also a suprise for you all ;)
« Reply #4 on: November 07, 2010, 07:27:56 pm »
0
Let the 2x2 matrix(transitional matrix)^n, and let n be a large value for long term.
Monash B.Aero Eng/Sci Discontinued in Sem2 2012 [2011-2015]

"I will always choose a lazy person to do a difficult job… because, he will find an easy way to do it." ~ Bill Gates
^ SNORLAX, I chooosee You!!!