Login

Welcome, Guest. Please login or register.

November 08, 2025, 05:29:32 am

Author Topic: Markov Chains Help!  (Read 544 times)  Share 

0 Members and 1 Guest are viewing this topic.

pinklemonade

  • Victorian
  • Trendsetter
  • **
  • Posts: 135
  • Respect: 0
  • School Grad Year: 2015
Markov Chains Help!
« on: August 18, 2014, 06:03:32 pm »
0
The manager of a snow resort has noticed that, if it snows on a given day, there is a 70% chance that it will snow the following day. If it does not snow, there is only a 30% chance that it will snow the following day. John arrived on Saturday when the weather was sunny and clear.
a.) What is the probability that he will have fresh snow the following Tuesday?

For this question, I understand that But I just dont know how to find out what to multiply it by.

If anyone could help, it would be much appreciated!
"Hard work beats talent when talent doesn't work hard"

2015: English [??] | Business Management [??] | Visual Communication and Design [??] | Mathematical Methods (CAS) [??] | Specialist Mathematics [??]

dcc

  • Victorian
  • Part of the furniture
  • *****
  • Posts: 1198
  • Respect: +55
  • School Grad Year: 2008
Re: Markov Chains Help!
« Reply #1 on: August 18, 2014, 06:36:18 pm »
0
What does actually mean?  How should it be interpreted?

Tyleralp1

  • Victorian
  • Forum Obsessive
  • ***
  • Posts: 450
  • Braaaaaaap
  • Respect: +12
Re: Markov Chains Help!
« Reply #2 on: August 18, 2014, 06:42:28 pm »
+1
I'm assuming you've set up your horizontal as snow and sunny, and vertical as snow sunny too.

The initial state matrix represents the probability of the first event. If we know that Saturday was sunny, then it's a 100% probability.

Hence the matrix is:
1
0

Conversely, if it had been snow on Saturday, it's a 100% chance of snow so the initial matrix would be
0
1

Hope that helps :)
The GOAL: Attain a RAW study score of 40+ in all my subjects.

Courses I would like to study in order of preference include: Bachelor of Medicine/Bachelor of Surgery (MBBS), Bachelor of Biomedicine or Bachelor of Science.

2014: Biology [42]
2015: English Language [??] | Chemistry [??] | Physics [??] | Mathematical Methods (CAS) [??] | Specialist Mathematics [??]

pinklemonade

  • Victorian
  • Trendsetter
  • **
  • Posts: 135
  • Respect: 0
  • School Grad Year: 2015
Re: Markov Chains Help!
« Reply #3 on: August 18, 2014, 07:18:54 pm »
0
I'm assuming you've set up your horizontal as snow and sunny, and vertical as snow sunny too.

The initial state matrix represents the probability of the first event. If we know that Saturday was sunny, then it's a 100% probability.

Hence the matrix is:
1
0

Conversely, if it had been snow on Saturday, it's a 100% chance of snow so the initial matrix would be
0
1

Hope that helps :)

Thank you!! Helped a lot :)
"Hard work beats talent when talent doesn't work hard"

2015: English [??] | Business Management [??] | Visual Communication and Design [??] | Mathematical Methods (CAS) [??] | Specialist Mathematics [??]