ATAR Notes: Forum
VCE Stuff => VCE Science => VCE Mathematics/Science/Technology => VCE Subjects + Help => VCE Psychology => Topic started by: betruetoyou22 on September 23, 2012, 03:10:10 pm
-
I completely don't understand what the differentce between them are and how they're related to gambling. Can someone please helppexplain it in a simple way?
-
A variable ratio schedule is when the reinforcer is given after an unpredictable number of correct responses. There is also a constant mean for the number of correct responses reinforced (Compared to a random-ratio schedule where reinforcement is random and each response has an equal chance of being reinforced) This schedule is effective in maintaining behaviour as the uncertainty keeps the organism responding in the desired way. Example: Poker machines.
-
A variable ratio schedule is when the reinforcer is given after an unpredictable number of correct responses. There is also a constant mean for the number of correct responses reinforced (Compared to a random-ratio schedule where reinforcement is random and each response has an equal chance of being reinforced) This schedule is effective in maintaining behaviour as the uncertainty keeps the organism responding in the desired way. Example: Poker machines.
For RR - what do you mean by "each response has an equal chance of being reinforced"? Can you actually give me an example?
-
Well VR has a constant mean so for example in 10 responses, 5 are going to get reinforced, no matter what but these 5 occur in a random order but with RR there is no mean so every single response COULD be reinforced at any given time, there is no limit, like with VR. ..I think I worded the 'equal chance' thing a bit oddly. Sorry about that. :o
Example: A poker machine with a VR schedule: The behaviour (playing the slots game) is only reinforced when the person wins the game but this chance in winning occurs 3 times in every 500 games played while a poker machine that runs on a RR schedule may present a winning slot ANY time, which means instead of there only being 3 responses out of 500, there would be any, like 400 out of 500, or 1 out of 500 but because it is not limited, each response has a CHANCE on being reinforced.
-
Well VR has a constant mean so for example in 10 responses, 5 are going to get reinforced, no matter what but these 5 occur in a random order but with RR there is no mean so every single response COULD be reinforced at any given time, there is no limit, like with VR. ..I think I worded the 'equal chance' thing a bit oddly. Sorry about that. :o
Example: A poker machine with a VR schedule: The behaviour (playing the slots game) is only reinforced when the person wins the game but this chance in winning occurs 3 times in every 500 games played while a poker machine that runs on a RR schedule may present a winning slot ANY time, which means instead of there only being 3 responses out of 500, there would be any, like 400 out of 500, or 1 out of 500 but because it is not limited, each response has a CHANCE on being reinforced.
Thanks for that!! :) Actually makes sense now
-
Thanks for that!! :) Actually makes sense now
I'm so glad I could help!! :)