Law of Effects
Law of Effects
Law of Effects
Thorndike suggested that responses closely followed by satisfaction will become firmly attached to the
situation and therefore more likely to reoccur when the situation is repeated. Conversely, if the situation
is followed by discomfort, the connections to the situation will become weaker, and the behavior of
response is less likely to occur when the situation is repeated.1 Imagine that you arrive early to work
one day by accident. Your boss notices and praises your diligence. The praise makes you feel good, so it
reinforces the behavior. You start showing up for work a little bit early each day to keep receiving your
boss’s commendations. Because a pleasing consequence followed the behavior, the action became
more likely to be repeated in the future. More Examples If you study and then get a good grade on a
test, you will be more likely to study for the next exam. If you work hard and then receive a promotion
and pay raise, you will be more likely to continue to put in more effort at work. If you run a red light and
then get a traffic ticket, you will be less likely to disobey traffic lights in the future.Discover While we
often associate the idea that consequences lead to changes in behavior with the process of operant
conditioning and B.F. Skinner, this notion has its roots in the early work of psychologist Edward
Thorndike.2 In his experiments, Thorndike utilized what is known as puzzle boxes to study how animals
learn. The boxes were enclosed but contained a small lever that, when pressed, would allow the animal
to escape. Thorndike would place a cat inside the puzzle box and then place a piece of meat outside the
box.He would then observe the animal’s efforts to escape and obtain the food. He recorded how long
each animal took to figure out how to free itself from the box. Eventually, the cats would press the lever,
and the door would open so that the animal could receive the reward. Even though first pressing the
lever occurred simply by accident, the cats became likely to repeat it because they had received an
award immediately after performing the action. Thorndike noted that with each trial, the cats became
much faster at opening the door. Because pressing the lever had led to a favorable outcome, the cats
were much more likely to perform the behavior again in the future.1Thorndike termed this the “Law of
Effect,” which suggested that when satisfaction follows an association, it is more likely to be repeated. If
an unfavorable outcome follows an action, then it becomes less likely to be repeated. There are two key
aspects of the law of effect:1 Behaviors immediately followed by favorable consequences are more
likely to occur again. In our earlier example, being praised by a supervisor for showing up early for work
made it more likely that the behavior would be repeated. Behaviors followed by unfavorable
consequences are less likely to occur again. If you show up late for work and miss an important meeting,
you will probably be less likely to show up late again in the future. Because you view the missed meeting
as a negative outcome, the behavior is less likely to be repeated.
Influence on Behaviorism
Thorndike’s discovery had a major influence on the development of behaviorism. B.F. Skinner based his
theory of operant conditioning on the law of effect. Skinner even developed his own version of a puzzle
box which he referred to as an operant conditioning chamber (also known as a Skinner box).3 In operant
conditioning, behaviors that are reinforced are strengthened, while those that are punished are
weakened. The law of effect clearly had a major influence on the development of behaviorism, which
went on to become the dominant school of thought in psychology for much of the 20th century.4
Definition
The Law of Effect is a specific mechanism of goal-directed or instrumental behavior. According to the
Law of Effect, a response that results in a positive or desirable outcome is more likely to occur in the
future because the positive outcome strengthens an association between the response (R) and the
stimulus context (S) in which the response occurred. Importantly, the response outcome or goal is not
part of the S-R association that is responsible for future occurrences of the response