Conditioning is one of those terms people understand generally, but not specifically. I hear it used incorrectly by ‘dog trainers’ when they can’t explain a behaviour, much less change that behaviour. Here is a clinical definition of conditioning:
In the experimental study of animal behaviour, conditioning refers to the changes in an animal’s behaviour that arise when it learns the association between two correlated events, which we will call Event 1 and Event 2. As the animal learns the Event 1-Event 2 association, the conditioned response emerges in its behavioural repertoire. This response is then observed to extinguish if the original association between events is broken (i.e. if Events 1 and 2 no longer occur together).
Conditioning has been divided into two main types – classical conditioning (also known as Pavlovian conditioning) and instrumental conditioning (also known as operant conditioning). The two types of conditioning differ at a basic operational level. In classical conditioning, the animal is exposed to correlations between external events. For example, the animal is presented with a neutral stimulus (a light or a noise) and this is followed by a biologically important stimulus (e.g. a noxious stimulus such as a shock, or a positive stimulus such as delivery of food). The initially neutral stimulus, known as the conditioned stimulus (CS), comes to provoke a response as a consequence of its being paired temporally with the intrinsically important unconditioned stimulus (US). The classic example of this is the preparation studied by Pavlov himself. Pavlov’s dogs would hear a noise (CS) that was reliably followed by delivery of meat powder (US) into the mouth. After several such pairings of the CS and US, the dogs would come to salivate in response to hearing the noise. Salivation would then extinguish if the noise were repeatedly presented to the dogs without subsequent arrival of food. Instrumental conditioning requires that the animal learns to produce a specified response as a result of its association with a positive or negative US (such as shock or food). For example, in the preparation developed by Skinner, hungry rats learn to press a bar (the instrumental or conditioned response) to receive a food pellet (the reward, the US). As in classical conditioning, the response then extinguishes if the US is withheld.
In addition to learning to make a specified response (like pressing a bar) to obtain a positive reinforcer (such as food), animals can learn to make such responses to prevent the occurrence of a negative US such as a shock (i.e. a punisher). In the early part of the 20th century, the separation of classical and instrumental conditioning had as much to do with geography as science. Pavlov and his disciples in Europe were devoted to the study of classical conditioning, while Thorndike and his disciples in North America were almost exclusively interested in instrumental conditioning. However, the two share more than just the term ‘conditioning’. In both there must be some contingency between the CS or instrumental response and the US – that is, the two events must be correlated such that the likelihood of the US occurring is increased in the presence of the CS or instrumental response. Thus, conditioning is retarded or prevented if the contingency is reduced by the occurrence of either event without the other (i.e. the CS or instrumental response occurring without the US, or the US occurring without being preceded by the CS or instrumental response). Similarly, conditioning in both cases is sensitive to the contiguity between the events – conditioning is retarded if the temporal gap between the two events is increased. Both are subject to extinction if the US is withheld, and extinguished responding can be seen to recover spontaneously if the animal spends time removed from the experimental apparatus, or if the US is presented on its own (without the CS or instrumental response).
Several theorists have argued that classical and instrumental conditioning are functionally the same, differing only in the experimental procedures employed. The early behaviourists in North America believed that both types of conditioning involved the formation of stimulus-response (S-R) links (e.g. Hull, 1943). According to this view, conditioning involves a simple mechanism whereby, if an animal performs a particular response in the presence of a stimulus and this was followed by a positive outcome, then a connection between the stimulus and response would be formed such that the animal would now be more likely to produce the same behaviour in response to that stimulus. This S-R mechanism was believed to operate whether the response was instrumental or classically conditioned. However, there is a considerable amount of empirical evidence indicating that any such explanation of classically conditioned responses cannot be correct.
One line of evidence is based on the logic that, if these responses were learned because of some putative beneficial outcome, then preventing the animal from producing the response should prevent conditioning. Clearly, this is inevitable in the case of instrumental conditioning – preventing a rat from pressing a bar will prevent it from ever learning that the response produces a reward. However, this has been shown to be false in a variety of classical conditioning paradigms (Mackintosh, 1974). For example, injecting dogs with atropine can prevent them from salivating and therefore should, according to S-R analysis, prevent them from learning to salivate to a CS (e.g. a noise) that is paired with delivery of food. Several experiments have shown that dogs do, nevertheless, learn the CS-US association, and will immediately salivate to the CS if they are tested after the effects of atropine have worn off. Many studies have since shown S-R theories to be incorrect in their description of most instances of instrumental conditioning, as well as classical conditioning. One critical source of evidence has been that both instrumental and classically conditioned responses are usually sensitive to changes in the value of the reward (Adams and Dickinson, 1981). For example, if a rat has learned to press a bar for food when hungry, then it will be much less inclined to perform this response when sated for food. This shows that instrumental responses are like an action produced to obtain a particular outcome, rather than being a reflexive response elicited by the sight of the bar (as argued by an S-R account). Other experiments have identified a more fundamental distinction between instrumental and classically conditioned responses. This evidence relies on the impact of what is known as an ‘omission schedule’, where the US is delivered only if the animal does not produce the particular response in question (Mackintosh, 1974). Instrumental responses can be eliminated by introduction of an omission schedule. For example, rats that have learned to press a bar to obtain food can readily learn to stop pressing to obtain the same food (if the response-reward contingency is reversed).
Classically conditioned responses, on the other hand, are not so flexible. For example, a dog cannot learn to stop salivating to a noise that signals the delivery of food. Similarly, pigeons cannot learn to suppress pecking at a small light that signals the delivery of grain. In each case, if the omission schedule is continued the classically conditioned response will eventually extinguish because the CS is no longer followed by the US. Such findings indicate that, whereas instrumental responses are more flexible and under voluntary control, classically conditioned responses are more reflex-like responses elicited automatically in anticipation of the US.
References
Adams, C.D. and Dickinson, A. (1981) Instrumental responding following reinforcer devaluation. Quarterly Journal of Experimental Psychology B 33, 109-121. Hull, C.L. (1943) Principles of Behaviour. Appleton-Century-Crofts, New York. Mackintosh, N.J. (1974) The Psychology of Animal Learning. Academic Press, Londo
Partially excerpted from “The Encyclopedia of Applied Animal Behaviour and Welfare”