The cortex is modern invention of the mammalian brain and a major focus of modern neuroscience. Yet, its role in important functions such as sensory perception has been challenged by numerous cortical inactivation experiments resulting in a lack of effect during stimulus detection or discrimination. This is particularly true in the auditory domain, in which causal requirement for auditory cortex in perception heavily varies across stimulus choices and conditions. This raises the questions of the computations that makes auditory cortex sound representations necessary or not for solving a particular auditory task. In this talk, I will describe the results of a systematic comparison between neuronal population representations of diverse sounds across several stages of the auditory system, measured with two-photon calcium imaging and electrophysiology. Our results indicate that auditory cortex decorrelates sounds representations and specifically generates population representations in which time-averaging has little effect on the discriminability of time- varying sounds. The implication of these properties for the role of cortex in sound discrimination tasks will be discussed based on a bioinspired reinforcement-learning model. In addition, I will present results about the emergence of prediction signals and about specific neuronal population signatures of wakefulness in the auditory cortex.