Jump to content
Maestronet Forums

Pennstater

Members
  • Posts

    127
  • Joined

  • Last visited

Pennstater's Achievements

Member

Member (3/5)

  1. I saw the Batta Strad cello too; along with his Tourte bow. The cello seems to be in a state of good condition but poor maintenance. The lowere strings were on the wrong pegs. The fine tuner is very old. Maybe those are original strings and fine tuners. This cello is begging to be played. I mean its nice to have Picassos, Manet, etc as neighbors, but that there is so much music that is wanting to come out of it.
  2. What happened to her hand exactly?
  3. Check out Yellowcards. It's a punk rock band that has a violinist.
  4. Thanks for the ID. I think you guys are right. I just didn't expect a stereo recording from Heifetz. By the way, you can get Heifetz's Beethoven/Mendelsohnn, and Tchaik/Brahms CDs at Buy.com for $16 SHipped for both CDs. Great deal!
  5. A bit off-topic. I think the vibrato is way overused in operas. Can't there be a compromise between Bach chorales and operas?
  6. I've used this analogy before. Your chance of becoming a world class virtuoso is: 0.00001% For someone who started early, got into a conservatory, etc, the chance of him/her becoming a world class virtuoso is: 0.001%
  7. Kabal, the server only allows 1 download at a time. keep trying. As for the debate on the good and evil of MP3s, you see I am trying to BUY a CD of something that I heard on MP3s
  8. http://hackel.hn.org/classical/composersindex.html Down under Beethoven, there are MP3 files of this concerto. But no track info is available. I own the Perlman and Milstein CD recordings already. But I still prefer this MP3 version. Can you pros identify which recording it is so I can buy a CD of it? Thanks
  9. I think a better question is: if you can play paganini caprices, does that mean you can play Music? (capitalized for emphasis). Have you ever wonder how a musician can make a single long note sound so "musical"?
  10. Woohoo, just got my 6-cd set of "Accardo plays Paganini" It has all the violin concertos, and other showpieces. Ever since I lost Accardo's Paganini #4&5 cd, I've been trying to find it again at a good price. Now, I just got the 6-cd set for ~$35. The playing is superb and so is the recording quality. Right now I am listening to the 2nd mvt of Pag #4. I think it's one of Paganini's best slow "pieces". I'll check back later. Here's the link if you want to buy it: http://www.walmart.com/catalog/product.gsp...duct_id=1031013
  11. I was digging through my parents' cd collection this Xmas break and came upon a cheap "encore" label (it was originally released by EMI) Mozart violin concertos 1, 3, 5 played by Menuhin accompanied by the Bath festival Orchestra. Brought it back to my apt and popped it into my stereo. Wow, I was amazed his interpretation! Although I admit that there were a few places where the intonation was a little "late" (or was it on purpose???). However, the....hmm what's a good word.....the "humanity" of the music came through my speakers! The orchestra's accompaniment was also superb, not too rigid nor too "free-style". I then took out Perlman 's recording of the Mozart concs (complete concs by DG) and I felt Perlman's interpretation was a lot more "mechanical". Anybody else heard of Menuhin's mozart care to comment?
  12. ooh ooh ooh. Here is a great deal: a used Yamaha RX-v595 receiver for $150 http://cgi.audiogon.com/cgi-bin/cl.pl?misc...mp;3&4& If i was building an entry system, I would jump on this. MMG $550 Rx-V595 $150 Still $200 left for DVD player and other speakers.
  13. Western music to be exact. Two articles from this week's Science magazine shows that harmonic chords in different keys trigger specific responses in the brain. Disclaimer:I am not posting the entire article (figures, tables, text) so there should be copy right issues. If you want the full pdf file, email me. NEUROSCIENCE: Mental Models and Musical Minds Robert J. Zatorre and Carol L. Krumhansl* Music is found in all cultures and has a remarkable diversity of forms. Cognitive scientists have discovered highly specific and detailed knowledge of musical structure even in individuals without extensive musical training. Brain imaging has proved valuable for investigating the neural basis of a variety of cognitive functions, including how the brain processes music. These developments converge on page 2167 of this issue with the research article by Janata et al. (1). They report that abstract patterns of Western tonal musical structure are mirrored in patterns of brain activity in human subjects. Using functional magnetic resonance imaging, Janata and colleagues analyzed the brain activity of eight musically experienced listeners as they performed two musical perception tasks requiring them to detect a deviance in timbre (a note played by a flute instead of a clarinet) and notes that violated local tonality. The investigators found that the auditory cortex as well as a number of other brain areas were activated in their subjects as they undertook the musical perception tasks. The most consistent activation was along the superior temporal gyrus of both hemispheres. Additional regions that were activated included the temporal, parietal, frontal, and limbic lobes as well as the thalamus and cerebellum, indicating that the processing of music is extremely complicated. Recent research in neuroanatomy, neurophysiology, and functional brain imaging has investigated the location and properties of auditory cortical fields (2). There is an emerging consensus that both the cytoarchitecture and neural connections within auditory cortical fields form the basis of a hierarchy of processing regions in the brain. These regions start in core areas of the primary auditory cortex, and emanate in several processing streams that extend into various portions of the superior and middle temporal gyri (see the top figure). Music offers a way to explore the functional organization of these putative processing streams. Auditory cortical regions have extensive, spatially organized projections to and from frontal and parietal lobes. These extended cortical networks are thought to underlie the complex neural events engaged during the processing of complex sounds. Even a seemingly simple musical task such as perceiving and recognizing a melody entails a variety of cognitive processes, including perceptual, attentional, mnemonic, and affective responses. -------------------------------------------------------------------------------- The sensation of music. (A) Auditory cortical areas in the superior temporal gyrus that respond to musical stimuli. Regions that are most strongly activated are shown in red. ( Metabolic activity in the ventromedial region of the frontal lobe increases as a tonal stimulus becomes more consonant. -------------------------------------------------------------------------------- Music seems to depend on specific brain circuitry, because it can be dissociated from processing of other classes of sounds, including speech, in individuals with certain diseases or brain lesions. This concept is supported by functional imaging studies, which reveal that there are specialized activity patterns for tonal processing, including those found in the temporal and frontal brain areas that are critical for tonal working memory. In addition, as Janata and colleagues confirm, there are hemispheric asymmetries; for example, the right side of the brain is preferentially activated during the processing of musical pitch (3). Initial studies on the affective response to music implicate a variety of neural structures in limbic and paralimbic brain areas, as well as in midbrain and basal forebrain regions linked to reward and motivation (4). One area consistently modulated by affective responses is the ventromedial frontal cortex (see the top figure), the principal region identified by Janata et al. as showing sensitivity to tonality. A separate line of investigation originates with research into the psychological reality of theoretical descriptions of music. Are listeners and performers influenced by the tonal, harmonic, melodic, rhythmic, and metric patterns identified in music theory? If so, does this knowledge influence how a listener organizes and remembers music, and how a musician plans and executes a performance? Empirical studies have extensively documented the precise contents of this knowledge and show that it largely conforms to descriptions of music theory. Many aspects of music, such as whether notes are played in tune, appear to be acquired implicitly--that is, without explicit formal instruction--and the influence of this knowledge is manifest in a wide range of musical behaviors. One of the most developed areas in the cognitive science of music is the description of pitch structures in Western tonal-harmonic music. This includes musical scales, harmonics, and relations between musical keys. Music in many styles is organized around one or more stable reference tones (the tonic, in Western tonal music). Other tones differ in their degree of perceived stability, giving rise to a tonal hierarchy. Tones high in the hierarchy are remembered accurately, are heard as giving a sense of finality or closure, and are more expected within the tonal context--an effect that Janata et al. exploit in their tonality perception task. Keys that are considered closely related, in the sense that modulations (that is, shifts from one key to another) are relatively easy to effect, have similar tonal hierarchies and shared harmonies. -------------------------------------------------------------------------------- Mental key maps. (A) Unfolded version of the key map, with opposite edges to be considered matched. There is one circle of fifths for major keys (red) and one for minor keys (blue), each wrapping the torus three times. In this way, every major key is flanked by its relative minor on one side (for example, C major and a minor) and its parallel minor on the other (for example, C major and c minor). ( Musical keys as points on the surface of a torus. -------------------------------------------------------------------------------- Geometric models, or maps, of these key relations that conform to descriptions in music theory have been generated by computational methods. One such key map in the shape of a torus or ring (5), and similar to that used by Janata et al. in their experiments, was obtained by training a self-organizing neural network model with experimentally quantified tonal hierarchies (see the bottom figure). This key map provides a visual display of an abstract mental model of key relationships. Results of experiments investigating how the sense of key develops and continuously changes can be projected onto the key map (6). The strength of the Janata et al. work is that it brings together various techniques in an original combination. Their study probes the neural correlates of tonality perception in an effort to identify brain regions that respond to musically modulating sequences. In so doing, this study raises a variety of questions. Is the topography of the key map reflected directly in the cortical activation pattern? Alternatively, do the Janata et al. findings reflect more general processes of remembering and comparing tones? How are the activation patterns (in particular those of the ventromedial frontal area) observed by Janata and co-workers related to affective responses to music? One might expect sensory-related regions of the superior temporal cortex to be implicated in computing the relationships between tones that result in tonality mapping. However, it is still not known how the perceptual responses of the superior temporal cortex interact with the responses distributed across the activated brain regions reported by Janata and colleagues. Implicit learning of key musical structures may take place over a lifetime of listening to music, so it is possible that tonal maps become widely distributed over the brain. Regardless of the answers to these questions, cognitive neuroscience has benefited from the application of sophisticated cognitive models to explore the correlation between music processing and neuroanatomical regions of the brain. References and Notes P. Janata et al., Science 298, 2167 (2002). J. H. Kaas, T. A. Hackett, M. J. Tramo, Curr. Opin. Neurobiol. 9, 164 (1999). [Medline] R. J. Zatorre, I. Peretz, Ann. N.Y. Acad. Sci. 930, 193 (2001). [Medline] A. J. Blood, R. J. Zatorre, Proc. Natl. Acad. Sci. U.S.A. 98, 11818 (2001). [Medline] P. Toiviainen, C. L. Krumhansl, Perception, in press. See www.cc.jyu.fi/~ptoiviai/bwv805/index.html for a movie that shows how key strengths change over time as a Bach organ duet is played. The perceptual judgments of listeners (top) are compared with a computer model of key-finding (bottom). -------------------------------------------------------------------------------- R. J. Zatorre is at the Montreal Neurological Institute, McGill University, Montreal, Quebec H3A 2B4, Canada. E-mail: [email=robert.zatorre@mcgill.ca">robert.zatorre@mcgill.ca</a> C. L. Krumhansl is in the Department of Psychology, Cornell University, Ithaca, NY 14853, USA. -------------------------------------------------- The Cortical Topography of Tonal Structures Underlying Western Music Petr Janata,1,2* Jeffrey L. Birk,1 John D. Van Horn,2,3 Marc Leman,4 Barbara Tillmann,1,2 Jamshed J. Bharucha1,2 Western tonal music relies on a formal geometric structure that determines distance relationships within a harmonic or tonal space. In functional magnetic resonance imaging experiments, we identified an area in the rostromedial prefrontal cortex that tracks activation in tonal space. Different voxels in this area exhibited selectivity for different keys. Within the same set of consistently activated voxels, the topography of tonality selectivity rearranged itself across scanning sessions. The tonality structure was thus maintained as a dynamic topography in cortical areas known to be at a nexus of cognitive, affective, and mnemonic processing. 1 Department of Psychological and Brain Sciences, 2 Center for Cognitive Neuroscience, 3 Dartmouth Brain Imaging Center, Dartmouth College, Hanover, NH 03755, USA. 4 Institute for Psychoacoustics and Electronic Music, Ghent University, Ghent, Belgium. * To whom correspondence should be addressed. E-mail: <a href="mailto:petr.janata@dartmouth.edu]robert.zatorre@mcgill.ca">robert.zatorre@mcgill.ca</a> C. L. Krumhansl is in the Department of Psychology, Cornell University, Ithaca, NY 14853, USA. -------------------------------------------------- The Cortical Topography of Tonal Structures Underlying Western Music Petr Janata,1,2* Jeffrey L. Birk,1 John D. Van Horn,2,3 Marc Leman,4 Barbara Tillmann,1,2 Jamshed J. Bharucha1,2 Western tonal music relies on a formal geometric structure that determines distance relationships within a harmonic or tonal space. In functional magnetic resonance imaging experiments, we identified an area in the rostromedial prefrontal cortex that tracks activation in tonal space. Different voxels in this area exhibited selectivity for different keys. Within the same set of consistently activated voxels, the topography of tonality selectivity rearranged itself across scanning sessions. The tonality structure was thus maintained as a dynamic topography in cortical areas known to be at a nexus of cognitive, affective, and mnemonic processing. 1 Department of Psychological and Brain Sciences, 2 Center for Cognitive Neuroscience, 3 Dartmouth Brain Imaging Center, Dartmouth College, Hanover, NH 03755, USA. 4 Institute for Psychoacoustics and Electronic Music, Ghent University, Ghent, Belgium. * To whom correspondence should be addressed. E-mail: <a href="mailto:petr.janata@dartmouth.edu[/email] -------------------------------------------------------------------------------- The use of tonal music as a stimulus for probing the cognitive machinery of the human brain has an allure that derives, in part, from the geometric properties of the theoretical and cognitive structures involved in specifying the distance relationships among individual pitches, pitch classes (chroma), pitch combinations (chords), and keys (1-3). These distance relationships shape our perceptions of music and allow us, for example, to notice when a pianist strikes a wrong note. One geometric property of Western tonal music is that the distances among major and minor keys can be represented as a tonality surface that projects onto the doughnut shape of a torus (1, 4). A piece of music elicits activity on the tonality surface, and harmonic motion can be conceptualized as displacements of the activation focus on the tonality surface (3). The distances on the surface also help govern expectations that actively arise while one listens to music. Patterns of expectation elicitation and fulfillment may underlie our affective responses to music (5). Two lines of evidence indicate that the tonality surface is represented in the human brain. First, when one subjectively rates how well each of 12 probe tones, drawn from the chromatic scale (6), fits into a preceding tonal context that is established by a single chord, chord progression, or melody, the rating depends on the relationship of each tone to the instantiated tonal context. Nondiatonic tones that do not occur in the key are rated as fitting poorly, whereas tones that form part of the tonic triad (the defining chord of the key) are judged as fitting best (2). Probe-tone profiles obtained in this manner for each key can then be correlated with the probe-tone profile of every other key to obtain a matrix of distances among the 24 major and minor keys. The distance relationships among the keys readily map onto the surface of the torus (4). Thus, there is a direct correspondence between music-theoretic and cognitive descriptions of the harmonic organization of tonal music (7). Second, electroencephalographic studies of musical expectancy (8-11) have examined the effect of melodic and harmonic context violations on one or more components of event-related brain responses that index the presence and magnitude of context violations. Overall, the cognitive distance of the probe event from the established harmonic context correlates positively with the amplitudes of such components. These effects appear even in listeners without any musical training (9, 11). The perceptual and cognitive structures that facilitate listening to music may thus be learned implicitly (2, 12-15). The prefrontal cortex has been implicated in the manipulation and evaluation of tonal information (10, 11, 16-18). However, the regions that track motion on the tonality surface have not been identified directly. When presented with a stimulus that systematically moves across the entire tonality surface, will some populations of neurons respond selectively to one region of the surface and other populations respond selectively to another region of the surface? Identification of tonality-tracking brain areas. In order to identify cortical sites that were consistently sensitive to activation changes on the tonality surface, eight musically experienced listeners (see "subjects" in supporting online text) underwent three scanning sessions each, separated by 1 week on average, in which they performed two perceptual tasks during separate runs. During each run, they heard a melody that systematically modulated through all 12 major and 12 minor keys (see "stimuli and tasks" in supporting online text) (Fig. 1 and audio S1). A timbre deviance detection task required listeners to respond whenever they heard a note played by a flute instead of the standard clarinet timbre, whereas a tonality violation detection task required listeners to respond whenever they perceived notes that violated the local tonality (Fig. 1D). The use of two tasks that required attentive listening to the same melody but different perceptual analyses facilitated our primary goal of identifying cortical areas that exhibit tonality tracking that is largely independent of the specific task that is being performed (see "scanning procedures" in supporting online text). Using a regression analysis with separate sets of regressors to distinguish task effects from tonality surface tracking, we identified task- and tonality-sensitive areas (see "fMRI analysis procedures" in supporting online text). Tonality regressors were constructed from the output of a neural network model of the moment-to-moment activation changes on the tonality surface (see "tonality surface estimation" in supporting online text). -------------------------------------------------------------------------------- Fig. 1. Properties of the tonality surface and behavioral response profiles. In the key names, capital letters indicate major keys and lower-case letters indicate minor keys. (A) Unfolded tori showing the average tonality surfaces for each of the 24 keys in the original melody. The top and bottom edges of each rectangle wrap around to each other, as do the left and right edges. and refer to the angular position along each of the circles comprising the torus. The color scale is arbitrary, with red and blue indicating strongest and weakest activation, respectively. Starting with C major and shifting from left to right, the activation peak in each panel reflects the melody's progression through all of the keys. ( The circle of fifths. Major keys are represented by the outside ring of letters. Neighboring keys have all but one of their notes in common. The inner ring depicts the (relative) minor keys that share the same key signature (number of sharps and flats) with the adjacent major key. The color code refers to the three groups of keys into which tonality tracking voxels were categorized (Fig. 3). © Correlations among the average tonality surface topographies for each key. The topographies of keys that are closely related in a music-theoretic sense are also highly positively correlated, whereas those that are distantly related are negatively correlated. Three groups of related keys, indicated in (, were identified by singular value decomposition of this correlation matrix. (D). Average response profiles (eight listeners, three sessions each) from the tonality deviance detection task illustrate the propensity of specific test tones to pop out and elicit a response in some keys but not in others over the course of the melody. Error bars reflect 1 SEM. [View Larger Version of this Image (72K GIF file)] -------------------------------------------------------------------------------- Our tasks consistently activated several regions in the temporal, parietal, frontal, and limbic lobes as well as the thalamus and cerebellum. The most extensive consistent activation was along the superior temporal gyrus (STG) of both hemispheres, though the extent was greater in the right hemisphere, stretching from the planum temporale to the rostral STG and middle temporal gyrus (Fig. 2A and Table 1). Both the task and the tonality regressors correlated significantly and consistently with activity in the rostromedial prefrontal cortex, primarily in the rostral and ventral reaches of the superior frontal gyrus (SFG) (Figs. 2 and 3). The consistent modulation of this area in all of our listeners led us to focus on this region as a possible site of a tonality map. -------------------------------------------------------------------------------- Fig. 2. Group conjunction maps showing the consistency with which specific structures were activated across listeners. Conjunction maps of individual listeners, containing the voxels that were activated significantly (P < 0.001) in all scanning sessions for that listener, were normalized into a common space and summed together across listeners (see "spatial normalization" in supporting online text). Voxels that were consistently activated by at least four of the eight listeners are projected onto the group's mean normalized T1 image. (A) Areas sensitive to the two task regressors (Table 1). ( The only areas whose activity patterns were significantly and consistently correlated with the tonality regressors both within and across listeners were the rostral portion of the ventromedial superior frontal gyrus and the right orbitofrontal gyrus. [View Larger Version of this Image (109K GIF file)] -------------------------------------------------------------------------------- Table 1. Loci consistently showing a main effect of task in a majority of listeners. MTG, middle temporal gyrus; IFG, inferior frontal gyrus; SPG, superior parietal gyrus. -------------------------------------------------------------------------------- Lobe Region (Brodmann area) Left hemisphere -------------------------------------------------------------------------------- Right hemisphere -------------------------------------------------------------------------------- Location (mm) -------------------------------------------------------------------------------- Listeners at peak (no.) Cluster size (voxels) Location (mm) -------------------------------------------------------------------------------- Listeners at peak (no.) Cluster size (voxels) x y z x y z -------------------------------------------------------------------------------- Temporal STG (22) -64 -11 10 6 74 STG/Heschl's gyrus (41/42) -56 -19 9 7 74 52 -11 5 8 163 STG/planum temporale (22) -68 -41 15 5 14 64 -30 15 6 163 64 -26 5 6 163 Rostromedial STG 38 15 -35 5 36 Rostroventral MTG (21) 52 0 -35 6 36 Middle MTG/superior temporal sulcus (21) 56 -15 -15 5 163 Ventral MTG (21) 60 -11 -25 6 163 Frontal Rostroventromedial SFG (10/14) 0 49 0 5 27 4 64 0 5 27 Superior frontal sulcus/frontopolar gyrus (10) 26 64 30 5 3 Lateral orbital gyrus (11) 49 41 -10 5 4 IFG, pars orbitalis (47) 49 45 4 5 3 IFG, pars opercularis (44) 56 19 5 6 3 60 22 20 4 11 Precentral gyrus (6) 49 4 55 5 10 Parietal Postcentral gyrus (1) 64 -11 25 6 163 Supramarginal gyrus (40) 64 -30 35 6 3 Precuneus (7) 0 -45 55 5 42 0 -45 55 5 42 -4 -56 75 6 42 SPG (7) 11 -56 80 5 3 19 -49 75 6 5 SPG/transverse parietal sulcus (7) -4 -71 60 6 22 Limbic Collateral sulcus -30 -8 -30 5 10 Hippocampus/collateral sulcus 26 -11 -25 5 23 Other Cerebellum -4 -82 -35 5 11 -38 -79 -25 6 19 26 -86 -30 5 10 45 -64 -45 5 8 Mediodorsal thalamic nucleus 0 -11 9 5 3 -------------------------------------------------------------------------------- Fig. 3. Topography of tonality sensitivity of rostroventral prefrontal cortex in three listeners across three scanning sessions each. Each voxel's color represents the key group with which the voxel's TSS was maximally correlated (Fig. 1B). The minority of voxels that were maximally correlated with the average tonality surface are shown in white. A TSS represents how sensitive the voxel is to each point on the torus. The TSSs of selected voxels are displayed as unfolded tori. Figure 1A serves as a legend for assigning keys to the individual TSSs. The highlighted voxels were chosen to display both the consistency and heterogeneity of the tonality surfaces across sessions. For each listener, the activity of all voxels shown was significantly correlated with the tonality regressors in all sessions. Thus, what changed between sessions was not the tonality-tracking behavior of these brain areas but rather the region of tonal space (keys) to which they were sensitive. This type of relative representation provides a mechanism by which pieces of music can be transposed from key to key, yet maintain their internal pitch relationships and tonal coherence. [View Larger Version of this Image (50K GIF file)] -------------------------------------------------------------------------------- Tonality-specific responses in the rostromedial prefrontal cortex. At the individual level, we reconstructed and categorized the tonality sensitivity surface (TSS) for each voxel that exhibited significant responses (P < 0.001) in every one of the three scanning sessions (see "tonality surface estimation" in supporting online text). The reconstructed surfaces from each session indicated that the medial prefrontal cortex maintains a distributed topographic representation of the overall tonality surface (Fig. 3). Although some voxels exhibited similar TSSs from session to session, the global tonality topography varied across sessions in each of the listeners. The number of voxels falling into each of the tonality categories (Fig. 1B) was evenly distributed within each session (table S1), but the relative pattern of tonality sensitivity changed. For all listeners, we also found tonality-sensitive voxels outside of the medial prefrontal region (table S2). The precise constellations of sensitive areas differed across listeners. We found tonality-sensitive foci in the orbital and frontal gyri, primarily in the right hemisphere; the temporal pole; the anterior and posterior superior temporal sulci; the precuneus and superior parietal gyrus; the posterior lingual gyrus; and the cerebellum (19). Discussion. Central to our ability to hear music coherently are cognitive structures that maintain perceptual distance relationships among individual pitches and groups of pitches. These structures shape expectations about pitches we will hear, given a preceding musical input. Given the diversity of the music we hear, the situations in which we hear it, and our affective and motoric responses to it, it is likely that tonal contexts are maintained in cortical regions predisposed to mediating interactions between sensory, cognitive, and affective information. The medial prefrontal cortex is a nexus for such functions (20, 21) and is therefore an ideal region for maintaining a tonality map. In the macaque, connections to the medial prefrontal cortex from unimodal sensory cortices are widespread for the auditory modality and sparse for the other sensory modalities (22). In our experiments, we observed significant task-related activity in auditory association areas and the anterior STG, primarily in the right hemisphere. Reciprocal projections between these areas and the ventral medial prefrontal cortex help explain how and why a tonality map might be maintained in the medial prefrontal cortex. This region has already been implicated in assessing the degree of musical consonance or dissonance caused by a harmonic accompaniment to a melody (23). Our results suggest that the rostromedial prefrontal cortex not only responds to the general degree of consonance but actively maintains a distributed topographic representation of the tonality surface. The perception of consonance and dissonance depends on intact auditory cortices (24, 25). However, even with bilateral auditory cortex ablations, the ability to generate expectancies based on tonal contexts remains, suggesting that the cognitive structures maintaining tonal knowledge largely reside outside of temporal lobe auditory structures (24). Dynamic topographies. In contrast to distributed cortical representations of classes of complex visual objects that appear to be topographically invariant (26), we found that the mapping of specific keys to specific neural populations in the rostromedial prefrontal cortex is relative rather than absolute. Within a reliably recruited network, the populations of neurons that represent different regions of the tonality surface are dynamically allocated from one occasion to the next. This type of dynamic topography may be explained by the properties of tonality structures. In contrast to categories of common visual objects that differ in their spatial features, musical keys are abstract constructs that share core properties. The internal relationships among the pitches defining a key are the same in each key, thereby facilitating the transposition of musical themes from one key to another. However, the keys themselves are distributed on a torus at unique distances from one another. A dynamic topography may also arise from the interplay of short-term and long-term memory stores of tonal information and may serve a beneficial role in coupling the moment-to-moment perception of tonal space with cognitive, affective, and motoric associations, which themselves may impose constraints on the activity patterns within rostral prefrontal regions (21, 27-29). REFERENCES AND NOTES 1. R. N. Shepard, Psychol. Rev. 89, 305 (1982) [CrossRef][iSI][Medline]. 2. C. L. Krumhansl, Cognitive Foundations of Musical Pitch (Oxford Univ. Press, New York, 1990). 3. F. Lerdahl, Tonal Pitch Space (Oxford Univ. Press, New York, 2001). 4. C. L. Krumhansl and E. J. Kessler, Psychol. Rev. 89, 334 (1982) [CrossRef][iSI][Medline]. 5. L. B. Meyer, Emotion and Meaning in Music (Univ. of Chicago Press, Chicago, 1956). 6. The chromatic scale consists of 12 equally sized intervals into which an octave is divided. On a piano, a chromatic scale starting at middle C would be played by striking adjacent keys until the note C, either one octave above or below middle C, was reached. 7. The extent to which tonality representations are maintained in long-term or short-term memory stores, or a combination of the two, is a matter of debate. Self-organizing neural network models of implicit learning accurately mimic results from a wide array of experiments that assess tonal knowledge (15), and harmonic priming experiments directly highlight the influence of learned tonal structures (13, 30). However, models of short-term sensory memory account for significant proportions of the variance in probe-tone experiments (31, 32), and probe tone ratings depend, partially, on the pitch distribution statistics of the contexts that precede probes (33). 8. P. Janata, J. Cognit. Neurosci. 7, 153 (1995) [iSI]. 9. M. Besson and F. Faïta, J. Exp. Psychol. Hum. Percept. Perf. 21, 1278 (1995) [CrossRef][iSI]. 10. A. D. Patel, E. Gibson, J. Ratner, M. Besson, P. J. Holcomb, J. Cognit. Neurosci. 10, 717 (1998) [Abstract]. 11. S. Koelsch, T. Gunter, A. D. Friederici, E. Schröger, J. Cognit. Neurosci. 12, 520 (2000) [Abstract/Free Full Text]. 12. J. J. Bharucha and K. Stoeckig, J. Exp. Psychol. Hum. Percept. Perf. 12, 403 (1986) [CrossRef][iSI]. 13. H. G. Tekman and J. J. Bharucha, J. Exp. Psychol. Hum. Percept. Perf. 24, 252 (1998) [CrossRef][iSI]. 14. R. Francès, La Perception de la Musique (Vrin, Paris, 1958). 15. B. Tillmann, J. J. Bharucha, E. Bigand, Psychol. Rev. 107, 885 (2000) [CrossRef][iSI][Medline]. 16. R. J. Zatorre, A. C. Evans, E. Meyer, A. Gjedde, Science 256, 846 (1992) [iSI][Medline]. 17. R. J. Zatorre, A. C. Evans, E. Meyer, J. Neurosci. 14, 1908 (1994) [Abstract]. 18. B. Maess, S. Koelsch, T. C. Gunter, A. D. Friederici, Nature Neurosci. 4, 540 (2001) [iSI][Medline]. 19. The existence of a tonal map that is distributed within and across cortical areas rather than focused within a small cortical area may seem paradoxical, yet this representational form is predicted by some models of functional brain organization (27). 20. H. Barbas, Brain Res. Bull. 52, 319 (2000) [CrossRef][iSI][Medline]. 21. D. Tranel, A. Bechara, A. R. Damasio, in The New Cognitive Neurosciences, M. S. Gazzaniga, Ed. (MIT Press, Cambridge, MA 2000), pp. 1047-1061. 22. H. Barbas, H. Ghashghaei, S. M. Dombrowski, N. L. Rempel-Clower, J. Comp. Neurol. 410, 343 (1999) [CrossRef][iSI][Medline]. 23. A. J. Blood, R. J. Zatorre, P. Bermudez, A. C. Evans, Nature Neurosci. 2, 382 (1999) [CrossRef][iSI][Medline]. 24. M. J. Tramo, J. J. Bharucha, F. E. Musiek, J. Cognit. Neurosci. 2, 195 (1990) . 25. I. Peretz, A. J. Blood, V. Penhune, R. Zatorre, Brain 124, 928 (2001) [Abstract/Free Full Text]. 26. J. V. Haxby, et al., Science 293, 2425 (2001) [Abstract/Free Full Text]. 27. A. R. Damasio, Cognition 33, 25 (1989) [CrossRef][iSI][Medline]. 28. J. J. Eggermont, Neurosci. Biobehav. Rev. 22, 355 (1998) [CrossRef][iSI][Medline]. 29. S. Funahashi, Neurosci. Res. 39, 147 (2001) [CrossRef][iSI][Medline]. 30. E. Bigand, B. Poulain, B. Tillmann, D. D'Adamo, J. Exp. Psychol. Hum. Percept. Perf., in press. 31. D. Huron and R. Parncutt, Psychomusicology 12, 154 (1993) . 32. M. Leman, Music Percept. 17, 481 (2000) [iSI]. 33. N. Oram and L. L. Cuddy, Psychol. Res. 57, 103 (1995) [iSI][Medline]. 34. We thank T. Laroche for assistance with data collection. Supported by NIH grant P50 NS17778-18. The data and stimuli from the experiment are available on request from the f MRI Data Centerat Dartmouth College (www.fmridc.org) under accession number 2-2002-1139B. 17 July 2002; accepted 27 September 2002 10.1126/science.1076262 Include this information when citing this paper.
  14. Hoosiergirl, if I had 1/10th of the budget that your husband have for audio equipment, I would probably quit school, and stay home to listen to music all day. I guess that's a bad thing. Anyways, I would love to know the specific models of his equipment. Bad to the original question, for < $900, I would spend $550 on a pair of magnepan MMG's and spend the rest on a decent dolby digital/DTS receiver. Brands like Yamaha, Onkyo, Rotel, Harman Kardon, Denon, Marantz (brands that hoosiergirl's husband would laugh at) make good receivers. Sure, it's only a stereo only system, but with an AV receiver, you can add speakers later. For classical music, I don't think you can find a better set of speakers, in terms of precision, sound stage, at this price. I hope this helps. Here's a link to top rated av receivers from audioreview.com http://www.audioreview.com/htp_2718crx.aspx You can go a cheaper route buy buying used equipment. www.audiogon.com and ebay are good places for bargains. Just make sure you always check reviews on audioreview first. -Pennstater a.k.a "ghetto audiophile"
  15. dang, hoosier girl, your husband must have over $100,000 worth of equipment there; enought to buy a house. Electrostatic speakers powered by tube amps. Is he a lawyer or something?
×
×
  • Create New...