9a. Pinker, S. Language Acquisition
A close look at one of the most controversial issues at the heart of cognitive science: Chomsky's view that Universal Grammar has to be inborn because it cannot be learned from the data available to the language-learning child.
Reading: Pinker, S. Language Acquisition. in L. R. Gleitman, M. Liberman, and D. N. Osherson (Eds.), An Invitation to Cognitive Science, 2nd Ed. Volume 1: Language. Cambridge, MA: MIT Press.
Instructions for commenting: Quote the passage on which you are commenting (use italics, indent). Comments can also be on the comments of others. Make sure you first edit your comment in another text processor, because if you do it directly in the blogger window you may lose it and have to write it all over again.
“Any theory that posits too little innate structure, so that its hypothetical child ends up speaking something less than a real language, must be false.”
ReplyDeleteWhen Pinker refers to “too little innate structure,” he is critiquing theories that suggest children acquire language solely through experience or imitation. He argues that if this were true, children would produce something less sophisticated than a real language—one lacking the complex, rule-governed patterns characteristic of all human languages. I agree with Pinker’s position. Language is an extraordinarily complex system, and the linguistic input children receive is relatively limited—often consisting of simplified sounds and phrases such as “goo goo ga ga.” If language acquisition depended entirely on environmental exposure, it would be difficult to explain how humans develop such rich and structured forms of communication from such minimal and fragmented input, both directly addressed to them and overheard in their surroundings. Therefore, it seems most plausible that both innate mechanisms and environmental factors interact to support language development.
Rachel, to add onto this, I found that one striking aspect of Pinker’s argument is the emphasis on the “poverty of the stimulus” which is the idea that children acquire complex grammatical knowledge despite limited and imperfect linguistic input. Pinker uses evidence from children’s rapid language acquisition, the scarcity of negative feedback in parent-child communication, and “overregularization” errors (like “goed” for “went”) to argue for significant innate constraints in the mind. I agree with Pinker that purely behaviorist or statistical learning models cannot fully account for the generativity and creativity seen in language development because for example, children produce sentences they’ve never heard, and sensibly avoid ungrammatical constructions without explicit correction. However, I think Pinker sometimes underestimates the power of rich, structured input and the role of social interaction. I believe it may be the case that children are sensitive to statistical distributions and use social cues to infer meaning. Therefore, while innate grammatical frameworks are likely necessary, language learning may be more dynamic. It is an intersection between innate mental structures and powerful learning abilities aligned with to environmental patterns.
DeleteTo add onto your comment Alexa: I think that Pinker would actually agree with your conception of language acquisition as a profoundly dynamic system. He notably mentions that children will in a way create sets of linguistic rules which they will believe to be true that are much larger than the sets which are accepted by their language, and will learn to restrict these rules based on interaction with others and feedback (this is a dynamic process: exposure to language-> creation of a large set of possibles -> feedback -> restriction of the set -> etc). Furthermore, Pinker highlights that it seems children will recognize and encode the set of possible sentence structures quite early on so as to learn to recognize the categories and uses of words.
DeleteI think all of you have picked out important aspects of Pinker's explanation on why language cannot develop solely from language experience. What striked me from the reading is the way that Pinker is able to develop his explanation from different angles. Ex: the importance of the poverty of the stimulus explanation. Children receiving input that is broken or incomplete but are able to form grammatical sentences without explicit instructions and rare correction from their caregivers, yet their able to develop grammatical structures.
DeleteI also liked how Sofia mentioned the part about how initially children will overgeneralize rules. Ex: saying “goed” instead of “went” or “runned” instead of “ran." This is because Pinker interprets the child’s behavior not as imitation but instead, they’re testing their own hypotheses about the rules of language, which are then refined by experience.
I agree with all of you, there definitely is some innateness involved, but the environment also plays a role. Kids are not blank slates, but neither robots following a script, they're just little theorists trying to work out language.
***EVERYBODY PLEASE NOTE: I REDUCED THE MINIMUM NUMBER OF SKYWRITINGS. BUT THE READINGS ARE **ALL** RELEVANT TO AN OVERALL UNDERSTANDING OF THE COURSE. SO, EVEN IF YOU DO NOT DO A SKYWRITING ON ALL OF THEM, AT LEAST FEED EACH READING YOU DO NOT READ TO CHATGPT AND ASK IT FOR A SUMMARY, SO YOU KNOW WHAT THE READING SAID — OTHERWISE YOU WILL NOT HAVE A COMPLETE GRASP OF THE COURSE TO INTEGRATE AND INTERCONNECT FOR THE FINAL EXAM.***
DeleteRachael, but do you need more than simple subject-predicate structure to get natural language and many of its benefits launched on its Baldwinian path? What was all the more complex structure for -- and how did that evolve, and for what? (Pinker is alluding here to the Chomskian innate features of language: Universal Grammar [UK] and its structure-dependence, but not addressing the harder questions about evolution (such as the Poverty of the Stimulus [POS]). Where did those further complexities come from, and how?)
Alexa, what is POS? It is not the scarcity of corrective feedback. There's plenty for errors in "Ordinary Grammar" (OG), but literally none for errors in UG. The reason is that children and adults make plenty of OG errors, but none at all for UG. So there's nothing to correct: they know it already. (And goed/went is OG error, not UG.) And since this paper of Pinker's, LLMs arrived, and complicated the picture more, because they make no errors in UG and very few in OG! Why? The controversy is far from over. (The Lilliputians seem to have concluded that Chomsky was wrong, and that the LLMs show UG was learnable after all; but Chomsky is too old and ill to continue to reply, so we will have to wait to see whether a another Brobdingnagian is born to take over...)
And generating never-heard, never-spoke sentence is not at all what this is about. (Why not?)
Sofia, kid-sib does not understand whAT Pinker is saying here, and why, and what it has to do with POS: Do you?
Lorena, I think Pinker seems to make sense only because he conflates OG and UG misunderstands POS. Do you understand how?
To add to this thread (which helped me understand pinker’s points better actually!), Pinker views language acquisition as a dynamic system that balances innate rules with environmental input using points like Alexa's emphasis on Poverty of the Stimulus (POS) and Sofia's view on rule restriction. This initial innate framework, which provides the 'large set of possibilities,' is precisely Pinker's mechanism for overcoming the POS if i understood correctly as the limited and messy environmental input merely needs to filter and prune this innate set rather than build the complex structure from scratch. Pinker addresses the need for complex structure, such as structure dependence by suggesting it evolved as a byproduct of general abstract and recursive thought, providing the innate capacity necessary to overcome the POS. So I believe generating a new sentence isn't the key and that the true proof of universal grammar is actually avoiding ungrammatical structures, is essential for distinguishing Pinker's broad model from a strict Chomskyan view, especially since Pinker uses evidence like Lorena's example of overgeneralization "goed" which may conflate fixable ordinary grammar errors with the non-violatable rules of universal grammar.
DeleteIn the section about context, Pinker says that children from deaf parents with only access to a radio as English input did not learn any speech from it. They would just hear sentences and the content-words would not be grounded to a real life referent if they never got to be exposed to the language previously. This goes hand in hand with what we spoke about in class. Before being able to understand propositions, the child needs to have a certain knowledge of the referent matching the words. Children can infer what their parents say without specific context or syntactic knowledge. They need to ground spoken words directly and then they will be able to give meaning to syntax and to define new categories. Later on, they will get to learn the different syntactic categories and how sentences are structured using context and semantic as cue to make up a language’s rules.
ReplyDeleteI like how you tied that example to grounding because it shows how acquiring language depends on connecting words to real experiences. Pinker further expands on this by arguing that our ability to form these connections to the real world is the result of evolution. He says language is a biological adaptation shaped by natural selection because it allows us to communicate grounded meanings efficiently, not just a byproduct of intelligence. The fact that every child can learn any language equally well supports this idea. Language isn’t passed down culturally like traditions, but biologically as part of our shared human capacity. What’s also interesting is that this universality explains why children can go beyond imitation to generate entirely new sentences, which is something only a species with an evolved functional capacity for language can do.
DeleteTo add on to Sophie's point: I think this is further important as it suggests that feedback and mimesis are necessary conditions symbol grounding: one can only learn the associate symbols with things in the world if another indicates to them this relation and corrects them when they wrongly associate them.
DeleteYour comment Sophie made me think about educational TV shows and picture books or even illustrated dictionaries. I am now wondering to what degree a word must be grounded for it to truly have meaning in our minds. For example, if someone has never heard of fruits in their lifetime and opens up the page with the definition and illustration of an apple, would that be enough to ground the word apple in their mind? Could they truly understand what an apple is even knowing the definition and being able to describe what it looks like? Sounds a lot like a LLMs which can tell you a lot about apples but do not really understand them.
DeleteThe nature vs. nurture debate of language acquisition is a topic I have frequently come across in various other courses. Pinker provides a compelling case that language arises from an innate biological capacity, a specialized cognitive adaptation shaped by evolution. He raises some common points in the field, such as the poverty of stimulus argument which discusses how children learn more than they could have inferred from experience, the notion of universal grammar and some developmental stages of language acquisition. Human language’s ability to serve as a tool for communicating, a system to transmit complex info and coordinate special behaviour suggests it was favoured by selection due to adaptive value. Connecting to neural networks, it thus makes sense that such models can simulate certain surface-level “learning” patterns but will fail to capture the rule-based generative structure of human grammar. I also found value in his discussion of the decline in language acquisition ability with age. This fits with his larger argument since in childhood, brains are highly plastic and most of this plasticity is largely lost in adulthood. Since he argues there is an innate ability for language and a critical sensitive period of development similar to many other biological systems, it reinforces that language learning ability is not entirely based on environment or general effort.
ReplyDeleteI find the poverty of the stimulus argument particularly impactful. Two points brought up by Pinker provide evidence that we don’t learn language solely based on the inputs we receive. The first is creoles: children exposed to an incomplete language (a Pidgin) will elaborate it into a full language. The second point was that children do not need negative feedback to learn, which they showed through cross-cultural examples, where parents do not correct their children's mistakes.
DeletePinker claims humans are biologically predisposed to learn language in a particular way with some blueprint of sorts for how language can be learned -- there are innate biological principles that children rely upon to learn language. This connects back to AI research in that LLMs learn from correlation and prediction, not from grounding – LLMs lack the principles that human children rely upon to learn language. Doesn’t T3 solve this problem? If we design the system to model that of human anatomy, ensure it receives sensory input, interacts + experiences feedback from the physical world, couldn’t it develop language in the same way as a human child? Pinker, I’m assuming, would argue no : adding these architectural components wouldn’t solve the fundamental difference between AI language learning and human learning. Human language learning is instinctual; it can’t be programmed in since development is key…or can it? Is there really something deeper, this “language instinct,” that can’t be artificially replicated without our biological evolutionary history? How can we determine if there is something irreducibly biological about human language learning that T3 wouldn’t be able to reproduce?
ReplyDeleteWhile I am, like you, skeptical that Pinker would believe that T3 robots could acquire the same language capacities as humans, the only way to test whether language learning is irreducibly biological is to build a T3 system and see. More specifically, this system would ideally come pre-installed with universal grammar-like constraints that are innate in humans, enabling it to recognize the allowable operations that all languages are confined to. Moreover, it should likewise be capable of learning language through sensorimotor grounding and trial-and-error (the key feature of a T3 system), follow a similar developmental timeline to that of a child, and be raised in a typical human environment. If these conditions are met, I believe that a T3 system could achieve child-like milestones, such as babbling, first words, and the production and comprehension of meaningful propositions across different contexts.
Delete“Possessing a language is the quintessentially human trait: all normal humans speak, no nonhuman animal does.” I really liked this quote because it sums up how deeply language defines us as a species. Pinker makes it sound like language isn’t just something we do, it’s something that makes us human. I was struck by how he contrasts humans with animals, showing that even though other species communicate, only humans can combine words and rules to create endless new meanings. It made me think about how language connects to our ability to share ideas, plan ahead, and build culture. If language is such a uniquely human ability, what do you think pushed it to evolve, was it mainly about survival and cooperation, or more about expressing thought and identity?
ReplyDeleteShireen I agree with you that language is one of the strongest traits that defines us. It really does set us apart from other species. I also like how you linked it to our ability to share ideas and build culture. But I’m not sure I’d say it’s what makes us human, since some people are born without full access to language because of impairments or neurological conditions (yet they’re still completely human). Maybe it’s not language itself, but the drive to communicate and connect that defines us as a species. Even without words, we find ways to express ourselves and build relationships (through gestures or art and music and even simply through being together in silence). That shared desire to connect feels just as fundamentally human as language itself.
DeleteI think both of you bring up really important points!
DeleteShireen, I agree that Pinker makes a powerful case for language as something that shaped who we are as a species, it’s not just communication, its how we think, plan, and create meaning together.
Jad, I like how you expand this idea by separating language from the deeper need to communicate and connect. The distinction really matters.
Pinker argues that this drive likely offered a major evolutionary advantage. Language made cooperation, teaching, and social bonding far more efficient than learning through trial and error alone. But he also links language to cognition itself, suggesting that once it evolved, it began shaping the very structure of thought and culture. So, Maybe the answer lies in both, language first emerged out of the need to survive and collaborate, but once established, it became a way to express identity, creativity, and abstract thought, turning communication into a reflection of what it means to be human.
I really like how this thread beings together both sides of Pinker's arguement - the evolutionary function of language and the broader human desire to create meaning. Your point about the two stages of language fits well with Pinker's view that language likely emerged because it made cooperation, teaching and survival more efficient. Once the system existed thought, it did not stay limited to practical coordination. It reshaped how humans reason, imagine and express identity.
DeleteThe deep social drive that Jad mentions may have provided the pressure that allowed a linguistic system to evolve in the first place. Language amplified that drive by giving us a much richer way to think and connect.
It might be more accurate to say that the combination of our cognitive capacities and our need for connection created a space where language could flourish.
Pinker’s claim that “all normal humans speak” highlights language as a universal biological capacity, yet his discussion of disorders like Specific Language Impairment and Williams Syndrome complicates that idea. These cases show that language can either break down or remain intact independently of general intelligence, suggesting it depends on a specialized, evolved mental system. This raises deeper questions about what counts as “normal” in human language: if language is an evolved mental organ, then such variations may not challenge its universality but instead reveal the structure, resilience, and fragility of the biological design that makes speech possible.
ReplyDeleteThis is more of a question-esque skywriting. So, in class we’ve talked about how UG is unlearnable because we do not get enough negative examples of it to be able to learn it categorically. In this paper, the authors talk about how Russian children won’t make nominative/accusative errors however they will make feminine/masculine errors. Yet, ( I premise this by saying that I don’t speak Russian but I do speak french) I don’t think adults are making feminine/masculine errors that often, and therefore children are not getting negative examples of the feminine/masculine distinction. However, the feminine/masculine rules not fall into the category of UG, they seem to be OG? So my question is how do we categorize what falls into UG and OG in this circumstance?
ReplyDeleteHey! I understand the confusion and I think I may be able to help, but please correct me if I’m wrong. Firstly, when it comes to universal grammar, it’s not that children aren’t capable of learning it, it’s more so that learning Universal Grammar (UG) in the same way we learn Ordinary Grammar (OG) is not possible, which demonstrates its innateness (though that might’ve been what you meant and I misinterpreted your sentence). Another factor reinforcing this idea that Universal Grammar is innate is the fact that it is (as the name suggests) universal; every child growing up, though they may not have learned the limits of the grammar proper to their target language, will not have the tendency to make mistakes according to the Universal Grammar regardless of their target language. Secondly, in this specific excerpt, Pinker showcases this distinction with the example of Russian children making mistakes in the feminine/masculine rules of OG but not in the normative/accusative rules presumably of UG. Feminine/masculine rules are specific to OG because, for one, they don’t apply to every language (e.g., English) which means they aren’t universal, and for two, feminine/masculine rules do not determine the architecture of the sentence which relies on innate constrains (unlike normative/accusative rules). Additionally, though most adults may have a good grasp of the OG rules in French, as a fellow French speaker, I can confidently say that other children provide more than enough negative evidence (i.e., I’ve looked up “un ou une trampoline” in Google more times than I can count). I think that’s where the difference lies!
DeleteThe part of the paper regarding learnability theory is interesting. Pinker already mentions many times that language is ‘extraordinarily complex’, and the actual ‘goal’ of language learning (what can be learned?) thus also becomes one that is difficult to answer in its own right. Pinker tries to answer this through the learnability theory and how it explains that there could be finite assumptions on 3 out of the 4 parts of the theory then the 4th will logically be constrained (specifically referring here to how learning occurs and what hypotheses are being made). The issue though, is that since language is so complex the logic breaks down in the infinite outputs that language can create and thus Pinker concludes that language cannot be explained or learned all through logical inferences performed on environmental inputs. I found this very interesting in the grander argument of this whole paper where this innateness of language learning must exist for children to learn languages, it just so happens this little section I’ve pulled makes for a very convincing argument.
ReplyDelete
DeleteLucy, I like how you connect learnability theory to the nature versus nurture debate! Pinker frames language learning as an induction problem, where children infer grammar rules from limited input and minimal negative evidence (i.e., information about what is ungrammatical is rarely explicitly provided). As you mentioned, learnability theory breaks down into four components: the class of languages, the environment, the learning strategy, and the success criterion—showing that assumptions about the first three logically constrain the fourth. Because language allows for an infinite set of words and sentences, purely logical inference is insufficient, highlighting that language acquisition cannot be entirely explained by environmental input. Therefore, children must also possess innate constraints or cognitive structures that guide them toward the correct grammar despite limited feedback and minimal corrections.
Children acquiring language is an example of unsupervised learning (as seen in class) - the presence of in-person language entails the acquisition of that language. However, it’s helped along by supervised learning (explaining what a word means, showing pictures of actions to explain transitive verbs). It also relies greatly on context, since listening to the radio/television does not entail language acquisition. Additionally, learning a second language past the critical period in childhood is definitely supervised learning.
ReplyDeleteWhen children are acquiring their grammar, they can present rules that over-apply in a certain inappropriate context, resulting in a grammatical error. However, certain grammatical errors are never violated, even across languages. This could be evidence for Universal Grammar.
I understand that UG is said to be unlearnable because the linguistic input lacks enough information for children to infer its abstract rules, which supports the idea of innate structure. However, I’m still trying to grasp why this isn’t just a form of unsupervised learning. After all, children seem to extract patterns from unlabelled data without explicit instruction. From what I gather, the key difference is that unsupervised learning assumes the data contains the structure to be found, whereas the Poverty of the Stimulus suggests that some principles of UG cannot be derived from input alone. If that’s true, then language learning isn’t simply unsupervised, it’s constrained by built-in cognitive architecture that shapes what can be learned in the first place.
ReplyDeleteThe author argues that children’s ability to learn language so quickly and uniformly across different environments points to an innate and biological origin rather than from the environment alone. As we’ve seen in class, the evolution of language in humans further supports this. This also reminds me of how LLMs “learn” language from the Big Gulp through patterns, paralleling the idea that exposure to language doesn’t equal comprehension in LLM’s (as we know LLMs don’t understand language from Searle’s Chinese Room Argument) and in humans.
ReplyDeleteI like the connection you make here, Jesse. It helps me see how negative evidence in grammar learning potentially relates to learning in computational models. I think you make a good point that the Big Gulp is sort of a counterfactual to the Poverty of the Stimulus due to the sheer volume of exposure to language found in the Big Gulp compared to that which is available infants and children.
DeletePinker points out that children don’t get formal grammar lessons or clear feedback, but somehow, in their early years they can produce complex sentences that follow rules they’ve never been explicitly taught, they've just learned from listening.
ReplyDeleteHe argues that this must mean something about the human mind, that we’re designed for using language and language acquisition. Even though we don't already speak a language when we are born, our brains come pre-wired with the ability to extract rules and patterns from whatever language we are exposed to whether it is English or Chinese. It is interesting how he frames language less as something to do with memorization but more as a discovery that we will all make at some point. Pinker also talks about the balance between nature and nurture, whilst we rely on an internal structure and our brains come pre-wired we still need to grow up in an environment which activates it. One thing that I was wondering though is how the “pre-wiring” applies to adults, we know that it is easier for us to learn languages early on but when does this window close?
We never fully have a decline in language acquisition to the point of not being able to learn, but Pinker does explain that it is very rare to reach a native's level of language after puberty hits. After around six years of age, the decline already starts but there is still that window of opportunity. I've seen this at my workplace (working in a camp with many immigrant children as a helper for children with disabilities). The eight year old children who had just arrived from Ukraine could not verbalize a word in at the start of camp in English or French but was able to form full complex sentences in a manner of two months. The older children (12 years old) who had freshly arrived from other countries had made less progress in that time frame in their English and French acquisition. This reflects Pinker's discussion on negative evidence too. The younger children were not frequently corrected on their language errors but the mere exposure to English and French was enough for them to get closer to accurate words and sentences. It shows the innateness of language acquisition in children, especially those who are younger. The older children were just as frequently exposed and had the same minimal correction but could not attain the same level, not due to lack of intelligence but a lack of the mechanisms that are innate at birth and young childhood.
Delete9.a. Pinker argues that children’s language learning reveals innate capacity for grammar (language isn’t simply taught). But his account sometimes downplays how deeply social, interactive and context-dependent early learning really is. He presents innateness as the main engine, yet the messy realities of exposure, variation, and cultural difference don’t get equal weight. Still, the piece pushes you to value biological underpinnings in acquisition even while reminding you that “just-input” isn’t enough.
ReplyDeleteLanguage acquisition starts with being able to tell things apart and group similar things together, like knowing which food is safe to eat and which is not. This is categorization which came long before language existed but became the foundation for grounding words and giving them meaning. When children start learning a language they use these categories to connect sounds and symbols to the things they represent. They also categorize words and phrases themselves into categories like nouns and verbs so they can create structured propositions, allowing infinite combinations of these phrases. This process relies on built in universal constraints like universal grammar and also experience. So as children hear input they set parameters for the language they are experiencing. Thus, language learning uses categorization, grounding, and innate structure to map meaning onto words.
ReplyDeletePinker says that children are born with a special skill to learn language. He thinks it is not only from what they hear, because if that was true, they would not learn grammar so fast. I agree, because children do not get much correction, but still speak in a correct way after some time. I also found interesting how he says that it is easier to learn a language when we are young. When people get older, it becomes harder. I think this shows that language is something natural for humans, but it still needs people and practice to grow.
ReplyDelete“For example, here are snapshots of the development of one of Brown's longitudinal subjects, Adam, in the year following his first word combinations at the age of 2 years and 3 months…”
ReplyDeleteI found the real-world example Pinker used to map out the trajectory of language development in infants quite fascinating. The monthly examples of sentences spoken by two-year-old Adam show remarkable development in their grammatical complexity and fluency over the course of just one year. What begins as short, abruptly terse phrases quickly blossoms into rich, multi-clausal constructions revealing a brain rapidly learning to express nuance, intention, and imagination. Pinker highlights this transformation not just to illustrate increasing vocabulary, but to show how children intuitively grasp the abstract rules of syntax long before they can consciously articulate them. Adam’s progression from “I got horn” to “I want to have some espresso”, as well as statements like his imaginative comparison between himself and a baby elephant, makes visible the astonishing speed and creativity of early language learning. This development obviously shows a rapid improvement in language abilities, but it also makes me wonder if this development is not just only a reflection of language, but also of the complexity and maturity of infant thoughts and feeling in general – unfortunately another piece of the hard problem puzzle we will probably never solve.
“Children learn languages that are governed by highly subtle and abstract principles, and they do so without explicit instruction or any other environmental clues to the nature of such principles.”
ReplyDeletePinker’s point here really stood out to me because it highlights the paradox at the heart of language learning. When kids master something incredibly abstract without ever being taught it. It makes me question how much of grammar is actually “learned” at all, versus how much is already shaped by innate constraints in the mind. If children aren’t given explicit rules, and the input they receive is incomplete or off, then the fact that they still reach the same adult grammar feels almost impossible without some built-in structure. But then it raises the bigger question of what exactly is innate here? Is it a full list of grammatical possibilities, or something closer to a set of cognitive biases that steer the child toward the right hypotheses? Also, if the constraints are actually biological, how do we compare that to the enormous diversity of languages that kids acquire so effortlessly? It makes me wonder where the line is between what the environment has and what the mind already “expects” to find.
From what I understood, yes, the fact that children reach adult grammar eventually and easily means there is a linguistic structure that is innate. However, it is not that there is a list of grammatical rules that are input into a child. Rather, children are born with abstract categories and constraints that allow for them to ignore irrelevant variations in speech and just focus on the patterns that matter in the structure of speech. For example, they expect that there is a subject and an action in a sentence and that expectation is innate. That pattern recognition allows for guiding interpretation of the language they hear and develop their grammar in the language surrounding them. So, children have a flexible innate system that allows for accommodation to any language, while being restrictive enough to prevent incorrect grammar form occurring (through environmental cues and their innate capacities to detect these, children narrow the grammatical rules needed for their language). So, I’d say that cognitive biases are the answer here.
DeletePinker’s article was super helpful in understand the importance of negative evidence. Through our discussion of UG (and hence my skywriting in 9b, as I wrote it before 9a), I couldn’t grasp why negative evidence was required (except if the knowledge/capacities are innate, i.e. UG). In my 9b skywriting I question why negative evidence is “stronger”/”more formative” than positive evidence, but Pinker really explained the concepts. In short, he stats that children hypothesize a language that is a superset of the target (desired) language, i.e. their set is composed of the target language + errors (ungrammatical sentences). This is referred to as the overshooting problem. To correct the overshooting problem, positive evidence isn’t enough, as it would solidify the target language (elements) but woud never discredit the ungrammatical sentences. This refers to the incorrigibility of the superset, when lacking negative evidence. This really helps understanding where UG comes in, and why somehting (here innateness) has to replace the negative evidence.
ReplyDelete“Children begin to understand words before they can speak them… They must be sorting the sounds directly.”
ReplyDeleteI found this part interesting because Pinker shows that babies already organize speech sounds before they even know what the words mean. For me, this suggests that learning a language doesn’t start with vocabulary or rules, it starts with the brain recognizing patterns automatically.
I agree with Pinker that this early ability can’t come only from experience, because babies don’t get enough clear input to figure everything out just from hearing words around them. But at the same time, I think Pinker doesn’t fully explain how babies know which sound differences matter in their language. They still need people talking to them in real situations for those patterns to make sense.
So I think language learning is both: an innate skill that helps babies notice important sound patterns, and a social process that gives those patterns meaning. Neither one works alone.
Pinker argues that Universal Grammar is an innate cognitive framework that shapes how we acquire and structure language. It’s distinct from the specific grammatical rules (Ordinary Grammar) that are learned through exposure.
ReplyDeleteI think the most interesting argument that UG must be innate is that kids don’t produce or hear UG-violations at all. For example, Pinker points out that nearly all of us human speakers can recognise that a complex sentence is ungrammatical, like “Who do you believe the claim that John saw?” even without saying it ourselves and receiving corrective negative evidence. This argument for UG being innate depends on the Poverty of the Stimulus. Kids successfully acquire language, despite not getting explicit corrective feedback about ungrammatical structures.
Pinker and Chomsky both argue that children’s fast language acquisition must be caused by an innate Universal Grammar. They disagree about its origins - Pinker claims it evolved through natural selection, Chomsky treats it as a biological given - but both assume grammar is built into the mind itself.
ReplyDeleteWhile reflecting on this, I’m thinking that the innate part may not be grammar at all, but the general learning capacities humans already have: sensitivity to patterns, the ability to infer hierarchical structure, limits on memory... If these capacities shape what children can easily learn, then languages that align with them get transmitted successfully, and languages that violate them die out.
In that case, the regularities shared across human languages would arise because languages adapt to the brain, not because the brain contains a fixed grammar. the poverty of the stimulus argument still holds - children need innate biases - but the biases are cognitive rather than grammatical. UG becomes the form that emerges when languages pass through generations of learners constrained by the same cognitive architecture.
“The child must have some mental mechanisms that rule out vast numbers of "reasonable" strings of words without any outside intervention.”
ReplyDeleteComing from various linguistics classes, I found that this quote encompasses the biggest question/paradox in language acquisition. Specifically, how can children attain the intricate complexity of adult grammar without the systematic guidance we might assume is necessary? With this question, Pinker introduces Learnability Theory and the dominant theory of universal grammar. Lots of other classmates have covered the nuances of these theories.
As such, what I found most interesting was the distinction between language acquisition and general intelligence. Colloquially, it seems like they would be aligned. Yet, in this paper, I learned about children with Spina Bifida and Williams Syndrome, which completely challenged my earlier assumptions. With these examples, it seems like language is a "species-specific module" that governs what children accept as possible linguistic structures, excluding certain grammatical possibilities, rather than relying on error feedback.
I really appreciate your point about how this quote captures the core paradox of language acquisition. What stood out to me most was your discussion of dissociations between language and general intelligence. Pinker’s examples of children with Spina Bifida or Williams Syndrome completely disrupt the assumption that language ability simply tracks cognitive ability. Instead, they suggest that children come equipped with a “species-specific module” that filters out impossible grammatical forms long before experience could teach them otherwise. This makes the case for innate constraints even stronger, since, as Pinker notes, children must rule out countless “reasonable” hypotheses without explicit correction.
DeleteI found that part fascinating too-- the way cases like Spina Bifida and Williams Syndrome make it so clear that language ability can be sharply dissociated from general intelligence. It makes clear that language functions as a specialized, species-specific module rather than a byproduct of broader cognitive skills.
DeleteWhat struck me about this is how this ties into the concept of “lazy evolution.” If a trait can be learned through general mechanisms, natural selection won’t bother genetically encoding it. But because language requires children to rule out countless impossible grammars without explicit feedback, evolution seems to have stepped in and built dedicated machinery. It’s precisely the kind of complexity that “lazy” Darwinian processes would have no choice but to hard-wire.
“Cognitive psychology has shown that people think not just in words but in images and abstract logical propositions”
ReplyDeletePinker’s quote here shows how there is an intermediate step between sensorimotor experiences and the acquisition of human language in evolution. Sorting things into categories enables humans to think propositionally because one can assign truth values to features once they are sorted (i.e. this mushroom is poisonous, this mushroom is safe to eat). Unlike other primates, humans have acquired the capacity to translate propositional thought into human language, which involves patterns of symbols arranged according to syntactic rules. Pinker argues that humans must have an “organ” in the brain, along with an exceptionally sensitive pattern recognition ability that allows them to rapidly develop an intuitive rule system to arrange symbols and determine whether or not they make up a sentence.
I found the debate of nature vs nurture in this paper interesting. While Pinker acknowledges that the environment plays an important role, he also argues that environmental input alone cannot explain how children acquire language so quickly and uniformly. What stood out most to me was the concept of poverty of the stimulus in his argument. Children hear messy, incomplete, and sometimes ungrammatical speech, yet they still manage to build full adult grammars without being directly taught the rules. That suggests the input alone isn’t enough. There has to be something internal helping them make sense of it. In fact, Pinker suggests that humans are biologically prepared for language. We have brain areas specialized for speech, a vocal tract shaped for producing a wide range of sounds, and even developmental windows where language learning is easiest. All of this supports the idea that children aren’t starting from scratch rather, they are equipped with innate tools that guide how they learn language, even from imperfect input.
ReplyDeleteI found it interesting how Pinker tackled the Whorf hypothesis in this reading. Whorf suggested that language acquisition is basically learning to think, since the grammar and categories of our specific language shape how we conceptualize reality. Pinker argues against the strong version of this idea, claiming it is false according to virtually all modern cognitive scientists. The strong Whorf hypothesis states that language determines how the world looks to you as opposed to the weak hypothesis where language only influences your world.
ReplyDeleteIt makes sense when you think about his points: babies demonstrate cognitive ability and can think before they can talk. Also, Pinker points out that people think in images and abstract logical propositions, not just in the ambiguous words of natural language. For example, when we think of "spring," we aren't confused whether it's the season or something that goes boing, which shows that internal thought isn’t purely dependent on external words. He emphasizes that children need a considerable amount of nonlinguistic cognitive machinery in place just to get the language acquisition process started. Therefore, Pinker views language as being grafted onto cognition to label thoughts, rather than determining thought itself.
"But it is unlikely that every such speaker has at some point uttered these sentences and benefited from negative feedback. The child must have some mental mechanisms that rule out vast numbers of "reasonable" strings of words without any outside intervention."
ReplyDeleteI found this paper fascinating because it clearly illustrates that the need for Universal Grammar is inevitable. Children don’t get the kind of negative evidence that would be required to eliminate all the “reasonable” but ungrammatical strings they never hear. And parents respond to truth-value, not grammar, yet children still recover from overgeneralization errors and end up with restrictive adult grammars. This strongly suggests internal mechanisms are guiding them. When paired with evidence for modular language circuitry, like aphasia impairing language but not general intelligence, it points to evolution shaping specialized systems. This aligns with the idea Prof. Harnad describes of “lazy evolution”, in the sense that natural selection only hard-codes what cannot be reliably learned, so in the case of language, where external cues alone are insufficient, there must be sophisticated internal mechanisms guiding its acquisition.
This comment has been removed by the author.
ReplyDelete