1a. What is Computation?
What is (and is not) computation? (Rule-based symbol manipulation.) What is the power and scope of computation? What does it mean to say (or deny) that "cognition is computation"?
Readings:
Computation is Symbol Manipulation
__________________________________________________
https://claude.ai/public/artifacts/9bddf6ce-d2a1-4bcd-848c-777f9d6dfd25
What is Computation? A Foundational Definition
The Core Definition
Computation is the manipulation of formal symbol tokens according to rules (called algorithms) that operate solely on the symbols' shapes or forms, not their meanings. This definition, crystallized in the work of mathematicians like Alonzo Church (1936), Alan Turing (1936), and Emil Post (1936), captures what mathematicians do when they "compute" - whether calculating the roots of a quadratic equation or proving a theorem.
Symbols and Algorithms
A symbol in this context is any distinguishable object whose specific physical form is arbitrary. The symbols "2", "II", "zwei", or even a red block can all represent the same mathematical concept, but for computational purposes, only their distinguishable shapes matter. What makes computation remarkable is that these symbols can have any form whatsoever - they could be marks on paper, electrical states in a circuit, or beads on an abacus.
An algorithm is a finite set of explicit rules that specify exactly how to manipulate these symbols. Consider the familiar algorithm for addition: given symbols arranged as "2 + 3 = ?", the rules tell us to replace "?" with "5". Crucially, this manipulation depends only on recognizing the shapes "2", "+", "3", "=" and "?" - not on understanding what "two" or "addition" means.
The Three Levels of Description
Understanding computation requires distinguishing three distinct levels:
1. The Formal Level
At this level, computation consists purely of symbol manipulations according to rules. When a mathematician applies the quadratic formula, they manipulate symbols according to the algorithm x = [-b ± √(b²-4ac)]/2a without necessarily thinking about what these symbols mean. The process operates entirely on the basis of symbol shapes and rule-following.
2. The Physical Level
Computation must be physically implemented - symbols must be actual physical objects, and something must physically execute the symbol manipulations. This could be a human with pencil and paper, mechanical gears and wheels, or electronic circuits. The specific physical details are irrelevant to the computation itself, which is why we say computation is "implementation-independent."
3. The Interpretive Level
Humans can interpret the results of computations as meaningful. When we see the output "5" from "2 + 3", we interpret this as meaning that combining two objects with three objects yields five objects. But this interpretation exists in our minds, not in the computation itself.
Implementation Independence
One of computation's most important properties is that the same algorithm can be executed by radically different physical systems while remaining the same computation. Whether you solve "2 + 3" with pencil and paper, an electronic calculator, or mechanical gears, you're performing the same computation because you're following the same symbol-manipulation rules. This is why Church's lambda calculus, Turing machines, and Post systems were proven mathematically equivalent - they all capture the same formal process despite their different notation systems.
The Software/Hardware Distinction
This leads directly to the software/hardware distinction that is fundamental to understanding computation:
- Software is the algorithm - the formal rules for symbol manipulation
- Hardware is whatever physical system executes those rules
The algorithm for computing square roots remains the same whether implemented in biological tissue (a mathematician's brain) or silicon circuits. The computation is the rule-following process; the implementer is just the physical machinery that carries out those rules.
What Computation Is Not
It's essential to understand what computation excludes:
Not semantic processing: Computation operates on symbol shapes, not meanings. A computer executing "cat + dog = ?" doesn't know anything about animals - it only manipulates shapes according to rules.
Not analog processes: A sundial "tells time" through continuous physical processes, but this isn't computation in the formal sense. There are no discrete symbol tokens being manipulated according to explicit rules.
Not the interpreted results: When we interpret computational output as meaningful, that interpretation is our cognitive contribution, not part of the computation itself.
Simple Examples
Consider basic arithmetic:
- Symbols: "2", "+", "7", "=", "?"
- Algorithm: "When you see two numbers with '+' between them followed by '=?', replace '?' with the sum"
- Implementation: Could be done by a human, electronic calculator, or abacus
- Interpretation: Humans understand this as representing numerical addition
The computation consists entirely of the rule-based manipulation of the symbol shapes. Everything else - what the symbols "mean" to us, how the physical device works, why we find the result useful - exists outside the computation proper.
Historical Foundations
The mathematical foundations were established independently by Church (1936) with lambda calculus, Turing (1936) with his abstract machines, and Post (1936) with production systems. Despite their different approaches, these systems were proven equivalent in their computational power, establishing that there is indeed a well-defined, universal notion of what formal computation is.
Turing's particular contribution was showing that his simple abstract machine - capable only of reading symbols, writing symbols, moving left or right along a tape, and changing internal states according to a lookup table - could perform any computation that any mathematician could perform. This is the foundation of what we now call the Church-Turing thesis regarding the nature of mathematical computation.
Why This Definition Matters
This precise definition matters because it establishes clear boundaries. Computation is a specific type of formal process - rule-based manipulation of arbitrarily-shaped symbols. Recognizing what computation is helps us understand both its remarkable power (any mathematical procedure can be formalized this way) and its inherent limitations (it operates only on syntax, never directly on meaning or semantics).
When we see computation applied in various fields - from weather modeling to game-playing programs - we're seeing this same fundamental process of rule-based symbol manipulation, just applied to different domains and implemented in different physical systems.
References
Church, A. (1936). An unsolvable problem of elementary number theory. American Journal of Mathematics, 58(2), 345-363.
Post, E. L. (1936). Finite combinatory processes—formulation 1. Journal of Symbolic Logic, 1(3), 103-105.
Turing, A. M. (1936). On computable numbers, with an application to the Entscheidungsproblem. Proceedings of the London Mathematical Society, 42(2), 230-265.
Turing, A. M. (1950). Computing machinery and intelligence. Mind, 59(236), 433-460.
Use the prompt belos to query other LLMs (ChatGPT, Claude, Gemini...)
__________________________________________________
“What is computation?” Prompt for LLMs
Please create an introductory explanation of computation for undergraduate cognitive science students who need a clear, foundational understanding. Please provide a comprehensive but accessible explanation that addresses the following requirements:
Primary Question
What is computation, in its most fundamental sense, as originally formalized by mathematicians and logicians in the 1930s?
Essential Guidelines
- Focus on the core definition: Emphasize computation as it was originally defined by Church, Turing, and other mathematicians—as a formal process for manipulating symbols according to rules.
- Avoid technical diversions: Do not discuss the halting problem, Turing completeness, computational complexity, uncomputable numbers, or other computer science technicalities that are not essential to understanding what computation fundamentally is.
- Distinguish levels clearly: Separate (a) what computation is as a formal process, (b) how it gets physically implemented, and (c) how humans interpret computational results as meaningful.
- Emphasize arbitrariness: Explain why the physical shapes of symbols are arbitrary and irrelevant to the computational process itself.
- Implementation independence: Clarify why the same computation can be performed by different physical systems (humans, machines, etc.) without changing what the computation is.
- Avoid cognitive speculation: Do not speculate about whether human cognition is computational, whether brains compute, or related questions about minds and consciousness.
- Use concrete examples: Provide simple, clear examples (like basic arithmetic) that illustrate symbol manipulation according to rules.
Required Structure
- Begin with the most basic, foundational definition
- Explain what symbols and algorithms are in this context
- Clarify the role of interpretation versus the computation itself
- Address the software/hardware distinction
- Provide appropriate scholarly citations to foundational sources
- Conclude with why this definition matters for understanding computation as distinct from other processes
Citation Requirements
Include citations to primary sources (Church, Turing, Post) and authoritative secondary sources, but avoid citing popular science articles, technical computer science papers about complexity theory, or speculative pieces about artificial intelligence and consciousness.
Please write as if explaining to students who need conceptual clarity rather than technical expertise, using language that is precise but not unnecessarily technical.
__________________________________________________
Instructions for commenting: Quote the passage on which you are commenting (use italics, indent). Comments can also be on the comments of others. Make sure you first edit your comment in another text processor, because if you do it directly in the blogger window you may lose it and have to write it all over again.
"The computation consists entirely of the rule-based manipulation of the symbol shapes. Everything else - what the symbols "mean" to us, how the physical device works, why we find the result useful - exists outside the computation proper." This is an interesting passage, as is every passage trying to define what computation is and is not. Although, it seems rather... redundant, uninteresting to define 'computation', to be exactly sure of its meaning without place for nuance. If computation is about representation, then one could have a stratified idea of computation, with lower- and higher-level abstractions of the concept (the functional and imperative models, respectively). Could it be that computation itself, as in the idea, depends solely on the model of the world on which it acts?
ReplyDeleteQuoting the readings: "The models are judged not by whether they are correct, but by whether they are useful." As agents, ergo models, manipulating representations of reality constantly, is one clear-cut definition useful, compared to thousands of higher-level ones, tailored to subjective experiences or unique contexts?
Camille, Kid-Sib did not understand your comment (and neither did I).
DeleteBefore getting into the "nuances" of computation: What is computation? What did Turing (and the others who co-invented computation) say it was? How did Claude (the Chatbot) put it together? What happens when you take Claude's prompt text and feed it to another Chatbot? What do they say computation is? It's alright to give your own opinions, but first show that you have grasped the basics, and can explain them to Kid Sib.
***EVERYBODY PLEASE NOTE: I REDUCED THE MINIMUM NUMBER OF SKYWRITINGS. BUT THE READINGS ARE **ALL** RELEVANT TO AN OVERALL UNDERSTANDING OF THE COURSE. SO, EVEN IF YOU DO NOT DO A SKYWRITING ON ALL OF THEM, AT LEAST FEED EACH READING YOU DO NOT READ TO CHATGPT AND ASK IT FOR A SUMMARY, SO YOU KNOW WHAT THE READING SAID — OTHERWISE YOU WILL NOT HAVE A COMPLETE GRASP OF THE COURSE TO INTEGRATE AND INTERCONNECT FOR THE FINAL EXAM.***
DeleteThere is no clear answer to what computation is yet, but there are important concepts associated with it, such as behavioural equivalence: if something/someone produces the correct answer to a problem, they are considered to have solved the problem regardless of the methods or procedure used. For example, someone who is able to solve a math equation by simply blurting it out after some thought is considered to be equivalent to someone who solves the same equation using pen and paper to write the procedure down. This idea, of the results’ importance over the procedure, is central to computation.
ReplyDeleteIsabelle, congratulations on being the second student in the course to do a skywriting. But I’m curious to know what you read that made you answer what you did? The answer to “What is computation?” Is the opposite of what you said! Computation is a method or procedure. (What is that method or procedure?) What you said is that if someone produces the correct solution to a problem, they have produced the correct solution to the problem regardless of how they did it. That is surely true, but it is a tautology. It's also true that computation is (among other things) a way to solve problems: But what is that way? Do you really think that just blurting out the answer to a problem after some thought is computation? What kind of “thought”? And, while we’re at it, what is thought? What’s certain is that kid-sib would not have any idea what computation was from what you said. Please read the sky reading “What is Computation? A Foundational Definition” above, and try again.
DeleteI understand the sentiment, Isabelle. Thank you for your comment! But I think there is some missing complexity. The idea that computation can be produced by differently operating systems does not negate the fact that the computation must follow the same algorithm, regardless of the mechanical process. As stated in the reading, the definition of “computation” is important for establishing its boundaries. In other words, Model 1 trained to respond with a randomly* generated number to any input calculation can output the same result as Model 2, which is trained to manipulate the input symbols following the specific calculation algorithm it has been fed (e.g., if input is “2+5=?”, the output will always be “7”). However, Model 1 isn’t performing a computation; it isn’t manipulating any symbols even if it outputs the correct result.
Delete*note: I acknowledge that in the field of computer science, there is no such thing as a true random number.
Nicole, spot-on on both counts (including the fact that an algorithm cannot generate a random numbe (only a pseudo-random one). Why not?
DeleteSorry! I based this off a reading that wasn't listed here but was put in the syllabus. Computation is something that is hard for me to grasp so I must have misunderstood the reading. The lecture clarified it a bit for me but I'll reread it soon to better grasp the concept.
DeleteComputation is defined as the “rule-based manipulation of arbitrarily-shaped symbols.” But what is a “rule” here? The passage also states that “computation is the manipulation of formal symbol tokens according to rules (algorithms) that operate only on symbol shapes, not meanings,” implying that rules and algorithms are synonymous; and yet, it defines an algorithm as “a finite set of explicit rules that specify exactly how to manipulate these symbols.” Algorithms are said to be built from rules, and yet rules are also described in algorithmic terms. Are they one and the same then? If not, does computation occur on the level of the rule or the algorithm?
ReplyDeleteElle, algorithms are the rules for manipulating symbols. The best example is the rule (or “algorithm”) we all learned for how to manipulate the symbols in the “quadratic equation” ax² +bx +c = 0 in order to find out what "x" is (it’s mentioned in Claude’s “Foundational Definition” above).
DeleteDo you remember the rule [-b ± √(b²-4ac)]/2a ? If you manipulate the symbols blindly and mechanically, following that rule, you'll end up reaching the number that "x" turns out to be.
Of course you can't start with just that: you have to know how to manipulate x +1 = 2 and a lot of other elementary symbol manipulation rules from arithmetic and algebra before you get to quadratic equations.
But the path is all the same: algorithms are just recipes for manipulating symbols according to their shapes. You can do computation like using a cookbook, without having any idea what (if anything) the symbols mean (and a lot of us did it that way before we even learned what “2 + 2 = 4” means).
We eventually learned what the symbols mean, but that “meaning” is not part of the computation that gets you to the right answer (“x”). The computation is just the rule for manipulating the symbol-shapes to get there.
This will all come up again when we get to Searle’s Chinese Room Argument (Week 3) about why cognition is not just computation.
And even before that, the verbal Turing Test (T2) and the robotic Turing Test (T3) (Week 2).
And it will come up again when we get to the symbol grounding problem (Week 5); and then when we get to meaning and language language (Weeks 8 and 9).
Computation is the process in which a symbol is manipulated through an algorithm by an active user. In simpler terms, computation occurs when you or a machine perform a series of steps on an object to achieve an outcome. A classic example is addition, where the numbers (symbols) are manipulated by humans (the acting outside component) through a mathematical process like addition, multiplication, etc. (the algorithm). There are a few important notes to make. Firstly, despite the method of implementing the computation, the results should always be the same (i.e. whether you perform it, a computer, a Turing machine, etc). Secondly, the interpretation of the outcome is an external process from the computation itself. Although computation may seem like a complicated notion, it can be as simple as calculating 2+2 and is found in most areas of modern life.
ReplyDeletePlease let me know if this was kid-sib friendly and accurate to the skyreading!
Question: when we do skywritings, are we allowed to comment on the other readings assigned in the same week, or should we only comment on the reading on the skywriting page?
Jesse, good, kid-sibly reply! and correct -- though [minor point] I wouldn't call a machine doing computation a "user"... although the machine may have a [human] user: I would call a machine, if it is doing computation, the "executor" of the symbol manipulations.
Delete(But of course organisms, including humans, are "machines" too [what is a machine?], so machines can really be users too.)
In skywritings you can comment on any reading (but please include some sign that you have read and understood that week's readings and lectures too). And you can comment on one another's skywritings too!
Great response Jesse! I liked how you made it easy to understand that the interpretation or meaning of the symbols and the outcome is different and external from the computation! I also like how you gave an example and you related what you were sharing with something easy for a reader or LLM to understand! One piece of advice I may give is to use a bit more precise language. For example, when you mentioned "you or a machine PERFORM a series of STEPS on an object", maybe being a bit clearer with what you mean when you say "perform"? Do you mean physically ordering a series of steps? Or computing? And what are the "steps" specifically in this context referring to regarding this "object"? Are "steps" referring to an algorithm or objects? Making the language a bit more exact will help a LLM/reader not draw their own conclusions about what you mean and help them get to the truth more directly without any confusion. But all in all this was a great response and I enjoyed reading it!
DeleteRachel, we'll discuss this soon in the course when we get to the Turing Machine. Meanwhile ask an LLM what a Turing Machine is and what it does.
DeleteComputation is the process of recognizing arbitrary symbols to produce a desired result. ‘Symbols’ are any type of information (such as “+”, “Oui”, or a blue balloon) that have distinguishable shapes that represent a mathematical concept and can therefore be recognized to fulfill a computation. An 'algorithm' is a specific set of rules of how these formal symbols could possibly be ordered to carry out the computation. However, it is important to note that computation does not rely on the interpretation of these symbols, only the manipulation of symbols following a certain rule. For example, computation does not rely on the interpretation that the symbol “+” means addition, but rather it focuses on the recognition of the formal symbol to fulfill the desired computation and achieve a result. Additionally, computation is not only the manipulation of symbols on the formal level, but computation must be physically carried out by some implementation method, such as a human with pencil and paper, mechanical hardware, electronic circuits, etc. No matter what physical method is used to implement the formal symbols, the computation stays the same and is unaffected. The physical implementation of the formal symbols is called ‘hardware’, and the algorithm or set of rules of symbols is called the ‘software’.
ReplyDeleteThe definition of computation is vital to acknowledging that formal symbols that look (or seem) different from one another could have the same meaning, but the only thing that matters for the computation is the distinct shapes to create a desired output. This means that all fields and disciplines could use computation to formalize any structure, creating more possibility for efficiency and ease. On the other hand, the definition of computation also shows us the pitfalls of computation, in that it solely relies on symbol recognition devoid of meaning or interpretation, possibly creating a need for intentional interpretation down the line (in the form of occupations, etc).
Also please let me know if this is kid-sib friendly and on the right track/too long/too short, etc! Thank you!
Rachel, that was pretty kid-sibly, but (at 300 words) too long!
Delete“Represent a mathematical concept” contains not one but two WWs! And what do you mean?
Computation is rule-based manipulation of “symbols” based on their arbitrary shapes. (What does “arbitrary” mean here?)
Most of what you said is correct or close, and it will get better after Thursday’s lecture. (But do look up “Turing Machine” already so you get a clearer idea of what “manipulating” symbols means: what does a TM actually do?)
Yes! I will endeavour to shorten it!
DeleteI think what I was mentioning when I said symbols can "Represent a mathematical concept" was that symbols can be seen as mathematical symbols and swiggles and symbols are strung together into rules called algorithms. In retrospect though, I don't think I would use "mathematical concepts" to refer to symbols as symbols could be anything, not just mathematical symbols!
Arbitrary in this case means 'random' in the sense that the symbols don't need to be anything specific, but they can be anything under the sun. This is because the symbols that are manipulated within computation don't need to be a specific symbol, but rather they just have to be able to be read and manipulated by a physical computational machine, regardless of the specific symbol shape, etc.
Rachel, "arbitrary" here means that the shape of symbols is does not resemble what (if anything) they can be interpreted (bu their users) as meaning.
Delete"Humans can interpret the results of computations as meaningful."
ReplyDeleteI find it interesting to think about how the formal level of computing differs from the human interpretive level by considering how a Turing machine functions. For example, suppose a Turing machine were programmed to solve a quadratic equation to find x (as x = (-b +- sqrt(b^2 - 4ac))/2a). In that case, the Turing machine only takes as input the currently observed cell and the current state that it’s in before giving output. It simply follows the rules as it is programmed, which is the essence of computing. Meanwhile, humans solving the quadratic equation understand that each step they take to solve it contributes to the greater goal of finding x. So, while humans also follow rules to compute the answer, there is another layer to how we solve the equation.
Jenna, you’re spot on that that’s what Turing Machines (and computers, and mathematicians, and students) do when they are solving quadratic equations, but only mathematicians — and advanced students, once they learned how and why to do the manipulations — actually understand what they are doing in manipulating the symbols, and why, and what a “root of a quadratic equation” really is.
DeleteIf they go on in maths, students will eventually understand a lot more. But the computer (and the Turing Machine) won’t. Why not? Does ChatGPT understand more? (If no, why not; if yes, how?)
And how do mathematicians go on to ground their symbols?
After going through 1a's readings, my most important takeaway is clearly defining symbols and algorithms. I've always had a very binary view of computing, imagining it as 1s and 0s, so reframing symbols as arbitrary objects helped me understand a few quirks I've experienced with AI. Reframing, combined with viewing algorithms as systematic manipulations, helped me better understand Conery's explanation of a computation as
ReplyDelete"a sequence of state transitions, where a state is defined by a set of symbols."
Convery's explanation, based on Turing's work, clearly sets up the argument that since computation can be performed by different physical systems (i.e., software, hardware, humans, biology), the principle of behavioural equivalence suggests that computation is unlimited in scope.
I'm currently working through the "cognition is computation" debate. It seems like the for-argument is largely contingent on Newell and Simon's physical symbol system hypothesis, which states that
"a physical symbol system has the necessary and sufficient means for general intelligent action."
I understand that, following this hypothesis, both cognition and computation share the "knowledge level" and "symbol level." So, cognitive processes can be understood as the manipulation of symbols. However, I'm struggling to accept the pro-"cognition is computation" argument. Frustratingly, I'm not sure how to articulate my counterarguments, other than that I believe that "feelings" and "emotions" and "love" (which are really unfortunate weasel words) must distinguish a living-breathing being's cognition from a machine's computation. I'm still going through this debate, so I would be happy to hear what others make of it!
Audrey, all good points. When we get to Searle (Week 3) you'll get a chance to weigh the first real counterargument against C=C.
DeleteWhat does it mean to say (or deny) that "cognition is computation"?
ReplyDeleteThis entails a large set of conclusions – I was struck by the ones suggesting a certain form of schematizing of our understanding of cognitive processes.
1) If human cognition is computation, this entails that cognition is the manipulation of a series of symbols which compose the different “states” of the human self, through a process which is either automatic (algorithmic) or more agentified process (where the human has a greater influence).
a. This brings forward the question of what these symbols are. May they be of different kinds? Can two persons use the same algorithms by using different symbols which form different but equivalent states?
b. When we describe phenomena such as cognitive distortions, or delusions, would the metaphor of a “faulty algorithm” be appropriate? Should we rather speak of an “abnormal agent”?
2) “computation is a discrete process, a sequence of distinct transition” – this would suggest that it is possible to fully describe every cognitive process – that these processes are not “infinitely complex”.
Sofia, before you can argue that cognition is or is not just computation you have to be very clear on what is or is not computation. Please see the other skywritings and my replies.
DeleteIn the reading “What is Computation? / Computation is Symbol Manipulation,” he describes computation as a sequence of states that are defined by symbols, where a state transition is “the result of some process or collection of processes, where a process could be an algorithm being executed on a single computer, a human interacting with an application running on a computer, another computer at a remote site on the Internet, or physical or biological systems that have states that can be represented symbolically.”
ReplyDeleteAs a computer science minor, I found the comparison of computations to algorithms like dijkstra’s helpful to conceptualize cognition as a sequence of physical state transitions. However, where I struggle is in relating these formal notions of computation to the complexity of human cognition, specifically when it comes to our social and emotional processing. For example, I can see how physical processes like neuronal transmission in our brains could be modeled computationally, or memoization (storing results to avoid recomputation) could relate to our brain’s ability to store and access our memories of previous experiences to guide current decision making. However, in social situations, inputs to our perceptual system are not just symbols like written/spoken language, but include body cues, environmental, and contextual factors. Even within language there exists nuance within tone, syntax choice, etc. All these external factors that we constantly receive as input make it hard to map them onto the traditional algorithmic processes described in the article. Can the social and emotional aspects of human cognition truly be modeled computationally?
Hey Catherine! I had a similar reaction/struggle regarding the "cognition is computation" argument and what I felt was missing. I'm not an expert, but yesterday I ended up exploring concepts like embodied cognition and embodied computation, and they helped me put some of my objections into words.
DeleteLooking at the syllabus, it seems like we'll also explore counterarguments in week 3 with Searle's Chinese room argument (and later weeks 11/12)! I'm sure our stances will shift and sharpen as time goes on.
Catherine, you are right that cognition is not just verbal, whereas the verbal version of the Turing Test (T2) jtries to produce only Turing-Indistinguishable verbal capacity.
DeleteBut in this course is far we are still only at the stage of asking what computation (not cognition) is. And (according to the Strong Church/Turing Thesis), computation can model or simulate just about anything in the universe.
And is do, computation can model cognition.
But to model something — say, climate change, or covid spread (or a cat or a mat) — is not to be climate changing, or covid spreading (or a cat or a mat). The recipe for a vegan cake is not a vegan cake. If you have the right recipe for vegan cake, you still need the 3D printer and the ingredients to produce the cake. And flour and water are not computation. Neither is a 3D printer — although the 3D printer can prove that the computational recipe can be used to produce something Turing-indistinguishable from a vegan cake, including the Taste Test.
But let’s stick with sorting out what computation actually is, before claiming, or denying, that that could be all there is to cognition (or at least to verbal capacity).
(I think, though, as Audrey says in her comment on your comment, that you might be thinking about embodied cognition, which the verbal + robotic Turing Test (T3) calls for. And a robot can no more be just computation than a 3D printer (or a vegan cake) can be (although these can all be modelled or simulated by computation). This is all leading up to the symbol gtounding problem in Week 5, but first we have to make sure everyone clearly understands what computation -- before we can ask whether cognition is just that, and if not, why not?
Prof Conery in "Computation is Symbol Manipulation" says computation is a process where symbols (called state) are transformed through a sequence of steps (symbol manipulation), resulting in new symbols (a new state). The steps are carried out by some agent, for example a computer or even biochemical processes (ex. Biological process is the agent for DNA translation). I found this interesting because this applies the idea of computation to describe natural processes and highlight “not all computations are defined by an algorithm”. This idea was confusing because it feels contradictory to “What is Computation? A Foundational Definition” when it claims analog processes (ex. sundial) is not a computation since there is “no discrete symbol tokens being manipulated according to explicit rules” and the emphasis on defined rules. Am I misinterpreting one of the texts? Are there different levels of definition of computation?
ReplyDeleteAnnabelle, you’re right, and right to be confused by Conery (he was not clear enough); and Claude was right. A computation is an algorithm (a recipe) and it has to be physically executed, whether by a physical mathematician with physical pencil and paper, a physical Turing Machine, a physical finite-state machine (i.e., a digital computer) or a physical biological system — although biogenetic programs are a mixture of computation (the blueprint) and 3D printing (the protein-synthesis following the recipe).
DeleteAnd a sundial really is not a finite state digital device doing computation (whereas a digital watch is). Both can be used to tell time (for us). And of course a sundial can be simulated computationally. But a simulated sundial is not a sundial!
I was also confused as to how analog devices are considered. Conery writes that it does not matter if the transition is the result of several, more detailed, symbol manipulations, or a single transition performed by an analog device or some other system best described as continuous. From my limited understanding, I might propose that analog devices may be seen to perform calculations, if the transitions can be broken down into smaller, discrete steps. I understand that a sundial may not fit this definition, as I cannot imagine how its mechanism could be broken down into steps with discrete symbol tokens being manipulated according to explicit rules, which Claude described as computation. I am curious to see what other devices we consider to be “analog”, besides the sundial, and if the seemingly continuous mechanisms could be broken down into steps which do satisfy Claud’s criteria.
DeleteKira analog devices used by humans include compasses, telescopes, vacuum-cleaners, lawn-mowers, bicycles and cars (though these days some often also contain computational parts). Analog devices are just physical systems that do whatever they happen to do, but we use them for something we need to do (like tell time or get to lecture on time). They can be modelled or simulated/ computationally, but they are not themselves computing.
DeleteThank you for your comment! From our recent discussions in class and content in 1b, I am beginning to grasp the topic much better. I am still curious as to what Corney envisioned when he said computation could be performed by an analog device, as I agree that all the devices you have listed are not computing.
DeleteI also realised the formatting on my original post was erased! Sorry about that, I’ll make sure to italicise in the future.
Kira, an analog device is just any physical device we use. If we build a sundial that casts the a shadow to "compute" time, we are not computing time: what does compute mean? Is there symbol-manipulation going on?
DeleteComputation is a well-defined, step-by-step process used to manipulate symbols. Computation requires an algorithm, a specific rule, to manipulate symbols which are “distinguishable objects whose specific physical form is arbitrary”. At its formal level, computation purely works on using the shapes to follow the rule to produce an output. The interpretation of the result that the computation produces is based on our perception of it. This is an important distinction to make as it means that the interpretation is based on the perceived meaning of the symbols and not the computation itself.
ReplyDeleteMaya, that's it. (I don't mind that it's being repeated in various different ways in the skywriting, because there still some that have not yet quite got it yet...
DeleteUpon reflection, I have come to realize the utmost importance of the Church–Turing Thesis primarily discussed in “What is Computation? A Foundational Definition”. It proposes that lambda calculus (developed by Church) and the Turing machine (developed by Turing) are equally powerful models of computation: the process where symbols are manipulated according to explicit rules. In other words, both formal systems use different symbols and rules but ultimately arrive at the same outcome. What struck me is how these abstract inventions became the blueprint for modern computers; once we knew any process could be translated into symbols and operations, it made sense that machines could do it too. That said, as Copeland exclaims in “What is a Turing Machine” modern computers are unfortunately still limited by memory and speed. This brings me to wonder whether the ongoing push to close this gap might reveal further practical limits to computation.
ReplyDelete“It proposes that lambda calculus (developed by Church) and the Turing machine (developed by Turing) are equally powerful models of computation.”
DeleteI second that point, and for me, the universal independence emphasized in "What is Computation? A Foundational Definition" is the most interesting point uncovered. Computation’s rules-driven manipulation of symbols remains the same regardless of the executor, biological, silicon, or even a pencil. In my opinion, the fact that a system is universal is very strong, but at the same time, it points out the fact that, regardless of the executor’s speed or memory limits, the logic of the system is clearly tied to a syntax that is devoid of any meaning. It is not hard to scrutinize and ask if the symmetrical system of interpretation and manipulation actually marks the real boundary of computation’s scope.
-Gabriel Gatt
Gabriel & Gabe, well done (mostly because you all read Claude and not just the other 2 papers).
DeleteGödel, Church, Turing, Post, Kleene and others all tried, around the same time (the 1930’s), independently, o formalize what it is that mathematicians are doing when they are “computing.” Although their formulations looked very different, they were all proved to be equivalent (and Turing’s Machine was the simplest).
Eventually tit came to be called the (Weak) Church/Turing Thesis, and the confidence that the thesis was right increased, both because of the proofs that all the versions were equivalent, and because no one, to this day, has yet found a counterexample (neither a human computation that cannot be done by a Turing Machine nor vice versa).
The C/T Thesis nevertheless remains a Thesis, not a Theorem: it cannot be proved to be true, just disproved by one counterexample.
I wouldn’t worry too much about speed and capacity limitations as constraints on what can be done with computation — or, rather, worry about the human damage to the environment that they add considerably to, not about what can’t be done with computation for lack of enough speed and capacity…
As mentioned by other students, computation is the manipulation of symbols according to rules. A “symbol” is any object that can be distinguished from another, regardless of its appearance. According to Claude’s definition, computation is not semantic: “Computation operates on symbol shapes, not meanings. A computer executing ‘cat + dog = ?’ doesn't know anything about animals - it only manipulates shapes according to rules.”
ReplyDeleteThus, when we interpret the results of a computation as meaningful (such as cats and dogs being animals in the real world, not just symbols), that interpretation comes from our own minds, not from the computation itself.
To say that “cognition is computation,” would suggest that cognition can be reduced to rule-based symbol manipulation. However, I’m inclined to deny that cognition is merely computation. While the brain can carry out computations, our cognitive ability to tie meaning and interpret outputs seems to require semantic processing.
Adelka, “interpretation comes from our own minds”: but cognitive science is trying to explain what “minds” (a WW) are. All we know is that the interpretation comes from our heads.
DeleteAnd these days, when we want to deny that computation can produce thinking and understanding, we have to face the remarkable performance capacity of Chat GPT.
I found it interesting the emphasis that Claude placed in its explanation on distinguishing computation from meaning and/or semantics. It explains how a limitation of computation is that "it operates only on syntax, never directly on meaning or semantics." While I understand the point it is trying to convey here, a statement like this reminded me of our discussion about "Weasel Words" in lecture (especially since the word “meaning” was specifically mentioned). Do symbols and/or objects not have inherent “meaning” beyond how they are interpreted by a sentient observer? How can a computing agent read and act on them if they have no “meaning”?
ReplyDeleteLiam, symbols — in both natural language and formal mathematics (which is a syntactic subset of natural language) -- are just arbitrary shapes.
DeleteYou asked Do symbols and/or objects not have inherent “meaning” beyond how they are interpreted by a sentient observer? How can a computing agent read and act on them if they have no “meaning”?
What do you mean? Arbitrarily shaped symbols are just arbitrary shapes, like marbles or squiggle on paper. If they “have” any “meaning," it is in the head of the symbol user, such as the reader of a text covered with english words or the quadratic equation root computation recipe. It is not “in” the symbols, or their manipulations.
Maybe you mean something like: “Does ‘the cat is on the mat’ (or ‘2+2=4’) "really mean" that the cat is "really on the mat" (or that two plus two really equals 4)? or that ‘the cat is on the mat’ (or that ‘2+2=4’) is really true?
Well, yes, if the proposition is really true (and the quadratic recipe is really correct). But these are ontological questions in philosophy. In cognitive science the question is: What is the causal mechanism (in the brain, or anywhere else it occurs) that produces cognitive capacity that is Turing-indistinguishable from the cognitive capacity of an average, neurotypical human being?. In the case of Turing’s T2, the question is: What is the causal mechanism (in the brain, or anywhere else it occurs) that produces verbal capacity that is Turing-indistinguishable from theverbale capacity of an average, neurotypical human being?
The article “what is computation” made me think about the importance of functionality in our approach to tackling the computation = cognition question. It seems like there are many useful outcomes of explaining the brain in terms of computation. For example, the problem of a malfunctioning cochlea. We know quite a lot (relatively) about our primary sensory systems like hearing. So, some very smart people were able to mimic the cochlea in a way that allows for people who use this device to hear sounds. Those electrical signals eventually disperse throughout the brain, giving rise to a person’s personal experience of sound. Ergo, a device giving rise to the intensely emotional experience of sound, the end goal is relatively behaviourally equivalent. Of course, there is no device that can actually create that particular experience of feeling, for now we can only transduce those signals to pretty low level areas of the brain. I guess my point is if there are functional applications to this cognition = computation, school of thought, like the cochlear implant, then we might as well continue this road of inquiry despite its obvious limits.
ReplyDeletePippa, cochlear implants are sensorimotor prostheses like hearing aids and artificial limbs. They do not hear or feel. (But maybe more parts of the brain will prove to be replaceable by prostheses.)
DeleteThe three levels of description in computation in the article 'What is Computation? A Foundational Definition' remind me of Marr’s three levels of analysis in neuroscience. The formal level (symbol manipulation by rules) aligns with Marr’s computational level, which specifies what problem is being solved. The physical level (the medium implementing computation) corresponds to Marr’s implementation level, describing how neurons or circuits realize the process. The interpretive level (how humans assign meaning) resonates with Marr’s algorithmic/representational level, which bridges raw computation and meaningful interpretation. Both frameworks stress that in order to understand complex information-processing systems it is crucial to examine the differences between all levels and how they build onto each other.
ReplyDeleteAlexa:
DeleteTuring’s’a Formal Level (1) is Marr’s Algorithmic Level (2).
Turing’s’s Physical Level (2) is Marr’s Implementational Level (3).
Marr has no Interpretative Level (3).
Marr’s Computational Level (1) is part of Turing’s Formal Level (1).
Dichotomies rarely align exactly. Trichotomies even less.
“Not semantic processing: Computation operates on symbol shapes, not meanings.” This reminded me of Searle’s Chinese Room argument that I studied in some of my philosophy and cognitive science classes. In that thought experiment, the person inside is just following a rulebook to manipulate Chinese symbols.. they can produce the right answers, but they don’t actually understand Chinese. The same idea shows up in the examples. When we compute 2 + 5 = ?, the algorithm tells us to put 7, and it works perfectly, but it doesn’t actually “understand” what seven means. Like if you give a computer cat + dog = ?, it will still manipulate the symbols and maybe give an output, but it has no sense that cats and dogs are animals in the real world. It’s all rule following, not understanding, which is what computation at the essence is. That’s why equating cognition with computation is controversial. Sure, computation is powerful sicne it lets us formalize and model processes, but cognition seems to go beyond that by attaching real meaning to the symbols in ways a purely computational system never could.
ReplyDeleteShireen we'll get to Searle's Chinese Room Argument that C≠C in Week 3. Here, we're still sorting out what computation is or is not. We know that the propositions of natural language (like "the cat is on the mat") mean something, and when we talk we understand what we are saying and what the other person is saying when we say that. In mathematics, the propositions also mean something, e.g., "2 + 2 = 4". We know what that means, and we know that it's true. But mathematical propositions are a subset of natural language. So we're speaking English when we say "2 + 2 = 4". What are we doing when we do computation, say, something somewhat harder than adding two plus two: long division: 6468/7 = ? We do some symbol manipulation, using some memorized recipes, either with pencil and paper or in our heads.
DeleteAsk yourself whether understanding anf that "1234/3=411⅓"
Is this:
The algorithm for long division of a 4-digit number by a single digit:
Setup: Write the 4-digit dividend inside the division bracket, single-digit divisor outside to the left.
Step-by-step process:
1 Start with leftmost digit: Can the divisor divide into the first digit?
◦ If yes: divide, write quotient above, multiply back, subtract remainder
◦ If no: move to first two digits
2 For each subsequent digit:
◦ Bring down the next digit to the right of any remainder
◦ Divide the divisor into this new number
◦ Write the quotient digit above the division line
◦ Multiply: quotient digit × divisor
◦ Subtract this product from the number you divided into
◦ The result is the new remainder
3 Repeat until all digits are processed
4 Final remainder (if any) is what's left after the last subtraction
Example: 1,234 ÷ 3
411
____
3 | 1234
12↓
---
03
3
---
04
3
---
1
Result: 411 remainder 1 (or 411⅓)
The algorithm systematically processes digits left-to-right, using the fundamental division/multiplication/subtraction cycle at each step.
The part of this paper that struck me if that reasonable proof of intelligence lies in a machine's ability to trick humans into thinking it is one of them. This idea is tested by the Turing Testing in which an interrogator is asked to tell apart a human from a computer through a defined set of interactions with both. However, where is the line drawn? Is a computer intelligent if it succeeds in tricking us at a rate of 80%? Must it virtually never fail?
ReplyDeleteFrank, the example for Shireen above is of what Turing meant by "computation": it's what a Turing Machine does: manipulate symbol according to a recipe based on their shapes.
DeleteThe (Weak) Church/Turing Thesis is that all computation that mathematicians can do Turing Machines can do, and vice versa.
The Turing Test is Turing's method for testing whether we a causal mechanism can do everything a human can do, completely indistinguishably TO any human FROM any other human, lifelong, the causal mechanism explains the capacity. (Turing called it "intelligence" but today he would say "cognitive capacity.")
Whether computation alone can be the causal mechanism that can produce that is not known, because neither computation alone nor any other causal mechanism has yet been shown to be able to do all that.
But you will learn this week the difference between T2 (a purely verbal Turing Test of verbal capacity) and T3 (a Turing Test of both verbal and sensorimotor [robotic] capacity).
But what's true of both T2 and T3 is that they are testing cognitive capacity, not "imitation". If a mechanism produces cognitive capacity that is indistinguishable from anyone's cognitive capacity, lifelong, what more can you ask?
But is T2 to T3 (or something else) the right T-Test? (And what is a causal mechanism [or "machine"]?)
Computation is considered the manipulation of symbols according to rules, to further simplify the definition, it can be thought of as a method or procedure developed in the foundational work of Church (1936), Post (1936), and Turing (1936). By symbols we are referring to marks, characters, or tokens whose shapes are arbitrary but clearly distinguishable from one another. Symbol appearance does not matter, instead, the importance lies in how but the rules operate on them.
ReplyDeleteAn algorithm is the set of step-by-step instructions that determine how symbols are transformed, a procedure at the heart of computation. Consider an algorithm as beginning with the symbol 2 and applying the rule for addition 3 to derive 5. To avoid confusion, it is essential to separate computation and interpretation. Computation itself is blind to meaning and only involves applying rules to symbols without regard for what they represent. Interpretation, by contrast, is the process through which we assign meaning to the results of a computation. Another vital distinction is between software and hardware. Software refers to the abstract algorithm that specifies the rules for manipulating these symbols, while hardware is the physical system that implements said rules. A clear understanding of computation helps to distinguish it from other processes and allows us to study more complex concepts!
Emily, yes, computation, and the Weak (and Strong) Church/Turing Thesis (what are they?) go a long way. And as to "complexity," computation can encode as complex an algorithm as you like. But there's still that niggling problem of interpretation -- not just the interpretation of computational algorithms but of language -- words, propositions! Stay tuned...
DeleteComputation is the manipulation of symbols using rules known as algorithms, which operate based off of the symbol’s shapes and not their meanings. Conery similarly defines computation as well defined steps that transform one set of symbols to a new set of symbols. So what differs computation from cognition is being able to find meaning in symbols, like the example given in the paper that “dog” + “cat” means nothing to a computation but these words have meaning to us as humans cognitively. At first as a student who has learned a bit about machine learning I was wondering if ML and AI models can find meaning in tokens and are therefore thinking cognitively based on this definition of computation compared to cognition. However, ML models are trained on a dataset of words and learn by breaking text into tokens (symbols) and seeing their sequences and the statistical relationship between them to better predict which words appear near one another. For example a model could learn statistically that the symbols “dog” and “cat” appear near each other just from reading a lot of text data. Comparatively we learn the meaning of words through real world experiences that let us assign concepts to the words. Therefore, it is my understanding that ML and AI models are still computation and not cognition.
ReplyDeleteSierra, yes, Machine Learning and Artificial Intelligence are based on computation (symbol manipulation), but parts of robotics are analog, particularly their sensory and motor components; so even though they are synthetic (i.e., human-made, therefore artificial), they are not computational.
DeleteWhen I read Conery’s (2010) definition of computation, I thought of the rules on how to teach Kid-Sib a concept. Explaining to Kid-Sib isn’t exactly solving a problem, but I think it shares similar structure. Each explanation involves a sequence of well-defined steps that move the listener from one “state” of understanding to another. Additionally, your explanation to them must be unambiguous so that Kid-Sib can understand. Like computation, success teaching Kid-Sib depends on breaking down concepts into manageable parts, sometimes involving an algorithm of sorts. In this way, to me, teaching Kid-Sib parallels computation as a process of "state transitions".
ReplyDeleteSannah, not every systematic, structured process is computation (formal symbol manipulation). And although computation is a subset of language, it is a syntactic, formal subset, not a semantic one. Semantics concerns meaning, and that applies to explaining things verbally to kid-sib, and kid-sib understanding them. We'll be looking more at the similarities and differences between computation and language in future weeks.
Delete“This single step is a computation, just as much as the original sifting operation was a computation; this second computation simply maps an integer *n* to a list of primes less than *n* in a single table look-up step.” From the passage, I will dive into memoization : Memoization turns a multi-step computation into a single-step lookup by storing the results of prior computations. Last lecture, we defined information as “a decrease in uncertainty”. Linking our definition of information with the reading on computation : each computational step can be seen as acquiring information, i.e. reducing uncertainty about the result. Memoization stores that acquired certainty, letting future computations skip steps. This is explored in the later passage : “The memoized version that looks up the answer in a table is convincing, but only because we have confidence in the computation that produced the table entry in the first place”. The idea of “having confidence” reflects reduced uncertainty. So, memoization preserves and reuses information. In this way, computation/memoization and information are "parallels" in the reduction of uncertainty.
ReplyDeleteEmmanuelle, your reflections are good, but a little too impressionistic. You get a clearer and more specific understanding of what computation is from Claude (if you use the prompt) than from Conery. And uncertainty, as well as uncertainty-reduction concerns the user of the computation, not the computation itself. A computation, based on data from barometric pressure, humidity, temperature, and wind speed and direction (if the algorithm is correct) can reduce the user's uncertainty about whether to plan a (vegan) picnic tomorrow, but that's not part of the computation.
DeleteComputation is a form of manipulation of symbols where the manipulation is based on a set of rules. The rules that i mention answer questions like - what is the outcome if you click the blue button. As professor Harnad mentioned in class on September 4th, rules are a "recipe for manipulating symbols". However, computation on its own isn't enough to produce understanding or cognition. It's through active implementation - by a human, calculator, abacus that this manipulation can take place. But implementation isn't just a guarantee for cognition either. The example from class about the study with the blinded cats highlights a key part of this equation- the passive cat didn't gain vision, whereas the active one did. The point I am trying to make is that this is not just rule following- there must be an active agent, interaction and engagement even if that is a computer or a calculator. Computation is the structure while cognition is the experience and so without the agent, the recipe just remains symbols on a page.
ReplyDeleteI hope this is kid-sib friendly!
Duru, you are conflating cognition and computation here. 1a is about what computation is, not about whether C=C (computationalism), which is coming.
DeleteHi professor, thank you for the feedback. I see now that i was mixing up computation with cognition, i understand that 1a was meant to focus on specifically what computation is. In my paragraph, I was trying to connect some concepts we've been discussing in class but I realize now that I shifted too quickly rather than defining and explaining computation itself. After discussing with Claude, I was able to better understand that computation is the mechanical manipulation of symbols according to explicit, formal rules, what you called a "recipe for manipulating symbols". It's a process of transforming symbol structures step by step, following predetermined instructions. Looking through the comments of my classmates, about the Church-Turing thesis and universal computation, I can see how my original answer missed this foundational point about the universality. That the same computational process works identically whether executed by biological systems, silicon, or pencil and paper, and this was exactly what the 1930s mathematicians were trying to state. I think I got carried away connecting this to broader themes about cognition and implementation.
DeleteDuru, bravo, you've caught up!
Delete“A Turing machine is an idealised computing device consisting of a read/write head… The tape is divided into squares, each square bearing a single symbol.”
ReplyDeleteCopeland’s machine shows how computation is just rule-following with no meaning. Conery adds that one “step” can hide layers of sub-steps. But humans often work this way too. For example, students using the quadratic formula: many just plug in numbers and follow steps without really understanding what a root is. Isn’t that also blind symbol-pushing? If humans sometimes compute without meaning, maybe the line between computation and interpretation is blurrier than the readings suggest.
Randala, the C/C line is not blurred: when humans are doing thinking, they are thinking (cognition); when they are just doing symbol manipulation that they don't understand, they are just computing. But whether the causal mechanism in their brains that is producing what they are doing and what they can do is itself just computation (symbol manipulation) is another matter.
Delete“Computation operates on symbol shapes, not meanings.”
ReplyDeleteI get why this boundary is drawn, but I’m not sure it’s so clean. If computation only cares about shapes, how do we explain why certain algorithms are judged better than others at solving the same problem? For example, two different algorithms for sorting numbers manipulate the same shapes but one might be faster or more efficient. Doesn’t that suggest computation isn’t just blind symbol-pushing, but also involves qualities like speed and structure? Maybe ignoring those aspects makes the definition too narrow.
Randala, computation is just symbol-pushing, but the user cares about the result; so we prefer to use symbol-pushing algorithms that work, because they deliver the results we want.
DeleteSimply put, computation is carrying out a set of steps (called an algorithm) to change or organize symbols like numbers or letters, with a purpose to produce a clear result or outcome that we can then use for something meaningful. The usefulness of a computation comes when people interpret the result for a purpose. For example, when you alphabetize a list of names, the letters are the symbols and the sorting rules are the algorithm, and the purpose is to make it easier to find someone in a list.
ReplyDeleteJad
DeleteComputation is:
(1) Symbol Manipulation:
(2) Rule-based,
(3) (Arbitrary) Shape-based
(4) Implementation-independent, and
(5) Semantically interpretable
Do you know how to explain that to kid-sib?
Based on Jack Copeland’s article, Turing’s “universal machine” shows that any computation can be carried out by manipulating symbols according to fixed rules, as long as the right program is supplied on the tape. Modern computers embody this idea, but AI seems to complicate it. Unlike traditional programs with predetermined instructions, AI systems can generate new patterns, rules, and even symbolic structures as they run. This blurs the line between input and output, since the program is no longer only given but partly produced by the system itself. It raises the question of the order of computation. AI seems to fold in not only the formal but also the physical and the interpretive. This makes it unclear as to what comes first: the symbols, the rules, or the interpretation..
ReplyDeleteIt also prompts us to ask whether semantics is now part of the process rather than just syntax, suggesting that AI’s way of generating its own rules proposes that meaning itself plays a role in computation, not just the manipulation of symbols.
So my question is, if computation in Turing’s model is strictly syntactic, how do we reconcile that with AI’s apparent integration of semantics? And more broadly, what does this mean for our understanding of the very nature and limits of computation?
Ayla, you're better off using Claude's definition. The others mix in the semantics, but the semantics is not in the computation. It's in the head of the human user or interpreter. The symbols that are being manipulated are just arbitrary shapes. The fact that computation is just syntactic, not semantic, is essential to the definition of computation.
DeleteAfter the readings and lectures, I think I am starting to grasp more the concepts of computation and Turing machines. Computation is the manipulation of symbols based on algorithms. Turing machines are idealized computational machines that contains an infinite tape of infinite length for input, output, and memory storage going through a head that contains the programmed algorithm.
ReplyDeleteI am studying psychology and have no previous knowledge of computer science, so I would like to use an example that could help me understand better (it might be completely irrelevant and inaccurate). Is coding on R Studio computation, where the symbols be any characters written from the keyboard, the tape be the console window, and the code and functions be the algorithms? If so, would this mean that we are performing computation (coding) within another computation (computer)?
Anne-Sophie, R is a bunch of cookbook "recipes" for doing statistical analyses on data. Already written and stored, and applied to the data with a few keystrokes from the user. Yes, it is computation.
DeleteMy 10 year old cousin told me that ChatGPT is smarter than him. I thought this was the perfect chance to explain computation in a kid-sibly way.
ReplyDeleteI tried a bunch of different ways, but the one that finally seemed to click went something like this:
“Chatgpt isn't smarter than you because it is just doing computation. Computation is following rules to move symbols (like numbers or letters) around. Cognition is what you’re doing when you actually understand what those symbols mean. Chatgpt is just moving letters around based on rules that someone has given it. Chatgpt can arrange the squiggles ‘H-A-R-R-I-S-O-N,’ but only you know that it’s your name.”
I’m not totally confident in how well this maps onto Chatgpt, so if I’ve misunderstood something, please correct me!
So in the end, my cousin decided he’s smarter than ChatGPT. Next time I’ll tell him that “smart” is a weasel word, and I’ll try to explain that in a kid sibly way.
Lucy good kid-sib explanation, but double-check that your cousin understands with some more probing...
DeleteComputation itself is simply the application of rules to manipulate symbols, without regard to what the symbols mean. As stated in the article, "computation is the manipulation of formal symbol tokens according to rules… not their meanings." That's what was shown by Church, Turing, and Post back in the 1930s. The computer may read a "2" as "II" or a red block. It doesn't matter, provided the rules act upon it consistently. Take 2 + 3 being calculated: the computers not worried about apples or things; it's just replacing forms with other forms. The amazing thing is that this can be done with paper, gears, or silicon chips, it's the same calculation. The meaning is only realized when we interpret the result.
ReplyDeleteLorena, that's right.
DeleteBoth Turing and Conery say that computation proceeds in discrete steps transforming one state of symbols into another state of symbols. The Turing Machine helps explain the computation of primitive/atomic operations by computers, but Conery broadens computation by showing that they can be realized by many different forms of agents, not only what we typically think that ‘machines’ but that biological processes which are also symbolic state transitions are also computations and that computations are broader than just algorithms. To say that cognition and computation are the same would suggest that human thought processes are rule-based symbol manipulations but human cognition cannot be reduced to symbol manipulation alone. When we compute 2+3 =5, the fact that humans interpret 2, 3, 5 as numbers and + and = as symbols in external to the actual computation, as a computation does not produce and understandable meaning on its own it simply allows for the production of symbols which humans can then interpret as meaningful.
ReplyDelete(I am doing this later because I added the class to my schedule later)
Lauren: you're more or less right about computation, but why do you say cognition cannot be "reduced" to computation? (We'll look at reasons that might be true in Week 3, but you do need reasons.) The reason some biological processes could be manipulating symbols is that any physical system that executes an algorithm is computation: computation is independent of the physical hardware that is doing the symbol manipulations:
DeleteComputation is:
(1) Symbol Manipulation:
(2) Rule-based,
(3) (Arbitrary) Shape-based
(4) Implementation-independent, and
(5) Semantically interpretable
Computation as originally formalized, is 'the manipulation of formal symbol tokens according to rules'. This means that computation is a series of operations using symbols and rules to complete a sequence of operations to resolve a problem and find a solution. In other words, computation relies on an algorithm to interpret symbols. Moreover, the physical shapes of symbols are irrelevant to computation. In fact, a physical system that uses computation and resolves a problem doesn't necessarily know and understand what the symbols mean, it only recognizes the symbols and follows the given rules to solve the problem without further reflection. Thus, as long as the symbols are distinguishable and consistently interpreted, their physical form does not affect the computation. This explains why different physical systems (humans and machines) can do the same computation without changing the computation process or the result. While reading this paper, it struck me how computation is exactly what AI, such as ChatGPT does. In fact, ChatGPT is fundamentally a computational system that uses an algorithm to manipulate symbols. For example, it doesn’t truly understand language, but it processes text as sequences of symbols and recognizes those sequences to produce coherent responses based on pattern recognition across different questions asked.
ReplyDelete
DeleteNada, computation does not rely on algorithms to “interpret” symbols but rather to “manipulate” them based on explicit sets of rules. The word interpret here can suggest understanding or deriving meaning from, which computation does not attempt to do. This distinction is central to machines and LLMs (like ChatGPT, as you noted), which follow mathematical algorithms to process and generate human-like language without genuine understanding.
From what I understand, computation is the process of solving problems step by step by following clear rules. This doesn’t just apply to machines. For example, when humans are given a math equation to solve, they are also computing. The common idea is that computation only happens inside computers, but that’s not true. Computation is basically symbol manipulation, taking information, applying rules, and getting an answer.
ReplyDeleteHowever, Alan Turing and John Conery give parallel definitions of computation that differ in the perspective they give about computing. Turing, along with other mathematicians in the 1930s, defined computation in its most basic sense as the formal manipulation of symbols according to rules. The shapes of the symbols themselves don’t matter, as long as they can be clearly distinguished and used in the process. Conery, on the other hand, described computation as a sequence of well-defined steps that transform one state of symbols into another until a solution is reached.
Together, their views show that computation is not tied to machines or meaning but is a rule-based process that can be carried out by every machine and human.
Computation is not about producing the right answer through some “understanding” of what the question is asking—different from a fluid and nuanced understanding, it is the blind execution of explicit rules on symbol shapes. The symbols must, for computation to be carried out, be distinguishable. Their ‘meaning’ as we understand it, is irrelevant, their distinctness is what makes the process work.
ReplyDeleteSeeing it this way makes two points stand out.
First, meaning exists completely outside of computation. When an algorithm returns "48 R1," nothing in the computation understands that this means "48 cookies and one left over"; that interpretation occurs only in our minds, or by some human instruction for an algorithm to contextually frame the output (which is still, for humans by humans)
Second, it makes no difference how the symbols appear physically. Whether the same algorithm is used with paper and pencil, circuits, or a Turing machine, same computation. The arbitrariness of symbol shapes enables computation to be implemented on any physical system.
Framing it as a procedure rather than an understanding or outcome digs at a fundamental aspect of computation which diverges the nature of algorithms from the nature of humans
The reading made me realize that maybe our brains just aren’t built to solve every kind of problem. It’s strange to think that there might be limits to what human understanding can ever reach.
ReplyDeleteThis idea comes from understanding what “uncomputable problems” actually are. These are problems that no matter how smart a computer or person is, they just can’t be solved by following any fixed set of steps. Some numbers, for example, can’t be completely generated by a program, they go beyond what any system of rules can describe. Alan Turing showed this through his Halting Problem, proving that no algorithm can predict for every program whether it will ever stop running or go on forever.
Then there’s the idea of behavioral equivalence; two systems can count as “the same” computationally if they always give the same answers, even if they’re built differently. In theory, a person act like a computer by following every instruction one step at a time. But could the reverse happen? Could a computer ever fully replicate a human brain?
Our brains have about 100 billion neurons constantly sending and receiving signals. And since a single neuron can already be simulated by a computer, it seems theoretically possible to simulate the entire network and thus an entire mind. But if that’s true, and if uncomputable problems exist, it means there are things even a perfect brain-simulation could never solve.
That idea really changes how I think about both human and artifical intelligence. It suggests that no matter how far we advance, there will always be limits to our knowledge, not because we aren’t clever enough, but because those problems exist outside the boundaries of what any mind (biological or mechanical) can compute.
In “Computation as Symbol Manipulation,” it is stated that computations defined by algorithms are finite, since “algorithms must terminate in order to ensure their outputs are well-defined.” However, the paper also presents an example of a non-terminating computation using an algorithm, specifically, one that generates all prime numbers, a list that is theoretically infinite. The algorithm works by breaking down the complex problem of finding all primes into simpler subproblems, applying the modulus operator and testing for divisibility by zero. How, then, can both statements be true at once–the idea that algorithm-defined computations are finite, and the existence of an algorithm that appears to be infinite in nature?
ReplyDeleteI found it fascinating that Denning points out how all computation is ultimately grounded in continuous physical processes even though we tend to treat it as discrete and symbolic. It is so interesting and true that we “rarely go this far” because we trust the lower level steps to be correct, and it made me think about how much of computation depends on abstraction and faith in the layers beneath it. I also thought the description of computation as a functional concept that can occur in biological, chemical, or mechanical systems, like gene expression being seen as a series of symbol manipulations, shows how computation can exist at multiple levels of abstraction. Together these points made me wonder where the boundary lies between computation as a framework and computation as something that actually carries meaning or purpose. If we never question the continuous processes that make symbol manipulation possible, are we missing something important about what computation, and in turn, cognition, really is? Or does abstraction itself make computation possible in the first place?
ReplyDeleteComputation is rule based manipulation of tokens, which are squiggles that can take form on any medium where they can be inscribed whether it's on pen and paper or within microchips. Computation is limited to just simulating; you can represent anything using tokens, but no matter what, you can’t create the real thing from it. When Descartes said ‘Cogito ergo sum’, he was reflecting upon his feeling of cognition - the feeling itself, while it can theoretically be simulated with tokens. Whether the token manipulation alone creates that epiphenomenon is still up for debate.
ReplyDelete***EVERYBODY PLEASE NOTE: I REDUCED THE MINIMUM NUMBER OF SKYWRITINGS. BUT THE READINGS ARE **ALL** RELEVANT TO AN OVERALL UNDERSTANDING OF THE COURSE. SO, EVEN IF YOU DO NOT DO A SKYWRITING ON ALL OF THEM, AT LEAST FEED EACH READING YOU DO NOT READ TO CHATGPT AND ASK IT FOR A SUMMARY, SO YOU KNOW WHAT THE READING SAID — OTHERWISE YOU WILL NOT HAVE A COMPLETE GRASP OF THE COURSE TO INTEGRATE AND INTERCONNECT FOR THE FINAL EXAM.***
ReplyDelete