From 71d45b51e660558161260e36cca641fddfcde8cc Mon Sep 17 00:00:00 2001 From: Yan Mendes <53550620+kings177@users.noreply.github.com> Date: Mon, 26 Sep 2022 17:59:34 -0300 Subject: [PATCH 1/2] 43 --- README.md | 20 ++++++++++---------- 1 file changed, 10 insertions(+), 10 deletions(-) diff --git a/README.md b/README.md index 9eb85a9..196f752 100644 --- a/README.md +++ b/README.md @@ -2,40 +2,40 @@ The Kindelia Manifesto ====================== What is the true nature of computation? A hundred years ago, humanity answered that very question, twice. In 1936, Alan -invented the Turing Machine, which, highly inspired by the mechanical trend of the 20th century, distillated the common +invented the Turing Machine, which, highly inspired by the mechanical trend of the 20th century, distilled the common components of early computers into a single universal machine that, despite its simplicity, was capable of performing every computation conceivable. From simple numerical calculations to entire operating systems, this small machine could compute anything. Thanks to its elegance and simplicity, the Turing Machine became the most popular model of computation, and served as the main inspiration behind every modern processor and programming language. C, Fortran, -Java, Python are languages based on a procedural mindset, which is highly inspired by Turing's invention. +Java and Python, are languages based on a procedural mindset, which is highly inspired by Turing's invention. Yet, the Turing Machine wasn't the only model of computation that humanity invented. Albeit a less known history, also -in 1936, and in a completely independent way, Alonzo Church invented the Lambda Calculus, which distillated the common +in 1936, and in a completely independent way, Alonzo Church invented the Lambda Calculus, which distilled the common components - not of machines, but of different branches of math - into a single universal language that was capable of modeling every mathematical theory. What was surprising, though, is that this language, unexpectedly, could also perform computations. The same algorithms that could be computed by Turing Machines procedurally, could also be computed by the Lambda Calculus, through symbolic manipulations. The idea of using the Lambda Calculus for computations inspired the -creation of an entire new branch of programming, which we call the functional paradigm. Haskell, Clojure, Elixir, Agda +creation of an entire new branch of programming, which we call the functional paradigm. Haskell, Clojure, Elixir and Agda, are languages based on the functional mindset, which is highly inspired by Church's invention. -If both Turing Machines (and procedural languages), and the Lambda Calculus (and functional languages), are capable of -computation, which mindset is the "right one"? When it comes to raw capabilities, neither. Still on the 20th century, it +If both, Turing Machines (procedural languages) and the Lambda Calculus (functional languages), are capable of doing +computations, then which mindset is the "right one"? When it comes to raw capabilities, neither. Still on the 20th century, it was proven that, when it comes to computability, Turing Machines and the Lambda Calculus are equivalent. Every problem that one can solve, can also be solved by the other. That insight is known as the Church-Turing thesis, which essentially states that computers are capable of emulating each-other. If that was completely true, then the choice -wouldn't matter. After all, if, for example, every programming language is capable of solving the same set of problems, +wouldn't matter. After all, if, for example, every programming language is capable of solving the same set of problems, then what is the point in making a choice? Yet, while the Church-Turing hypothesis makes a statement about computability, it says nothing about computation. In other words, a model can be inherently less efficient than other. Historically, procedural languages such as C and -Fortran have have consistently outperformed the fastest functional languages, such as Haskell and Ocaml. Yet, languages +Fortran have consistently outperformed the fastest functional languages, such as Haskell and Ocaml. Yet, languages like Haskell and Agda provide abstractions that make entire classes of bugs unrepresentable. Historically, the functional paradigm has been more secure. Of course, these factors can vary greatly, but the point is that this notion -of equivalence is limited, and there are impactiful differences. +of equivalence is limited, and there are impactful differences. In 1983, Stephen Wolfram introduced the Rule 110, an elementary cellular automaton that has been shown to be as capable as both. Wolfram argues that this model is of fundamental importance for math and physics, and that a new kind of science should emerge from -its study. These claims were met with harsh scepticism; after all, if all models are equivalent, what is the point? +its study. These claims were met with harsh skepticism; after all, if all models are equivalent, what is the point? Yet, we've just stablished that, while equal in capacity, different models result in different practical outcomes. Perhaps there isn't a new branch of science to emerge from the study of alternative models of computation, but what about the design of processors and programming languages? From e63c883932dd65ad1ba61a9300e5f4f1662761e9 Mon Sep 17 00:00:00 2001 From: Yan Mendes <53550620+kings177@users.noreply.github.com> Date: Wed, 28 Sep 2022 00:11:07 -0300 Subject: [PATCH 2/2] pog --- README.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/README.md b/README.md index 196f752..cabb99f 100644 --- a/README.md +++ b/README.md @@ -18,7 +18,7 @@ Lambda Calculus, through symbolic manipulations. The idea of using the Lambda Ca creation of an entire new branch of programming, which we call the functional paradigm. Haskell, Clojure, Elixir and Agda, are languages based on the functional mindset, which is highly inspired by Church's invention. -If both, Turing Machines (procedural languages) and the Lambda Calculus (functional languages), are capable of doing +If both, Turing Machines (and procedural languages) and the Lambda Calculus (and functional languages), are capable of doing computations, then which mindset is the "right one"? When it comes to raw capabilities, neither. Still on the 20th century, it was proven that, when it comes to computability, Turing Machines and the Lambda Calculus are equivalent. Every problem that one can solve, can also be solved by the other. That insight is known as the Church-Turing thesis, which @@ -61,8 +61,8 @@ runtimes. Attempts to solve the issue only pushed it into other directions, such inhibit parallelism, or garbage collection, which isn't atomic. The failure of the functional paradigm to achieve satisfactory efficiency impacted its popularity, which, in turn, lead to tools like formal proofs to never catch up. -This raises the question: is there a model of computation which, like Turing Machine, has a reasonable physical -implementation, yet, like the Lambda Calculus, has a robust logical interpretation? In 1997, Yves Lafont proposed a new +This raises the question: is there a model of computation, which, like the Turing Machine, has a reasonable physical +implementation, and yet, like the Lambda Calculus, has a robust logical interpretation? In 1997, Yves Lafont proposed a new alternative, the Interaction Combinators, on which substitution is broken down into 2 fundamental laws: commutation, which creates and copies information, and annihilation, which observes and destroys information. In a sense, this may resemble SKI combinators, but that isn't a good analogy, since SKI combinators still include non-atomic operations: K @@ -71,7 +71,7 @@ Interaction Combinators is that its reduction laws are truly atomic: each operat amount of steps, and has a clear physical mapping. Not only that, they're inherently parallel, in the same sense that the Lambda Calculus has been claimed to be, in theory, but without the issues that let it to be, in practice. -Interestingly, every aspect which is considered good in other models of computation is present on Interaction +Interestingly, every aspect that is considered good in other models of computation, is present on Interaction Combinators, while negative aspects are completely absent. Moreover, both the Lambda Calculus and the Turing Machine can be efficiently emulated by the Interaction Combinators, while the opposite isn't true. This suggests that, while the 3 systems are equivalent in terms of computability, the Interaction Combinators are more capable in terms of computation.