Wednesday, June 13, 2018

Counterfactuals in Programming and Language

A counterfactual is something that is discussed as if it exists but does not. I can talk about the bald king of france and people will follow my flight of fancy knowing there is no such person. In one sentence i have conjured up a world different from our own. Many talk about the multiple world hypothesis and attribute this some more tangible meaning than i think it is due. Just because we can discuss an alternative reality does not mean that one can, or should, exist "out there." These counterfacturals hinge on a capacity of human minds to imagine that which is not.

If Speech Act Theory does one thing well, it explains how the very real abstract objects of social life are created. Property, money, and justice all depend upon these human created objects which would have no existence in our universe without people. These are human constructs but real and important to us even without any corporeal presence.

Programs too conjure things into existence. The objects instantiated by a program are mere potential or real energy captured as information on some substrate. Through multiple layers of representation we come to attribute meaning or social reality to these strings of bits. Is it accurate to look at these ontologically? to suggest that we are recognizing a new metaphysics? I am beginning to believe we should.

The human mind continually engages in imaging alternatives to the world we perceive. We can see improvements in our mind's eye. We wreck havoc on our enemies to satisfy an urge we must not act on. And we spin tales, fiction that generates many billions of dollars of US revenue by making more real something that is not.

When writers invite us into a fictional universe, they do so with conventions. A book is understood to be fictional before the first page is read. A movie, already something understood to be at minimum a dramatic retelling of some historical truth, may tip into magical realism with sone scene that defies the rules of physics. We willingly enter into the consensual hallucination of the story and are entertained by it. But when Searle talks about fiction and its speech acts, it all becomes hopelessly muddled. It seems that recognizing this alternative reality more directly makes the whole enterprise much easier as a theory.

While the multiverse may be entertained by some as a form of cosmology, it is undeniable that we can imagine a multi-verse whether or not one exists. Why not allow this human capacity to exist in multiple realities into the theory of language? By casting myself as an author and speaking as the author of this reality, I am relieved of the usual burden of truth. I am free to populated by universe with whatever rules and objects I like. I take on God-like powers there and can create heroes and demons at will. And these objects become real in the minds of the readers, the audience at a play or movie and in the players of games. The consumer of these products willingly pushes the one reality we cannot deny into the background enough to allow this alternative reality to consume their attention.

In this way creating computer systems is very much like taking some small bit of physical space and spinning an alternative universe for some purpose. Prosaically we build formal models of the social realities we want. We create accounting systems to reflect the economic systems we have, land registries, policy and procedure manuals and they record the various roles individuals play in the myriad organizations we create. These systems do more than make tangible the concepts we have. In important ways they become that reality. If the bank shows a balance, that is my money and there is no point in arguing one way or the other at that point of time. There have been many interesting cases of bank errors that enriched an individual and sometimes the money cannot be clawed back by the bank due to some accounting error on their part. Their accounting error created a reality and it is only in the most obvious cases where that windfall is understood to be theft when acted upon.

As our world become more determined by these cybernetic realities, it is important to heighten our philosophical notions to match this encroaching reality where cyber objects become as real, or even more real, than the older forms of social reality that preceded them. I suppose one hopes that it is still only human speech acts that create the cyber reality that transcends the older one. But like the Buttle/Tuttle mixup of the movie Brazil, that veil is thin. I want to dwell on this metaphysic for a time and come to my own understanding of the relationship between human language and the socio technical systems we create.

Saturday, June 9, 2018

The Ontology of a Java Program

To someone who cannot let go of the physical world, ontology is an odd subject. It is the philosophical study within the branch of metaphysics that deals with the nature of being. Or as Wikipedia says, "Ontology is the philosophical study of the nature of being, becoming, existence, or reality, as well as the basic categories of being and their relations." But for my purposes I will go no further than to talk about objects and their existence.

For anyone immersed in the objects relations school of computer programming objects is a weighted term. And it is exactly that overtone I wish to look at. But let me take one step back for a moment and observe that until one has successfully strung together a file of symbols that will pass a Java compiler, there is not yet a software module. The moment the compiler first accepts that text file and generates a byte code file, a new module is brought into existence in that file namespace. It has been summoned into existence.

The object-relations school of thought looks at a software module as an object that possesses attributes and behaviors. The byte code object can be copied, moved, executed and deleted (destroyed). Yet does the byte code file have any behaviors? Does any purely descriptive object be said to exhibit any behavior? I am going to say no. But of course the purpose of a byte code file is not to merely exist like some artifact lost in the desert of a file system. It is the essence that imparts some special magic into the machine with a Java Virtual Machine (JVM) running on it which will take that descriptive text and use it to create objects within the memory of that computer. And the original intent of the creator was that this object in the machines memory takes over the capabilities of this machine in a virus like way and bends it to the hidden will of the creator. It transmits the design of a virtual machine to this physical machine and then

 Computer Science is no more about computer than Astrophysics is about telescopes (E.W. Dijkstra)

Ontology is one of the prime areas of metaphysics. How did I end up reading about metaphysics when I intended to understand what makes programs readable? I am here because while we blithely talk about instantiating objects in a Java program, I am taking it in a more literal sense and finding that the language of ontology and then the later issues of sense and reference applicable to these. It cannot be mere coincidence that the two fields overlap in this way.

I am not alone of course. There is this post which directly asks the same question and gives a rather good answer. I found in in June 2018. 
http://www.mathema.com/philosophy/metafisica/is-metaphysics-relevant-to-computer-science/

But let me race on to the thought that sparked this entry, the famous sentence in philosophy about the bald king of France. Is this true or false?

in Java terms, we have two predicates. One would answer the question of whether x is the King of France. The other would answer the question of being bald. The first predicate could never be answered, could never be true, since there is no such person. This gives a null referent for the second predicate. And null referents defy the precondition of any predicate giving an indeterminate response. So in programming terms we exactly see the philosophical notion of the presumed referent.

When listening to the lectures on the philosophy of language and specifically the discussions of fiction that I came to what I think may be a metaphysics different from Searle's. I need to read more about this to see if it is a distinct metaphysics and if so, learn to describe it. 


References:
Ontologies: Principles Methods and Application, Mike Uschold Michael Gruninge, 1996,  The University of Edinburgh
http://www.aiai.ed.ac.uk/publications/documents/1996/96-ker-intro-ontologies.pdf 

https://www.springer.com/cda/content/document/cda_downloaddocument/9780387370194-c1.pdf?SGWID=0-0-45-495101-p173670217

https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-871-knowledge-based-applications-systems-spring-2005/lecture-notes/lect22_ontolog.pdf

https://plato.stanford.edu/entries/computer-science/



Friday, June 1, 2018

Charles Sanders Peirce - philosopher, logician, mathematician

For anyone who has studied formal logic, Peirce (pronounced like "purse"), is remembered for his binary operator, the Peirce arrow which, as it turns out, is functionally sufficient to derive all the other logical operators. This math fact gives us the ability to construct entire computers using only a single logic gate.

Charles Sanders Peirce (/pɜːrs/,[9] like "purse"; 10 September 1839 – 19 April 1914) was an American philosopherlogicianmathematician, and scientist who is sometimes known as "the father of pragmatism". He was educated as a chemist and employed as a scientist for 30 years. Today he is appreciated largely for his contributions to logic, mathematics, philosophy, scientific methodology, and semiotics, and for his founding of pragmatism.


Peirce's most important work in pure mathematics was in logical and foundational areas. He also worked on linear algebramatrices, various geometries, topology and Listing numbersBell numbersgraphs, the four-color problem, and the nature of continuity.
He worked on applied mathematics in economics, engineering, and map projections (such as the Peirce quincuncial projection), and was especially active in probability and statistics.[79]
Discoveries
Peirce made a number of striking discoveries in formal logic and foundational mathematics, nearly all of which came to be appreciated only long after he died:
In 1860[80] he suggested a cardinal arithmetic for infinite numbers, years before any work by Georg Cantor(who completed his dissertation in 1867) and without access to Bernard Bolzano's 1851 (posthumous) Paradoxien des Unendlichen.
The Peirce arrow,
symbol for "(neither) ... nor ...", also called the Quine dagger.
In 1880–1981[81] he showed how Boolean algebra could be done via a repeated sufficient single binary operation (logical NOR), anticipating Henry M. Sheffer by 33 years. (See also De Morgan's Laws.)






https://en.wikipedia.org/wiki/Charles_Sanders_Peirce

W Ross Ashby

One of the new names that popped up in my research is W Ross Ashby.

W. Ross Ashby (6 September 1903 in London – 15 November 1972) was an English psychiatrist and a pioneer in cybernetics, the study of the science of communications and automatic control systems in both machines and living things. His first name was not used: he was known as Ross Ashby.Despite being widely influential within cyberneticssystems theory and, more recently, complex systems, Ashby is not as well known as many of the notable scientists his work influenced, including Herbert A. SimonNorbert WienerLudwig von BertalanffyStafford BeerStanley MilgramKevin Warwick, and Stuart Kauffman.[5]


https://en.wikipedia.org/wiki/W._Ross_Ashby



It has been a few years since I used this blog. I had started posting things to Facebook but I am going into a deep dive on a topic and I want to try to gather some posts on that topic. I am currently interested in trying to establish a foundation in how language is used when dealing with computers and how it is used when computers interact. I am going to make a series of posts on influential thinkers as a form of annotated bibliography although the posts are about the people and not their works per se. I may go back and fill in the appropriate writings that I need but for now their brief bios will meet my needs.

Saturday, May 4, 2013

An Outline for a New GE Course in Computer Science

I have just completed teaching a course at a community college for the second time. Before all the volatile thoughts I had are completely lost, I want to sketch out the unique ideas I have about how a course like this should be taught. What I find generally lacking is a narrative thread that connects the disparate arcana that must be covered into a motivating course.

I. Introduction
Is this course really about "computer science"? If so, what is "information technology"? "management information science"? And why is it focused on "computation"? What I teach is really information technology and the central role of information to individuals, groups, organizations and society. While the motivation for most of the innovations through history have been focused on computation or communications, we are finding that we are looking toward and information convergence where all forms of information will be handled with the same technology. To be an educated and productive member of our society you must gain certain basic understandings of this technology, how it is used and the various ways you can approach it in your academic studies.

Show some entertaining Rube Goldberg machines and discuss causality and the design of mechanisms to achieve some end. Emphasize that computers are machines. While details may be daunting, there is nothing going on that a motivated person cannot understand.

I like to present computers in a continuum of mechanized development. The tie between Jacquard and information processing is well known, I think the tie between continued development and the industrial age is not fully explored. I think this is important simply because those students who have mechanical aptitude will find the mechanistic aspect of computers easy to grasp and dwelling on the more mechanical nature of the machines avoids alienating those students who lack this intuitive grasp of machines.



II. History
What is information? Is there information without humans? Does it server human needs? If so, which ones? Who historically funded the innovations in information technology and communications? commerce, military, government, religion. Eventually we see entertainment. Taking commerce as a starting point, you must understand the basics of numbers and numeric representation. This motivates an algebraic understanding of our positional number system and its variants in different bases. Depending on the depth possible in the course, negative numbers and real numbers can be covered in binary notation. Even more interesting is the notion of the digitization of numbers in even real representation and the inherently limited size of machine memory. No physical memory is unlimited and therefore has inherent limits on the size of the number that it can represent. An interesting philosophical point is the innovation of zero and the difficulty of representing nothing. This comes up again with null sets, representing a space and other places in the curriculum.

The basic arithmetic algorithms for the conversion of one base to another versus the conceptual understanding of those representations. The algorithm for long division. The concept of an algorithm as a series of steps that are executed to achieve some result. Summarize rules of binary addition into table.

Introduce the concept of mathematical series to calculate irrational numbers. Present enough to convince students that this work becomes tedious. Conclude with the method of calculating polynomial expressions using a series of differences.

Jump to text and some of the innovations. Transition from oral society to literate society. The first representation of spoken words through writing. Code of Hamirabi. Pictographic representation of thoughts. Development of phonetic alphabet. The limited growth of literacy. Look at literacy rates of 18th century France for grounding.

Use Greeks to discuss discourse and rhetoric. Codification of the laws of argumentation and thinking. The introduction of deductive logic. Cover basic forms of AND, OR (both inclusive and exclusive), NOT. Introduce truth tables. Nesting of expressions.

I like to cover the reality of the Medieval period of Europe and the growth of the Islamic world since major innovations occurred outside Europe and were introduced in the Renaissance like Arabic numerals, zero, algebra and double entry bookkeeping. It was the growth of trade that made this bookkeeping needed and also provided the explosive growth that led to the middle class and the growth of literacy, the reformation and the importance of the moveable type press. lower case, typesetting, italic, serifs. These are all needed to understand web technologies and word processors.

Cover the concept of variable in algebra if needed. The unknown which becomes a container for a value to be determined. If possible and necessary, cover rules of algebra like commutative, associative and distributive.

Discuss early attempts at mechanical computation like abacus used for commerce and still popular and the Pascal calculator. Show how innovations in timekeeping and metallurgy made this possible. Discuss decade counter (carry), odometer. If desired, discuss how negative is represented using an odometer (-1 = 9999) and tie back to the concept of ones complement.

Communication innovations. Lanterns in towers, flags. All depended on some system of alphabetic or message encoding. Discuss difference and give examples. motivates ASCII code later on. Military applications and encoding for secrecy. Discuss bandwidth, parallel, serial, representing null messages, secrecy and ways of encrypting messages.

Introduce Enlightenment and Industrial Revolution. Discuss basic electromagnetism if needed. Discuss need for communication to achieve coordination and the parallel growth of railroads and communication. Morse code, mechanics of receive and forward of discrete messages throughout a telegraph network. Do this in a way that motivate the packet switched network and introduce basic concepts like point-to-point, broadcast, routing. The growth in the publishing of books of transcendental equations for engineers and scientists.

Cover more logic. Boolean algebra. discuss how rules of Boolean algebra can be accomplished using simple mechanical devices. Introduce notion of binary addition as identical to simple Boolean expressions.

Talk about the tedium of looming, especially complex Jacquard fabrics. Describe the method of weaving them prior to mechanical loom. Discuss how the use of cards was used to control the needles and could be looped together to repeat the pattern. This allowed one loom to create many different patterns by changing the cards. The master weaver could then create many card decks for the various patterns.

Discuss Babbage first creation which mechanized the solving of polynomial equations with a geared machine. Input was via dials and levers. Discuss how he used Jacquard innovation to make machine setup easier. Ada describing cards that would solve a problem. Hollerith and 1890 census and success of punched cards to hold information.

Introduce typewriter as an innovation of moveable type press. Show how combination of telegraph and typewriter lead to teletype. Discuss difficulty of having national codes when communication goes global. Cover Baudot code if time allows. ASCII code. Multiplexing to compress/decompress signals.

Telephone. Analog versus digital. Waveform. How to digitize? PCM. How do you represent digital using analog? modem. can cover circuit switching if desired. Emphasize bandwidth. If desired, cover carrier and its encoding. Explain this as mechanism for most DSL.

There is a great deal of history on non-contemporary computer architecture. I skip it and jump ahead to the concept of gates. I like to emphasize that a gate can be implemented with very simple switches. Progress through innovations to speed switching; relays, tubes, transistors.

Revisit logic and introduce NAND, NOR and then describe half-adder circuit. Make tie between logic and arithmetic explicit. Motivate need for speed by discussion of military uses; munitions that could fire beyond horizon making visual correction impossible, the element of surprise that is lost if first shell does not land on target. the widespread use of "computers" (people) to do many complex algebraic computations. This leads to the rush to create a machine to perform the theoretical calculations needed to create the A-bomb and later the H-bomb.

The red scare and the race for space spurred new innovations like the integrated circuit. Moore's law.




privacy (anonymity) versus secrecy

**********************************

Addendum. I am reconsidering this approach in that more math must be introduced earlier. I still believe that it is better to introduce language and its various representations before math, it should still be introduced almost immediately in the form of numeric representation. I think the leap from addition and subtraction to multiplication and division would be good, particularly at the community college level since long division offers a good introductory algorithm that is not beyond their mathematical maturity. This ties in nicely with the transition from clocks to calculators as they grappled with these operations using gears.

Structuring Software Engineering Case Studies to Cover Multiple Perspectives

This post is my riff on a paper titled Structuring Software Engineering Case Studies to Cover Multiple Perspectives by Emil Boerjesson and Rober Feldt from Chalmers Univ. The paper offers suggestions on how multiple perspectives can be ensured by using a 6 step process. In their case study, they wanted to ensure they looked at the V&V process using four different perspectives; Business, Architecture, Process and Organization (BAPO) as well as from three distinct temporal perspectives; past, present, future (PCF).

The paper does not have any deep contribution to the case study approach to software engineering research; however it does provide an easy paper for the start of understanding how case studies can be used in the research of software engineering.

They call a case study:

  • an observational research method,
  • a way to investigate a phenomenon in its context,
  • applicable when there is no clear distinction between the phenomena and its context,
  • have guidelines recently published by Runeson and Hoest [1]

Their six-step methodology is:
  1. Get Knowledge about the Domain
  2. Develop focus Questions/Areas
  3. Choice of Detailed Research Methods
  4. Data collection
  5. Data analysis and Alignment
  6. Valuation Discussion














[1] P. Runeson and M. Hoest, "Guidelines for conducting and reporting case study research in software engineering," Empirical software Engineering, vol. 14, no. 2, pp. 131-164, 2009.