Intelligent Process Flow take care of itself

Triangle of Knowledge

Product development describes the path through which an object goes from being an idea to being a saleable result. The phase of product development encompasses diverse processes – based on the complexity of implementation of a product that is ready for production, from an idea that needs to be transformed. The people that participate in these processes occupy the different positions in the company.

Man as agent
The know-how pertaining to this is distributed among employees in different departments that work with different applications. The employees have completely different perspectives of the end product, evidenced in the analysis and design phases as well as the development and production of internal and external documentation.

In such processes, each person is available to communicate what they know. Often the data exchange is through documents:

* As knowledge carriers, the challenge is to incorporate knowledge (of the product, of specific processes or requirements from the project) in the form of suitable data in the documentation.
* As knowledge seekers, the task is to obtain the relevant information from the documentation.
********************************************************************************************************************************************************
In philosophical sense
The origins of the word “Ontology” stem from synonyms of “metaphysics” or “leading philosophies” as defined by Aristotle. In this sense, Ontology is basically linked with the study of existence and being. Ontology is thus a part of philosophy, which wants to find out which things, and phenomena in the world are related to existence and which only in the thoughts of people or under specific conditions. A typical ontology question could be: when a tree in the forest topples, and there is nobody there that can hear it, does the falling tree even make a noise?

In the view of Information Technology
The 20th century is characterized by linguistic movements and strides in Science and Technology based on the methods and knowledge in formal logic. The term ontology is constantly changing and is subject to a modified understanding of the interaction between real objects of the world, symbols and thoughts. The fundamental discussion about these coherences is not new – they can be traced back to Plato and Aristotle. For instance, as observed in the Peri hermeneias [1]: “Spoken words are the symbols of mental experience and written words are the symbols of spoken words. Just as all men have not the same writing, so all men have not the same speech sounds, but the mental experiences, which these directly symbolize, are the same for all, as also are those things of which our experiences are the images.”

The semiotic triangle of ontological basis
Aristotle’s approach was further developed several times in the history of philosophy and schematically presented by Ogden and Richards for the first time in the year 1923 as a semiotic triangle.

Semiotic Triangle

The semiotic triangle exists in several variants. The relationships and characteristics of the vertices of the triangle are, for instance, extensively discussed by Charles Sanders Peirce. Basically it signifies that words, according to Peirce, do not symbolize a unique object in the world, nor can they be assigned to a related concept. For example: the word Jaguar could bring to mind either “animal“ or “car” and either object could be indicated by the word.

If we envisage that the intra-personal agent has access to a kind of semantic network of concepts, which is exhausted by the conversion to data of knowledge, then the data exchange between two agents can be represented by the diagram 3. The conversion to data is done using the semantic data model by Sieber/Kammerer [2] which represents the understanding of symbols by Peirce.

Communicating knowledge – schematic diagram

************************************************************************************************************************************
The basic idea of ontology is that a specific group of agents can agree on the concepts that are important in their application area and how these are linked with one another. The term ontology in Information Technology thus signifies a formally defined system of terms (concepts) and their relationship with one another.

Prevent wrong interpretation
Ontology therefore minimizes the possibility of wrong interpretations of the used words and their corresponding relationships. Therefore, Ontology presents a conceptual basis to communicate knowledge (compare [3]).

The scope within which things can be termed as ontology range from

* very simple glossaries to
* Taxonomies, the concepts and relationships contained to
* very complex ontologisms that also contain rules apart from concepts and their relationships.

The latter allow a logical conclusion over a data quantity described with ontology. One such complex application can, for instance, be used to validate a product against its technical specifications and to determine early on if the product really does what it is supposed to. The successful application in projects is illustrated by the example of an implemented ontology-based validation of control equipment from the automobile industry.

Machine-supported validation of control equipment
The complexity as well as the number of control equipment in a vehicle is steadily increasing. High quality requirements and constantly sinking development cycles require intensive testing of control equipment during the development process. Several control equipment are produced by manufacturers today and delivered to car makers. These must test if the delivered control equipment corresponds to the requirements as well as product specifications. Different scenarios are simulated with control equipment at special testing bays and the results are recorded. The measured data must then be checked for accuracy using suitable methods. In spite of the enormous quantity of measured data, the amount of manual work required is very high.

The basis for further automation of this test process is a formal, or through machine processable description of the required result of individual control equipment. Using such information, the results obtained on the test bay for control equipment can be automatically validated. Deviations from the required result or errors can therefore be identified early. Ontologisms and rules form a promising technological basis to implement this application.

Illustration of Know-how
Using Ontology, the terms of an application area are formulated and set in relationship with one another. Diagram 4 shows a schematic abstract from ontology. It defines, for instance, that a vehicle is comprised of several parts. The parts can be classified into different classes such as motor, electronic and motor control. In the same way, it is defined that the motor control equipment only supports certain motors. Terminological occurrences such as synonyms – vehicle or car – can similarly be recorded in the corresponding ontology.

ontology of a vehicle

Finally, rules use the terms that were already defined in the ontology to create additional and very complex relations. Thus, the complete functionality of a control device is described while also describing specific error configurations. As diagram 4 shows, you can therefore check if the correct motor control equipment is built into the vehicle.

Integrated application in product development
The central task of the presented approach for test data analysis is the transformation of the know-how from technical documentation, for instance control equipment specifications, into ontologisms and rules. This procedure must be carried out manually today and is therefore prone to error and expenditure. Through integrated application of ontologisms in product development, this preliminary work can be carried out in a precedent process step. Apart from this, all the participating processes benefit from such a formal model as:

Ontologisms in product development

* While recording specifications, ontologisms and regulations can be used to create a uniform know-how basis for the product to be developed.
* Process steps arising later can use the available know-how and also enhance it.
* During development, ontology can be used to communicate without errors with external developers.
* During the test phase, the formal model can be used to validate the products.

Bridging knowledge management
Overall, the use of ontologisms can produce an integrated and uniform documentation of a product and its development. Not only technical documents need to be part of such know-how. Other data that are relevant for the creation of the product, for instance, regulations can also be a part. The heart of ontology-based documents and their machine-processability are the people involved in the process steps that need to be able to translate their knowledge into suitable form into data. Ontologisms can be the bridge, so to say, between knowledge management, documentation and data integration on all levels.

Links und Literature
[1] Aristotle: (2007): On interpretation.
http://etext.library.adelaide.edu.au/a/aristotle/interpretation/
[2] Sieber, T.; Kammerer, M. (2006): Sind Metadaten bessere Daten? In: technische Kommunikation. H. 5, S. 56–58.
[3] Sieber, T. (2005): Ontologies and their representation. Doktoranduszok Fóruma.

[[Source:http://www.tekom.de/index_neu.jsp?url=/servlet/ControllerGUI?action=voll&id=2268%5D%5D

Philosophy of information 1

Philosophy of information

The philosophy of information (PI) is the area of research that studies conceptual issues arising at the intersection of computer science, information technology, and philosophy.
It includes:
1. the critical investigation of the conceptual nature and basic principles of information, including its dynamics, utilisation and sciences
2. the elaboration and application of information-theoretic and computational methodologies to philosophical problems.
Contents
1 History
1.1 Logic of information
1.2 Cybernetics
1.3 Study of language and information
1.4 P.I.
2 Defininitions of “information”
2.1 Peirce
2.2 Shannon and Weaver
2.3 Bateson
2.4 Floridi
3 Philosophical directions
3.1 Information and society

Semiotics

Semiotics, also called semiotic studies or (in the Saussurean tradition) semiology, is the study of signs and sign processes (semiosis), indication, designation, likeness, analogy, metaphor, symbolism, signification, and communication. Semiotics is closely related to the field of linguistics, which, for its part, studies the structure and meaning of language more specifically. Semiotics is often divided into three branches:
Semantics: Relation between signs and the things to which they refer; their denotata, or meaning
Syntactics: Relations among signs in formal structures
Pragmatics: Relation between signs and the effects they have on the people who use them
Semiotics is frequently seen as having important anthropological dimensions; for example, Umberto Eco proposes that every cultural phenomenon can be studied as communication.[citation needed] However, some semioticians focus on the logical dimensions of the science. They examine areas belonging also to the natural sciences – such as how organisms make predictions about, and adapt to, their semiotic niche in the world (see semiosis). In general, semiotic theories take signs or sign systems as their object of study: the communication of information in living organisms is covered in biosemiotics or zoosemiosis.
Syntactics is the branch of semiotics that deals with the formal properties of signs and symbols.[1] More precisely, syntactics deals with the “rules that govern how words are combined to form phrases and sentences.”[2] Charles Morris adds that semantics deals with the relation of signs to their designata and the objects which they may or do denote; and, pragmatics deals with the biotic aspects of semiosis, that is, with all the psychological, biological, and sociological phenomena which occur in the functioning of signs.

Cybernetics

Cybernetics is the interdisciplinary study of the structure of regulatory systems. Cybernetics is closely related to control theory and systems theory, at least in its first-order form. (Second-order cybernetics has crucial methodological and epistemological implications that are fundamental to the field as a whole.) Both in its origins and in its evolution in the second half of the 20th century, cybernetics is equally applicable to physical and social (that is, language-based) systems.

Code
A code is a rule for converting a piece of information (for example, a letter, word, phrase, or gesture) into another form or representation (one sign into another sign), not necessarily of the same type.
In communications and information processing, encoding is the process by which information from a source is converted into symbols to be communicated. Decoding is the reverse process, converting these code symbols back into information understandable by a receiver.
One reason for coding is to enable communication in places where ordinary spoken or written language is difficult or impossible. For example, semaphore, where the configuration of flags held signaller or the arms of a semaphore tower encodes parts of the message, typically individual letters and numbers. Another person standing a great distance away can interpret the flags and reproduce the words sent.
Floating signifier
Floating signifiers or empty signifiers is a term used in semiotics to denote signifiers without referents, such as a word that doesn’t point to any actual object or agreed upon meaning.

Hermeneutics

In religious studies and social philosophy, hermeneutics (English pronunciation: /hɜrməˈn(j)uːtɨks/) is the study of the theory and practice of interpretation. Traditional hermeneutics—which includes Biblical hermeneutics—refers to the study of the interpretation of written texts, especially texts in the areas of literature, religion and law. Contemporary, or modern, hermeneutics encompasses not only issues involving the written text, but everything in the interpretative process. This includes verbal and nonverbal forms of communication as well as prior aspects that affect communication, such as presuppositions, preunderstandings, the meaning and philosophy of language, and semiotics.[1] Philosophical hermeneutics refers primarily to Hans-Georg Gadamer’s theory of knowledge as developed in Truth and Method, and sometimes to Paul Ricoeur.[2] Hermeneutic consistency refers to analysis of texts for coherent explanation. A hermeneutic (singular) refers to one particular method or strand of interpretation. See also double hermeneutic.
The terms exegesis and hermeneutics are sometimes used interchangeably because exegesis focuses primarily on the written text. Hermeneutics however is a more widely defined discipline of interpretation theory including the entire framework of the interpretive process and, encompassing all forms of communication and expression; written, verbal, artistic, geo-political, physiological, sociological etc.
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography generally, networks other than communication networks — as in neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection[4] in ecology, thermal physics,[5] quantum computing, plagiarism detection[6] and other forms of data analysis.[7]
A key measure of information is known as entropy, which is usually expressed by the average number of bits needed for storage or communication. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).
Applications of fundamental topics of information theory include lossless data compression (e.g. ZIP files), lossy data compression (e.g. MP3s and JPGs), and channel coding (e.g. for DSL lines). The field is at the intersection of mathematics, statistics, computer science, physics, neurobiology, and electrical engineering. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones, the development of the Internet, the study of linguistics and of human perception, the understanding of black holes, and numerous other fields[citation needed]. Important sub-fields of information theory are source coding, channel coding, algorithmic complexity theory, algorithmic information theory, information-theoretic security, and measures of information.
International Association for Semiotic Studies
International Association for Semiotic Studies (Association Internationale de Sémiotique, IASS-AIS) is the major world organisation of semioticians, established in 1969.
The founding members of the Association include Algirdas Julien Greimas, Roman Jakobson, Julia Kristeva, Emile Benveniste, André Martinet, Roland Barthes, Umberto Eco, Thomas A. Sebeok, and Juri Lotman.
The official journal of the Association is Semiotica, published by Mouton de Gruyter. The working languages of the association are English and French.
The Executive Committee of the IASS (le Comité Directeur de l’AIS) consists of the representatives from semiotic societies of member countries (two from each).

Presentation

Logic of information

The logic of information, or the logical theory of information, considers the information content of logical signs and expressions along the lines initially developed by Charles Sanders Peirce. In this line of work, the concept of information serves to integrate the aspects of signs and expressions that are separately covered, on the one hand, by the concepts of denotation and extension, and on the other hand, by the concepts of connotation and comprehension.
Peirce began to develop these ideas in his lectures “On the Logic of Science” at Harvard University (1865) and the Lowell Institute (1866).
Pragmatic theory of truth
Pragmatic theory of truth refers to those accounts, definitions, and theories of the concept truth that distinguish the philosophies of pragmatism and pragmaticism. The conception of truth in question varies along lines that reflect the influence of several thinkers, initially and notably, Charles Sanders Peirce, William James, and John Dewey, but a number of common features can be identified. The most characteristic features are (1) a reliance on the pragmatic maxim as a means of clarifying the meanings of difficult concepts, truth in particular, and (2) an emphasis on the fact that the product variously branded as belief, certainty, knowledge, or truth is the result of a process, namely, inquiry.

Pragmatic maxim

The pragmatic maxim, also known as the maxim of pragmatism or the maxim of pragmaticism, is a maxim of logic formulated by Charles Sanders Peirce. Serving as a normative recommendation or a regulative principle in the normative science of logic, its function is to guide the conduct of thought toward the achievement of its purpose, advising on an optimal way of “attaining clearness of apprehension”. Here is its original 1878 statement in English[1] when it was not yet named:
It appears, then, that the rule for attaining the third grade of clearness of apprehension is as follows: Consider what effects, that might conceivably have practical bearings, we conceive the object of our conception to have. Then, our conception of these effects is the whole of our conception of the object.
(Peirce on p. 293 of “How to Make Our Ideas Clear”, Popular Science Monthly, v. 12, pp. 286–302. Reprinted widely, including Collected Papers of Charles Sanders Peirce (CP) v. 5, paragraphs 388–410.)
Charles Sanders Peirce bibliography
This Charles Sanders Peirce bibliography consolidates numerous references to Charles Sanders Peirce’s writings, including letters, manuscripts, publications, and Nachlass. For an extensive chronological list of Peirce’s works (titled in English), see the Chronologische Übersicht (Chronological Overview) on the Schriften (Writings) page for Charles Sanders Pierce.