• Shuffle
    Toggle On
    Toggle Off
  • Alphabetize
    Toggle On
    Toggle Off
  • Front First
    Toggle On
    Toggle Off
  • Both Sides
    Toggle On
    Toggle Off
  • Read
    Toggle On
    Toggle Off
Reading...
Front

Card Range To Study

through

image

Play button

image

Play button

image

Progress

1/45

Click to flip

Use LEFT and RIGHT arrow keys to navigate between flashcards;

Use UP and DOWN arrow keys to flip the card;

H to show hint;

A reads text to speech;

45 Cards in this Set

  • Front
  • Back
Humberto Maturana
Maturana's work extends to philosophy and cognitive science and even to family therapy. He was early inspired by the work of the biologist Jakob von Uexküll.

Constructivist epistemology

Maturana and his student Francisco Varela were the first to define and employ the concept of autopoiesis. Aside from making important contributions to the field of evolution, Maturana is also a founder of constructivist epistemology or radical constructivism, an epistemology built upon empirical findings of neurobiology.

Maturana and Varela wrote in their Santiago Theory of Cognition: "Living systems are cognitive systems, and living as a process is a process of cognition. This statement is valid for all organisms, with or without a nervous system."

Biology of Knowledge

At the Instituto de Formación Matriztica Maturana has contributed extensively to the Biology of Knowledge and Biology of Loving. At this point, Maturana has formulated, with Ximena Davila, new perspectives about human life and continues to do so; psychology, the use of language, experience, and the general impulse to understand serve as explanatory bases for the depiction of human ways of life.
Jakob von Uexküll
akob Johann von Uexküll (September 8, 1864 - July 25, 1944) was a Baltic German biologist who had important achievements in the fields of muscular physiology, animal behaviour studies, and the cybernetics of life.

Works by scholars such as Kalevi Kull connect Uexküll's studies with some areas of philosophy such as phenomenology and hermeneutics. Jakob von Uexküll is also considered a pioneer of semiotic biology, or biosemiotics. However despite his influence (on the work of philosophers Martin Heidegger, Maurice Merleau-Ponty, Gilles Deleuze and Félix Guattari (in their A Thousand Plateaus), for example) he is still not widely known, and his books are mostly out of print.
Roger Barker
Roger Garlock Barker (1903, Macksburg, Iowa - 1990) was a social scientist, a founder of environmental psychology and a leading figure in the field for decades, perhaps best known for his development of the concept of behavior settings.

He helped set up the Midwest Psychological Field Station station in the nearby town of Oskaloosa, Kansas, a town of fewer than 2000 people. Barker's team gathered empirical data in Oskaloosa from 1947 through 1972, consistently disguising the town as 'Midwest, Kansas' for publications like "One Boy's Day" (1952) and "Midwest and Its Children" (1955). Based on this data, Barker first developed the concept of the behavior setting to help explain the interplay between the individual and the immediate environment
Virginia Satir
Virginia Satir (26 June 1916 - 10 September 1988) was a noted American author and psychotherapist, known especially for her approach to family therapy. Her most well-known books are Conjoint Family Therapy, 1964, Peoplemaking, 1972, and The New Peoplemaking, 1988.

She is also known for creating the "Virginia Satir - Change Process Model", this model was developed through clinical studies. Change management and organizational gurus of the 1990s and 2000s embrace this model to define how change impacts organizations.
Max Wertheimer
Max Wertheimer (April 15, 1880 – October 12, 1943) was a Czech-born Jewish psychologist who was one of the three founders of Gestalt psychology, along with Kurt Koffka and Wolfgang Köhler.
Karl Ludwig von Bertalanffy
Karl Ludwig von Bertalanffy (September 19, 1901, Vienna, Austria – June 12, 1972, New York, USA) was an Austrian-born biologist known as one of the founders of general systems theory. Von Bertalanffy grew up in Austria and worked from Vienna and London to Canada and over the USA.

Bertalanffy occupies an important position in the intellectual history of the twentieth century. His contributions went beyond biology, and extended to cybernetics, education, history, philosophy, psychiatry, psychology and sociology. Some of his admirers even believe that von Bertalanffy's general systems theory could provide a conceptual framework for all these disciplines.[1] He is seen as founder of the interdisciplinary school of thoughts known as general systems theory. Yet he spent his life in semi-obscurity, and he survives mostly in footnotes. Ludwig von Bertananffy may well be the least known intellectual titan of the twentieth century.[4] wiki

The individual growth model published by von Bertalanffy in 1934 is widely used in biological models and exists in a number of permutations.

In its simplest version the so-called von Bertalanffy growth equation is expressed as a differential equation of length (L) over time (t):

General System Theory (GST)

The biologist is widely recognized for his contributions to science as a systems theorist; specifically, for the development of a theory known as General System Theory (GST). The theory attempted to provide alternatives to conventional models of organization. GST defined new foundations and developments as a generalized theory of systems with applications to numerous areas of study, emphasizing holism over reductionism, organism over mechanism.

[edit] Open systems

Bertalanffy's contribution to systems theory is best known for his theory of open systems. The system theorist argued that traditional closed system models based on classical science and the second law of thermodynamics were untenable. Bertalanffy maintained that “the conventional formulation of physics are, in principle, inapplicable to the living organism being open system having steady state. We may well suspect that many characteristics of living systems which are paradoxical in view of the laws of physics are a consequence of this fact.” [7] However, while closed physical systems were questioned, questions equally remained over whether or not open physical systems could justifiably lead to a definitive science for the application of an open systems view to a general theory of systems.

In Bertalanffy’s model, the theorist defined general principles of open systems and the limitations of conventional models. He ascribed applications to biology, information theory and cybernetics. Concerning biology, examples from the open systems view suggested they “may suffice to indicate briefly the large fields of application” that could be the “outlines of a wider generalization;” [8] from which, a hypothesis for cybernetics. Although potential applications exist in other areas, the theorist developed only the implications for biology and cybernetics. Bertalanffy also noted unsolved problems, which included continued questions over thermodynamics, thus the unsubstantiated claim that there are physical laws to support generalizations (particularly for information theory), and the need for further research into the problems and potential with the applications of the open system view from physics.

[edit] Systems in the social sciences

In the social sciences, Bertalanffy did believe that general systems concepts were applicable, e.g. theories that had been introduced into the field of sociology from a modern systems approach that included “the concept of general system, of feedback, information, communication, etc.” [9] The theorist critiqued classical “atomistic” conceptions of social systems and ideation “such as ‘social physics’ as was often attempted in a reductionist spirit.” [10] Bertalanffy also recognized difficulties with the application of a new general theory to social science due to the complexity of the intersections between natural sciences and human social systems. However, the theory still encouraged for new developments from sociology, to anthropology, economics, political science, and psychology among other areas. Today, Bertalanffy's GST remains a bridge for interdisciplinary study of systems in the social sciences.
Second-order cybernetics
also known as the cybernetics of cybernetics, investigates the construction of models of cybernetic systems. It investigates cybernetics with awareness that the investigators are part of the system, and of the importance of self-referentiality, self-organizing, the subject-object problem, etc.

The work of Heinz von Foerster, Humberto Maturana, Gordon Pask, Ranulph Glanville, and Paul Pangaro is strongly associated with second-order cybernetics.
Urie Bronfenbrenner
Urie Bronfenbrenner (April 29, 1917–September 25, 2005) was a renowned American psychologist, known for developing his Ecological Systems Theory, and as a co-founder of the Head Start program in the United States for disadvantaged pre-school children.

leading scholars in the field of developmental psychology, Bronfenbrenner's primary contribution was his Ecological Systems Theory, in which he delineated four types of nested systems. He called these the microsystem (such as the family or classroom); the mesosytem (which is two microsystems in interaction); the exosystem (external environments which indirectly influence development, e.g., parental workplace); and the macrosystem (the larger socio-cultural context). He later added a fifth system, called the Chronosystem (the evolution of the external systems over time). Each system contains roles, norms and rules that can powerfully shape development.

It has been said that before Bronfenbrenner, child psychologists studied the child, sociologists examined the family, anthropologists the society, economists the economic framework of the times, and political scientists the political structure.

As a result of Bronfenbrenner's groundbreaking work in "human ecology", these environments, from the family to economic and political structures, have come to be viewed as part of the life course from childhood through adulthood. The "bioecological" approach to human development broke down barriers among the social sciences, and built bridges between the disciplines that have allowed findings to emerge about which key elements in the larger social structure, and across societies, are vital for optimal human development.
Psychology systems/scales
From Urie Bronfenbrenner

* Microsystem: Immediate environments (family, school, peer group, neighborhood, and childcare environments)
* Mesosystem: A system comprised of connections between immediate environments (i.e., a child’s home and school)
* Exosystem: External environmental settings which only indirectly affect development (such as parent's workplace)
* Macrosystem: The larger cultural context (Eastern vs. Western culture, national economy, political culture, subculture)
* Chronosystem: The patterning of environmental events and transitions over the course of life.

The person's own biology may be considered part of the microsystem; thus the theory has recently sometimes been called "Bio-Ecological Systems Theory." Each system contains roles, norms, and rules that can powerfully shape development.
Activities of man
Aristotle held that there were three basic activities of man: theoria, poiesis and praxis. There corresponded to these kinds of activity three types of knowledge: theoretical, to which the end goal was truth; poietical, to which the end goal was production; and practical, to which the end goal was action. Aristotle further divided practical knowledge into ethics, economics and politics. He also distinguished between eupraxia (good praxis) and dyspraxia (bad praxis, misfortune)
Gestalt
Started Pre WWII
Gestalt psychology (also Gestalt of the Berlin School) is a theory of mind and brain that proposes that the operational principle of the brain is holistic, parallel, and analog, with self-organizing tendencies; or, that the whole is different than the sum of its parts. The classic Gestalt example is a soap bubble, whose spherical shape is not defined by a rigid template, or a mathematical formula, but rather it emerges spontaneously by the parallel action of surface tension acting at all points in the surface simultaneously. This is in contrast to the "atomistic" principle of operation of the digital computer, where every computation is broken down into a sequence of simple steps, each of which is computed independently of the problem as a whole. The Gestalt effect refers to the form-forming capability of our senses, particularly with respect to the visual recognition of figures and whole forms instead of just a collection of simple lines and curves.
**
ORIGINS (Max Wertheimer, Kurt Koffka and Wolfgang Köhler??)
Although Max Wertheimer is credited as the founder of the movement, the concept of Gestalt was first introduced in contemporary philosophy and psychology by Christian von Ehrenfels (a member of the School of Brentano). The idea of Gestalt has its roots in theories by Johann Wolfgang von Goethe, Immanuel Kant, and Ernst Mach. Wertheimer's unique contribution was to insist that the Gestalt is perceptually primary, defining the parts of which it was composed, rather than being an "additional" element over and above the components parts, as von Ehrenfels earlier Gestalt-qualität had been.

Both von Ehrenfels and Edmund Husserl seem to have been inspired by Mach's work Beiträge zur Analyse der Empfindungen (Contributions to the Analysis of the Sensations, 1886), in formulating their very similar concepts of Gestalt and Figural Moment, respectively.

Early 20th century theorists, such as Kurt Koffka, Max Wertheimer, and Wolfgang Köhler (students of Carl Stumpf) saw objects as perceived within an environment according to all of their elements taken together as a global construct. This 'gestalt' or 'whole form' approach sought to define principles of perception -- seemingly innate mental laws which determined the way in which objects were perceived.

These laws took several forms, such as the grouping of similar, or proximate, objects together, within this global process. Although Gestalt has been criticized for being merely descriptive, it has formed the basis of much further research into the perception of patterns and objects (ref: Carlson, Buskist & Martin, 2000), and of research into behavior, thinking, problem solving and psychopathology.
**
http://en.wikipedia.org/wiki/Max_Wertheimer

*** Die Gestalt is a German word for form or shape. It is used in English to refer to a concept of 'wholeness'

*** The Gestalt effect refers to the form-forming capability of our senses, particularly with respect to the visual recognition of figures and whole forms instead of just a collection of simple lines and curves

** the basic methodological approach (holistic, phenomenological, experimental) of Gestalt theory, its system theoretical approach,


***
Gestalt psychology should not be confused with the Gestalt therapy of Fritz Perls, which is only peripherally linked to Gestalt psychology. Gestalt therapy was forged from various influences in the times and lives of the founders: physics, Eastern religion, existential phenomenology, Gestalt psychology, psychoanalysis, theatrical performance, systems and field theory (Mackewn, 1997).
Gestalt therapy theory rests atop essentially four "load bearing walls:" phenomenological method, dialogical relationship, field-theoretical strategies, and experimental freedom.
Critical realism
In the philosophy of perception, critical realism is the theory that some of our sense-data (for example, those of primary qualities) can and do accurately represent external objects, properties, and events, while other of our sense-data (for example, those of secondary qualities and perceptual illusions) do not accurately represent any external objects, properties, and events. In short, critical realism refers to any position that maintains that there exists an objectively knowable, mind-independent reality, whilst acknowledging the roles of perception and cognition.

Critical realism refers to several schools of thought. These include the American critical realists (Roy Wood Sellars, George Santayana, and Arthur Lovejoy) and a broader movement including Bertrand Russell and C. D. Broad.
Pragmatism
Pragmatism is a philosophic school generally considered to have originated in the late nineteenth century with Charles Peirce, who first stated the pragmatic maxim. It came to fruition in the early twentieth-century philosophies of William James and John Dewey. Pragmatists consider practical consequences or real effects to be vital components of both meaning and truth. Other important aspects of pragmatism include anti-Cartesianism, radical empiricism, instrumentalism, anti-realism, verificationism, conceptual relativity, a denial of the fact-value distinction, a high regard for science, and fallibilism.

Pragmatism began enjoying renewed attention from the 1950s on, because a new school of philosophy put forth a revised pragmatism which criticized the logical positivism that had dominated philosophy in the United States and Britain since the 1930s, notably in the work of analytic philosophers like W. V. O. Quine and Wilfrid Sellars. The concept of naturalized epistemology was further developed and widely publicized by Richard Rorty, whose later work grew closer to continental philosophy and is often considered relativistic. Contemporary pragmatism is still divided between work that is strictly within the analytic tradition, and a more relativistic strand in the wake of Rorty, and lastly neoclassical pragmatism (which includes philosophers such as Susan Haack) that stays closer to the work of Peirce, James, and Dewey.
Idealism
Idealism is any system or theory that maintains that the "real" is of the nature of thought or that the object of external perception consists of ideas. It can also be the tendency to represent things in an ideal form, or as they might or should be rather than as they are, with emphasis on values.

The approach to idealism by Western philosophers has been different from that of Eastern thinkers. In Western thought the ideal relates to direct knowledge of subjective mental ideas, or images. It is usually juxtaposed with realism in which the real is said to have absolute existence prior to and independent of our knowledge. Epistemological idealists (such as Kant) might insist that the only things which can be directly known for certain are ideas. In Eastern thought, as reflected in Hindu idealism, the concept of idealism takes on the meaning of higher consciousness, essentially the living consciousness of an all-pervading God, as the basis of all phenomena. A type of Asian idealism is Buddhist idealism.

Idealism is often contrasted with materialism, both belonging to the class of monist as opposed to dualist or pluralist ontologies.
Epistemology
Study of Knowledge

Epistemology (from Greek επιστήμη - episteme, "knowledge" + λόγος, "logos") or theory of knowledge is a branch of philosophy concerned with the nature and scope of knowledge.[1] The term was introduced into English by the Scottish philosopher James Frederick Ferrier (1808-1864).[2]

Much of the debate in this field has focused on analyzing the nature of knowledge and how it relates to similar notions such as truth, belief, and justification. It also deals with the means of production of knowledge, as well as skepticism about different knowledge claims. In other words, epistemology primarily addresses the following questions: "What is knowledge?", "How is knowledge acquired?", "What do people know?", "How do we know what we know?"
Metaphysics
Study of Existence

Metaphysics is the branch of philosophy investigating principles of reality transcending those of any particular science. Cosmology and ontology are traditional branches of metaphysics. It is concerned with explaining the ultimate nature of being and the world.[1]

The word derives from the Greek words μετά (metá) (meaning "beyond" or "after") and φυσικά (physiká) (meaning "physical"), "physical" referring to those works on matter by Aristotle in antiquity. The prefix meta- ("beyond") was attached to the chapters in Aristotle's work that physically followed after the chapters on "physics", in posthumously edited collections. Aristotle called some of the subjects treated there "first philosophy"
Ontology
In philosophy, ontology (from the Greek ὄν, genitive ὄντος: of being (part. of εἶναι: to be) and -λογία: science, study, theory) is a branch of metaphysics, often considered the most fundamental. It is the study of the nature of being, existence, or reality in general and of its basic categories and their relations, with particular emphasis on determining what entities exist or can be said to exist, and how these can be grouped and related within an ontology (typically, a hierarchy subdivided according to similarities and differences). Though distinguished from epistemology, as theories of knowledge typically involve some assumptions about existence and what exists, these can be seen as complementary disciplines.

The study of the nature of Being and existence; includes the definition and classification of entities, physical or mental, the nature of their properties, and the nature of change.

A central branch of metaphysics is ontology, the investigation into what types of things there are in the world and what relations these things bear to one another. The metaphysician also attempts to clarify the notions by which people understand the world, including existence, objecthood, property, space, time, causality, and possibility.
Structural versus algorithmic information theory
Since the 1960s, SIT (in psychology) and AIT (in computer science) evolved independently as viable alternatives for Shannon's classical information theory which had been developed in communication theory.[7] In Shannon's approach, things are assigned codes with lengths based on their probability in terms frequencies of occurrence (as, e.g., in the Morse code). In many domains, including perception, such probabilities are hardly quantifiable if at all, however. Both SIT and AIT circumvent this problem by turning to descriptive complexities of individual things.

Although SIT and AIT share many starting points and objectives, there are also several relevant differences:

* First, SIT makes the perceptually relevant distinction between structural and metrical information, whereas AIT does not;
* Second, SIT encodes for a restricted set of perceptually relevant kinds of regularities, whereas AIT encodes for any imaginable regularity;
* Third, in SIT, the relevant outcome of an encoding is a hierarchical organization, whereas in AIT, it is a complexity value.
Structural information theory versus connectionism and dynamic systems theory
On the one hand, a representational theory like SIT seems opposite to dynamic systems theory (DST). On the other hand, connectionism can be seen as something in between, that is, it flirts with DST when it comes to the usage of differential equations and it flirts with theories like SIT when it comes to the representation of information. In fact, the analyses provided by SIT, connectionism, and DST, correspond to what Marr called the computational, the algorithmic, and the implementational levels of description, respectively. According to Marr, such analyses are complementary rather than opposite.

What SIT, connectionism, and DST have in common is that they describe nonlinear system behavior, that is, a minor change in the input may yield a major change in the output. Their complementarity expresses itself in that they focus on different aspects:

* First, DST focuses primarily on how the state of a physical system as a whole (in this case, the brain) develops over time, whereas both SIT and connectionism focus primarily on what a system does in terms of information processing; according to both SIT and connectionism, this information processing (which, in this case, can be said to constitute cognition) thrives on interactions between bits of information.
* Second, regarding these interactions between bits of information, connectionism focuses primarily on the nature of concrete interaction mechanisms (assuming existing bits of information suited for any input), whereas SIT focuses primarily on the nature of the (assumed to be transient, i.e., input-dependent) bits of information involved and on the nature of the outcome of the interaction between them (modelling the interaction itself in a more abstract way).
formalization and representation by symbol strings
In SIT's formal coding model, this encoding is modelled by way of symbol manipulation. In psychology, this has led to critical statements of the sort of "SIT assumes that the brain performs symbol manipulation". Such statements, however, fall in the same category as statements such as "physics assumes that nature applies formulas such as Einstein's E=mc2 or Newton's F=ma" and "DST models assume that dynamic systems apply differential equations". That is, these statements ignore that the very concept of formalization means that things are represented by symbols and that relationships between these things are captured by formulas or, in the case of SIT, by simplest codes.
Nonlinear system
In mathematics, a nonlinear system is a system which is not linear, i.e. a system which does not satisfy the superposition principle

* Additivity f(x+y) = f(x) + f(y)
* Homogeneity f(ax) = af(x) -- through 0?


Nonlinear problems are of interest to physicists and mathematicians because most physical systems are inherently nonlinear in nature. Nonlinear equations are difficult to solve and give rise to interesting phenomena such as chaos. The weather is famously nonlinear, where simple changes in one part of the system produce complex effects throughout.

A nonhomogenous system, which is linear apart from the presence of a function of the independent variables, is nonlinear according to a strict definition, but such systems are usually studied alongside linear systems, because they can be transformed to a linear system as long as a particular solution is known.

Types
nl Algerbraic - often solved, understood numerically
nl Recurence - include logistic map
nl Differential - diverse
--> 1st order ODEs often exactly solvable by seperation of variables
--> Higher order forms rarely have closed form solutions (can be expressed as an equation)
Connectionism
Connectionism is an approach in the fields of artificial intelligence, cognitive psychology/cognitive science, neuroscience and philosophy of mind, that models mental or behavioral phenomena as the emergent processes of interconnected networks of simple units. There are many forms of connectionism, but the most common forms use neural network models.

In most connectionist models, networks change over time.

A closely related and very common aspect of connectionist models is activation. At any time, a unit in the network has an activation, which is a numerical value intended to represent some aspect of the unit.

Earlier work

PDP's direct roots were the perceptron theories of researchers such as Frank Rosenblatt from the 1950s and 1960s. But perceptron models were made very unpopular by the book Perceptrons by Marvin Minsky and Seymour Papert, published in 1969. It elegantly demonstrated the limits on the sorts of functions which perceptrons can calculate, showing that even simple functions like the exclusive disjunction could not be handled properly. The PDP books overcame this limitation by showing that multi-level, non-linear neural networks were far more robust and could be used for a vast array of functions.

Many earlier researchers advocated connectionist style models, for example in the 1940s and 1950s, Warren McCulloch, Walter Pitts, Donald Olding Hebb, and Karl Lashley. McCulloch and Pitts showed how neural systems could implement first-order logic: their classic paper "A Logical Calculus of Ideas Immanent in Nervous Activity" (1943) is important in this development here. They were influenced by the important work of Nicolas Rashevsky in the 1930s. Hebb contributed greatly to speculations about neural functioning, and proposed a learning principle, Hebbian learning, that is still used today. Lashley argued for distributed representations as a result of his failure to find anything like a localized engram in years of lesion experiments.

[edit] Connectionism apart from PDP

Though PDP is the dominant form of connectionism, other theoretical work should also be classified as connectionist.

Many connectionist principles can be traced to early work in psychology, such as that of William James. Psychological theories based on knowledge about the human brain were fashionable in the late 19th century. As early as 1869, the neurologist John Hughlings Jackson argued for multi-level, distributed systems. Following from this lead, Herbert Spencer's Principles of Psychology, 3rd edition (1872), and Sigmund Freud's Project for a Scientific Psychology (composed 1895) propounded connectionist or proto-connectionist theories. These tended to be speculative theories. But by the early 20th century, Edward Thorndike was experimenting on learning that posited a connectionist type network.

In the 1950s, Friedrich Hayek proposed that spontaneous order in the brain arose out of decentralized networks of simple units. Hayek's work was rarely cited in the PDP literature until recently.

Another form of connectionist model was the relational network framework developed by the linguist Sydney Lamb in the 1960s. Relational networks have been only used by linguists, and were never unified with the PDP approach. As a result, they are now used by very few researchers.
Nonlinear recurrence relations
A nonlinear recurrence relation defines successive terms of a sequence as a nonlinear function of preceding terms. Examples of nonlinear recurrence relations are the logistic map and the relations that define the various Hofstadter sequences.

Common methods for the qualitative analysis of nonlinear ordinary differential equations include:

* Examination of any conserved quantities, especially in Hamiltonian systems.
* Examination of dissipative quantities (see Lyapunov function) analogous to conserved quantities.
* Linearization via Taylor expansion.
* Change of variables into something easier to study.
* Bifurcation theory.
* Perturbation methods (can be applied to algebraic equations too).
Dynamical system
The dynamical system concept is a mathematical formalization for any fixed "rule" which describes the time dependence of a point's position in its ambient space. Examples include the mathematical models that describe the swinging of a clock pendulum, the flow of water in a pipe, and the number of fish each spring in a lake.

A dynamical system has a state determined by a collection of real numbers, or more generally by a set of points in an appropriate state space. Small changes in the state of the system correspond to small changes in the numbers. The numbers are also the coordinates of a geometrical space—a manifold. The evolution rule of the dynamical system is a fixed rule that describes what future states follow from the current state. The rule is deterministic: for a given time interval only one future state follows from the current state.


Overview

The concept of a dynamical system has its origins in Newtonian mechanics. There, as in other natural sciences and engineering disciplines, the evolution rule of dynamical systems is given implicitly by a relation that gives the state of the system only a short time into the future. (The relation is either a differential equation, difference equation or other time scale.) To determine the state for all future times requires iterating the relation many times—each advancing time a small step. The iteration procedure is referred to as solving the system or integrating the system. Once the system can be solved, given an initial point it is possible to determine all its future points, a collection known as a trajectory or orbit.

Before the advent of fast computing machines, solving a dynamical system required sophisticated mathematical techniques and could only be accomplished for a small class of dynamical systems. Numerical methods executed on computers have simplified the task of determining the orbits of a dynamical system.

For simple dynamical systems, knowing the trajectory is often sufficient, but most dynamical systems are too complicated to be understood in terms of individual trajectories. The difficulties arise because:

* The systems studied may only be known approximately—the parameters of the system may not be known precisely or terms may be missing from the equations. The approximations used bring into question the validity or relevance of numerical solutions. To address these questions several notions of stability have been introduced in the study of dynamical systems, such as Lyapunov stability or structural stability. The stability of the dynamical system implies that there is a class of models or initial conditions for which the trajectories would be equivalent. The operation for comparing orbits to establish their equivalence changes with the different notions of stability.
* The type of trajectory may be more important than one particular trajectory. Some trajectories may be periodic, whereas others may wander through many different states of the system. Applications often require enumerating these classes or maintaining the system within one class. Classifying all possible trajectories has led to the qualitative study of dynamical systems, that is, properties that do not change under coordinate changes. Linear dynamical systems and systems that have two numbers describing a state are examples of dynamical systems where the possible classes of orbits are understood.
* The behavior of trajectories as a function of a parameter may be what is needed for an application. As a parameter is varied, the dynamical systems may have bifurcation points where the qualitative behavior of the dynamical system changes. For example, it may go from having only periodic motions to apparently erratic behavior, as in the transition to turbulence of a fluid.
* The trajectories of the system may appear erratic, as if random. In these cases it may be necessary to compute averages using one very long trajectory or many different trajectories. The averages are well defined for ergodic systems and a more detailed understanding has been worked out for hyperbolic systems. Understanding the probabilistic aspects of dynamical systems has helped establish the foundations of statistical mechanics and of chaos.

It was in the work of Poincaré that these dynamical systems themes developed.

***
Basic definitions

Main article: Dynamical system (definition)

A dynamical system is a manifold M called the phase (or state) space and a smooth evolution function Φ t that for any element of t ∈ T, the time, maps a point of the phase space back into the phase space. The notion of smoothness changes with applications and the type of manifold. There are several choices for the set T. When T is taken to be the reals, the dynamical system is called a flow; and if T is restricted to the non-negative reals, then the dynamical system is a semi-flow. When T is taken to be the integers, it is a cascade or a map; and the restriction to the non-negative integers is a semi-cascade.
Logistic map
The logistic map is a polynomial mapping of degree 2, often cited as an archetypal example of how complex, chaotic behaviour can arise from very simple non-linear dynamical equations. The map was popularized in a seminal 1976 paper by the biologist Robert May, in part as a discrete-time demographic model analogous to the logistic equation first created by Pierre François Verhulst.[1] Mathematically, the logistic map is written
Bifurcation theory
Bifurcation theory is the mathematical study of changes in the qualitative or topological structure of a given family. Examples of such families are the integral curves of a family of vector fields or, the solutions of a family of differential equations. Most commonly applied to the mathematical study of dynamical systems, a bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden 'qualitative' or topological change in its behaviour. Bifurcations occur in both continuous systems (described by ODEs, DDEs or PDEs), and discrete systems (described by maps).

Types of bifurcation

It is useful to divide bifurcations into two principal classes:

* Local bifurcations, which can be analysed entirely through changes in the local stability properties of equilibria, periodic orbits or other invariant sets as parameters cross through critical thresholds; and
* Global bifurcations, which often occur when larger invariant sets of the system 'collide' with each other, or with equilibria of the system. They cannot be detected purely by a stability analysis of the equilibria (fixed points).

Local bifurcations
Phase portrait showing Saddle-node bifurcation.
Phase portrait showing Saddle-node bifurcation.
Period-halving bifurcations (L) leading to order, followed by period doubling bifurcations (R) leading to chaos.
Period-halving bifurcations (L) leading to order, followed by period doubling bifurcations (R) leading to chaos.

A local bifurcation occurs when a parameter change causes the stability of an equilibrium (or fixed point) to change. In continuous systems, this corresponds to the real part of an eigenvalue of an equilibrium passing through zero. In discrete systems (those described by maps rather than ODEs), this corresponds to a fixed point having a Floquet multiplier with modulus equal to one. In both cases, the equilibrium is non-hyperbolic at the bifurcation point. The topological changes in the phase portrait of the system can be confined to arbitrarily small neighbourhoods of the bifurcating fixed points by moving the bifurcation parameter close to the bifurcation point (hence 'local').

Examples of local bifurcations include:

* Saddle-node (fold) bifurcation
* Transcritical bifurcation
* Pitchfork bifurcation
* Period-doubling (flip) bifurcation
* Hopf bifurcation
* Neimark (secondary Hopf) bifurcation

Global bifurcations

Global bifurcations occur when 'larger' invariant sets, such as periodic orbits, collide with equilibria. This causes changes in the topology of the trajectories in the phase space which cannot be confined to a small neighbourhood, as is the case with local bifurcations. In fact, the changes in topology extend out to an arbitrarily large distance (hence 'global').

Examples of global bifurcations include:

* Homoclinic bifurcation in which a limit cycle collides with a saddle point.
* Heteroclinic bifurcation in which a limit cycle collides with two or more saddle points.
* Infinite-period bifurcation in which a stable node and saddle point simultaneously occur on a limit cycle.
* Blue sky catastrophe in which a limit cycle collides with a nonhyperbolic cycle.

Global bifurcations can also involve more complicated sets such as chaotic attractors.
Catastrophe theory
In mathematics, catastrophe theory is a branch of bifurcation theory in the study of dynamical systems; it is also a particular special case of more general singularity theory in geometry.

Bifurcation theory studies and classifies phenomena characterized by sudden shifts in behavior arising from small changes in circumstances, analysing how the qualitative nature of equation solutions depends on the parameters that appear in the equation. This may lead to sudden and dramatic changes, for example the unpredictable timing and magnitude of a landslide.

Catastrophe theory, which was originated with the work of the French mathematician René Thom in the 1960s, and became very popular due to the efforts of Christopher Zeeman in the 1970s, considers the special case where the long-run stable equilibrium can be identified with the minimum of a smooth, well-defined potential function (Lyapunov function).

Small changes in certain parameters of a nonlinear system can cause equilibria to appear or disappear, or to change from attracting to repelling and vice versa, leading to large and sudden changes of the behaviour of the system. However, examined in a larger parameter space, catastrophe theory reveals that such bifurcation points tend to occur as part of well-defined qualitative geometrical structures.
Phenomenology (psychology)
In psychology, phenomenology is used to refer to subjective experiences or their study. The experiencing subject can be considered to be the person or self. Subjective experiences are those that are in principle not directly observable by any external observer. One aspect of this of great philosophical interest is qualia, whose archetypical exemplar is "redness". "Is my experience of redness the same as yours?" "How would we know?" Subjective experiences are not merely perceptual. They can include any emotional, cognitive, or conative experience reaching the consciousness of the subject[citation needed].

** In Phil -- Phenomenology is the study of phenomena (from Greek, meaning "that which appears") and how they appear to us from a first-person perspective. In modern times, it usually refers to the philosophy developed by Edmund Husserl, which is primarily concerned with consciousness and its structures (the ways in which phenomena appear to us). Because consciousness is supposed to be that to which everything shows itself, and phenomenology is the study of consciousness, Husserl considered it to be a proper first philosophy. Husserl also sought to develop a "philosophy as rigorous science".

Husserl's original account of phenomenology has through the years been criticised and developed, partly by Husserl himself, but also by Martin Heidegger, who was his student and assistant, and many of the later existentialist thinkers such as Maurice Merleau-Ponty, Jean-Paul Sartre, and Simone de Beauvoir.
"Qualia"
"Qualia" (pronounced /ˈkwɑːliə/) is "an unfamiliar term for something that could not be more familiar to each of us: the ways things seem to us"[1]. They can be defined as qualities or sensations, like redness or pain, as considered independently of their effects on behavior and from whatever physical circumstances give rise to them. In more philosophical terms, qualia are properties of sensory experiences.

The importance of qualia in philosophy of mind comes largely from the fact that they are often seen as posing a fundamental problem for physicalism. Much of the debate over their existence, however, hinges on the debate over the precise definition of the term, as various philosophers emphasize or deny the existence of certain properties.

The word "qualia" comes from Latin, meaning "what sort" or "what kind." The Latin and English singular is "quale" (pronounced /ˈkwɑːleɪ/, roughly KWAH-leh)[2])

Believers in qualia are known as qualophiles; non-believers as qualophobes.[3]
Control Theory - history
Control theory is

* a theory that deals with influencing the behavior of dynamical systems
* an interdisciplinary subfield of science, which originated in engineering and mathematics, and evolved into use by the social sciences, like psychology, sociology and criminology.

History
Although control systems of various types date back to antiquity, a more formal analysis of the field began with a dynamics analysis of the centrifugal governor, conducted by the physicist James Clerk Maxwell in 1868 entitled On Governors.[1] This described and analyzed the phenomenon of "hunting", in which lags in the system can lead to overcompensation and unstable behavior. This generated a flurry of interest in the topic, during which Maxwell's classmate Edward John Routh generalized the results of Maxwell for the general class of linear systems.[2] Independently, Adolf Hurwitz analyzed system stability using differential equations in 1877. This result is called the Routh-Hurwitz theorem.

By World War II, control theory was an important part of fire-control systems, guidance systems and electronics. The Space Race also depended on accurate spacecraft control. However, control theory also saw an increasing use in fields such as economics .
# Norbert Wiener
Norbert Wiener (1894-1964) co-developed the Wiener-Kolmogorov filter and coined the term Cybernetics in the 1940s.
Max Weber
Maximilian Carl Emil Weber (pronounced [maks 'veːbɛɐ]) (21 April 1864 – 14 June 1920) was a German political economist and sociologist who was considered one of the founders of the modern study of sociology and public administration. He began his career at the University of Berlin, and later worked at the universities of Freiburg, Heidelberg, and Munich.

Weber's major works deal with rationalization in sociology of religion and government.[1] His most famous work is his essay The Protestant Ethic and the Spirit of Capitalism, which began his work in the sociology of religion. In this work, Weber argued that religion was one of the non-exclusive reasons for the different ways the cultures of the Occident and the Orient have developed, and stressed importance of particular characteristics of ascetic Protestantism which led to the development of capitalism, bureaucracy and the rational-legal state in the West. In another major work, Politics as a Vocation, Weber defined the state as an entity which claims a monopoly on the legitimate use of physical force, a definition that became pivotal to the study of modern Western political science. His analysis of bureaucracy in his Economy and Society is still central to the modern study of organizations. His most known contributions are often referred to as the 'Weber Thesis'.
Béla H. Bánáthy
Béla H. Bánáthy (Gyula, Hungary 1 Dec. 1919 – Chico, California, 4 Sept 2003), was a systems scientist and a professor at the San José State University and the UC Berkeley. Bánáthy was the founder of the White Stag Leadership Development Program whose leadership model was adopted across the United States; founder of the International Systems Institute[1] and its innovative "conversation"-oriented conference structure; co-founder of the General Evolutionary Research Group[2]; an influential professor of systems theory; and a widely-read and respected author.

who argued - along with the founders of the systems society - that “the benefit of humankind” is the purpose of science
Émile Durkheim
Émile Durkheim (French pronunciation: [dyʁkɛm]; April 15, 1858 – November 15, 1917) was a French sociologist whose contributions were instrumental in the formation of sociology and anthropology. His work and editorship of the first journal of sociology, L'Année Sociologique, helped establish sociology within academia as an accepted social science. During his lifetime, Durkheim gave many lectures, and published numerous sociological studies on subjects such as education, crime, religion, suicide, and many other aspects of society. He is considered as one of the founding fathers of sociology and an early proponent of solidarism.
Spinoza
Reverence for being

Baruch or Benedict de Spinoza (Hebrew: ברוך שפינוזה‎, Portuguese: Bento de Espinosa, Latin: Benedictus de Spinoza) (November 24, 1632 – February 21, 1677) was a Dutch philosopher of Portuguese Jewish origin. Revealing considerable scientific aptitude, the breadth and importance of Spinoza's work was not fully realized until years after his death. Today, he is considered one of the great rationalists of 17th-century philosophy, laying the groundwork for the 18th century Enlightenment and modern biblical criticism. By virtue of his magnum opus, the posthumous Ethics, Spinoza is also considered one of Western philosophy's definitive ethicists.

***
Modern Relevance


Late 20th century Europe demonstrated a greater philosophical interest in Spinoza, often from a left-wing or Marxist perspective. Notable philosophers Gilles Deleuze, Antonio Negri, Étienne Balibar and the Brazilian philosopher Marilena Chauí have each written books on Spinoza. Deleuze's doctoral thesis, published in 1968, refers to him as "the prince of philosophers."[9] Other philosophers heavily influenced by Spinoza include Constantin Brunner and John David Garcia. Stuart Hampshire wrote a major English language study of Spinoza, though H. H. Joachim's work is equally valuable. Unlike most philosophers, Spinoza and his work were highly regarded by Nietzsche.

Philosopher Ludwig Wittgenstein evoked Spinoza with the title (suggested to him by G. E. Moore) of the English translation of his first definitive philosophical work, Tractatus Logico-Philosophicus, an allusion to Spinoza's Tractatus Theologico-Politicus. Elsewhere, Wittgenstein deliberately borrowed the expression sub specie aeternitatis from Spinoza (Notebooks, 1914-16, p. 83). The structure of his Tractatus Logico-Philosophicus does have certain structural affinities with Spinoza's Ethics (though, admittedly, not with the latter's own Tractatus) in erecting complex philosophical arguments upon basic logical assertions and principles. Furthermore, in propositions 6.4311 and 6.45 he alludes to a Spinozian understanding of eternity and interpretation of the religious concept of eternal life, stating that "If by eternity is understood not eternal temporal duration, but timelessness, then he lives eternally who lives in the present." (6.4311) "The contemplation of the world sub specie aeterni is its contemplation as a limited whole." (6.45) Furthermore, Wittgenstein's interpretation of religious language, in both his early and later career, may be said to bear a family resemblance to Spinoza's pantheism.

Spinoza has had influence beyond the confines of philosophy. The nineteenth century novelist, George Eliot, produced her own translation of the Ethics, the first known English translation thereof. The twentieth century novelist, W. Somerset Maugham, alluded to one of Spinoza's central concepts with the title of his novel, Of Human Bondage. Albert Einstein named Spinoza as the philosopher who exerted the most influence on his world view (Weltanschauung). Spinoza equated God (infinite substance) with Nature, consistent with Einstein's belief in an impersonal deity. In 1929, Einstein was asked in a telegram by Rabbi Herbert S. Goldstein whether he believed in God. Einstein responded by telegram: "I believe in Spinoza's God who reveals himself in the orderly harmony of what exists, not in a God who concerns himself with the fates and actions of human beings."[10] Spinoza's pantheism has also influenced environmental theory. Arne Næss, the father of the deep ecology movement, acknowledged Spinoza as an important inspiration. Moreover, the Argentinian writer Jorge Luis Borges was greatly influenced by Spinoza's world view. In many of his poems and short stories, Borges makes constant allusions to the philosopher's work, though not necessarily as a partisan of his doctrines, but merely in order to use these for aesthetic purposes--a common tactic in Borges's work.

Spinoza is an important historical figure in the Netherlands, where his portrait was featured prominently on the Dutch 1000-guilder banknote, legal tender until the euro was introduced in 2002. The highest and most prestigious scientific award of the Netherlands is named the Spinoza prijs (Spinoza prize). Spinoza's work is also mentioned as the favourite reading material for Bertie Wooster's valet Jeeves in the P. G. Wodehouse novels.
Robert May
Seminal 1976 paper popularized iterated maps for discrete population dynamics (and chaos from simple equations)
Christian von Ehrenfels
Christian Freiherr von Ehrenfels (June 2, 1859 in Rodaun near Vienna - September 8, 1932 in Lichtenau) was an Austrian philosopher, and is known as one of the founders and precursors of Gestalt psychology.

Although Max Wertheimer is to be credited as the founder of the movement of Gestalt psychology, the concept of Gestalt itself was first introduced in contemporary philosophy and psychology by Ehrenfels in his famous work Über Gestaltqualitäten ("On the Qualities of Form", 1890). The idea of Gestalt has its roots in theories by Johann Wolfgang von Goethe and Ernst Mach. Both he and Edmund Husserl seem to have been inspired by Mach's work Beiträge zur Analyse der Empfindungen ("Contributions to the Analysis of the Sensations", 1886) to formulate their very similar concepts of Gestalt and Figural Moment respectively.

Ehrenfels studied philosophy at the University of Vienna with Franz Brentano and Alexius Meinong, promoted under supervision of Meinong, after the latter's move to the University of Graz, in 1885 on the topic of Größenrelationen und Zahlen. Eine psychologische Studie ("Relations of magnitude and numbers. A psychological study"). He obtained his habilitiation in 1888 in Vienna with the work: Über Fühlen und Wollen ("On feeling and willing"). From 1896 to 1929 he was professor of philosophy at the German university of Prague.
Johann Wolfgang von Goethe
The idea of Gestalt has its roots in theories by Johann Wolfgang von Goethe and Ernst Mach.

Johann Wolfgang von Goethe (help·info) IPA: [ˈjoːhan ˈvɔlfgaŋ fɔn ˈgøːtə], (in English generally pronounced /ˈgɝːtə/;[1] 28 August 1749 – 22 March 1832) was a German writer. George Eliot called him "Germany's greatest man of letters… and the last true polymath to walk the earth."[2] Goethe's works span the fields of poetry, drama, literature, theology, humanism, and science. Goethe's magnum opus, lauded as one of the peaks of world literature, is the two-part drama Faust.[3] Goethe's other well-known literary works include his numerous poems, the Bildungsroman Wilhelm Meister's Apprenticeship and the epistolary novel The Sorrows of Young Werther.

Goethe was one of the key figures of German literature and the movement of Weimar Classicism in the late 18th and early 19th centuries; this movement coincides with Enlightenment, Sentimentality (Empfindsamkeit), Sturm und Drang, and Romanticism. The author of the scientific text Theory of Colours, he influenced Darwin with his focus on plant morphology.[4][5] He also served at length as the Privy Councilor ("Geheimrat") of the duchy of Weimar.

Goethe is the originator of the concept of Weltliteratur ("world literature"), having taken great interest in the literatures of England, France, Italy, classical Greece, Persia, Arabic literature, amongst others. His influence on German philosophy is virtually immeasurable, having major effect especially on the generation of Hegel and Schelling, although Goethe himself expressly and decidedly refrained from practicing philosophy in the rarefied sense.

Goethe's influence spread across Europe, and for the next century his works were a major source of inspiration in music, drama, poetry and philosophy. Goethe is considered by many to be the most important writer in the German language and one of the most important thinkers in Western culture as well. Early in his career, however, he wondered whether painting might not be his true vocation; late in his life, he expressed the expectation that he would ultimately be remembered above all for his work in colour.
Ernst Mach
The idea of Gestalt has its roots in theories by Johann Wolfgang von Goethe and Ernst Mach.

Ernst Mach (pronounced [max]) (February 18, 1838 – February 19, 1916) was an Austrian physicist and philosopher and is the namesake for the "Mach number" (also known as Mach speed) and the optical illusion known as Mach bands.

Mach developed a philosophy of science which was influential in the 19th and 20th centuries. Mach held that scientific laws are summaries of experimental events, constructed for the purpose of human comprehension of complex data. Thus scientific laws have more to do with describing sensations than with reality as it exists beyond sensations. Some quotations from Mach's writings will illustrate his philosophy. These selections are taken from his essay The Economical Nature of Physical Inquiry, excerpted by Kockelmans (citation below).
Edmund Husserl
Edmund Gustav Albrecht Husserl (IPA: [ˈhʊsɛrl]; April 8, 1859 – April 26, 1938) was a philosopher, known as the father of phenomenology. His work was a break with the purely positivist orientation and understanding of the science and philosophy of his day, giving weight to the notion that experience is the source of all knowledge, while at the same time elaborating critiques of psychologism and historicism.

Husserl was a pupil of Franz Brentano and Carl Stumpf; his philosophical work influenced, among others, Hans Blumenberg, Ludwig Landgrebe, Eugen Fink, Max Scheler, Martin Heidegger, Jean-Paul Sartre, Emmanuel Levinas, Rudolf Carnap, Hermann Weyl, Maurice Merleau-Ponty, Alfred Schütz, Pierre Bourdieu, Paul Ricœur, Jacques Derrida, Jan Patočka, Roman Ingarden, Edith Stein (St. Teresa Benedicta of the Cross), and Karol Wojtyla[citation needed]. In 1887 Husserl converted to Christianity and joined the Lutheran Church. He taught philosophy at Halle as a tutor (Privatdozent) from 1887, then at Göttingen as professor from 1901, and at Freiburg im Breisgau from 1916 until he retired in 1928. After this, he continued his research and writing by using the library at Freiburg.
Talcott Parsons
Talcott Parsons (December 13, 1902 - May 8, 1979) was an American sociologist, who served on the faculty of Harvard University from 1927–1973. He produced a general theoretical system for the analysis of society, that came to be called structural functionalism. This was created by Parsons to reflect his vision of an integrated social science.

For many years Parsons was the best-known sociologist in the United States, and indeed one of the best-known in the world. His work was very influential through the 1950s and well into the 1960s, particularly in the United States, but fell gradually out of favour afterward. The most prominent attempt to revive Parsonian thinking, under the rubric neofunctionalism, has been made by the sociologist Jeffrey Alexander, now at Yale University.
***

Work

Parsons was an advocate of "grand theory," an attempt to integrate all the social sciences (except anthropology) into an overarching theoretical framework.

His early work on the Structure of Social Action, he reviewed the output of his great predecessors, especially Max Weber, Vilfredo Pareto, and Émile Durkheim. Parsons attempted to derive from them a single "action theory" based on the assumptions that human action is voluntary, intentional, and symbolic.

Later, he became intrigued with, and involved in, an astonishing range of fields: from medical sociology (where he developed the concept of the sick role, to psychoanalysis—personally undergoing full training as a lay analyst), to anthropology, to small group dynamics, working extensively with Robert Freed Bales, to race relations and then economics and education.

[edit] Systems theory and cybernetics

Parsons developed his ideas during a period when systems theory and cybernetics were very much on the front burner of social and behavioral science. In using systems thinking, he postulated that the relevant systems treated in social and behavioral science were "open," meaning that they were embedded in an environment consisting of other systems. For social and behavioral science, the largest system is "the action system," consisting of interrelated behaviors of human beings, embedded in a physical-organic environment.[2]

Parsons had a seminal influence and early mentorship of Niklas Luhmann, pre-eminent German sociologist, originator of autopietic systems theory.

[edit] AGIL paradigm

The procedure he adopted to analyze this system and its subsystems is called the "AGIL Paradigm" or "AGIL scheme". To survive or maintain equilibrium with respect to its environment, any system must to some degree adapt to that environment (Adaptation), attain its goals (Goal attainment), integrate its components (Integration), and maintain its latent pattern (Latency pattern maintenance), a cultural template of some sort. These are called the system's functional imperatives.

In the case of the analysis of a societal action system, the AGIL Paradigm, according to Parsons, yields four interrelated and interpenetrating subsystems: the behavioral systems of its members (A), the personality systems of those members (G), the society as a system of social organization (I) and the cultural system of that society (L). To analyze a society as a social system (the I subsystem of action), people are posited to enact roles associated with positions. These positions and roles become differentiated to some extent and in a modern society are associated with such things as occupational, political, judicial and educational roles.

Considering the interrelation of these specialized roles as well as functionally differentiated collectivities (e.g., firms, political parties), the society can be analyzed as a complex system of interrelated functional subsystems, namely:

* The economy -- societal adaptation to its action and non-action environmental systems
* The polity -- societal goal attainment
* The societal community -- the integration of its diverse social components
* The fiduciary system -- processes and units that function to reproduce societal culture

Parsons elaborated upon the idea that each of these systems also developed some specialized symbolic mechanisms of interaction analogous to money in the economy, e.g.., influence in the societal community. Various processes of "interchange" among the subsystems of the societal system were postulated.

The most elaborate of Parsons's use of functional systems analysis with the AGIL scheme appear in two collaborative books, Economy and Society (with N. Smelser, 1956) and The American University (with G. Platt, 1973).

[edit] Social evolutionism

Parsons contributed to the field of social evolutionism and neoevolutionism. He divided evolution into four subprocesses:

1. differentiation, which creates functional subsystems of the main system, as discussed above;
2. adaptation, where those systems evolve into more efficient versions;
3. inclusion of elements previously excluded from the given systems; and
4. generalization of values, increasing the legitimization of the ever-more complex system.

Furthermore, Parsons explored these subprocesses within three stages of evolution:

1. primitive,
2. archaic and
3. modern (where archaic societies have the knowledge of writing, while modern have the knowledge of law).

Parsons viewed Western civilisation as the pinnacle of modern societies, and out of all western cultures he declared the United States as the most dynamically developed. For this, he was attacked as an ethnocentrist.

Parsons' late work focused on a new theoretical synthesis around four functions common (he claimed) to all systems of action—from the behavioral to the cultural, and a set of symbolic media that enable communication across them. His attempt to structure the world of action according to a mere four concepts was too much for many American sociologists, who were at that time retreating from the grand pretensions of the 1960s to a more empirical, grounded approach. Parsons' influence waned rapidly in the U.S. after 1970.

[edit] Pattern variables

Parsons asserted that there were two dimensions to societies: instrumental and expressive. By this he meant that there are qualitative differences between kinds of social interaction.

He observed that people can have personalized and formally detached relationships based on the roles that they play. The characteristics that were associated with each kind of interaction he called the pattern variables.

Some examples of expressive societies would include families, churches, clubs, crowds, and smaller social settings. Examples of instrumental societies would include bureaucracies, aggregates, and markets.
Episteme
Distinguished from techne, the word ἐπιστήμη is Greek for knowledge or science, coming from the verb ἐπίσταμαι "to know".

Michel Foucault used the term épistémè, making a distinction with épistémé, taking it from an essay by Miya Osaki, in his work The Order of Things to mean the historical a priori that grounds knowledge and its discourses and thus represents the condition of their possibility within a particular epoch. In subsequent writings, he made it clear that several epistemes may co-exist and interact at the same time, being parts of various power-knowledge systems. But, he did not disown the concept:

I would define the episteme retrospectively as the strategic apparatus which permits of separating out from among all the statements which are possible those that will be acceptable within, I won’t say a scientific theory, but a field of scientificity, and which it is possible to say are true or false. The episteme is the ‘apparatus’ which makes possible the separation, not of the true from the false, but of what may from what may not be characterised as scientific.[1]

Foucault's use of episteme has been asserted as being similar to Thomas Kuhn's notion of a paradigm, as for example by Jean Piaget.[2] However, there are decisive differences. Whereas Kuhn's paradigm is an all-encompassing collection of beliefs and assumptions that result in the organization of scientific worldviews and practices, Foucault's episteme is not merely confined to science but to a wider range of discourse (all of science itself would fall under the episteme of the epoch). While Kuhn's paradigm shifts are a consequence of a series of conscious decisions made by scientists to pursue a neglected set of questions, Foucault's epistemes are something like the 'epistemological unconscious' of an era; the configuration of knowledge in a particular episteme is based on a set of fundamental assumptions that are so basic to that episteme so as to be invisible to people operating within it. Moreover, Kuhn's concept seems to correspond to what Foucault calls theme or theory of a science, but Foucault analysed how opposing theories and themes could co-exist within a science.[3] Kuhn doesn't search for the conditions of possibility of opposing discourses within a science, but simply for the (relatively) invariant dominant paradigm governing scientific research (supposing that one paradigm always is pervading, except under paradigmatic transition). Like Althusser, who draws on the concept of ideology, Foucault goes deeper through discourses, to demonstrate the constitutive limits of discourse, and in particular, the rules enabling their productivity. However, Foucault maintained that though ideology may infiltrate and form science, it need not do so: It must be demonstrated how ideology actually forms the science in question; contradictions and lack of objectivity is not an indicator of ideology.[4] Kuhn's and Foucault's notions are both influenced by the French philosopher of science Gaston Bachelard's notion of an "epistemological rupture", as indeed was Althusser. Judith Butler would use the concept of episteme in her book Excitable Speech. After Foucault and Garth Fowden, Victoria Nelson uses episteme ("the state of knowing") in opposition to gnosis ("the process of knowing"; see also Hermeticism and Gnosticism) in her book The Secret Life of Puppets.
Aristotle's knowledge?
* Episteme/ Epistemology
* Phronesis
* Techne
Phronesis
Phronesis (Greek: φρόνησις) in Aristotle's Nicomachean Ethics is the virtue of moral thought, usually translated "practical wisdom", sometimes as "prudence".

Aristotle distinguishes between two intellectual virtues: sophia and phronesis. Sophia (usually translated "wisdom") is the ability to think well about the nature of the world, to discern why the world is the way it is (this is sometimes equated with science); sophia involves deliberation concerning universal truths. Phronesis is the capability to consider the mode of action in order to deliver change, especially to enhance the quality of life. Aristotle says that phronesis is not simply a skill, however, as it involves not only the ability to decide how to achieve a certain end, but also the ability to reflect upon and determine that end (this latter point is denied by some commentators, who contend that Aristotle considers the desired end, eudaimonia, to be given, such that phronesi
Techne
Techne, or techné, as distinguished from episteme, is etymologically derived from the Greek word τέχνη (Ancient Greek: IPA: [tékʰ.nεː], Modern Greek [ˈtex.ni] (help·info)) which is often translated as craftsmanship, craft, or art. It is the rational method involved in producing an object or accomplishing a goal or objective. The means of this method is through art. Techne resembles episteme in the implication of knowledge of principles, although techne differs in that its intent is making or doing, as opposed to "disinterested understanding."

In Ion, Plato wrote that techne (in the sense of an art or craft) represented a threat to peace, order and good government for which Reason and Law “by common consent have ever been deemed best.” Aristotle saw it as representative of the imperfection of human imitation of nature. For the ancient Greeks, it signified all the Mechanical Arts including medicine and music. The English aphorism, ‘gentlemen don’t work with their hands,’ is said to have originated in ancient Greece in relation to their cynical view on the arts. Due to this view, it was only fitted for the lower class while the upper class practiced the Liberal Arts of ‘free’ men (Dorter 1973).

Socrates also compliments techne only when it was used in the context of episteme. Episteme sometimes means knowing how to do something in a craft-like way. The craft-like knowledge is called a ‘technê.' It is most useful when the knowledge is practically applied, rather than theoretically or aesthetically applied. For the ancient Greeks, when techne appears as art, it is most often viewed negatively, whereas when used as a craft it is viewed positively: because a craft is the practical application of an art, rather than art as an end in itself. In The Republic, written by Plato, the knowledge of forms "is the indispensable basis for the philosophers' craft of ruling in the city" (Stanford 2003).

Techne is often used in philosophical discourse to distinguish from art (or poiesis). This use of the word also occurs in The Digital Humanities to differentiate between linear narrative presentation of knowledge and dynamic presentation of knowledge, wherein techne represents the former and poiesis represents the latter.