\"Homoncule\"

Relativized Systemic & applications

MRC

Method of Relativized Conceptualization insight

MRC culminates with a striking formula. Its extreme simplicity contrasts with the subtlety of the approach, conducted with an intellectual rigor without any concession.

D/G, oeG , V/ ⇔ [G,V]

According to MRC, the description D of a physical entity oeG results from the co-genesis of psycho-physical concepts which explicitly appear in the description formalism.

oeG is an operational concept which points to some physical substratum. It intersubjectively exists so far as carrying out the same process [G,V] of generation/qualification, yields stable values according to the adopted convergence criteria (same value, neighboring values, … which may be associated to other criteria such as the stability of emergence frequencies in the case of probability distributions). The physical probability law developed in RS defines the cognitive situation entailed by such a situation, and thereby, stresses the very nature of the difference between statistics and probabilities.

The [G,V] process and the physical ground RG it acts upon exist and may be regarded as the same, so long as their different realizations are identically conceptualized on the basis beforehand built knowledge. We call epistemic referential (G,V) this conceptualization beneath any realization. Without this referential it wouldn’t be possible to reach an agreement upon what goes on and what is to be noticed, there and now. The RS concept of primordial transfered description prevents regression to infinity. A physical entity oeG breaks through as a rational concept from the stability of the correlation, in space and time, of the different realizations of the known process [G,V] and of the “same” values they yield. These values are regarded as the traces of the effect of the [G,V] process on this entity which reveal thereby its “properties”. This entity and its “properties” are conceptualized as the cause of this effect.The contextual meaning coming with the "stable" qualification of both the genetic process and the values it yields is formalized in the RS concept of assessment, used as the methematized framework of neighboring concepts as hypothesis, conformity assessment, anticipation based on experince, ...

The genetic class [G,V]n is logically equivalent to the description of the physical entity. It symbolizes the finite number - from 1 to n - of experiments necessary to state upon the stability of the process relatively to the adopted criteria, which leads to postulate the existence of a reproducible physical entity oeG . This postulate is formalized as a bijection G ↔ oeG between the generation/selection process G and the conceptualized entity oeG .

So much the statistical stability criteria as the number of experiments deemed necessary to state upon the reproducibility of the experiment are freely adopted conventions, not logical deductions. One can imagine that if a direct sensorial perception makes us feel as obvious some physical entity (we can see it, we can touch it, we can smell it, …), we’ll tend to satisfy ourselves with a limited number of experiments, maybe a unique one : one doesn’t deem it necessary to measure a table length several times to be sure it does exist and to assess it is x meters long under stable conditions (temperature, hygrometry, …). But we’ll get puzzled if the repetition of a same process G , which led us to postulate the existence of a physical entity relatively to some view V, yields an inexplicable instability of results relatively to another view V’. It highlight this natural trend we have to absolutize our concepts, we can hardly dominate.