Papers by Michael S Jones
Online publication, 2011
The term ‘Radical Affinity’ stems from an investigation into the properties of the natural number... more The term ‘Radical Affinity’ stems from an investigation into the properties of the natural numbers – their tendencies to behave, according to characteristics of their radices (or ‘bases’), in ways previously unacknowledged in the analyses of quantitative systems. This inquiry begins from an empirical comparison of values in exponential series across a limited range of diverse number radices (base-2 to base-9), in terms of the logarithmic ratios of sequential values in each exponential series, relative to the ratios of corresponding values in the decimal series. While the logarithmic ratios of sequential values in the decimal series are naturally consistent, and would produce graphs consisting of horizontal straight lines, in the case of each of the radical series reproduced in this paper, the distributions revealed are mostly (with a few limited exceptions) irregular series of variegated peaks and troughs displaying proportional inconsistency. It is noted that these results critically undermine the consistency of the logarithmic function in expressing 'common ratios of proportion' between integer values when those values are expressed as their numerically 'equal' values across a range of alternative radices.
Online publication, 2021
This paper raises some concerns over conventional approaches to quantitative understanding, with ... more This paper raises some concerns over conventional approaches to quantitative understanding, with respect to the definition of an ‘integer’ as a stable index of quantity, and to the principle of rational proportionality governing integers in the denotation of numeric value. It is noted that post-Renaissance mathematics, in its motivation towards abstraction and universality, promoted an axiomatic definition of proportion which was insensitive to the distinction between discrete and continuous forms of magnitude -- i.e., those between arithmetic and geometric ones. It is inferred that the legacy of Neoclassical mathematics is to engender a transcendental understanding of proportion (and hence of logic) which is applied with assumed independence, both towards the ostensive forms of distribution of the objects under analysis, as well as towards the various methods of encoding their quantities. The effect of this received understanding upon contemporary applications of mathematics within technology is the assumption that the conditions of proportionality that apply to numerical quantities as expressed within the decimal rational schema may be seamlessly transposed into any alternative numerical radix (for our purposes, typically those of binary, octal, hexadecimal, base-64, etc.) while consistently maintaining those relations of proportion. With reference to empirical research, it is argued conversely that the relations of proportion that obtain within any individual numerical radix are resistant to any axiomatic definition, and are in fact determined uniquely according to the characterological requirements of the particular codebase employed (0-9 in decimal, 0-7 in octal, for instances). This means that, while discrete numerical values may appear to be seamlessly transferable into their 'equal' values across a range of diverse radices, the relations of proportion between those values will not be so preserved. The assumption widely held by mathematicians and information scientists with regard to the preservation of those relations is therefore a mathematical error-in-principle. Similarly, it is argued that digital algorithms are defined by unique non-universal sets of rules, i.e., analogously in principle with the unique set of rules that defines the expression of natural numbers within decimal notation. By extension therefore, the data that results from the processing of any particular digital algorithm, or set of algorithms, will have logical consistency only to the extent that it is qualified with respect to the particular unique set of rules that define those algorithms, and under which the data derives its unique and potent existence. Any further use of the data independently of that qualification implies that it will be logically inconsistent with comparative data produced under different sets of rules, as well as being logically inconsistent with the real-world criteria it purports to represent.
Online publication, 2015
A sceptical review of the cognitive sciences and of computational linguistics, with a view to a c... more A sceptical review of the cognitive sciences and of computational linguistics, with a view to a critique of computationalism for its ill-fated reduction to the empirical in an approach to a philosophy of mind. By juxtaposing empirical with non-empirical approaches to the subject, with particular reference to those of Locke and Kant, it is argued that computationalism's arbitrary reduction to the empirical is one chiefly determined by instrumental technological ambitions that predispose its adherents to a mechanistic understanding of mind, and to a linear, functionalist understanding of natural language. That characteristic approach proceeds by eliminating reasonable reflection towards the rudiments of intuition and metaphysics, both with respect to the dominance of the former in the hierarchy of the structure of thought, as well as to the precession of each in the epistemology of the empirical sciences themselves.
Uploads
Papers by Michael S Jones