Talk:Monotonic function

Latest comment: 11 months ago by Jochen Burghardt in topic Sequences

[Monotonicity in computer science]

edit

I'm not sure about the following statement in the paragraph 'computer science': "... In some heuristic algorithms, such as A*, the algorithm can be considered optimal if it is monotonic.[2]"

The A star algorithm can just as well be considered optimal if the heuristic function doesn't satisfy the monotonicity restriction. The only practical relevance is that redundant path deletion can be done more efficient. A star without any extension related to this matter can't guaranty that the next selected path is indeed the best path to the last node on the respective path. If the chosen heuristic function 'is' monotonic, we can assume that the next selected path is indeed the best path to the last node on the selected path. It's a matter of efficiency, not optimality. — Preceding unsigned comment added by Gheylen (talkcontribs) 16:14, 12 April 2012 (UTC)Reply

[first thread]

edit

I know I'm not the most mathematically knowledgeable person here, but I think I may have found an error in this article. It says that a monotonic function can only have countably many discontinuities in its domain. Isn't the devil's staircase (the one related to the Cantor set) a counterexample to this? Thanks for any clarifications you can give! -- Oliver Pereira 02:12 Nov 29, 2002 (UTC)

Sorry, that's not what I meant to say. The devil's staircase is continuous, of course! But if you turn it on its side, don't you get a function (okay, specifying the values on the straight line bits in some arbitrary way) from [0, 1] to [0, 1] which is monotonic and which has an uncountable number of discontinuities? -- Oliver Pereira 02:27 Nov 29, 2002 (UTC)

Nope, okay, the set of discontinuities is countable, of course. Why don't I think before posting these stupid questions??? -- Oliver Pereira 02:31 Nov 29, 2002 (UTC)

Derivative

edit

Is there any name for this theorem? Tosha 20:21, 19 Feb 2005 (UTC)

if f is a monotonic functions defined on an interval I, then f is differentiable almost everywhere on I, i.e. the set of numbers x in I such that f is not differentiable in x has Lebesgue measure zero.
This question is very old, but in case someone needs it: It is a theorem of Lebesgue. There might not be a unique name, but when it is referred to by something other than its statement it is called by a name formed combining in some way the words 'Lebesgue', 'theorem', 'differentiability/differentiation', 'monotonic/monotone functions'. The attribution could be added to the article. Cactus0192837465 (talk) 13:39, 29 January 2019 (UTC)Reply

nondecreasing

edit

if u r going to have nondecreasing redirect here, then you should better define nondecreasing. inpired by the shape of the graph? can you show the graph then??--Jaysscholar 23:21, 12 October 2005 (UTC)Reply

  • Non-decreasing is defined on the page. Is your issue with the spelling nondecreasing vs. non-decreasing? My personal preference is nondecreasing, but I'm sure many opinions differ. Ah, maybe I see your point. The definition relies on the definition of monotone above, which is a bit far away. I don't see an easy improvement, but I agree that the current definition is hard to read locally.

--Erik Demaine 00:55, 13 October 2005 (UTC)Reply

    • "Non-decreasing" is misleading since it is not the same as "not decreasing", e.g., (-1)^n is not decreasing, but not non-decreasing in the sense of (weakly) increasing. In all other instances I know (or: can think of in the moment), "non-" means "not", e.g. non-linear = not linear, etc. — MFH:Talk 16:51, 18 June 2010 (UTC)Reply

clarification

edit

does this definition mean that the equation's slope is always positive or negative (no points of inflection)? The definition is unclear (similarly so in my calculus textbook, unfortunately...). --anon

There can be points of inflection. Positive slope is about first derivative, and points of inflection are about second derivative, so they are not mutually exclusive. Example:
 
First derivative is always > 0, while second derivative is 6(2x-1) which is 0 at x=1/2, so there are points of inflection. Oleg Alexandrov (talk) 19:12, 21 February 2006 (UTC)Reply

Actually the answer is that under the definition of a monotonic function, ie a function f such that for all    , constant functions f(x)=c for some number c are also monotonic. For example, for the constant function f(x)=2, if   then   because  . Thus f(x)=2 is monotonic, and has a constant slope of 0 at all points in its domain.

A typical related definition is that a function f is called strictly monotonic if for all    . Such functions can still have points of inflection (eg   is strictly monotonic with an inflection point at x=0). However, they can't be constant over an interval. Dugwiki 19:00, 6 November 2006 (UTC)Reply

Isotone

edit

Should Isotone really redirect here? I'm looking at a different definition in Bhatia's "Matrix Analysis" - if x is majorized by y, then f(x) is submajorized by f(y). For instance, the entropy function (from Rn+ to R+) is isotone. A5 18:59, 6 March 2006 (UTC)Reply

Article needs references

edit

I noticed by chance while reading the article that it doesn't have any citations. References will be needed. Especially handy would be references to proofs of some of the statements in the article, such as the section that implies that all monotone functions have both left and right limits at every point in their domain, and that only having jump discontinuities implies only having a countable number of discontinuities. (I'm sure the statements are true, but it would be informative to reference proofs to that effect.) Dugwiki 18:07, 31 October 2006 (UTC)Reply

Proof of only countable discontinuities?

edit

Ok, for the heck of it I was trying to prove that a monotonic function can only have countably many discontinuities, but I'm stuck. :/ I'm sure it's true, but I haven't figured out how to formally prove it. Anybody have a nice little proof of it, just out of curiosity? Dugwiki 22:56, 1 November 2006 (UTC)Reply

To follow up, I think a possible pseudo proof might be something like this:

Let f is a monotonic function on the reals, and let D be the set of jump discontinuities in the domain of f. Assume that D is an uncountable set. Then it follows (I think) that there must exist a point d in D such that for some e there is a finite interval (d-e,d+e) that contains an uncountable subset of D. (I suspect this part is true but am not sure of the proof.)

Then for each jump discontinuity within that interval, there is an associated non-zero "distance" jumped which is the difference between the right hand and left hand limits at the discontinuity. Consider the set of all such jump distances within the above finite interval (d-e,d+e). Since f is monotonic, and this is a finite subinterval of the domain, it follows that the righthand limit of f at (d-e) must be less than the lefthand limit of f at d+e. And, in fact, the righthand limit at d+e must be at least as large as the lefthand limit at d-e plus the sum of all jump distances for all discontinuities between the two points. This total amount must be finite, because if it were unbounded in either direction then since f is monotone it would not be able to cover values beyond the unbounded direction (ie if the distance grows unbounded toward the point d+e, then f(y) could not exist for y>(d+e) since f(y) must be greater than all function values within the interval (d-e,d+e)).

But an uncountable sum series of positive values can only converge if at most a countable number of elements in the series are non-zero. Since D is uncountable, the uncountable sum of its elements' jump distances can therefore likewise only converge to a finite total distance if only a countable number of those jump distances are non-zero. However, if the jump distance is equal to zero, then the lefthand and righthand limits are equal, and there is by definition no jump discontinuity at that point. Thus there can only be at most a countable number of discontinuities within (d-e,d+e), a contradiction with the above initial assumptions that (d-e,d+e) contained an uncountable set of discontinuities. Therefore, by contradiction, there are no finite intervals in the domain of f with an uncountable number of discontinuities, and thus the set of all discontinuities D can not be uncountable.


That's about the best I can do at the moment. If anyone can clean up the above "proof" or has something neater, please feel free. Dugwiki 20:37, 2 November 2006 (UTC)Reply

If you have access to a copy of Riesz and Nagy's book 'Functional Analysis', it is worth reading - they discus lots of ideas around this area, with some excellent proofs. Madmath789 20:51, 2 November 2006 (UTC)Reply
Thanks. I don't have access to that book, unfortunately, but I'll keep that in mind if I'm at Borders. Dugwiki 21:59, 2 November 2006 (UTC)Reply
Oh, also, if you can pinpoint the proofs in that book, you might want to include them in the article as footnotes for references. The article is currently uncited. Dugwiki 22:01, 2 November 2006 (UTC)Reply

As a follow-up, I thought it about this some more Friday night and came up with what I think is a reasonable proof of the corollary I mentioned regarding a point with an uncountable number of neighbors in an uncountable subset of the reals. Here's how this bit of original research goes:

Lemma: For all uncountable subsets S of the reals, there exists at least one element s of S such that for all k>0 there are an uncountable number of elements of S in the interval [s-k, s+k].

Proof: Let S be a given uncountable subset of the reals, and let k be any given strictly positive real number. Consider the partition of the reals defined by all subintervals of the form [nk, (n+2)k] for all integers n. In other words, a partition of the reals of intervals of width 2k including the interval [0,2k] and expanding outward in both directions.

This partition covers the reals, and hence includes all elements of S in its combined intervals. The number of intervals in this partition is countable, though, so since S is uncountable there must exist at least one interval in the partition that contains an uncountable number of elements in S. The reason is that if every interval contained only a countable number of elements of S, then the combined union of all those elements would be a countable number of elements from a countable number of intervals, which is at most a countable union.

Therefore, by the definition of the partition outlined above, there exists at least one number n such that the interval [nk, (n+2)k] contains an uncountable subset of S. This interval can be rewritten as [(n+1)k - k, (n+1)k + k], and thus meets the criteria of an interval of the form desired in the lemma. (qed)


With that lemma now proven, it follows that the total "jump continuity distance" within a finite interval can only be finite number if there are at most a countable number of strictly positive distances jumped. Thus no finite interval in the domain of a monotonic function can have an uncountable number of jump discontinuities, and therefore by the lemma above there can only be at most a countable number of total discontinuities on a real monotonic function's total domain.

I'm not sure how this proof compares to ones in the textbook reference mentioned previously, but it was certainly fun to play around with last week. :) Dugwiki 18:20, 6 November 2006 (UTC)Reply


You are thinking to hard. It is simple. Lets assume we have f and it increases. Then for all x<y => f(x)<f(y). Therefore, f can only have one type of simple discontinuity, where the right and left limit do not equal each other(the other type is where right and left limit at a point do equal each other but that point doesnt equal the limit), and more precisely f(p-)<f(p+) where f(p-) indicates the limit from the left. Therefore, for each discontinuity you can assign a rational number r such that f(p-)<r<f(p+). Thus, you are mapping the discontinuities to the rational number, which means that the discontinuities are at most countable. The only way this proof could fail is if you could map more than one discontinuity to a single rational number, but that is no possible. For example, assume we have two discontinuities at x1 and x2, and for sake of definition x1 < x2, then f(x1-)< r1 <f(x1+) < f(x2-) < r2 < f(x2-) => r1 < r2 strictly and thus you can only map one discontinuity into a single rational number, in other words the mapping is injective. —Preceding unsigned comment added by 68.13.252.98 (talk) 08:03, 20 July 2008 (UTC)Reply


Here's how I'd put it. Assume f is monotone and has an uncountable set D of discontinuities in the interval [a, b]. Take any x in D; then there's some "jump distance" h = h(x) such that either f(y) >= f(x) + h for all y > x, or f(y) <= f(x) + h for all y < x. [Otherwise, we can pick a monotone increasing sequence yn approaching (but never equal to) x such that f(yn) gets arbitrarily close to f(x), and a similar monotone decreasing sequence zn. Since x is a discontinuity of f, then by definition there's some e > 0 with every neighborhood around x containing a point c with f(c) outside of [f(x) - e, f(x) + e]. Since f(yn) and f(zn) both approach f(x), choose m such that f(x) - e < f(ym) < f(x) < f(zm) < f(x + e). Now just choose an appropriate c in the interval [ym, zm], and the monotonicity of f is contradicted.] Let H1/n = {x in D : h(x) ≥ 1/n}. Some H1/k must be uncountable, since otherwise the union of the countably many countable sets Hn would be the uncountable set D. The distance between any two non-adjacent elements of H1/k is at least 1/k, and there are infinitely many such elements. This is impossible on a finite interval, so we have reached a contradiction. 216.80.52.166 216.80.52.166 (talk) 08:38, 15 January 2009 (UTC)Reply


I like the one with the rational numbers. It's how my book did it. I believe this generalises to any separable space? Maybe, and I don't like proof by contradictions.Standard Oil (talk) 13:58, 31 May 2009 (UTC)Reply

Diagram

edit

{{reqdiagram}}

This page should have some pictures of graphs. —Ben FrantzDale 15:46, 25 May 2007 (UTC)Reply

Graphs have been added. --pfctdayelise (talk) 14:16, 2 August 2008 (UTC)Reply

Another error

edit

It is claimed in the current version of the article that, if f has limits from the right and from the left at every point of its domain, then f has a limit at infinity (either ∞ or −∞) of either a real number, ∞, or −∞. This is false, as the example f(x) = sin(x) shows. Correction and a general cleanup of all the claims made in this article is needed. Sullivan.t.j (talk) 00:14, 30 December 2007 (UTC)Reply

Good point. I don't know who added that, but I reworded the statement to say that those properties are true for monotonic functions rather than following one from another. That's enough for the purposes of the article anyway. Have you seen other mistakes? Oleg Alexandrov (talk) 00:46, 30 December 2007 (UTC)Reply

Inline citations

edit

An IP editor has requested that additional inline citations be added to the article. I'm not sure which parts of the article the editor is concerned about, however. — Carl (CBM · talk) 17:59, 6 February 2008 (UTC)Reply

At the request of CBM, I am here. I had a disagreement with a colleague about the definition of this term. This article does not help our discussion because it is unclear as to where the term (as it is defined in the introduction) or where any of the other material came from. It would be nice to see inline citations, at least to add some certainty as to where specific statements come from.
I am also worried about text being plagarized from other sources on the web. It would be shameful if this webpage simply copied someone else's math notes without giving him or her credit. Inline citations would at least alleviate the concern that the information was not stolen from someone else. 155.198.204.98 (talk) 18:09, 6 February 2008 (UTC)Reply
I have added two additional references, one to a popular analysis text and one to a popular lattice theory text. Both have monotone map or function, monotone in the index. Any other introductory analysis or lattice theory book will probably also include a definition of the term. — Carl (CBM · talk) 18:15, 6 February 2008 (UTC)Reply
I would suggest adding an inline citation to at least the first line in the article to show exactly where the definition comes from. The problem with these broad reference sections is that it is unclear as to which reference corresponds to each piece of information in the text. For all you or I know, someone could have slipped an invalid statement about monotonic functions into this article. (I have indeed seen this elsewhere on Wikipedia, and usually these statements get ferreted out when people start using inline citations.) 155.198.204.98 (talk) 19:03, 6 February 2008 (UTC)Reply
As I said, you can find the definition in analysis in the book by Bartle and the definition in lattice theory in the book by Grätzer. Simply pick up the book and look for "monotone" or "function"in the index. In fact, pretty much any introductory analysis or lattice theory book will have the definition; it doesn't really matter which book you look in.
It's just as easy for subtle errors to slip in with inline citations as without (in the process of paraphrasing and formatting). If you see anything that looks wrong, of course, you should feel free to fix it or ask about it on the talk page. — Carl (CBM · talk) 00:07, 7 February 2008 (UTC)Reply

More interesting stuff

edit

Should the article include, a monotonic function can be discontinuous at a countable dense set? It's very interesting because I can't visualize it :O —Preceding unsigned comment added by Standard Oil (talkcontribs) 14:01, 31 May 2009 (UTC)Reply


Possible error?

edit

Apologies if this is an unusual route but I figured I could highlight a possible error: The terms "non-decreasing" and "non-increasing" should not be confused with the (much weaker) negative qualifications "not decreasing" and "not increasing". For example, the function of figure 3 first falls, then rises, then falls again. It is therefore not decreasing and not increasing, but it is neither non-decreasing nor non-increasing.

To me it makes a whole lot more sense if it says decreasing and increasing, but I am not a mathematician. I hope one of you could verify? Thanks! :) — Preceding unsigned comment added by Beyonix (talkcontribs) 21:07, 2 December 2011 (UTC)Reply

That is correct. They are different concepts with similar names. Zaslav (talk) 21:48, 6 March 2016 (UTC)Reply

Absolutely Monotonic = 1-1 function?

edit

It seems obvious that an absolutely monotonic function is 1-1, but is a 1-1 function in R^1 an absolutely monotonic function? I.e. are the two synonyms? — Preceding unsigned comment added by 99.149.190.128 (talk) 20:59, 19 June 2012 (UTC)Reply

Strictly monotonic implies bijectivity?

edit

I contest this statement. Imagine the following function: f:R → R with f(x) = x for x < 0 and f(x) = x+1 for x ≥ 0. This function is injective, but not surjective. -- David N. Jansen. — Preceding unsigned comment added by 131.174.142.225 (talk) 12:59, 11 September 2012 (UTC)Reply

Sloppy explanation

edit

It needs to be clarified what is meant by a function "between" ordered sets. What is the domain and what is the codomain? Also, the article could use some inline citations. I find it surprising that "monotonic" doesn't refer equally to both monotonically increasing and monotonically decreasing. Isheden (talk) 08:25, 29 July 2012 (UTC)Reply

Here is an example of a better explanation: [1] Isheden (talk) 13:44, 25 April 2013 (UTC)Reply

A Function Can Be Both Increasing and Non-increasing?

edit

Your definitions imply that a constant function is increasing, non-increasing, decreasing, and non-decreasing all at the same time. The definitions of increasing and decreasing conflict with calculus books I have consulted, e.g. Salas and Hille, was well as with common sense and English. — Preceding unsigned comment added by 98.213.32.21 (talk) 01:40, 25 April 2013 (UTC)Reply

Could you provide precise citations from calculus textbooks including page number? At present, there are no in-line citations that support the definitions in the article. Isheden (talk) 12:16, 25 April 2013 (UTC)Reply
From Salas and Hille, Calculus of One and Several Variables, 1982, Section 4.2 "Increasing and Decreasing Functions", Definition 4.2.1, p. 142: "A function f is said to (i) increase on the interval I iff for every two numbers x1, x2 in I x1 < x2 implies f(x1) < f(x2). (ii) decrease on the interval I iff for every two numbers x1, x2 in I x1 < x2 implies f(x1) > f(x2)." Also Section 12.1 "Sequences of Real Numbers", p. 491: "A sequence {an} is said to be (i) increasing iff an < an+1 for each positive integer n, (ii) nondecreasing iff an ≤ an+1 for each positive integer n, (iii) decreasing iff an > an+1 for each positive integer n, (iv) nonincreasing iff an ≥ an+1 for each positive integer n." See also James Stewart, Calculus Concepts and Contexts, 4th Edition, p. 560, There are certainly sufficient references to support these definitions, and wikipedia has an opportunity to define these in a way that makes sense. If you want to talk about a function that increases or stays the same, we already have a perfectly good term in nondecreasing, so there is no reason to define increasing the same way. We all know what it means to increase. Constant functions don't increase. Non-increasing functions don't increase. To state otherwise is to perpetuate unnecessary confusion. — Preceding unsigned comment added by 98.213.32.21 (talk) 22:12, 1 May 2013 (UTC)Reply
Perhaps we should consider only the definitions of monotone functions at this stage and discuss sequences later (although the terminology is analogous). There are four terms for monotone functions with clear definitions: non-increasing, non-decreasing, strictly increasing, and strictly decreasing. I think the article should use this terminology to avoid confusion. After introducing these terms, it should be mentioned that some references define increasing to mean strictly increasing, whereas others define increasing to mean non-decreasing. Isheden (talk) 05:34, 2 May 2013 (UTC)Reply

A constant function is both increasing, decreasing, non-increasing, and non-decreasing - that is just the way the terminology works out. A constant function is neither strictly increasing nor strictly decreasing. — Carl (CBM · talk) 12:32, 25 April 2013 (UTC)Reply

I checked a few calculus textbooks and it seems they often[2] [3] define "increasing function" to mean "strictly increasing function". Perhaps that should be mentioned in the article including a reference. In any case an in-line citation would be needed to support the terminology in the article. Isheden (talk) 12:38, 25 April 2013 (UTC)Reply
Your reference Bartle, Elements of Real Analysis, 1976, Section IV.20 "Local Properties of Continuous Functions", p. 145, problem 20.R contains your definition of increasing. "A function f defined on an interval I contained in R is said to be increasing on I if x ≤ x', x, x' in I imply that f(x) ≤ f(x'). It is said to be strictly increasing on I if x < x' x,x' in I imply that f(x) < f(x'). Similar definitions can be given for decreasing and strictly decreasing." Note though that a) this is an exercise, and b) he doesn't use the terms nondecreasing and nonincreasing. It would make sense that when these latter terms are used in the same reference as increasing and decreasing that all four terms are defined to have different meanings as does Salas and Hille that I quoted above. — Preceding unsigned comment added by 98.213.32.21 (talk) 22:40, 1 May 2013 (UTC)Reply
Here are a few other references that define increasing function to mean either non-decreasing or strictly increasing: [4] [5] [6] [7] [8] [9] Isheden (talk) 05:53, 2 May 2013 (UTC)Reply

First sentence of the lead

edit

Would it be possible to simplify the first sentence of the article to make it easier to understand for calculus students? E.g. "In mathematics, a real-valued function of a real variable is called monotonic (or monotone) if it is either increasing or decreasing." Or would it be better to change the present redirects Increasing function and Decreasing function to short articles that provide the basic definitions without mentioning ordered sets? Isheden (talk) 12:51, 25 April 2013 (UTC)Reply

By the way, I'd say the first sentence in the Section "Monotonicity in calculus and analysis" is confusing. That is the definition of an increasing (or non-decreasing) function. A monotone function may be either increasing or decreasing (as defined in the next sentence). At least that is standard terminology in calculus textbooks. Here are two references to support this statement: [10] [11] Isheden (talk) 13:05, 25 April 2013 (UTC)Reply

Too sophisticated

edit

Like many science/technology Wiki pages, this one is way too complicated to be useful to non-specialists. Could someone do a Simple English version, getting across the basic ideas, e.g. that "monotonically increasing" means "always going up", or whatever it does mean, in simple layman's terms. — Preceding unsigned comment added by 86.159.201.166 (talk) 11:52, 17 September 2014 (UTC)Reply

Monotone, isotone, antitone

edit

The definition stated in the introduction at this time is incorrect. "Isotone" means monotone increasing. "Antitone" means monotone decreasing. "Monotone" means either isotone or antitone. "Monotone" of a function of several variables can mean monotone in each variable, thus, not necessarily monotone as a whole since it could be isotone in one variable and antitone in another variable. It is true that "monotone" is too often used loosely (and incorrectly) to mean "monotone increasing" but that does not make increase the correct standard definition. I will edit the article if I don't get good arguments not to do so. I await replies. Zaslav (talk) 21:53, 6 March 2016 (UTC)Reply

Contribution

edit

I contributed 50 to Wikipedia because it was made technically easy to do so. 178.38.132.48 (talk) 21:11, 5 December 2017 (UTC)Reply

Introduces unexplained notation, presumably from functional analysis

edit

The section on monotonicity in functional analysis introduces this notation:   Having almost no experience in functional analysis, in what sense does it make to say that an (apparent) ordered pair (a, b) can be greater than or equal to 0? (a, b) >= 0? What does this mean? In this case, clearly it does not seem to mean that both a >= 0 and b >= 0. I can only guess that it means both a and b have the same sign, or are both zero. Or, perhaps, a*b >= 0. The main point is that this notation appears out of nowhere and is not explained. Also a little later on here:   24.57.106.253 (talk) 14:45, 12 May 2018 (UTC)Reply

In this context   is used to denote either the inner product of   and   if they are elements of a Hilbert space, or to denote   the application of   on </math> if   is an element of a topological vector space, and   is an element of the dual of that space, i.e. a (usually continuous) linear functional on that space. Cactus0192837465 (talk) 13:58, 29 January 2019 (UTC)Reply

Order-reflecting

edit

Order-reflecting, which is linked in Order_embedding, redirects to order-preserving. The term order-reflecting however does not appear in this article, an in Order_embedding it does not mean the same as order-preserving, but the converse: if f(x) < f(y) then x < y. - 17:01, 4 June 2021‎ 2001:9e8:e256:6900:468a:5bff:fe9a:57e7

I fixed the redirect Order-reflecting, it now points to Order theory#Functions between orders. - Jochen Burghardt (talk) 15:44, 4 June 2021 (UTC)Reply

Boolean functions

edit

@129.125.75.136 and 2001:1C01:3DC9:F100:C559:8A4D:E9F3:397: I agree that the definition of monotonicity for Boolean functions restricts them to monotonic nondecreasing functions (i.e. without "negative slopes"). However, I still don't understand what "completely monotonic" should mean; the explanation "being a discrete analogue of the continous definition" doesn't help (the monotonicity definitions don't use any notion of continuity, do they?). Can you please elaborate on the latter issue? - Jochen Burghardt (talk) 12:45, 17 August 2022 (UTC)Reply

Loosely speaking, it is when all possible parallel partitions of a Boolean function are monotonic. For example, take F(A,B,C,D)=AB+CD and consider the parallel partitions p1(B,C)=F(1,B,C,0)=B and p2(B,C)=F(0,B,C,1)=C. Now, p1(B,C)-p2(B,C) can be both positive and negative and hence these paritions cannot be ordered, implying that the function is not completely monotonic. A more precise definition is given in the technical references I shared. Please undo your deletion, so that we can add on more to the notion of complete monotonicity. By the way, there is not standard definition for monoticity in multiple dimensions (real analysis - Monotonicity of function of two variables - Mathematics Stack Exchange). 2001:1C01:3DC9:F100:147:D8C7:698F:9194 (talk) 17:05, 17 August 2022 (UTC)Reply
I have no general objection against adding text about "complete monotonicity"; my reverts were just triggered by missing definitions (cf. my edit summaries), I would have looked-up them myself, but the 3 reliable sources you gave are not accessible to me, while youTube is hardly reliable. So I'd ask you to add the definitions yourself, before you use them.
From your example, I get a vague impression about complete monotonicity, but I still have a problem: both your "parallel partitions" p1 and p2 are monotonic, so they aren't a counter-example to the definition in your 1st sentence. That sentence doesn't require that they can be ordered.
As for monotonicity of multi-argument functions, this notion is covered by the order-theoretic definition (Monotonic_function#In_order_theory) which is the most general definition; all earlier def.s are special cases. I guess, for this reason, somebody has put the definition of coordinatewise ordering at the start of section Monotonic_function#In_Boolean_functions. - Jochen Burghardt (talk) 17:46, 18 August 2022 (UTC)Reply

Sequences

edit

"Decreasing sequence" redirects here but this page has nothing to say about decreasing sequences. 159.196.132.1 (talk) 07:59, 2 January 2024 (UTC)Reply

No, Decreasing sequence redirects to sequence. - Jochen Burghardt (talk) 08:38, 2 January 2024 (UTC)Reply