I am trying to construct a general function/method based on two sets of minimum/maximum data point constraints, which can take on new values in different situations. The only known data for this general function is the starting point (y-axis intercept) and the x-range. The rate of change over time must equal zero, so the amount increased/decreased must be compensated within the x-range. Optimally will vary as little as possible, as long as it meets the constraints.
I would appreciate any advice or suggestions on how to approach this problem as I've had very little success so far. Thank you in advance.
Welcome to the forum. What does the data look like? What kind of regression model do you want?
For me to do anything with your problem I am going to need more information.
In mathematics, you don't understand things. You just get used to them.
Of course that result can be rigorously obtained, but who cares?
Combinatorics is Algebra and Algebra is Combinatorics.