Welcome to the forum.

Thanks for your method. This is a very old post so the OP may no longer be interested but it's useful to have your contribution anyway.

Bob

]]>Here's an outline of the method:

Define your starting point, which is the y-axis intercept.

Determine the x-range over which your function will operate.

Divide the x-range into segments based on the constraints you have. Each segment will correspond to a different rate of change.

Calculate the slope for each segment, which represents the rate of change for that segment. The slope should be such that the segment stays within the given minimum and maximum constraints.

Create linear equations for each segment using the slope and the corresponding x-values.

Combine all the linear equations to form a piecewise function that satisfies the constraints and has zero overall rate of change.

Here's an example to illustrate the process:

Let's say we have the following constraints:

Starting point: (0, 10)

x-range: [0, 10]

Minimum constraint: (2, 8)

Maximum constraint: (8, 12)

To construct the function, we'll divide the x-range into two segments: [0, 2] and [2, 10].

For the first segment, the slope can be calculated as:

slope1 = (8 - 10) / (2 - 0) = -1

The linear equation for the first segment becomes:

y1 = -x + 10

For the second segment, the slope can be calculated as:

slope2 = (12 - 8) / (8 - 2) = 0.8

The linear equation for the second segment becomes:

y2 = 0.8x + 2.4

Combining both segments, we get the piecewise function:

f(x) = {

-x + 10, 0 <= x <= 2

0.8x + 2.4, 2 < x <= 10

}

This function satisfies the constraints, has zero overall rate of change, and varies as little as possible within the given constraints.

You can modify this approach to accommodate different sets of constraints by adjusting the number of segments and calculating the slopes accordingly.

]]>Welcome to the forum. What does the data look like? What kind of regression model do you want?

For me to do anything with your problem I am going to need more information.

]]>I am trying to construct a general function/method based on two sets of minimum/maximum data point constraints, which can take on new values in different situations. The only known data for this general function is the starting point (y-axis intercept) and the x-range. The rate of change over time must equal zero, so the amount increased/decreased must be compensated within the x-range. Optimally will vary as little as possible, as long as it meets the constraints.

I would appreciate any advice or suggestions on how to approach this problem as I've had very little success so far. Thank you in advance.

]]>