Here is a funny function that I could’t wrap my mind around as a freshman: if , and . It is both continuous and differentiable, the derivative at zero is 1, because:
but at the same time – and that is what I found so hard to believe – it is not increasing in any, even the smallest neighborhood of zero, isn’t even monotonous. When I think about it now, it seems clear that it has no reason too, unless it’s continuously differentiable, as opposed to being just differentiable, but back then I thought that if the derivative exists, and is positive, then the function must be be going up, and if it is going up, then there must be an interval when it does that. This last ‘if-then’ is where I was wrong. No, there doesn’t have to be.
‘Alright’, you say, ‘but what does it look like?’
The picture makes it easy to see that it is indeed increasing at zero, in the sense that all the values to the right are greater than , while all the values to the left are less than zero, but at the same time, no matter how close to zero you get, there will always be some tiny intervals, where the function is decreasing. But there will also be equally tiny intervals where the function is increasing, and this is what does the trick.
Now, if the function were continuously differentiable such an anomaly could not occur: from the fact that derivative was positive in one point would follow that there exists an interval, where it would be positive and inside that interval our function would be monotonous just like you would expect, but where would be fun in that?