Glam Prestige Journal

Bright entertainment trends with youth appeal.

$\begingroup$

Suppose I have $f(x) = 5x$.

I know that $\frac{d\ f(x)}{dx} = 5$.

But what is $\frac{d f(x)}{d 5}$ , the derivative of the function $f$ with respect to the constant 5?

The reason I ask is that I'm implementing software that computes auto-differentiation (a la TensorFlow). I want to know if I can treat a constant like a variable (as above) or if I have to do something else. This Stanford deep learning class webpage is what I'm referring to:

$$ f(x) = c+x \\ f_a(x) = ax $$Where the functions $f_c$, $f_a$ translate the input by a constant of $c$ and scale the input by a constant of $a$, respectively. These are technically special cases of addition and multiplication, but we introduce them as (new) unary gates here since we do not need the gradients for the constants $c$, $a$.

That above statement implies that you could compute the derivative w.r.t. a constant, but they chose not to.

This post did not answer my question:derivative with respect to constant.

Thanks.

$\endgroup$ 3

4 Answers

$\begingroup$

The derivative of a constant with respect to a variable is $0$, but the derivative of a function with respect to a constant, as Fra mentioned in the comments, is ill defined.

EDIT

The question has been updated. The link provided in the question discusses functions

$$f_c(x) = c + x $$and$$f_a(x) = ax$$

The link also indicates their derivatives are:

$$\frac{df}{dx}=1$$and$$\frac{df}{dx}=a$$respectively, as expected.

These derivatives are still with respect to $x$, not constants $c$ or $a$. The confusion might have arisen since letter $c$ in $f_c(x)$ might have given the impression that this is a function with respect to $c$, which is not the case. Same argument applies for $f_a(x)$.

$\endgroup$ 2 $\begingroup$

It help to know about the Derivation operator that satisfies$$ D_t[u+v] = D_t[u]+D_t[v],\quad D_t[u\,v] = D_t[u]\,v + u\,D_t[v].\tag{1} $$For example, use equation $(1)$ to get$$ D_t[m\,x+b] = D_t[m]\,x + m D_t[x] + D_t[b].\tag{2} $$You asked

I want to know if I can treat a constant like a variable

In the context of symbolic differentiation the answer is yes with $\,D_t,\,$ and if you want to assume laterthat some symbol $\,c\,$ is a constant, you just set $\,D_t[c] = 0.\,$

Mathematica has a function Dt which implements the total differential.

$\endgroup$ $\begingroup$

A derivative is defined by a limit\begin{equation} \frac{df}{dx}(x_0)=\lim_{x\rightarrow x_0}\frac{f(x)-f(x_0)}{x-x_0} \end{equation}and that means how $f$ changes when $x$ changes if $x$ is constant, then $x$ never changes so we can interpret $f$ never changes but this has not sense. Anothe way (matemathecally) that limit does not exist because the limit at the denominator is $0$ $\forall x$.

$\endgroup$ $\begingroup$

In the context of computation graphs used for auto-differentiation (like in TensorFlow) for neural network back-propagation, constants (e.g. 5) can be treated like an input variable.

In the case of the original example,

$$ f(x) = 5x $$Then$$ \frac{d f(x)}{d 5} = x $$

Note that in practice, while $\frac{d f(x)}{d 5}$ is computable, you would never need the result because the point of autodifferentiation for backpropagation is to update weight variables.

$\endgroup$

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy