How can I get a time series graph of the derivative of a data set using Graphana and InfluxDB

I have a process which loads RXBYTES and TXBYTES from a Linux server's interface info every 5 seconds... I would like to create a graph in Graphana which will show JUST the difference between each data point..I.E.: (target point - previous point)/time intervalIt looks like the derivative() function in InfluxDB should do exactly this, but I cannot get it to work. The query I built in Graphana is like this:select derivative(value) from "stats.bandwidth.home.br0.rx.gauge" where time>now() - 1h group by time(10s) order ascThe results of that que...Read more

(Openmdao 2.4.0) difference between providing no derivatives / forcing FD on a disciplines with derivatives

this question is in line with this one but it is not the same. The objective is still for students purpose !Still playing with Sellar problem , I compared the 2 different problems :problem 1 : MDA of Sellar without derivatives information on Disciplines with Newton solver as NonlinearSolver problem 2 : MDA of Sellar with derivatives information on Disciplines with Newton solver as NonlinearSolver but with the options declare_partials('', '', method='fd') on each discipline in the problem level for both, the linearsolver is the same and both c...Read more

derivative - (Openmdao 2.4.0) 'compute_partials' function of a Component seems to be run even when forcing 'declare_partials' to FD for this component

I want to solve MDA for Sellar using Newton non linear solver for the Group . I have defined Disciplines with Derivatives (using 'compute_partials') but I want to check the number of calls to Discipline 'compute' and 'compute_partials' when forcing or not the disciplines not to use their analytical derivatives (using 'declare_partials' in the Problem definition ). The problem is that is seems that the 'compute_partials' function is still called even though I force not to use it .Here is an example (Sellar)So for Discipline 2, I add a counter an...Read more

openmdao - Derivative check with scalers

I have a problem that I want to scale the design variables. I have added the scaler, but I want to check the derivative to make sure it is doing what I want it to do. Is there a way to check the scaled derivative? I have tried to use check_total_derivatives() but the derivative is the exact same regardless of what value I put for scaler:from openmdao.api import Component, Group, Problem, IndepVarComp, ExecCompfrom openmdao.drivers.pyoptsparse_driver import pyOptSparseDriverclass Scaling(Component): def __init__(self): super(Scaling, s...Read more

derivative - How to derive an angle to time in mupad

So I have a pritty nasty function with sines and cosines that represents the position of some point in a certain system. Now that I know the location of the point dependant on angle Beta. I wish to derive the function to find the speed. The problem is that mupad thinks that beta is a constant when you try to derive it to time. Obiously the derivative of Beta is the angular velocity. But how do I tell this to mupad?This is the code I have so far.reset();eq:=(a/cos(Beta))^2=(a/cos(Alpha))^2+d^2-2*a/cos(Alpha)*d*sin(Alpha);Ex:=-a+Lb*cos(Beta);a:=s...Read more

derivative - Can I enhance one LGPL library based of implementation of another?

I was wondering if it was legal/not frowned upon to base enhancements to one LGPL library off of the functionality of another LGPL library. Note that because of the method of implementation, the source code could not be directly built off of, however the general idea is to essentially implement similar functionality in another library based off of the functionality in the original library, without copying the implementation or directly using the other library.An example of what I'm thinking of is:Both libraries are covered by the LGPL:Library 1...Read more

Bilinear Transform (Tustin's Method) applied to the Derivative

I hope that I have not misunderstood something terribly wrong, but the continuous derivative $D=d/dt$ can be considered a transfer function in Laplace space $D(s) = s$, right?So when I try to discretize it using the bilinear transform (Tustin's method) I trivially get$D(z) = \frac{2}{T} \frac{1-z^{-1}}{1+z^{-1}}$When I apply this to a series containing one discrete impulse, the response oscillates at the Nyquist frequency. Even worse, the spectrum around $\omega=0$ is quadratic and not $\sim i\omega$ like it would be expected from the derivativ...Read more

Matrix Representation of Softmax Derivatives in Backpropagation

I have a simple multilayer fully connected neural network for classification. At the last layer I have used softmax activation function. So I have to propagate the error through the softmax layer. Suppose, I have 3 softmax units at the output layer. Input to these 3 logits can be described by the vector $z =\begin{pmatrix}z1\\z2\\z3\end{pmatrix}$. Now let's say those 3 logits output $y = \begin{pmatrix}y1\\y2\\y3\end{pmatrix}$. Now I want to calculate $\frac{\partial y}{\partial z}$. Which is simply: $ $ $$\begin{equation} \\ \frac{\partial }{\...Read more