The short answer: Yes, if your Gaussian Process (GP) is differentiable, its derivative is again a GP. It can be handled like any other GP and you can calculate predictive distributions.
But since a GP G and its derivative G′ are closely related you can infer properties of either one from the other.
- Existence of G′
A zero-mean GP with covariance function K is differentiable (in mean square) if K′(x1,x2)=∂2K∂x1∂x2(x1,x2) exists. In that case the covariance function of G′ is equal to K′. If the process is not zero-mean, then the mean function needs to be differentiable as well. In that case the mean function of G′ is the derivative of the mean function of G.
(For more details check for example Appendix 10A of A. Papoulis "Probability, random variables and stochastic processes")
Since the Gaussian Exponential Kernel is differentiable of any order, this is no problem for you.
- Predictive distribution for G′
This is straightforward if you just want to condition on observations of G′: If you can calculate the respective derivatives you know mean and covariance function so that you can do inference with it in the same way as you would do it with any other GP.
But you can also derive a predictive distributions for G′ based on observations of G. You do this by calculating the posterior of G given your observations in the standard way and then applying 1. to the covariance and mean function of the posterior process.
This works in the same manner the other way around, i.e. you condition on observations of G′ to infer a posterior of G. In that case the covariance function of G is given by integrals of K′ and might be hard to calculate but the logic is really the same.