Pioran is a Julia package to estimate bending power-law power spectrum of time series. This method uses Gaussian process regression with the fast algorithm of Foreman-Mackey, et al. 2017. The bending power-law model is approximated using basis functions as shown in the Figure below:
The method is described in Lefkir et. al 2025, where it is used to model the random flux variability observed in active galaxies.
using Pkg; Pkg.add("Pioran")Read the documentation here: https://mlefkir.github.io/Pioran.jl/stable/.
Example scripts are provided in the examples directory. To infer the parameters of the power spectrum, I use either Turing.jl for Hamiltonian Monte Carlo or the Python library ultranest for nested sampling.
Here a very quick example on how to use it with ultranest. I assume you have installed PyCall and ultranest following the guide in the documentation here.
Assuming you have a Gaussian time series y at times t with errorbars σ. The GP can be built using a function as follows:
using Pioran, Distributions
function GP_model(t, y, σ, params, n_components = 20, basis_function = "DRWCelerite")
T = (t[end] - t[1]) # duration of the time series
Δt = minimum(diff(t)) # min time separation
f_min, f_max = 1 / T, 1 / Δt / 2
# Get the parameters
α₁, f₁, α₂, variance, ν, μ = params
# Rescale the measurement variance
σ² = ν .* σ .^ 2
# Define the power spectral density function
𝓟 = SingleBendingPowerLaw(α₁, f₁, α₂)
# Approximate the PSD to form a covariance function
𝓡 = approx(𝓟, f_min, f_max, n_components, variance, basis_function = basis_function)
# Build the GP
GP = ScalableGP(μ, 𝓡)
# return the conditioned GP on the times and errors and the transformed values
return GP(t, σ²)
endThe log-likelihood can be obtained using the logpdf function from Distributions.jl:
function loglikelihood(t, y, σ, params)
GP = GP_model(t, y, σ, params)
return logpdf(GP, y)
end
logl(pars) = loglikelihood(t, y, yerr, pars)We use distributions from Distributions.jl to define the priors for nested sampling. For this example, we can have:
function prior_transform(cube)
α₁ = quantile(Uniform(0.0, 1.5), cube[1])
f₁ = quantile(LogUniform(1e-3, 1e3), cube[2])
α₂ = quantile(Uniform(α₁, 4.0), cube[3])
variance = quantile(LogNormal(0, 1), cube[4])
ν = quantile(Gamma(2, 0.5), cube[5])
μ = quantile(Normal(x̄, 5 * sqrt(va)), cube[6])
return [α₁, f₁, α₂, variance, ν, μ]
end
paramnames = ["α₁", "f₁", "α₂", "variance", "ν", "μ"]We can load ultranest using PyCall:
using PyCall
ultranest = pyimport("ultranest")and start sampling the posterior:
sampler = ultranest.ReactiveNestedSampler(paramnames, logl, resume = true, transform = prior_transform, log_dir = "path/to/dir", vectorized = false)
results = sampler.run()
sampler.print_results()
sampler.plot()If this method or code was useful to you, you can cite Lefkir et. al 2025 for method and Foreman-Mackey, et al. 2017 for the celerite algorithm.
If you have used ultranest or Turing.jl to sample the posterior, have a look at their documentation on how to cite them properly.