class LowRankGaussianProcess[D, Value] extends GaussianProcess[D, Value]
A gaussian process which is represented in terms of a (small) finite set of basis functions. The basis functions are the orthonormal basis functions given by a mercers' decomposition.
- D
The dimensionality of the input space
- Value
The output type
- Alphabetic
- By Inheritance
- LowRankGaussianProcess
- GaussianProcess
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Instance Constructors
- new LowRankGaussianProcess(mean: Field[D, Value], klBasis: KLBasis[D, Value])(implicit arg0: NDSpace[D], vectorizer: Vectorizer[Value])
- mean
The mean function
- klBasis
A set of basis functions
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##(): Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- def coefficients(trainingData: IndexedSeq[(Point[D], Value)], sigma2: Double): DenseVector[Double]
Returns the sample of the coefficients of the sample that best explains the given training data.
Returns the sample of the coefficients of the sample that best explains the given training data. It is assumed that the training data (values) are subject to 0 mean Gaussian noise
- def coefficients(trainingData: IndexedSeq[(Point[D], Value, MultivariateNormalDistribution)]): DenseVector[Double]
Returns the sample of the coefficients of the sample that best explains the given training data.
Returns the sample of the coefficients of the sample that best explains the given training data. It is assumed that the training data (values) are subject to 0 mean Gaussian noise
- val cov: MatrixValuedPDKernel[D]
- Definition Classes
- GaussianProcess
- def discretize[DDomain[DD] <: DiscreteDomain[DD]](domain: DDomain[D]): DiscreteLowRankGaussianProcess[D, DDomain, Value]
Discretize the gaussian process on the given points.
Discretize the gaussian process on the given points.
- Definition Classes
- LowRankGaussianProcess → GaussianProcess
- def domain: Domain[D]
- Definition Classes
- GaussianProcess
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def instance(c: DenseVector[Double]): Field[D, Value]
an instance of the gaussian process, which is formed by a linear combination of the klt basis using the given coefficients c.
an instance of the gaussian process, which is formed by a linear combination of the klt basis using the given coefficients c.
- c
Coefficients that determine the linear combination. Are assumed to be N(0,1) distributed.
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- val klBasis: KLBasis[D, Value]
- def logpdf(coefficients: DenseVector[Double]): Double
Returns the log of the probability density of the instance produced by the x coefficients.
Returns the log of the probability density of the instance produced by the x coefficients.
If you are interested in ordinal comparisons of PDFs, use this as it is numerically more stable
- def marginal(points: IndexedSeq[Point[D]])(implicit domainCreator: Create[D]): DiscreteLowRankGaussianProcess[D, UnstructuredPointsDomain, Value]
Compute the marginal distribution for the given points.
Compute the marginal distribution for the given points. The result is again a Gaussian process, whose domain is an unstructured points domain
- Definition Classes
- LowRankGaussianProcess → GaussianProcess
- def marginal(pt: Point[D]): MultivariateNormalDistribution
Compute the marginal distribution at a single point.
Compute the marginal distribution at a single point.
- Definition Classes
- GaussianProcess
- val mean: Field[D, Value]
- Definition Classes
- GaussianProcess
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def outputDim: Int
- Definition Classes
- GaussianProcess
- def pdf(coefficients: DenseVector[Double]): Double
Returns the probability density of the instance produced by the x coefficients
- def posterior(trainingData: IndexedSeq[(Point[D], Value, MultivariateNormalDistribution)]): LowRankGaussianProcess[D, Value]
The posterior distribution of the gaussian process, with respect to the given trainingData.
The posterior distribution of the gaussian process, with respect to the given trainingData. It is computed using Gaussian process regression.
- Definition Classes
- LowRankGaussianProcess → GaussianProcess
- def posterior(trainingData: IndexedSeq[(Point[D], Value)], sigma2: Double): LowRankGaussianProcess[D, Value]
The posterior distribution of the gaussian process, with respect to the given trainingData.
The posterior distribution of the gaussian process, with respect to the given trainingData. It is computed using Gaussian process regression. We assume that the trainingData is subject to isotropic Gaussian noise with variance sigma2.
- Definition Classes
- LowRankGaussianProcess → GaussianProcess
- def project(trainingData: IndexedSeq[(Point[D], Value, MultivariateNormalDistribution)]): Field[D, Value]
Returns the sample of the gaussian process that best explains the given training data.
Returns the sample of the gaussian process that best explains the given training data. It is assumed that the training data (values) are subject to 0 mean gaussian noise
- trainingData
Point/value pairs where that the sample should approximate, together with the variance of the noise model at each point.
- def project(trainingData: IndexedSeq[(Point[D], Value)], sigma2: Double = 1e-6): Field[D, Value]
Returns the sample of the gaussian process that best explains the given training data.
Returns the sample of the gaussian process that best explains the given training data. It is assumed that the training data (values) are subject to 0 mean Gaussian noise
- trainingData
Point/value pairs where that the sample should approximate.
- sigma2
variance of a Gaussian noise that is assumed on every training point
- def rank: Int
the rank (i.e.
the rank (i.e. number of basis functions)
- def sample()(implicit rand: Random): Field[D, Value]
A random sample of the gaussian process
- def sampleAtPoints[DDomain[DD] <: DiscreteDomain[DD]](domain: DDomain[D])(implicit rand: Random): DiscreteField[D, DDomain, Value]
A random sample evaluated at the given points
A random sample evaluated at the given points
- Definition Classes
- LowRankGaussianProcess → GaussianProcess
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- def truncate(newRank: Int): LowRankGaussianProcess[D, Value]
Returns a reduced rank model, using only the leading basis function of the Karhunen-loeve expansion.
- implicit val vectorizer: Vectorizer[Value]
- Definition Classes
- GaussianProcess
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()