object GaussianProcess
Factory methods for creating Gaussian processes
- Alphabetic
- By Inheritance
- GaussianProcess
- AnyRef
- Any
- Hide All
- Show All
- Public
- Protected
Value Members
- final def !=(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- final def ##(): Int
- Definition Classes
- AnyRef → Any
- final def ==(arg0: Any): Boolean
- Definition Classes
- AnyRef → Any
- def apply[D, Value](cov: MatrixValuedPDKernel[D])(implicit arg0: NDSpace[D], vectorizer: Vectorizer[Value]): GaussianProcess[D, Value]
Creates a new zero-mean Gaussian process with the given covariance function.
- def apply[D, Value](mean: Field[D, Value], cov: MatrixValuedPDKernel[D])(implicit arg0: NDSpace[D], vectorizer: Vectorizer[Value]): GaussianProcess[D, Value]
Creates a new Gaussian process with given mean and covariance, which is defined on the given domain.
- final def asInstanceOf[T0]: T0
- Definition Classes
- Any
- def clone(): AnyRef
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.CloneNotSupportedException]) @native()
- final def eq(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- def equals(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef → Any
- def finalize(): Unit
- Attributes
- protected[lang]
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.Throwable])
- final def getClass(): Class[_ <: AnyRef]
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- def hashCode(): Int
- Definition Classes
- AnyRef → Any
- Annotations
- @native()
- final def isInstanceOf[T0]: Boolean
- Definition Classes
- Any
- def marginalLikelihood[D, Value](gp: GaussianProcess[D, Value], trainingData: IndexedSeq[(Point[D], Value, MultivariateNormalDistribution)])(implicit arg0: NDSpace[D], vectorizer: Vectorizer[Value]): Double
* Computes the marginal likelihood of the observed data, according to the given GP.
* Computes the marginal likelihood of the observed data, according to the given GP.
This can for example be used in a model selection setting, where the GP with the maximum marginal likelihood of the observed data would be selected.
- gp
The gaussian process
- trainingData
Point/value pairs where that the sample should approximate, together with an error model (the uncertainty) at each point.
- To do
The current implementation can be optimized as it inverts the data covariance matrix (that can be heavy for more than a few points). Instead an implementation with a Cholesky decomposition would be more efficient.
- final def ne(arg0: AnyRef): Boolean
- Definition Classes
- AnyRef
- final def notify(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- final def notifyAll(): Unit
- Definition Classes
- AnyRef
- Annotations
- @native()
- def regression[D, Value](gp: GaussianProcess[D, Value], trainingData: IndexedSeq[(Point[D], Value, MultivariateNormalDistribution)])(implicit arg0: NDSpace[D], vectorizer: Vectorizer[Value]): GaussianProcess[D, Value]
* Performs a Gaussian process regression, where we assume that each training point (vector) is subject to zero-mean noise with given variance.
* Performs a Gaussian process regression, where we assume that each training point (vector) is subject to zero-mean noise with given variance.
- gp
The gaussian process
- trainingData
Point/value pairs where that the sample should approximate, together with an error model (the uncertainty) at each point.
- final def synchronized[T0](arg0: => T0): T0
- Definition Classes
- AnyRef
- def toString(): String
- Definition Classes
- AnyRef → Any
- final def wait(): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long, arg1: Int): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException])
- final def wait(arg0: Long): Unit
- Definition Classes
- AnyRef
- Annotations
- @throws(classOf[java.lang.InterruptedException]) @native()