These instances, whenmapped to an N-dimensional space, represent a core set that can be
used to construct an approximation to theminimumenclosing ball. Solving the SVMlearning
problem on these core sets can produce a good approximation solution in very fast speed.
For example, the core-vector machine [81] thus produced can learn an SVM for millions of
data in seconds.
We describe and demonstrate an algorithm that takes as input an
unorganized set of points fx1 xng IR3 on or near an unknown
manifold M, and produces as output a simplicial surface that
approximates M. Neither the topology, the presence of boundaries,
nor the geometry of M are assumed to be known in advance — all
are inferred automatically from the data. This problem naturally
arises in a variety of practical situations such as range scanning
an object from multiple view points, recovery of biological shapes
from two-dimensional slices, and interactive surface sketching.
// chebysheve outlier detection
// this function is used to detect the abnormal value among a set of data
// input:
// delta: a set of data
// flag: discribe which data is already known as outlier
// p: restrict level
// output:
// double[] door : byyond which the data may be considered as a outlier
// door[0]: the upperdoor
// door[1]: the lowerdoor
Batch version of the back-propagation algorithm.
% Given a set of corresponding input-output pairs and an initial network
% [W1,W2,critvec,iter]=batbp(NetDef,W1,W2,PHI,Y,trparms) trains the
% network with backpropagation.
%
% The activation functions must be either linear or tanh. The network
% architecture is defined by the matrix NetDef consisting of two
% rows. The first row specifies the hidden layer while the second
% specifies the output layer.
%