Posts

Showing posts from October, 2012

Convex Optimization

Image
Convex optimization problems are problems of the kind - min f(x) subject to g_i(x) ≤ b ∀ i=1...n h_i(x) = 0 ∀ i=1...m where the both the objective function and the inequality constraints are convex and the equality constraints are affine. A simple definition of a convex function is that if we consider two points on the function the line segment is contained in the function. An alternate definition is that a tangent at any point on the function always lies below the function. In simple words, a convex function is a function with positive curvature as shown in the figure below. Some example convex functions are as follows: An affine function. (Actually it is both convex and concave) Power of x, x^α α ≥ 1 or α ≤ 0 (Note that if α is between 0-1 the function is concave) Exponential (Logarithm is concave) Norms How to systematically determine if a function is convex ? Jensen's inequality: A function f is convex if f(θx + (1 - θ)y) ≤ θf(x) + (1 - θ)f(y) for 0

Boosting

Boosting is perhaps one of the most powerful ideas in Machine Learning. The idea is to combine several weak classifiers and turn it into a strong classifier. A weak classifier is a classifier that has Boosting has proven to be a very powerful approach and is used for various real world applications. The basic idea behind boosting is to concentrate on the examples that were incorrectly classified and use a majority voting from the various classifiers. One of the most popularly used boosting algorithms is known as AdaBoost. The basic algorithm of Ada Boost is as follows: Suppose we have T weak classifiers such that the error on each of them is slightly less than 1/2 i.e., εt = 1/2 - γt Input: Training points (x1, y1), (x2, y2) ...... (xn, yn) where xi ∈ X and yi ∈ {-1,1} Let wt(i) be the weight associated with the ith training point ∀i ∈ 1..N in the t-th iteration Let αt be the weight associated with the t-th classifier ∀t ∈ 1..T Let the output of the t-th classifier for the ith