Support Vector Machine [DRAFT]
Created at 2016-03-27T01:25:01.000Z

Implementation

>>> np.array([1, 2, 3]).transpose()
array([1, 2, 3])
>>> np.array([[1, 2, 3]]).transpose()
array([[1],
       [2],
       [3]])
>>> np.array([[4], [5], [6]]).dot(np.array([[1, 2, 3]]))
array([[ 4,  8, 12],
       [ 5, 10, 15],
       [ 6, 12, 18]])

Theoretic Interests

TODO: the below discussions are OPPOSITE (necessity and sufficient conditions)

Lagrange Multiplier

I wrote (informal and simple) proof of the theorem about one way implication between original problem and "_Lagrange-multiplier_ed" problem. In simple term, we can say two things:

  • if you find an answer in converted problem, that must be an answer in original problem,
  • but, there might be a problem where you can not find an answer in converted problem even though the answer exists in original problem.

Karush–Kuhn–Tucker Conditions

This technique expands application of Lagrange Multiplier so that it includes inequality constraints. but, of course, this requires condition for objective function or others.

The optimization problem appearing in finding SVM can be written very simple equation and inequation, which satifies those condition, so it can be solved by this technique.

Intuitively, inequality can be deal with same way as equality case since anyway extreme point exists on the boundary of area made by inequality. (maybe this is too rough.)

TODO: write down problem definition for the SVM case (mimization)


Future Work

  • follow proof of necessity implication in Lagrange Multiplier and KKT conditions

References