read moreIf \(\mathbf{Q} \in \mathbb{R}^{n \times n}\) is orthogonal, then for all \(\mathbf{x}\), \(\mathbf{y}\) \(\in \mathbb{R}^n\),
\begin{equation*} \begin{aligned} \langle \mathbf{Q} \mathbf{x}, \mathbf{Q} \mathbf{y} \rangle &= \langle \mathbf{x}, \mathbf{y} \rangle \\ \Vert \mathbf{Q} \mathbf{x} \Vert_2 &= \Vert \mathbf …
Other articles
Conservation laws - II (Weak solutions)
In this article, we will mainly discuss some standard numerical techniques to analyse two types of Hyperbolic partial differential equations:
Advection equation
\begin{equation} u_t + a(x,t)~u_x = 0 \label{advection} \end{equation}scalar conservation laws
\begin{equation} u_t + (f(x,t,u))_x = 0 \label{conservation} \end{equation …read moreConservation laws - I (Continuum hypothesis)
In ancient Greek philosophy, there were two opposite views about the nature of physical phenomena, the discrete, and the continuum. Aristotle was the representative of the continuous theory and Democritus of the atomistic one. Despite the quantitative challenge to the continuous nature of reality from Robert Boyle (1666), John Dalton …
read moreLeast square residual classifier Part - II
In the Least square residual Part - I, classification algorithm called least square residual classifier based on singular value decomposition was presented and applied on the MNIST handwritten digit recognition task. Please read the mentioned post before proceeding.
In this post, we introduce least square residual classifier based on autoencoders and …
read moreLeast square residual Classifier Part - I
Residual based machine learning algorithms such as Deep Residual Neural Network, Gradient Boosted Tree are popular and shown great promise in machine learning tasks like computer vision.
This is a multi-part series in which I’m planning to cover the following residual based algorithms:
- Least square residual classifier basen on …