DoGsigma1sigma2I
DoGsigma1sigma2I refers to the Difference of Gaussians operation applied to an image I, using two Gaussian blur kernels with standard deviations sigma1 and sigma2. It is commonly used in image processing to highlight features at a specific range of spatial scales and to serve as a simple, efficient edge and blob detector.
Mathematically, DoG_sigma1_sigma2(I) is defined as the difference between two Gaussian-blurred versions of I: DoG_sigma1_sigma2(I) = G_sigma1 * I
DoG is an approximation to the Laplacian of Gaussian (LoG). By choosing sigma2 relative to sigma1 (commonly
Applications include blob and edge detection, feature detection in computer vision, and as a component in scale-space