TY - GEN
T1 - Foreground detection using spatiotemporal projection kernels
AU - Moshe, Yair
AU - Hel-Or, Hagit
AU - Hel-Or, Yacov
PY - 2012
Y1 - 2012
N2 - In this paper, we propose a novel video foreground detection method that exploits the statistics of 3D spacetime patches. 3D space-time patches are characterized by means of the subspace they span. As the complexity of real-time systems prohibits performing this modeling directly on the raw pixel data, we propose a novel framework in which spatiotemporal data is sequentially reduced in two stages. The first stage reduces the data using a cascade of linear projections of 3D space-time patches onto a small set of 3D Walsh-Hadamard (WH) basis functions known for its energy compaction of natural images and videos. This stage is efficiently implemented using the Gray-Code filtering scheme [2] requiring only 2 operations per projection. In the second stage, the data is further reduced by applying PCA directly to the WH coefficients exploiting the local statistics in an adaptive manner. Unlike common techniques, this spatiotemporal adaptive projection exploits window appearance as well as its dynamic characteristics. Tests show that the proposed method outperforms recent foreground detection methods and is suitable for real-time implementation on streaming video.
AB - In this paper, we propose a novel video foreground detection method that exploits the statistics of 3D spacetime patches. 3D space-time patches are characterized by means of the subspace they span. As the complexity of real-time systems prohibits performing this modeling directly on the raw pixel data, we propose a novel framework in which spatiotemporal data is sequentially reduced in two stages. The first stage reduces the data using a cascade of linear projections of 3D space-time patches onto a small set of 3D Walsh-Hadamard (WH) basis functions known for its energy compaction of natural images and videos. This stage is efficiently implemented using the Gray-Code filtering scheme [2] requiring only 2 operations per projection. In the second stage, the data is further reduced by applying PCA directly to the WH coefficients exploiting the local statistics in an adaptive manner. Unlike common techniques, this spatiotemporal adaptive projection exploits window appearance as well as its dynamic characteristics. Tests show that the proposed method outperforms recent foreground detection methods and is suitable for real-time implementation on streaming video.
UR - http://www.scopus.com/inward/record.url?scp=84866699148&partnerID=8YFLogxK
U2 - 10.1109/CVPR.2012.6248056
DO - 10.1109/CVPR.2012.6248056
M3 - Conference contribution
AN - SCOPUS:84866699148
SN - 9781467312264
T3 - Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
SP - 3210
EP - 3217
BT - 2012 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2012
T2 - 2012 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2012
Y2 - 16 June 2012 through 21 June 2012
ER -