Datenbestand vom 17. April 2024

Warenkorb Datenschutzhinweis Dissertationsdruck Dissertationsverlag Institutsreihen     Preisrechner

aktualisiert am 17. April 2024

ISBN 978-3-8439-4528-8

84,00 € inkl. MwSt, zzgl. Versand


978-3-8439-4528-8, Reihe Informatik

Christian Bailer
New Data Based Matching Strategies for Visual Motion Estimation

171 Seiten, Dissertation Technische Universität Kaiserslautern (2019), Hardcover, A5

Zusammenfassung / Abstract

Motion estimation is likely the most crucial task in video based computer vision, as it connects the information in consecutive images.

While priors are often required for good motion estimation, matching the available data still stays the most essential part as it depends on the actual image content. In this thesis we present new data based matching strategies for visual motion estimation, to utilize the available data as good as possible.

We demonstrate that our strategies, lead to very promising motion estimations.

We present new approaches in the fields of optical flow estimation and object tracking. First, we present a novel very robust matching approach tailored for optical flow estimation. It is much less outlier prone than conventional approaches. The robustness of our approach is based on a new multi-scale matching strategy and requires no explicit regularization or smoothing (like median filtering). To further improve the approach we introduce new CNN based features, which are trained using our novel thresholded hinge loss, which outperforms other loss functions in our tests. We also present a new way to strongly speed up dense CNN based feature computation in the presence of pooling and striding layers. We show that our approach can achieve state-of-the-art results on the KITTI and MPI-Sintel datasets.

For object tracking, we introduce a new tracking fusion approach that allows to clearly outperform the best trackers by fusing the results of several state-of-the-art trackers. By fusing only fast trackers we can even outperform most trackers in both speed and tracking quality. We can also provide smoother trajectories than most trackers. For cases where tracking fusion is not robust enough, we present an user supported tracking framework. Here, the task is to obtain nearly perfect tracking results while minimizing the necessary user input. Finally, we present an eye tracking and calibration approach for parallax barrier based 3D screens.