Vehicle and pedestrian video-tracking with classification based on deep convolutional neural networks

Alejandro Forero, Francisco Calderon

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Scopus citations

Abstract

In this article we propose an algorithm for the classification, tracking and counting of vehicles and pedestrians in video sequences; The algorithm is divided into two parts, a classification algorithm, which is based on convolutional neural networks, implemented using the You Only Look Once (YOLO) method; and a proposed algorithm for tracking regions of interest based in a well defined taxonomy. For the first stage of classification, We train and evaluate the performance with a set of more than 50000 labels, which we make available for their use. The tracking algorithm is evaluated against manual counts in video sequences of different scenarios captured in the management center of the Secretaria distrital de Movilidad of Bogota.

Original languageEnglish
Title of host publication2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728114910
DOIs
StatePublished - Apr 2019
Event22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Bucaramanga, Colombia
Duration: 24 Apr 201926 Apr 2019

Publication series

Name2019 22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019 - Conference Proceedings

Conference

Conference22nd Symposium on Image, Signal Processing and Artificial Vision, STSIVA 2019
Country/TerritoryColombia
CityBucaramanga
Period24/04/1926/04/19

Keywords

  • Object detection
  • image processing
  • vehicle counting.
  • video object tracking
  • video-tracking

Fingerprint

Dive into the research topics of 'Vehicle and pedestrian video-tracking with classification based on deep convolutional neural networks'. Together they form a unique fingerprint.

Cite this