Please use this identifier to cite or link to this item: http://hdl.handle.net/2440/120659
Type: Thesis
Title: [EMBARGOED] Efficient Deep Learning Models with Autoencoder Regularization and Information Bottleneck Compression
Author: Williams, Jerome Oskar
Issue Date: 2019
School/Discipline: School of Computer Science
Abstract: Improving efficiency in deep learning models implies achieving a more accurate model for a given computational budget, or conversely a faster, leaner model without losing accuracy. In order to improve efficiency, we can use regularization to to improve generalization to the real world, and compression to improve speed. Due to the information-restricting nature of regularization, these two methods are related. Firstly we present a novel autoencoder architecture as a method of regularization for Pedestrian Detection. Secondly, we present a hyperparameter-free, iterative compression method based on measuring the information content of the model with the Information Bottleneck principle.
Advisor: Carneiro, Gustavo
Suter, David
Sasdelli, Michele
Dissertation Note: Thesis (MPhil) -- University of Adelaide, School of Computer Science, 2019
Keywords: Machine learning
neural network
deep learning
computer vision
regularization
compression
information bottleneck
autoencoder
pedestrian detection
region of interest
convolutional
statistics
efficiency
Provenance: This thesis is currently under Embargo and not available.
Appears in Collections:Research Theses

Files in This Item:
File Description SizeFormat 
Williams2019_MPhil.pdfLibrary staff access only1.98 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.