Hassan Hamad

I am a PhD candidate studying ECE at the University of Southern California (USC) with a B.E. in Computer & Communcations Engineering from Notre Dame University-Louaize (NDU) and an M.S. in Communications Engineering form Technical University of Munich (TUM). I am advised by Professor Keith Chugg. My research expertise is in Machine Learning in general and efficient Deep Learning in particular. Specifically, I study how Deep Learning can be made more efficient both from the algorithm side and the hardware side. I am also part of the Hardware Accelerated Learning (HAL) group at USC.


HAL Group

I am part of the Hardware Accelerated Learning (HAL) group at USC. We focus on co-designing algorithms and hardware for reduced complexity training of Neural Networks.


Current Research

Training Convolutional Neural Networks using Logarithmic Computations

Floating point numbers are the go-to format when doing all sorts of computations. They perform very accurate computations while supporting large dynamic range. This comes at the cost of increased memory footprint and computational complexity compared to fixed point arithmetic. On the other hand, fixed point arithmetic has much lower precision and range; it performs approximate compuations. Using the fact that deep learning is inherently noise-tolerant and doesn't require exact compuations, we show that trainig neural networks using fixed point format and in the logarithmic domain is possible at negligible performance degradation but significant area gain.

FIRE Dataset: FInancial Relation Extraction Dataset

We publish a new dataset called FIRE: FInancial Relation Extraction. The dataset is comprised of named entity and relation labels extracted from financial documents (e.g. 10-K and 10-Q) and financial news articles (e.g. Yahoo Finance news). The intended task is called Supervised Relation Extraction, a multi-label multi-class classification problem. In addition to defining the class labels, we developed detailed annotation guidelines and supervised the development of a labeling tool to speed up the labeling process. The dataset is expected to be released soon.

A combined SSL-ACL Paradigm to train Relation Extraction Models on a Low Data Budget

We investigate the problem of training RE models in the low data regime. To achiever higher performace, the model should be very selective as to which data sample to train on and in what order. To this end, we develop a new paradigm which combines Semi-Supervised Learning (SSL) and Actve-Learning (ACL) together with Curriculum Learning. Two RE-specific metrics are proposed, one metric to jugde a sample difficulty and another metric to jugde the model's confidence in it's prediction. We conduct experiments using state-of-the-art RE models across different RE datasets proposed in the literature.


Selected Publications

William Chang, Hassan Hamad, Keith M. Chugg

2022 Asilomar Conference on Signals, Systems, and Computers

2022
Mari Kobayashi, Hassan Hamad, Gerhard Kramer, Giuseppe Caire

2019 IEEE International Symposium on Information Theory (ISIT), 270-274

2019
Ghassan M Kraidy, Hassan Hamad

2019 16th Canadian Workshop on Information Theory (CWIT)

2019
Wissam Hamad, Marwan Bou Sanayeh, Tobias Siepelmeyer, Hassan Hamad, Werner HE Hofmann

IEEE Photonics Journal

2019
Hassan Hamad, Ghassan M Kraidy

2017 15th Canadian Workshop on Information Theory (CWIT)

2017