Training Convolutional Neural Networks using Logarithmic Computations
Floating point numbers are the go-to format when doing all sorts of computations. They perform very accurate computations while supporting large dynamic range. This comes at the cost of
increased memory footprint and computational complexity compared to fixed point arithmetic. On the other hand, fixed point arithmetic has much lower precision and range; it performs approximate compuations.
Using the fact that deep learning is inherently noise-tolerant and doesn't require exact compuations, we show that trainig neural networks using fixed point format and in the logarithmic domain is possible
at negligible performance degradation but significant area gain.
FIRE Dataset: FInancial Relation Extraction Dataset
We publish a new dataset called FIRE: FInancial Relation Extraction. The dataset is comprised of named entity and relation labels extracted from financial documents (e.g. 10-K and 10-Q) and
financial news articles (e.g. Yahoo Finance news). The intended task is called Supervised Relation Extraction, a multi-label multi-class classification problem.
In addition to defining the class labels, we developed detailed annotation guidelines and supervised the development of a labeling tool to speed up the labeling process. The dataset is expected to be released soon.
A combined SSL-ACL Paradigm to train Relation Extraction Models on a Low Data Budget
We investigate the problem of training RE models in the low data regime. To achiever higher performace, the model should be very selective as to which data sample to train on and in what order. To this end,
we develop a new paradigm which combines Semi-Supervised Learning (SSL) and Actve-Learning (ACL) together with Curriculum Learning. Two RE-specific metrics are proposed, one metric to jugde a sample difficulty and another metric to jugde
the model's confidence in it's prediction. We conduct experiments using state-of-the-art RE models across different RE datasets proposed in the literature.