Please use this identifier to cite or link to this item:

Feature enrichment through multi-gram models

File Size Format  
paper0099.pdf 1.14 MB Adobe PDF View/Open

Item Summary

Title:Feature enrichment through multi-gram models
Authors:Forss, Thomas
Keywords:Big Data and Analytics: Pathways to Maturity
feature enrichment, feature selection, multi gram, n-gram, text classifications
Date Issued:03 Jan 2018
Abstract:We introduce a feature enrichment approach, by developing multi-gram cosine similarity classification models. Our approach combines cosine similarity features of different N-gram word models, and unsupervised sentiment features, into models with a richer feature set than any of the approaches alone can provide. We test the classification models using different machine learning algorithms on categories of hateful and violent web content, and show that our multi-gram models give across-the-board performance improvements, for all categories tested, compared to combinations of baseline unigram, N-gram, and sentiment classification models. Our multi-gram models perform significantly better on highly imbalanced sets than the comparison methods, while this enrichment approach leaves room for further improvements, by adding instead of exhausting optimization options.
Pages/Duration:9 pages
Rights:Attribution-NonCommercial-NoDerivatives 4.0 International
Appears in Collections: Big Data and Analytics: Pathways to Maturity

Please email if you need this content in ADA-compliant format.

This item is licensed under a Creative Commons License Creative Commons