Feature enrichment through multi-gram models

dc.contributor.author Forss, Thomas
dc.date.accessioned 2017-12-28T00:40:47Z
dc.date.available 2017-12-28T00:40:47Z
dc.date.issued 2018-01-03
dc.description.abstract We introduce a feature enrichment approach, by developing multi-gram cosine similarity classification models. Our approach combines cosine similarity features of different N-gram word models, and unsupervised sentiment features, into models with a richer feature set than any of the approaches alone can provide. We test the classification models using different machine learning algorithms on categories of hateful and violent web content, and show that our multi-gram models give across-the-board performance improvements, for all categories tested, compared to combinations of baseline unigram, N-gram, and sentiment classification models. Our multi-gram models perform significantly better on highly imbalanced sets than the comparison methods, while this enrichment approach leaves room for further improvements, by adding instead of exhausting optimization options.
dc.format.extent 9 pages
dc.identifier.doi 10.24251/HICSS.2018.099
dc.identifier.isbn 978-0-9981331-1-9
dc.identifier.uri http://hdl.handle.net/10125/49986
dc.language.iso eng
dc.relation.ispartof Proceedings of the 51st Hawaii International Conference on System Sciences
dc.rights Attribution-NonCommercial-NoDerivatives 4.0 International
dc.rights.uri https://creativecommons.org/licenses/by-nc-nd/4.0/
dc.subject Big Data and Analytics: Pathways to Maturity
dc.subject feature enrichment, feature selection, multi gram, n-gram, text classifications
dc.title Feature enrichment through multi-gram models
dc.type Conference Paper
dc.type.dcmi Text
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
1.11 MB
Adobe Portable Document Format