Show simple item record

dc.contributor.authorKumar, A.
dc.contributor.authorSingh, J.P.
dc.contributor.authorDwivedi, Y.K.
dc.contributor.authorRana, Nripendra P.
dc.date.accessioned2020-01-03T17:11:01Z
dc.date.accessioned2020-01-08T10:19:52Z
dc.date.available2020-01-03T17:11:01Z
dc.date.available2020-01-08T10:19:52Z
dc.date.issued2022
dc.identifier.citationKumar A, Singh JP, Dwivedi YK et al (2022) A deep multi-modal neural network for informative Twitter content classification during emergencies. Annals of Operations Research. 319: 791-822.en_US
dc.identifier.urihttp://hdl.handle.net/10454/17558
dc.descriptionYes
dc.description.abstractPeople start posting tweets containing texts, images, and videos as soon as a disaster hits an area. The analysis of these disaster-related tweet texts, images, and videos can help humanitarian response organizations in better decision-making and prioritizing their tasks. Finding the informative contents which can help in decision making out of the massive volume of Twitter content is a difficult task and require a system to filter out the informative contents. In this paper, we present a multi-modal approach to identify disaster-related informative content from the Twitter streams using text and images together. Our approach is based on long-short-term-memory (LSTM) and VGG-16 networks that show significant improvement in the performance, as evident from the validation result on seven different disaster-related datasets. The range of F1-score varied from 0.74 to 0.93 when tweet texts and images used together, whereas, in the case of only tweet text, it varies from 0.61 to 0.92. From this result, it is evident that the proposed multi-modal system is performing significantly well in identifying disaster-related informative social media contents.en_US
dc.language.isoenen_US
dc.rights© Springer Science+Business Media, LLC, part of Springer Nature 2020. Reproduced in accordance with the publisher's self-archiving policy. The final publication is available at Springer via https://doi.org/10.1007/s10479-020-03514-x
dc.subjectDisaster
dc.subjectTwitter
dc.subjectLSTM
dc.subjectVGG-16
dc.subjectSocial media
dc.subjectTweets
dc.titleA deep multi-modal neural network for informative Twitter content classification during emergenciesen_US
dc.status.refereedYes
dc.date.application16/01/2020
dc.typeArticle
dc.type.versionAccepted manuscript
dc.identifier.doihttps://doi.org/10.1007/s10479-020-03514-x
dc.date.updated2020-01-03T17:11:09Z
refterms.dateFOA2020-01-08T10:20:55Z
dc.openaccess.statusopenAccess


Item file(s)

Thumbnail
Name:
Rana_Annals_of_Operations_Rese ...
Size:
877.0Kb
Format:
PDF
Thumbnail
Name:
KumarEtAl2020.docx
Size:
1.659Mb
Format:
Microsoft Word 2007
Description:
To keep suppressed

This item appears in the following Collection(s)

Show simple item record