Professional Documents
Culture Documents
Sample
Sample
At first we scrape the images and voice messages from social media platforms and
store it in a database.
Then using those scraped tweets we do sentimental analysis through hashtags and put
into view the resk of disaster at a particular place.
Then with the live voice messages extracted, we find out the seriousness of the
disaster using lexicon method.
The probablity that the images we scrape from the social media platforms to be real
may not be 100% true. So we use deep learning algorithm to find out the fake images
i.e.,whether it is editted or real images posted by people from disaster prone
areas.
We combine all these features/algorithms to build our outcome which is a high tuned
custom model.
The real time live data(tweets posted in twitter, images and voice messages shared
in instagram and facebook) the are regularly posted gets trained by the model
created and gets classified and negotiated based on the algorithms used by us to
build the model.
Then the final part is to post these details in our web portal so that it can be
accessed by various users across the world and provide help for people at risk.
For interfacing the outcome of our model with website we use python rest API.
The website is built using Node js whic acts as a better interface for users to
understand the seriousness of people at risk and help them effeciently by helping
the NGO's allocating optimal fund to rescue people from those disaster prone areas.
Hence out product helps in meeting the social needs of people during sudden
disaster.