Jack Geraghty
Title: Multimodal data wrangling for Real-time IoT-based Urban Emergency Response Systems
Supervision Team: Fatemeh Golpayegani, UCD / Andrew Hines, UCD / Rob Brennan, UCD
Description: Emergency Response Systems (ERS) enable the rapid location of emergencies and deployment of resources by emergency response teams. Historically this has been as a result of an emergency call from a person at the scene. Technology advancements in urban areas and so-called smart cities mean that Internet of Things-enabled infrastructure can offer a “single strike” data dump of multimodal information via the ERS. For example: in a vehicle collision, information regarding the crash severity, number of passengers, fuel type, etc. can be gathered from in-place cross-platform sensors including vehicles or smartphones’ audio, and accelerometer sensors, traffic cameras, etc. This information may be valuable to fire crews, ER staff and other members of the response team. The technical challenges to be addressed by this project will focus on audio and video processing, data collection and curation and applying data-driven learning (e.g. deep learning and knowledge graphs) to cross-platform knowledge models. The student will identify and prioritise data sources, build a framework to integrate and generalize multi-modal data, and demonstrate how multiple platforms can assist in real-time ERS decision making.