Pascal Schlaak | Project 11

XAI for TSC based on DL of time-frequency domain

My last module in the master was a practical work, where I worked on a project in the research area of my supervising professor. Explainable AI is a very current topic, especially in the classification of time series. In contrast to pictures, where for example animal species are depicted, time series are less understandable for humans. In this project I worked on the topic of XAI of time series classifications based on Deep Learning. In contrast to other research, I investigated a naive approach, using images of the time-frequency domain as the basis for different DL models.

In the context of my experiment, I used four different DL approaches: MLP, a flat CNN of my own, ResNet50V2, InceptionV3. As a basis I used 30 different data sets of the UCR Time Series Classification Archives, of which I generated spectrograms and scalograms for all entries. In total, I trained and evaluated 1080 models. I also proposed an evaluated XAI method, which is better suited to explain DL approaches based on spectrograms and scalograms. For the evaluation of the project I created a web application, which should increase the understanding of DL on spectrograms/scalograms and also allow to view all results. As part of the requirements of the project I also had to write a short scientific paper, which you can find here. I also presented my results at work, you can find a presentation about the whole project right here.

In general, I could show that my approaches cannot compete with existing approaches but there are two data sets where my approaches achieve better results. The acceptance towards spectrograms and scalograms is also lower than towards the time domain due to the complexity. Nevertheless, my proposed XAI method is more suitable than the underlying method for this use case.

# Technologies

For this project I also used Python. For the creation of the classifiers, the reading and preprocessing of the data and the training I used Tensorflow and Scipy. One difficulty was to load all the data into Colab, where the ImageDataGenerator and the flow_from_dataframe() method helped a lot. For the creation of the web app I used Dash. I'm currently still hosting the web app via Heroku. For statistical evaluation of the results I used Scipy.

# Usage

My complete code is private but you can get more insights about the project in the web app. The web app visualizes the general results of my work and is divided into three sections. In the first section all used data sets can be viewed in their different representations (time series, spectrograms, scalograms). For this purpose one sample per class is visualized. In the second section the heatmaps of my proposed method can be displayed. In the third section the accuracies of the single models and representations per class and the most important areas are shown.

Current status: Online

https://xai-dl.herokuapp.com/

Since I'm using Heroku's free plan, computing ressources are very limited but loading time should be fine. If you first access the web app Heroku will start ressources which will require about half a minute too.

Context

Lecture

Tools

Python, Dash, Pandas, Tensorflow, Scipy

Contributors

None

Published

Aug. 15, 2021

Source

Github