This project integrates a novel Sparse CNN (SCNN) accelerator architecture aimed at enhancing the performance and energy efficiency of Convolutional Neural Networks (CNNs) into SCALESim. SCALESim serves as a Systolic Array Simulator, aiding designers in fine-tuning accelerator parameters for executing diverse models and conducting Data Space Exploration (DSE). However, its capability to furnish optimal performance metrics for Sparse CNNs is limited. This limitation arises from the inclusion of compute cycles that process multiplications for operands with zero values. Eliminating such redundant computations can effectively reduce overall compute cycles. We investigate the impact of sparsity percentages in the input and filter matrices on both compute cycles and mapping efficiency.
SCALE-Sim can be run by using the scale.py script from the scalesim repository and providing the paths to the architecture configuration, and the topology descriptor csv file.
$ python3 scale.py -c <path_to_config_file> -t <path_to_topology_file> -p <path_to_output_log_dir>
For our testing purposes, we have used the scale.cfg in the configs directory and the topology file used is test.csv in topology/conv_nets
Since this project is an extension of SCALE-Sim, in addition to the modifications in the cfg file, we have added a new parameter in the topology file test.csv, to represent the Sparsity in the IFMAP and Filter matrices as a percentage.
The modifications made in the topology file are:

Here is an example output dumped to stdout when running test.csv with the following configurations: IFMAP Dimensions - 3x3, Filter Dimensions: 2x2, Array Dimensions: 3x3, Sparsity for Filter and Input is set to 0.

We have used the Total Compute Cycles (SCNN) variable to log the number of cycles and mapping efficiency. To calculate the time consumed my the process, the script was run as "time python scale.py"





