Skip to content

This project proposes the integration a novel Sparse CNN (SCNN) accelerator architecture aimed at enhancing the performance and energy efficiency of Convolutional Neural Networks (CNNs) into SCALESim.

Notifications You must be signed in to change notification settings

anandketan/Sparse_CNN_SCALE_Sim

Repository files navigation

Efficient Sparse CNN Exploration with SCALE-Sim

Documentation Status

This project integrates a novel Sparse CNN (SCNN) accelerator architecture aimed at enhancing the performance and energy efficiency of Convolutional Neural Networks (CNNs) into SCALESim. SCALESim serves as a Systolic Array Simulator, aiding designers in fine-tuning accelerator parameters for executing diverse models and conducting Data Space Exploration (DSE). However, its capability to furnish optimal performance metrics for Sparse CNNs is limited. This limitation arises from the inclusion of compute cycles that process multiplications for operands with zero values. Eliminating such redundant computations can effectively reduce overall compute cycles. We investigate the impact of sparsity percentages in the input and filter matrices on both compute cycles and mapping efficiency.

scnn_design_2

Launching a run

SCALE-Sim can be run by using the scale.py script from the scalesim repository and providing the paths to the architecture configuration, and the topology descriptor csv file.

$ python3 scale.py -c <path_to_config_file> -t <path_to_topology_file> -p <path_to_output_log_dir>

For our testing purposes, we have used the scale.cfg in the configs directory and the topology file used is test.csv in topology/conv_nets Since this project is an extension of SCALE-Sim, in addition to the modifications in the cfg file, we have added a new parameter in the topology file test.csv, to represent the Sparsity in the IFMAP and Filter matrices as a percentage. The modifications made in the topology file are: modifications_code

Output

Here is an example output dumped to stdout when running test.csv with the following configurations: IFMAP Dimensions - 3x3, Filter Dimensions: 2x2, Array Dimensions: 3x3, Sparsity for Filter and Input is set to 0. matrices

Results

We have used the Total Compute Cycles (SCNN) variable to log the number of cycles and mapping efficiency. To calculate the time consumed my the process, the script was run as "time python scale.py" image

result1

result2

result3

result4

About

This project proposes the integration a novel Sparse CNN (SCNN) accelerator architecture aimed at enhancing the performance and energy efficiency of Convolutional Neural Networks (CNNs) into SCALESim.

Topics

Resources

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages