Skip to content

luca-parisi/hyper_sinh

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 

Repository files navigation

hyper-sinh in TensorFlow, Keras, and PyTorch

An Accurate and Reliable Function from Shallow to Deep Learning

The 'hyper-sinh' is a Python custom activation function available for both shallow and deep neural networks in TensorFlow, Keras, and PyTorch for Machine Learning- and Deep Learning-based classification. It is distributed under the CC BY 4.0 license.

Details on this function, implementation and validation against gold standard activation functions for both shallow and deep neural networks are available at the following paper: Parisi et al., 2021a.

Dependencies

The dependencies are included in the environment.yml file. Run the following command to install the required version of Python (v3.9.16) and all dependencies in a conda virtual environment (replace <env_name> with your environment name):

  • conda env create --name <env_name> --file environment.yml

Usage

You can use the custom HyperSinh activation function in Keras or PyTorch as a layer:

Example of usage in a sequential model in Keras with a HyperSinh layer between a convolutional layer and a pooling layer

Either

model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), input_shape=(32, 32, 3)))
model.add(HyperSinh()) 
model.add(layers.MaxPooling2D((2, 2)))

or

model = keras.Sequential(
        keras.Input(shape=(32, 32, 3)),

        layers.Conv2D(32, kernel_size=(3, 3)),
        HyperSinh(),

        layers.MaxPooling2D(pool_size=(2, 2)),
    ]
)

Example of usage in a sequential model in PyTorch with a HyperSinh layer between a convolutional layer and a pooling layer

self.conv1 = nn.Conv2d(1, OUT_CHANNEL_CONV1, kernel_size=KERNEL_SIZE_CONV)
self.hyper_sinh1 = HyperSinh()
self.pool1 = nn.MaxPool2d(kernel_size=KERNEL_SIZE_MAX_POOL)

Linting

isort is used to ensure a consistent order of imports, whilst autopep8 to ensure adherence of the codes to PEP-8, via the following two commands respectively:

  • isort <folder_name>
  • autopep8 --in-place --recursive .

Citation request

If you are using this function, please cite the papers by:

Releases

No releases published

Packages

 
 
 

Contributors

Languages