elib
DLR-Header
DLR-Logo -> http://www.dlr.de
DLR Portal Home | Imprint | Privacy Policy | Contact | Deutsch
Fontsize: [-] Text [+]

Approximation of Activation Functions for Vector Equalization based on Recurrent Neural Networks

Mostafa, Mohamad and Teich, Werner and Lindner, Jürgen (2014) Approximation of Activation Functions for Vector Equalization based on Recurrent Neural Networks. In: international symposium on turbo codes and iterative information processing ISTC'14. International symposium on turbo codes and iterative information processing ISTC'14, 18.-22. Aug. 2014, Bremen, Deutschland.

[img] PDF - Registered users only
172kB

Abstract

Activation functions represent an essential element in all neural networks structures. They influence the overall behavior of neural networks decisively because of their nonlinear characteristic. Discrete- and continuous-time recurrent neural networks are a special class of neural networks. They have been shown to be able to perform vector equalization without the need for a training phase because they are Lyapunov stable under specific conditions. The activation function in this case depends on the symbol alphabet and is computationally complex to be evaluated. In addition, numerical instability can occur during the evaluation. Thus, there is a need for a computationally less complex and numerically stable evaluation. Especially for the continuous-time recurrent neural network, the evaluation must be suitable for an analog implementation. In this paper, we introduce an approximation of the activation function for vector equalization with recurrent neural networks. The activation function is approximated as a sum of shifted hyperbolic tangent functions, which can easily be realized in analog by a differential amplifier. Based on our ongoing research in this field, the analog implementation of vector equalization with recurrent neural networks is expected to improve the power/speed ratio by several order of magnitude compared with the digital one.

Item URL in elib:https://elib.dlr.de/89555/
Document Type:Conference or Workshop Item (Speech)
Title:Approximation of Activation Functions for Vector Equalization based on Recurrent Neural Networks
Authors:
AuthorsInstitution or Email of AuthorsAuthors ORCID iD
Mostafa, Mohamadmohamad.mostafa (at) dlr.deUNSPECIFIED
Teich, Wernerwerner.teich (at) uni-ulm.deUNSPECIFIED
Lindner, Jürgenjuergen.lindner (at) uni-ulm.deUNSPECIFIED
Date:2014
Journal or Publication Title:international symposium on turbo codes and iterative information processing ISTC'14
Refereed publication:Yes
Open Access:No
Gold Open Access:No
In SCOPUS:No
In ISI Web of Science:No
Status:Published
Keywords:vector equalization, recurrent neural network, function approximation
Event Title:International symposium on turbo codes and iterative information processing ISTC'14
Event Location:Bremen, Deutschland
Event Type:international Conference
Event Dates:18.-22. Aug. 2014
Organizer:Jakob university Bremen
HGF - Research field:Aeronautics, Space and Transport
HGF - Program:Space
HGF - Program Themes:Communication and Navigation
DLR - Research area:Raumfahrt
DLR - Program:R KN - Kommunikation und Navigation
DLR - Research theme (Project):R - Vorhaben GNSS2/Neue Dienste und Produkte
Location: Oberpfaffenhofen
Institutes and Institutions:Institute of Communication and Navigation > Communications Systems
Deposited By: Mostafa, Mohamad
Deposited On:07 Oct 2014 10:26
Last Modified:13 Mar 2015 09:26

Repository Staff Only: item control page

Browse
Search
Help & Contact
Information
electronic library is running on EPrints 3.3.12
Copyright © 2008-2017 German Aerospace Center (DLR). All rights reserved.