Loading

Artificial Neural Network Implementation in Microchip PIC 18F45J10 8-Bit Microcontroller
Jnana Ranjan Tripathy1, Hrudaya Kumar Tripathy2, S.S.Nayak3
1Er. Jnana Ranjan Tripathy, Department of Computer Science & Engineering, Biju Pattnaik University of Technology, Orissa Engineering College Bhubaneswar, Odisha, India.
2Dr. Hrudaya Kumar Tripathy,  Department of Computer Science & Engineering, KIIT University, Bhubaneswar, Odisha, India.
3Dr. S.S. Nayak,, Centurion University of Technology & Management Paralakhemundi, Odisha, India.
Manuscript received on May 20, 2014. | Revised Manuscript received on June 13, 2014. | Manuscript published on June 30, 2014. | PP: 131-135  | Volume-3, Issue-5, June 2014.  | Retrieval Number:  E3123063514/2013©BEIESP

Open Access | Ethics and Policies | Cite
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: Implementing neural networks on an 8-bit microcontroller with limited computing power presents several programming challenges. In order for the network to perform as quickly as possible, creating the software at the assembly level was chosen. Writing the software in assembly allows a level of customization that cannot be achieved with C. However, the need for hardware portability was also a motivating factor and a more generic C implementation was also created. It was also very important to manually manage the very limited amount of data memory. Several assembly routines were created with this purpose in mind. A pseudo floating point arithmetic protocol was created exclusively for neural network calculations along with a multiplication routine for multiplying large numbers. A tanh compatible activation function was also needed. The final procedure is capable of implementing any neural network architecture on a single operating platform.
Keywords: Neural Architecture (NA), Microcontroller, Embedded C, Pseudo Floating Point, Activation Function.