Need for an Artificial Brain

A human brain is considered as a central organ of our entire nervous system, which is connected to a spinal cord. It controls almost all of our body activities, process lots of information, learn about the process and correct itself from the mistakes. It is made up of 100 billion neurons interconnected with each other creating a very complex network capable of learning a complex task by just looking and listening. Won’t it be wonderful if we can build an artificial form of the human brain with layers of neuron stacked and interconnected with each other and feed with data to learn and solve complex problems? Surely we can now but with the approach, we have to build such complex network consumes a lot of memory and computing power. Even with a supercomputer, we are not able to create a brain of a bee which has close to 1 million neurons. This is something we need to wait for the research work to progress, and hopefully, with more powerful computing tech like Quantum Computer and much more optimized algorithms, the possibility to develop a replica of the human brain will be possible.

Now the question arises, if at least we can design and architect a neural network with a bunch of artificial neurons, of course within the limits of computing power, what can we achieve with this? How are we able to achieve it?

What can we achieve with this?

  • Computer Vision problem for image recognition like face recognition and object detection in an image
  • Speech and speaker recognition
  • Natual language processing
  • Signal processing
  • Medical Diagnosis
  • Prescriptive modeling

In the upcoming blogs, I will take up these topics in detail.

How are we able to achieve it?

To understand better about neural network lets look in the basic building block of it which is a neuron. Below is the diagram of a biological neuron

Courtesy image from Wikipedia

A short explanation from Wiki: “A neuron, also known as a neuron (British spelling) and nerve cell, is an electrically excitable cell that receives, processes, and transmits information through electrical and chemical signals. These signals between neurons occur via specialized connections called synapses.”

In simple term it is something like this below:


The dendrite from the biological neuron becomes the input; the Nucleus becomes the transfer function which takes input X and does some mathematical computation. The Axon becomes the output Y which then connects to the dendrite of another neuron. This is a simple representation, but when you see the dendrite you see there are multiple terminals and is capable of taking multiple inputs as we see in the diagram:

And if we stack these neurons, that creates a dense network, and we can also call it Multi-layer Perceptrons which is now known as Deep learning. It consists of a stack of an interconnected neuron with one input layer, one or more hidden layers and one output layer.

In my next blog, I will go deep into the transfer function and try to simulate an example visually and with code.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.