# Linear separability

Linear separability is an important concept in neural networks. The idea is to check if you can separate points in an n-dimensional space using only n-1 dimensions. Lost it? Here’s a simpler explanation. One Dimension Lets say you’re on a number line. You take any two numbers. Now, there are two possibilities: You choose two […]

# 7 unique neural network architectures

They way you interconnect neurons plays an extremely important role. In fact, researchers try to find new architectures that can be used for many purposes. I’ll talk about a few architectures over here. The dictomizer You’ve already seen this one. It consists of a single neuron, and can classify between two different sets. Nothing fancy […]

# Dictomizer examples: AND/OR gates

Now that you know how a dictomizer works, we’ll work on some more examples. I’ll leave the mathematical calculates for you. Make sure you do them! AND gate We’ll create a 2 input AND gate. Since there are two inputs, we can assume this to be a 2 dimensional Cartesian plane. Something like this: The […]

# A single neuron Dictomizer

Okay, so we’ll get into actually using a neural network right now! The idea is to create what is called a “dictomizer”. A dictomizer can differentiate between two “classes” of things. For example, checking if something is “human” or “not human”. Or maybe checking if a book is “printed” or “hand written”. The examples I […]

# The modern artificial neuron

Modern artificial neurons have been derived from McCulloch-Pitts model, but with some changes. These changes allow us to create physical neural networks (using resistors, opamps, etc) relatively easily. Plus, these changes make the neurons much more versatile and can adapt to a wider range of applications. The changes I assume you’ve gone through the post […]

# First artificial neurons: The McCulloch-Pitts model

The McCulloch-Pitts model was an extremely simple artificial neuron. The inputs could be either a zero or a one. And the output was a zero or a one. And each input could be either excitatory or inhibitory. Now the whole point was to sum the inputs. If an input is one, and is excitatory in […]