A gentle introduction to algorithms and bigO

algorithmAlgorithms are an integral part of computer science, and in fact our day to day lives. We use algorithms all the time without thinking about it, in a way our brains are wired to figure out solutions to problems.   As computer programmers and application developers we need to know how to translate these solutions into code and ideally effective and efficient code.

In this tutorial series I am going to break a bit from my normal C# Google tutorials.    Over the last several months I have seen a lot of people talk about algorithm performance and big O.    This is something I learned a very long time ago in collage.    After more then 20 years a lot of this has become instinct to me rather then something I was doing with direct intent.  When designing an algorithm it is best to know exactly what you are doing and why you are doing it.  As someone who has been an application developer for more then twenty years somethings become more instinctive and less purposeful.   I have always enjoyed writing tutorials so I thought that I would write a tutorial series on this,  in order to get it strait in my own head.    I intend to put my normal spin on the series.   This will be as simple as possible and the examples will just work.

Algorithm series

So what is an algorithm?

algorithm ˈalɡərɪð(ə)m/ noun
noun: algorithm; plural noun: algorithms
a process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer.

An algorithm is just a set of instructions to needed to solve a problem.

Real world example.

Imagine I place a book on the desk in front of you. Now I would like you to find page 42. How would yo begin? Well one way to start would be to open the book and start flipping pages one at a time. This is called a linear search.     In order to find the most efficient algorithm for our problems we need to consider a few things.

When we work with algorithms in computer programming we need to be mindful of a few things.

  • How much data is there?
  • Is the data already sorted?
  • How much work do we need to do to solve this problem?
  • How much power does the machine we will be preforming this on have?

Why does amount of data matter?

Lets go back to our book example. Lets say there are 600 pages in your book and I want you to find page number 600. If we stick with our linear search option we are going to have to flip though 599 pages before we find the page we are looking for. This is called an edge case or worst case. Now if we are looking for page one we will find it the first page we check this is also an edge case and is called best case. So worse case we will have to look though all 600 pages in our book to find the right page and best case it will be the first page.


Why does it matter if the data is stored.

Back to our book again it still has 600 pages. What would happen if all the pages had fallen out of the book and the were no longer in order? Now find page ten.  Can you imagine how much time it would take now? Assuming we are smart enough to put the pages we have already checked aside, worst case we are going to have to check 599 pages before we find page 10.

That is an interesting concept with algorithms. Best case and worst case. Best case above would be if you picked up page 10 first time worst case would be if you find it last. I always try and think of what the best and worse case would be for my algorithms.

Amount of data and the size of the machine.

If we knew for a fact that there would only be 50 pages in our book it wouldn’t matter if you had to go though all 50 pages to find the one you were looking for.    However if our book could potentially have 10000000 pages and we needed to find the last one.  The amount of power the machine you will be running on will begin to matter a great deal. A faster machine could run though the pages a lot faster then a slower one.


Before you even begin to code a new algorithm you must first consider things about the data it self. Determine out what your best and worse case will be, look at the amount of data you have, and the specs of the machine that will be running the algorithm.

Join me in the next tutorial Algorithms Constant Time O(1).

About Linda Lawton

My name is Linda Lawton I have more than 20 years experience working as an application developer and a database expert. I have also been working with Google APIs since 2012 and I have been contributing to the Google .Net client library since 2013. In 2013 I became a a Google Developer Experts for Google Analytics.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.