Algorithms and Data Structures (Part 1)

In this post we will be talking about how important the choice of data structure and algorithms to the overall performance of any software.

With highly transactional software, performance optimisation is an integral part of the development process. These optimisations may come in many different forms like using certain design patterns, improving your network configuration and even down to algorithms and data structures used in your code. Any bottleneck during the transaction flow can cause delay and these can lead to further issues.

Algorithms (Time complexity)

In this article we will talk about an algorithms Time Complexity, which is measured with Big(O) notation. We will cover space complexity with data structures in a future article.

So… in simplest terms big O notation is used to classify algorithms according to how their running time or space requirements grow as the input size grows for example:

A simple algorithm (like shown in the image below) have a complexity of O(N) which is a linear time meaning that the time of processing will increase with argument N. If N is 1 then it will take one iteration, but we always take the worst-case scenario which is for example N = 1000000000 it will take 1000000000 iterations so the complexity is O(N):

1.png

To get the all possible ordered pairs the example below has a complexity of O(N^2). You can see why as we are looping twice against N, the (Console.WriteLine) is considered a constant and will not be calculated.

Let's now take this a step further and show how the complexity of the algorithm can make or break the system and how we can use memorisation or dynamic programming to improve and solve the problem.

Fibonacci sequence optimisation

Next, we will show how two different implementations of an algorithm to get the last value of a Fibonacci sequence can perform completely differently, the difference is like night and day. You can see the implementation below is a recursive algorithm, the complexity on this algorithm is O(2^n) which is exponential.

Algorithm 1

3.png

With a very basic timestamps we can see for the value of 40 it is taking about 5 seconds to get the value back. That is a very long time to complete and if you were to increase the value it will get to a point that it will take a day to return a result if it didn’t throw a stack overflow exception first.

4.png

To tackle this problem, we need to improve the time, which we can do with a simple change.

As you can see below, we created a dictionary to short-circuit the recursion and check if the value is available and if it is, just to return it. This is called memorisation which will lead to O(N) time complexity.

Algorithm 2

5.png

And if we try to run the program again the result come back almost instantly

6.png

There are different ways of implementing algorithms but if complexity is considered, calculated and measured it can significantly improve the overall performance of a highly transactional system.

In part 2 we will be talking about choosing of data structures and also the space complexity of algorithms.

 

 

Useful Links:

https://www.geeksforgeeks.org/fundamentals-of-algorithms/#AnalysisofAlgorithms