API’s, Re-platforming, 3rd Party Integrations and Postman

In this modern world we live in, this ever-connected and ever-integrated digital world - wouldn’t it be great if there was a tidy, cohesive and easy to understand way of getting systems to work with each other and at the same time, be easy to understand.

Since the early days of computing, since the 1940’s, there have been ways of defining the way a programmer should integrate with a module of code, thereby defining a tidy, cohesive and easy to understand way of working with a system. At the time these were called Application Programmer Interfaces, but the name has - over time - been changed to mean Application Programming Interfaces or APIs.

The wheel: re-invented, but for good reasons…

Domino’s UK & ROI (DPG) have had a highly successful, robust, flexible and relatively well understood eComm system for some time. It has been working well, it has had many changes made to it throughout the years and whilst it may not be all that shiny and (from a technological perspective only) sexy - it has served Domino’s well and enabled some really important, valuable and impressive features to be added to benefit our customers.

That being said, all good things must come to an end.

Now, it is at this point where one could berate the legacy systems for being clunky, slow, technological antiques and many other terms… but that’s not the right thing to do, nor is it appropriate.

What would be is better is to learn from the legacy system to ensure its successor is better - and better in appropriate ways; not just to use cool tech because it is what “the cool kids” are using.

At the start of 2021 a major program of work was started to replace the bulk of the legacy eComm systems with something fresh, something using new tools and technologies, to allow new requirements to be added quicker, safer, more reliably and to make the technology accessible to anyone (the definition here of anyone is not necessarily the same as you are thinking - it is anyone who needs to access it that is outside of the codebase itself, so mostly internal systems rather than anyone out there in the world)

There’s a well known (and well used) description of the work, and it’s one that has been muttered by many companies around the world who are also undertaking a major platform transformation project…

Replacing the wings of the plane whilst it’s still in flight

Where we were, where we’re going to be

A blog post about technology, transformation and re-platforming just wouldn’t be right without a few diagrams and pictures - and this one is no different.

Where we were

The below diagram represents a very simplified view of the eComm estate before the transformation project started

If you’re familiar with this kind of architecture, it’ll be nothing new - a monolithic system designed and evolved over many years but as with all such designs, it has limits and there becomes a point whereby the cost (in terms of time and effort) of adding new capabilities versus maintaining the code makes it un-viable.

This challenge is nothing specific to Domino’s, it’s the same problem that many many companies have when working with a legacy monolithic system.

Along comes the new

From a high level architectural perspective, it is nothing more than introducing a scalable microservice architecture; using the power of the cloud, BFF’s, containers, container orchestration, service-mesh, eventing and modern coding techniques (and many others too).

From a Domino's perspective however this is huge! This transformation will allow a great deal of future innovation to be possible.

Let’s talk…

Having a new - microservice driven - platform needs to work in a particular way; there’s no value in all of the code living in its respective service boundary if there’s no way of understanding how any other system can interact with it and this is where API’s come into the picture.

Each micro service is built up of one or more “endpoints” - this is nothing more than a place in the microservice that can be talked to. (Clearly the language used in the previous sentence is incorrect when talking about system-to-system interaction but it gets the message across)

Consider the following example:

  • A customer visits the Domino’s website and wants to find their nearest store to see when it opens.

  • The micro service would be “store” as it pertains to activities and actions related to a store

  • The endpoint would be “store search” as it allows the customer to talk to it in order to answer a question.

Having the endpoint is great, essential really but, there needs to be a consistent, common and easy to understand suite of documentation to support this, otherwise how would anyone ever be able to know how to talk to it.

Walking the new path with a degree of swagger

Every developers least favourite task is authoring or maintaining documentation - it’s often “no fun” and it gets in the way of them creating beautifully written and well tested code.

There are two important words in the previous sentence that need to be clarified and expanded on - that are highly relevant to this post too…

written” and “tested

How do the developers know what to write? How do they know what to test? How do the test team know what their test-cases should execute to ensure that the new code “does what it is supposed to do”?

Documentation!

Before code is written, there needs to be an understanding of what the new endpoint needs to accept and what it needs to return - these are the contracts for the API.

With this simple set of documentation the whole development process can commence - developers can write their code, testers can write their tests with a common and consistent understanding of what to expect.

There’s a bonus here too - self generating documentation!

Okay, so it’s not truly self generating, it has to have the code written first to allow the documentation to be generated against.

Standard API specifications exist such as the OPENAPI specification which can interrogate the code (at compile time) and present standard documentation which can be used to “understand” how to talk to the API.

It’s all well and good having this documentation - indeed, it is invaluable - but there is a down-side; one that could lead to problems.

Suppose your well written, full documentation is available on the public internet, this could lead to major data problems with potential bad actors exploiting the APIs, obviously not a position any company wants to be in. Because of this, more often than not this complete documentation is disabled for production access and only accessible to internal systems on pre-release code.

Great! So close!

All is not lost - there just needs to be another set of documentation, one that can be shared publicly.

What, more documentation?

Yep, that’s right, another set of documentation is needed but this needs to be crafted to cater for a different type of audience, an audience that could be anyone and not just people from the development sphere.

This documentation needs to have textual descriptions, samples, generated sample code, be easily shared, have the ability to be executed in isolation, secure and up-to-date.

Making sure it’s baked to perfection

To address the wider documentation and associated points defined above, Domino’s looked to see what tools are already out there to aid in this - it didn’t take much looking to find the ideal candidate. Not only is this tool well known and highly regarded but, it’s already in use (to some degree) within the Domino’s development teams.

A letter, delivered by a specialist

The tool of choice was Postman (take a look at their website for the full info on what it can do as it’s extensive) as it met all of the immediate needs of the program, offered some easy wins and would allow the teams to become better organised and efficient with their delivery.

Getting the ingredients together, the right way and getting them cooking

An initial proof of concept was completed to ensure that not only is it the right tool but, it can work in a way that is right for Domino’s (it’s all well and good picking the best tool out there but it has to work with the internal teams and processes rather than against it). This was a resounding success and demonstrated not only how simple the tool is to use, but also how it can form a key part of the SDLC and deliver quality improvements across the whole of the re-platforming.

Some of the obvious and immediate benefits were things about defining API endpoints in Postman, adding human readable documentation, sample request and responses, immediate publishing of externally visible documentation for 3rd parties to consume, test-cases and test executions on the APIs deemed externally visible, compliance analysis automatically executed on each change and a big benefit was realised on collaboration - each member of the development, quality and architecture team had immediate access to the same suite of Postman specs without the need to share anything over email/Teams/Slack etc.

Letting others come to the party

The newly created APIs, the internal documentation and the externally visible documentation have allowed Domino’s to - with comparable ease and swiftness - complete an ambitious integration with a very well known Online Food ordering service called Just Eat. The elegance and completeness of the external Postman documentation alleviated so many traditional pain points of having to share PDF documentation, sample request and response payloads for myriad use-cases and, in the event a change is needed having to re-send them again - to many people whilst trying to ensure everyone has the same latest version.

The integration was so successful that the exact same process is being used to bring on another very well known food ordering service - Uber Eats - and the Postman documentation is proving to be invaluable for this engagement too.

The new APIs, the public and private documentation, the constant API level testing and the enhanced collaboration are huge wins for Domino’s and are leading the way for future innovations within the company and working with other 3rd parties.

Putting it in the box and sending it to our fantastic customers

It’s an exciting time to be working at, with and for Domino’s; the re-platforming, the engagement with 3rd parties and the technological opportunities made available as a result of these are going to be compelling - for the teams working here, for the franchisees and - of course - for our fantastic customers.

For the time being then, this (rather long) post can come to an end but the journey is only just beginning.

See you on the API side!

Peace!

From Curiosity to Solutions Architect: My Journey (so far) at Domino's


Wisdom is not a product of schooling but of the lifelong attempt to acquire it.
— Albert Einstein

I've always been fascinated by why things are the way they are, and in turn, how everything works. My interest began with something seemingly mundane: the weather. As a child, I would become enthralled by thunderstorms, these magnificent but terrifying processes that occurred on a scale far larger than I could fully comprehend.

I'd often feel the pull of intrigue as a distant storm rolled towards me, its booming thunder heralding its arrival. Whilst others fled to find shelter, hoping to avoid getting wet, I would instead seek the nearest open space to provide me with the best seat possible: at its heart.

Experiencing these events sparked something in me, an intense desire to understand, to find the answers to how the storm swirling around me was capable of even existing.

My curiosity continued, expanding with a thirst to comprehend other weather events, and in particular, extreme weather. Tornadoes, hurricanes, volcanoes, earthquakes, and monsoons, all treasure troves waiting to be understood.

Unsurprisingly, this quest to attain a greater understanding of the world around me led straight to Physics, and in particular, computers and space. To me, space was a natural continuation of my obsession with extreme weather; the universe is filled with wonderous celestial bodies and processes that we can only observe from a distance (a mind-bogglingly great distance!). Stars, black holes, red dwarfs, gamma-ray bursts, and supernovae, all spectacular and mysterious occurrences waiting for their secrets to be unlocked and enhance our understanding of the universe.

Whilst space energised my imagination, computers, on the other hand, were something much more tangible. I'll always remember the first time my older brother sat down alongside me to help me write my first line of code on an Amiga 500: making the screen change colour. Of course, it's incredibly simple by today's standards, but that excitement I felt as the screen cycled through colours was something very special indeed. I not only understood something, but I could create something using that newfound comprehension.

A side note, I highly recommend watching The Royal Institution Christmas Lectures that take place every year after Christmas. Whilst they're aimed at children, they're delivered in a way that remains engaging for adults alike. They cover all sorts of topics and have been held since 1825, only stopping for World War II!

CHRISTMAS LECTURES | Royal Institution (rigb.org)

Interestingly, I did not go on to study Computing at university. For whatever reason, the open days I attended never resonated with my instinctual need for understanding. This led me to seek it elsewhere, and naturally, my thoughts returned to Physics. After attending a number of demonstrations, I settled on my degree: Physics with Cosmology.

Fast-forward to the end of my degree, and I was keen to get out into the world and apply the skills I'd developed during my studies. However, I yearned for something tangible to work on, something I could learn and use to help create something new. Of course, my thoughts drifted back to computers, specifically software development.

Alongside my degree, I'd continued to learn about programming in my own time, playing around with various technologies that piqued my interest. However, when it came to applying for jobs, I hit a barrier. In every interview I attended, the same question was asked, 

"Please give examples of work experience in software development outside of education and study". I felt like a joke had been played. In the fortunate position I was in, I had not been forced to split my time in order to pay for my studies. Yes, I'd had a couple of jobs growing up, but nothing relevant to software development. I'd dedicated years of study, and yet at that moment, all of it felt useless.

Knowledge, wisdom and understanding, was not something I could acquire through study alone.


This is Just the Beginning
— Count Dooku

Whilst I wasn't able to attain a software development role directly, I was able to get my foot in the door by becoming a Business Systems Analyst at a small, start-up-like company and taking every opportunity possible to apply my self-acquired software development skills to the role.

Four years later, I sat across a table from a Senior Developer and a Development Lead. Intensely nervous, I shifted in my seat as they began assessing my capabilities to determine whether I was a suitable fit for their company.

Undoubtedly, any minute they were going to discover I was underqualified for the role and that I was a fool for even believing I might be up to the challenge in the first place. Up to this point, my coding knowledge had been crafted through self-learning only, with a few opportunities to apply it in my previous job. However, unlike before, those times I'd applied my skills gave me examples to demonstrate my knowledge and understanding.

Two and a half hours into the interview, now sitting with the Head of Development across the table, the question came that always causes a moment of doubt: Give an example of when you've made a mistake. In this moment, all those feelings of self-doubt come to the fore: See? They're about to see how I'm not fit for the role, that I'm not good enough.

Taking a deep metaphorical breath, I began explaining how shortly after starting at my previous company, I'd pushed something into production only for it to bring the website down the moment it deployed. Naturally, being just out of university, my stress levels skyrocketed; who wants to be the one to break everything? To top it off, this was a manual deployment via a CMS (we had no CI/CD!). The built-in rollback didn't work, meaning the website remained offline with no means for me to fix it.

Of course, my boss was in a meeting with the CEO at that moment, meaning I had to make an embarrassing interruption to inform him of what had happened. He quickly resolved the issue by manually updating the files, the problem caused by a lack of permissions.

Taking me into a side room, my boss proceeded to dress me down. Initially, I was a little confused; he'd told me to release this morning... hadn't he? And there it was, at that moment, as he explained he hadn't meant for me to trigger the release, I realised the most crucial thing: communication, and how easily it can go awry.


Many of the truths that we cling to depend on our point of view.
— Yoda

Fortunately, I hadn't sabotaged my chances of getting the Software Developer role at Domino's, and despite my self-doubt arguing otherwise, they thought I was the right choice for the job.

Elation and relief are often the first feelings to surface when receiving good news, but self-doubt can quickly reassert itself, sapping any much-needed confidence before you even begin. And guess what? Confidence is a critical component of communicating effectively.

Expressing your ideas is a vital part of any role, but how can you expect someone to understand and subsequently engage with your ideas if you portray little confidence in them yourself? The interesting thing here is that it's less about how you feel inside and more about how people perceive you from the outside.

I was having a conversation with a colleague some time ago, and the subject of confidence came up in the context of them giving a presentation. In response and hopeful support, I expressed how I can get incredibly anxious when presenting or talking, sometimes to the point that my heart is beating so fast I can hear it in my ears, and I struggle to breathe normally. What did they say in response?

"You? But you've always been so confident!". This hit me like a ton of bricks; how did this person get that impression? That's not me; I'm not that confident.

Of course, everyone has their own experience of the world around them, with each being unique. If people all have their distinctive perceptions, then it follows that everyone you meet has their very own version of you. The logical question that follows is, which version is the real me? Well, both.

It might sound strange initially, but no one can know your mind. It follows that nobody has a complete picture of you with only your words and actions to make judgements upon, and someone's judgement is shaped by their own biases.

OK, but someone else's view of me is incomplete, so surely my view of myself must be the real me? I'm the one in my head, after all! Well, yes and no.

You, too, are affected by biases, and it's unlikely you're aware of them all. This means your perception of how you're coming across to others is also affected. Just as others can't truly know your mind, you can't know theirs, and so never really know how you're being perceived.

Neither party has the whole picture, but neither is wrong. They're both true; it just depends on your point of view.

This is another area of interest that perhaps should be a topic for a future post, but for now, I'll link to an article on the Johari window and how it can help build better awareness and confidence:

The Johari Window - Building Self-Awareness and Trust (mindtools.com)

Biases also deserve a dedicated post, significantly impacting how people treat one another. In recent decades, women have had difficulty breaking back into IT and software development. With the stereotype being a man with poor hygiene, a lack of social skills and who rarely sees the light of day, it's no wonder young girls have drifted away from the discipline as a future career.

Things are changing, but old biases, such as men being more naturally suited to technology, being more logical and level-headed, and women being more emotional and less rational, still impede progress towards a future of fairer opportunities. This, and the possible solutions, will be explored in a future post.


Do, or do not. There is no try.
— Yoda

For much of my first year at Domino's, I struggled with imposter syndrome, something that many people reading this will be all too familiar with. However, this is also heavily related to confidence stemming from how you believe others perceive (or even judge) your abilities and skills. To emphasise, this is based on what you, and not what they actually, believe.

To put it simply, you have to stop trying to be a mind reader, believing you have some magical ability to know other people's thoughts. People are often their own worst critics, and learning to recognise that these thoughts and feelings are, in fact, not a statement of reality can go a long way to improving your self-confidence.

When working on your confidence, it's essential to keep this in mind. It won't suddenly appear out of thin air; it takes practice, meaning you need to fake it till you make it. And this is where the magic lies; because even though you aren't confident within, you know people can still perceive you as such. This leads to better engagement with your ideas, which in turn leads to greater understanding and you feeling positive about yourself. Positivity drives confidence and greater exploration of new ideas, which yields new solutions for you to convey. And so the feedback loop begins.

So, how does this relate to my desire for greater knowledge and understanding, and wielding them to create new things? Well, if perceived confidence drives greater engagement, and engagement drives further development of your ideas, then the natural conclusion is that confidence increases the likelihood of discovering new perspectives on the problems we face, leading to a greater variety of potential solutions.

Of course, it's at this point that you might realise that for this to work, the way these different perspectives are communicated is vital, reinforcing the need for a sustained effort towards better communication.


Patience you must have
— Yoda

Being a software developer was incredibly rewarding. Initially working to find and fix bugs, the satisfaction I felt when I successfully tracked down an issue to its source was addictive, increasing once again once I'd formed a solution. After deployment, I'd be content knowing that the next time a customer used that part of the system, it would work as expected for them to continue happily towards their desire (pizza!).

Over time, I became increasingly interested in getting involved with developing new features, building something new and hopefully grappling with new technologies. Joining my first sprint team was incredibly exciting, and I was fortunate enough to jump straight into a major project. On a whole new scale, the project pushed me to learn rapidly. I was fortunate enough to have some brilliant colleagues to learn from, and I learnt the immense value of pair programming.

Having completed the project, I realised there was an element that had particularly resonated with me; not just working on a single component but thinking about how different components worked in tandem. Designing numerous interconnecting components to fulfil a requirement from the business would mean attaining an understanding of the ask/problem, gathering knowledge of any existing systems and technologies that could help, and finally presenting solutions back to the business for one to be selected and subsequently implemented.

A thread that runs throughout is, once again, communication. A skill I'd need to renew my focus on to help me reach the goal of becoming a Solutions Architect. While not something entirely tangible in and of itself, over time and lots of self-reflection, evidence of my effects became evident in the interactions with those around me.

Perfecting communication is a never-ending endeavour, but one, given time, is well worth the effort. The more aware you are of how you and others communicate, and the differences between them, the more you'll improve your ability to communicate. In turn, this leads to more effective conveying of your ideas, increasing productivity and understanding. People go away happier with more confidence to move forwards. People have clarity, which is often elusive, causing no end to meetings and repeated conversations (and we all love meetings).


So long, and thanks for all the fish
— Douglas Adams

Sitting here now as a Solutions Architect, maintaining clarity when conveying my ideas is paramount. How can I expect people to assess and choose from the solutions I provide if they have yet to grasp what these solutions actually offer?

How people digest information is a whole area of study that I won't go into here (a future post!), but it significantly impacts how successful you are at sharing your ideas. Whilst you can't control how your audience digests the information you share, you can control how you express ideas; meaning knowing your audience is critical to success.

Having almost completed my first six months as a Solutions Architect, it's the ideal time for me to take a step back and reflect upon the successes and challenges I've encountered. Analysing these will help me assess how successful my transmission of ideas has been and whether or not they've subsequently flourished. I can step outside myself and look at how I might tailor my approaches (in both communication and solutions) to better serve the various audiences and situations I will encounter in the future.

The result? Becoming a better communicator, and in turn, a more useful Architect. All whilst satisfying that deep-seated need to understand the why and how things work. Whilst I don't yet feel qualified to write about it here (I see you imposter syndrome!), I plan to write about my experiences in a future post.


Time to First Byte (TTFB)

Lately Elon Musk has recently been publicly talking about performance of Twitter.

This article will talk about one important aspect of web performance, Time to First Byte (TTFB) and how changes have been made in this area to provide customers with a better experience.

TTFB is a metric that measures the time between a clients request and the time for the first byte of the response. This metric by itself won’t necessarily mean a website is performing well, but it is an important base metric to know as it plays a part in the perceived load speed, for example TTFB effects the Largest Contentful Paint (LCP) metric.

A good value for TTFB is 800ms and a poor time is over 1800ms:

Working on TTFB

Some typical problems with TTFB are slow infrastructure, databases etc. that reduce the ability of the origin servers to respond to requests quickly enough. At Domino’s we have tried to remove any bottlenecks with our servers by taking advantage of the following techniques:

1. Caching and reuse of content for multiple users can save on processing the same code over and over. Some webpages might have a mixture of content that is viewed by everyone and content that is viewed by a single customer only. This data can be split into different requests and the shared data can be cached. E.g. the basket is unique to a customer, therefore it isn’t cached or sent down on the original document request, but potentially the rest of the page can.

2. The above splitting of content then helps with rendering the cacheable content and storing it on a content delivery network’s (CDN) edge servers. There are usually more edge servers on a CDN than your own origin servers, and they are also likely to be closer to the customers devices.

This is beneficial for a couple of reasons, one, is that being closer to the customer means the response can travel faster as there is less network hops and physical distance, and secondly it takes load away origin servers, which helps against slow infrastructure and databases. 

3. A large download size can reduce the effectiveness of the above two points, as a slow mobile connection can be very bandwidth limited, therefore you don’t want an initial document request of multiple megabytes even though it might perform fine on your local development machine.  

The Domino’s homepage document size is currently under 70kB, but still includes styles and JavaScript that allows the browser to show relevant content quickly and doesn’t have any major flashes of unstyled content (FOUC). This was achieved by using methods like compression (gzip), minification and tree shaking.

Results

All the above techniques have had a positive impact on the TTFB, and the metrics like LCP it directly influences. Ultimately those metrics will improve a customer’s experience when visiting by allowing them to use and interact faster.

Useful links

https://web.dev/ttfb/

Algorithms and Data Structures (Part 1)

In this post we will be talking about how important the choice of data structure and algorithms to the overall performance of any software.

With highly transactional software, performance optimisation is an integral part of the development process. These optimisations may come in many different forms like using certain design patterns, improving your network configuration and even down to algorithms and data structures used in your code. Any bottleneck during the transaction flow can cause delay and these can lead to further issues.

Algorithms (Time complexity)

In this article we will talk about an algorithms Time Complexity, which is measured with Big(O) notation. We will cover space complexity with data structures in a future article.

So… in simplest terms big O notation is used to classify algorithms according to how their running time or space requirements grow as the input size grows for example:

A simple algorithm (like shown in the image below) have a complexity of O(N) which is a linear time meaning that the time of processing will increase with argument N. If N is 1 then it will take one iteration, but we always take the worst-case scenario which is for example N = 1000000000 it will take 1000000000 iterations so the complexity is O(N):

1.png

To get the all possible ordered pairs the example below has a complexity of O(N^2). You can see why as we are looping twice against N, the (Console.WriteLine) is considered a constant and will not be calculated.

Let's now take this a step further and show how the complexity of the algorithm can make or break the system and how we can use memorisation or dynamic programming to improve and solve the problem.

Fibonacci sequence optimisation

Next, we will show how two different implementations of an algorithm to get the last value of a Fibonacci sequence can perform completely differently, the difference is like night and day. You can see the implementation below is a recursive algorithm, the complexity on this algorithm is O(2^n) which is exponential.

Algorithm 1

3.png

With a very basic timestamps we can see for the value of 40 it is taking about 5 seconds to get the value back. That is a very long time to complete and if you were to increase the value it will get to a point that it will take a day to return a result if it didn’t throw a stack overflow exception first.

4.png

To tackle this problem, we need to improve the time, which we can do with a simple change.

As you can see below, we created a dictionary to short-circuit the recursion and check if the value is available and if it is, just to return it. This is called memorisation which will lead to O(N) time complexity.

Algorithm 2

5.png

And if we try to run the program again the result come back almost instantly

6.png

There are different ways of implementing algorithms but if complexity is considered, calculated and measured it can significantly improve the overall performance of a highly transactional system.

In part 2 we will be talking about choosing of data structures and also the space complexity of algorithms.

 

 

Useful Links:

https://www.geeksforgeeks.org/fundamentals-of-algorithms/#AnalysisofAlgorithms

The journey from a manual tester to an aspiring developer

Sweat, Tears & Code

I’ve been a manual tester for just over 5 years (3 of which have been at Domino’s), and since late last year I’ve been on a journey to learn testing automation tools such as SpecFlow in large projects. To understand how to use automation tools such as SpecFlow effectively – I’d need to learn development from the beginning. In the sense of ‘learn to walk before you can run’.

What I’ve learnt over the past year is that this is a long journey where the learning process never stops. There might be several mountains to climb – but it’s worth it.

Not only is it worth it to learn development to advance your career (especially considering testing automation is rapidly becoming the norm), but it also benefits your personal development as well. In fact, it’s inspired me to learn more about development in general!

I thought I’d make a post for those who are manual testers themselves who are going on a similar journey or those who want to learn development in general. I’m hoping this post has something for everyone.

Find a learning strategy that works for you

There are various ways to learn to code; bootcamps, websites, videos, books and mobile applications. With so many different options to choose from, it can be quite intimidating to find one that works for you. Honestly, take your time and look around at what’s available. I believe it’s important to find a resource that you can understand and that fits into your schedule. I’ve compiled a list below of resources that I have tried and my thoughts on them:

CodecademyThis is a very good resource in my view, it offers a web and mobile application where you learn by doing. It offers many tutorials for free. There is a paid version where you can see the full catalogue of courses – but the free version still offers an excellent range of courses.

PluralsightPluralsight offers videos on a wide range of topics. Features instructors such as Troy Hunt (famous for security). Pluralsight offers a web and mobile application.

MimoThis uses multiple-choice questions in a bite-sized format. You do need to pay to use this. Web and mobile applications are available (originally a mobile only application).

Learn Code the Hard Way – Honestly an amazing resource to learn Python and Ruby. The author, Zed A. Shaw, offers courses on other languages such as SQL and JavaScript. The Ruby and Python courses are free but the others are paid (one payment per course, no subscription).

Eloquent JavaScript – An excellent resource for learning JavaScript! Free to read online but there is a paperback version available as well. The website does have a code sandbox for you to practice your JavaScript in which is great. Definitely worth a read!

C# Programming in Easy Steps – I found this book to be useful, it gives you good examples of key areas in C#.

Remember to pace yourself

In my experience, don’t rush! It’s perfectly okay to learn at a slow pace if you want to. If something doesn’t make sense, go back and read it again or research it. You may find learning at a slower pace may suit you! There will be times where you will be working on something and you can’t get it to work – don’t panic! Take a break and come back to it, taking a step back might help you solve the problem you are on! Learning to code doesn’t happen overnight – take your time!

Don’t be afraid to ask for help

I asked my colleagues for advice on how to solve a problem and for what I should do next (in fact, it’s where I got inspiration to create a C# program that performs a GET request on an API to get currency rates). You can also go to online communities such as StackOverflow to get help and advice on any programming problems you are facing.

Work together

This goes hand in hand with the advice above, if you know anyone else who is learning development, ask them if you want to collaborate on a project together or share advice on how to tackle problems. It’s perfectly okay to work by yourself if you want however. There are communities online who want to help each other, such as the Codecademy forums.

Put what you know into practice – build!

If there’s one piece of advice that I give which I want to highlight the most it’s this; build! You won’t improve or learn unless you build! Honestly, build anything, even if it’s something simple like printing a list or an array. I created an account on GitLab and started to upload anything I built to it to create a portfolio of sorts. It’s important to practice whenever you can – what you’ve learnt won’t stick otherwise!

Impact of what I’ve learnt

Since learning more about development I can say that I’ve become more confident in working on more technical problems. I’ve even gone further afield in learning other automation tools such as Cypress. In terms of learning languages, I’ve had experience with C#, JavaScript (TypeScript as well), Ruby (Ruby on Rails as well), HTML and CSS. It’s been an interesting journey in learning software development, and I’m interested to see where it takes me.

Thanks for reading!

Page load speed and customer experience

Over time more and more research has shown how important the speed of site is. Faster sites will generally mean users will have a better experience and will be more engaged with your site. No matter what the aim of your site is, it can improve vital statistics like the average session length, page views and conversion rate, all which can increase the revenue the site generates.

Research carried out by Akamai found that a 1 second delay in page load can result in up to a 7% reduction in conversions. A 1 second delay may seem like it’s not much whilst you are developing a new feature, and will probably be ignored, but percentages like this can results in millions of pounds of revenue being lost over the course of the year.

Page speed doesn’t just affect the users experience on your site, it also can affect how easy users can find your site. Google now uses page speed as a factor in how it ranks their results, so a poor performing page could make the difference between you or your competitor being the 1st result returned by Google.

So, if you are not thinking about the speed of your website it might be worth thinking about various techniques to improve performance.

Areas and Techniques

There are various areas that need to be looked at when working on how to improve the performance of your site. I’ll break these down into three areas to talk about briefly:

1.       Server-side

2.       Client-side

3.       Third-party

f12.PNG

Diagram showing Google Chrome DevTools (F12)

Away from the client, on the server infrastructure there are components such as a database and server-side code that will need to execute to give the client a response. On the above diagram this will be the time before the first response is received over the network (Time to first byte (TTFB)). It is important for this time to be as small as possible, because without this the user sees nothing. Techniques such as improving code efficiency, improving database queries and caching commonly used content can help reduce this time.

Client-side performance is an area which you will have more variance because a multitude of different devices can be accessing your site. This variance can be due to differences in network speed (e.g. 3G vs Wi-Fi) and device performance (mobile phone vs desktop PC). You can analyse the client-side performance using your browser (e.g. Google Chrome DevTools) or with online tools like WebPageTest. These tools give a waterfall chart showing the length and order of resources downloaded to the client which then can be used to optimise performance with techniques like CSS & JavaScript minification and compression, using CDNs and leveraging client-side caching.

Examples of commonly used third-parties in e-commerce are Google Analytics and payment providers. Third-party code can be called from the server-side and client-side, but unlike the others it can be much harder to optimise as you are not in control of the code. Therefore, techniques to improve the performance of third-party code is significantly reduced so it is important to choose a good provider and have good support. Also, if possible try to reduce the number of third-parties used and reduce the dependencies either by completely removing them or having a good fall back.

Next steps

Page load speed can significantly change the behaviour of your customer, which in turn can result in large changes to the revenue made. Even if you’ve never investigated the performance of your site or you think it is fine, it’s worth analysing how well it’s doing and if it can be improved. If you have Google Analytics on your site, the data it collects includes information on the speed your customers are experiencing, and it will give you some simple tips to get started.

Useful links

WebPagetest: https://www.webpagetest.org/

Google Analytics: https://www.google.com/analytics/web/

 

Progressive Web Apps Basics

Progressive Web Apps brings your typical website experience closer to that of a mobile app. Unlike normal websites they can cater for bad network conditions (or even being completely offline), add an icon to the home screen and send push notifications. Two important aspects of Progressive Web Apps that we’ll talk about in this article are Service Workers and Manifest Files.

Service Workers

Service Workers are one of the main components of Progressive Web Apps. A Service Worker runs separately to the main browser thread.

With a few simple lines of JavaScript you can register your Service Worker:

if ('serviceWorker' in navigator) {
  window.addEventListener('load', function() {
    navigator.serviceWorker.register('/sw.js').then(function(registration) {
      // Registration was successful
      console.log('ServiceWorker registration successful with scope: ', registration.scope);
    }, function(err) {
      // registration failed :(
      console.log('ServiceWorker registration failed: ', err);
    });
  });
}

Once the Service Worker is registered you can add code to intercept requests, cache resources and send push notifications. For example, if you want your Progressive Web App to work in times of bad/no internet connectivity you can get the Service Worker to intercept network calls and cache resources and if the network is not available in the future the cached resource can be used as a fall-back.

Service Workers are compatible with most modern web browser like newer versions of Chrome (45+), Firefox (44+), Edge (17+) and Opera (32+).

Manifest Files

A manifest file is used to provide information about how the web application should work when being installed onto a device. The information is provided in a JSON format, which contains properties like the name of the app, icons, theme colour etc.

This JSON file is then simply added to the HTML of the webpage like:

<link rel="manifest" href="/manifest.json">

Useful links

Building an offline progressive web app: https://developers.google.com/web/fundamentals/codelabs/offline/

Web app manifest files: https://developers.google.com/web/fundamentals/web-app-manifest/

Service Worker browser compatibility: https://caniuse.com/#search=serviceworker