Building a LMS on the Cloud
There are many ways to host education content within a company, from large corporate level learning management systems to freeware like Moodle or Blackboard, we decided to build our own Learning Management System, this is how we did it and why and how that’s worked out for us.
What sort of LMS do we need?
What is the best Learning Management System to use for our eLearning content? This was a question which I was asking a couple of years ago.
Up to this point in time we were launching our content from two separate web portals; one was an external portal for partners, the other a SharePoint site for internal staff, with the assets themselves hosted on an ISP’s web server. Initially our only need was to make content available, there were no demands placed on us for tracking individual user’s progress so using a LMS was overkill when a web server would do the job at a fraction of the cost.
However once we had the need to track user progress through our eLearning we needed to look for an LMS that would meet our needs. I looked through the specifications of the leading contenders and sat through demonstrations of many feature-rich ‘Swiss Army knives’, only to witness horrible web interfaces, each more difficult to use than the last! I actually wanted something simple to use that had the flexibility to add new functionality and new features as required, and as we dreamt up new ways to engage our end users.
Eventually it looked like the only way forward was to build something which met our needs in-house. This obviously isn’t a typical approach or an easy step to make. The decision point came when I realised it was more cost effective to hire a developer full time than it was to pay the licenses for an existing LMS. Building our own may not get a system up and running within a week but we then have the benefit of being able to influence the development of the system and design how it would work down to the smallest detail.
Rich media content development
We had for some time been developing rich media content that included video such as interviews with subject matter experts, green screen presentations, 3D animation and interactive content. For a telecommunications audience, rich media education was a significant improvement over what they were usually subjected to: The ‘eLearning’ equivalent of voice over PowerPoint. This was everywhere at the time, and we found it lacked the ability to engage audiences or raise anything other than a pained groan. Particularly when used to present content that is often technical and usually lends itself to a highly visual and interactive delivery.
We had chosen Adobe Flash as our development platform, which wasn’t too controversial in 2009/2010. In our rebellion against bullet-point text we found we could deliver cost effective, engaging education without resorting to rapid development tools.
To organise our content we wanted to use a modular approach where we could address each asset individually. Each individual asset was a SWF file that presented a stand-alone education concept. This approach allowed us to recombine assets into new courses based on the target user group’s needs. The individual courses were then constructed using a simple XML configuration file.
To enable this distributed modular approach we used a tiny Flash container on the launch portal which was a few kilobytes in size. This container in turn downloaded the learning interface, the course xml, and the course assets.
This approach allowed us to provide access to our eLearning courseware on any intranet/extranet/web server. In addition, we could deploy content to customer intranets or externally managed partner portals whilst maintaining complete control over our content, including the ability to update material without having to work through external system administrators.
(Perhaps add note on content security? Educational content could only be accessed through the eLearning interface – asset URLs are hidden within the course xml.)
The LMS UI
As Adobe Flash was our development platform of choice, Flex was a natural choice for our application framework, allowing an easy collaboration between design values, and features. Implementing a rich media UI gave us the ability to design a unique user experience that was very different to traditional eLearning templates.
It’s all grinding to a halt!
With the success of our rich media approach we found ourselves with exponentially increasing asset sizes - especially if we had long interviews or product demonstrations - and this was starting to stretch the ability of our web hosting service to deliver the material reliably. Having a single server hosting our content meant we were seeing long download times and the end result meant students were having longer and longer wait times between each page of a course. To add to this problem, as our content became more and more popular, multiple students hitting the same server caused further delays. One factor guaranteeing a high drop-out rate was students having to wait more than a few seconds for each page.
With growing delegate numbers we were starting to approach our monthly download cap from our provider, which when we exceeded it, would dramatically increase our costs. This would have been an issue with some of the larger education programmes we intended to deliver.
It’s always good to get a call from the boss who tells you that he is about to include our material in a programme for 10,000 additional delegates globally! The power of eLearning, and the cost of an unplanned expansion in traffic!
Migrating to the cloud
By April 2011, in response to requests for increased delivery capacity, we moved all of our content on to Amazon Web Services Simple Storage Service S3. AWS S3 provides a pay-as-you-use service based on the amount of storage space plus the amount of data transferred. With a few simple calculations I could see we would immediately reduce our storage costs down to a few dollars a month. Our original Virtual Private Server solution by its very nature had a fixed price, but with storage size limits and a bandwidth cap. VPS worked out to be more expensive for storage than S3 in every one of our use cases. I remember our first months AWS bill was less than $3 USD!
One new benefit that I was looking forward to trying out was that we could now use Cloudfront, Amazon’s content distribution network provides a high bandwidth backbone and caching in multiple locations globally. This meant that we were no longer limited by having a single server. In Cloudfront, then as now, requests for content are routed to the nearest point of presence which pulls it down from the S3 bucket and the first request caches the content local to the user. This means that every subsequent user is calling the content from their local point of presence not the central bucket.
From a Global perspective this meant that we now had 40 servers we were downloading from, all with unlimited bandwidth. We knew that Netflix used AWS to deliver high definition TV and Movie content globally, and if it was good enough for them, then it should work just fine for us.
Cloudfront distributions also had the optional extra of video streaming, which meant we could stream video content instead of using the progressive download method. Streaming content means that you can jump forwards or backwards in a movie file without having to wait for the preceding content to download first - it starts streaming at the point in the video that the user wants to begin viewing. This is particularly valuable in interviews, and puts control in the hands of the user.
Once we had implemented Cloudfront we saw a massive improvement in response time and speed of download. Every test from every location showed we had a more responsive delivery mechanism. Our first test used a 17Mb file which took on average 24 seconds to download from our web server, this we got down to 6 seconds using AWS reflecting an immediate 4-fold increase! We have subsequently tested Cloudfront from multiple locations around the globe, all tests have shown significant speed improvements. The only limiting factor in any test was the users local loop connection to the internet.
Cloudfront’s 40 points of presence, connected to a global backbone means that there are no bandwidth limitations at a regional level and response times are excellent due to files being cached locally.
Security and tracking
The storage and distribution of content is only one part of the function of an LMS, the next ingredients we needed to add in was access control and tracking.
We needed to include tracking for certification purposes and access control to ensure we kept customer, internals, and partner information segregated. We decided against SCORM for tracking as that limited what we could do with our content. I’ve written more about that here . However we wanted a system that would show progress through a course while giving delegates the ability to restart courses from where they had previously viewed them. We had the eventual aim of making this work across different platforms and devices, so students can start the course on their laptop, and continue it on their tablet or phone .
In terms of security, we started from a simple access control script and expanded that to include the parameters needed to build delegate profiles. We added security groups to ensure users only got access to the content we wanted each group to see, using a traditional model of user/group and privilege/group proved very flexible, especially as users can be assigned to multiple groups.
Security is provided firstly through user registration control when you access our LMS, and secondly through Amazon’s IAM access control that ensures our content is only served to authenticated users.
EC2 Application Server
To build the application server we started with a LAMP stack (Linux, Apache, MySQL, PHP) and once we had a system working locally we rebuilt it using an Amazon EC2 image as a web server connected to an Amazon RDS SQL server. Most of our development work was done on a micro instance which was free for the first year. After that, we added load-sharing and auto-scaling to increase the capacity based on load. Amazon’s Route 53 provided the DNS service, although we kept domain registration with our old ISP.
The development of the LMS has been dependent on the skills of our team, which consisted of three very capable individuals (we’ve grown a little since then) who worked hard to create engaging content and develop the systems from which it is served, if I was to unfairly to pigeon hole their roles I would say:
- Architect & AWS Admin (me)
- Implementation (Gary)
- Design (Derrin)
- Programming (Dhana)
The division of labour in this small ‘2-pizza team’, together with the skills and spirit of innovation each individual brought to this project, has enabled us to develop a great system that integrates well with our content, has a nice look and feel, has the flexibility to support the features we need when we need them, and offers something that works well for the end users. In addition we get all of this with no license fees or subscription costs.
Integrating the development of the eLearning content with the development of the system on which it is hosted has allowed us a tighter collaboration between these two elements. It has also given us the ability to develop new features into our content, and having that content powered by a back end system that supports it. Developing it ourselves has allowed us to use a lean development model with complete control over direction, we can easily integrate the LMS platform into other training tools we create such as event management, curriculum selection, training development management or a training scheduling tools.
I’m looking forward to where we are going with this development: with a small team we can add individual features quickly, respond to user’s needs, and develop a system that exactly meets the needs of the business. I’m keeping the next feature set under my hat but I’m looking forward to it.