<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1645804&amp;fmt=gif">
Recently we have been involved in two projects with two separate organisations where the same solution was implemented.  The approach undertaken by both was vastly different and as a result the outcome on Go Live day and the potential for a cost overrun was significantly different.
For Organisation 1 the initial Scoping session didn’t involve any of the end users.  This was the same for the more detailed Design session.  Even though we requested some senior end users to be present in the Design sessions, this didn’t happen.  The output documents were therefore lacking some real coal face operational requirements and design decisions.
For Organisation 2, both the Scoping and Design sessions had senior users present. Also, as part of the process we were asked to setup a simple proof of concept software image with some of the client’s data.  This organisation then got different users in to try the software out before finalising the Design session documents.
Organisation 1 missed an opportunity here to involve the senior end users and this impacted the project later with significant Change Requests.
Later on, however, was the real test between the approach of the two organisations and that was in User Acceptance Testing (UAT).
All we can say is that Organisation 2 invested significant time in the UAT process not only testing the solution but the many integrations to and from other systems.  They tested and retested to ensure every stone was turned over.   This was a significant difference from Organisation 1. Organisation 1 did almost no UAT testing pushing much of the responsibility for this back to us. The results of these different approaches were evident on the Go Live date for both organisations. 
Organisation 1 had more than 100 Go Live issues in the first two weeks and required a huge input from all involved and a significant cost increase as a result.
Organisation 2 had one issue on Go Live day, and it was an incorrect email setup so this person couldn’t login.
We cannot stress the importance of good planning and user acceptance testing in software implementations to ensure a successful outcome. The client must ensure that adequate resources (suitable staff) are provided at all the critical phases in the project. 
Olympic offers a direct engagement with our clients where we provide an Implementation Plan which is reviewed in a kick-off meeting and used as a working document throughout the engagement. On site or remote training is provided, and a User Guide is always available to the users. Olympic will set up Live and Sandbox environments and the required services on the client' chosen server/s. The Sandbox environment gives the client an area where they can train and test without affecting the live data.
We have an established, quality deployment capability and ‘Out of the box’ best practice processes. Our current GO onboarding process has been established with quality and thorough deployment as the central focus. The first phase of the process is the Business Requirements definition which allows detailed business planning to be conducted and ensures a higher degree of success. Typical implementation delivery times range from 8-12 weeks+ for GO Timesheets, Expenses and Leave deployment, training and support, depending on the number of integrations and client staff availability. We have a scripted process in which both Olympic and Client responsibilities are clearly defined.
Our strong recommendation is that our clients follow the implementation plan that we have set up and that we know, works. 
If you're interested in talking to us about a GO implementation, please contact us today.