AECOM Global Unite Cost Benchmarking

AECOM is a global provider of professional technical and management support services to a broad range of markets, including transportation, facilities, environmental, energy, water and government. With approximately 45,000 employees around the world, AECOM is a leader in all of the key markets that it serves. AECOM provides a blend of global reach, local knowledge, innovation and technical excellence in delivering solutions that create, enhance and sustain the world's built, natural and social environments. A Fortune 500 company, AECOM serves clients in more than 140 countries and had revenue of $8.2 billion during the 12 months ended March 31, 2013.

 
 
wooloware_shores_3.jpg
 

The Problem

AECOM is a global provider of professional technical and management support services to a broad range of markets, including transportation, facilities, environmental, energy, water and government. The organisation operates in every continent bar Antarctica and has over 45,000 staff. The combined intellectual property is enormous however, accessing the collective conscience and sharing the detail has proven to be cumbersome. Collecting project information for benchmarking and lessons learnt is a continual challenge. Benchmarking occurs but is a manual process either collated by a research specialist or compiled on the fly by emailing the distribution list asking for sample plans in this sector for this building type. The key to this problem is the number of disparate, thick client applications installed on client/server type structures or as unstructured documents hidden on folders on individual PCs.

A particular area of interest for AECOM was the benchmarking of cost estimates. Cost plans are generally created in a thick client application installed on the desktop of the cost manager’s computer. The range of tools in the marketplace to do this is considerable however, few if any allow for the consolidation of plans across a portfolio that can then be averaged as benchmark pricing.

The opportunity of leveraging this massive pool of cost information to create benchmarks of cost and quantity information to better service AECOM clients was identified by senior management in Australia and linked to the portfolio capabilities of UniPhi’s project portfolio software. The key issues to be addressed were how to get the disparate cost plans that had been captured in a multitude of systems into the application without burdening the cost manager and how to then aggregate this information and present it to the cost manager in a dynamic and useful way.

 

The Business Case

Typically in corporate finance there is one key ingredient to getting approval to invest. This key ingredient is political momentum. If you don’t have political momentum, you have to rely on more academically taught methods of positive net present value and you will be hoping there is excess capital budget when making the pitch. In information technology, the key political ingredient is called the killer app. The killer app is the piece of functionality that causes people to go “wow”. People purchased the first iPod over the other cheaper and more functional mp3 players because, when it was launched people went “wow”. This then flowed through to the iPhone and iPad as each device came with the signature Apple design quality and “wow’ factor.

For this project, the key to getting the business case over the line was to create a link to the main cost planning tool used by the Project Cost Consultancy division of AECOM (Cato – built by Causeway) and UniPhi’s enterprise software. This link was developed by UniPhi’s software development team in collaboration with AECOM Technical Director Barry Laycock and his team of cost managers. What UniPhi was able to do was utilise the chart of accounts functionality already developed in the enterprise application to match on code with the elemental structure that existed in Cato. Cato cost plans are built off a master template of elements that are then drilled in to bottom up estimate the cost of a building. By matching the codes in the template with the structure of a UniPhi chart of accounts, the quantity of the element, its rate and total were able to be dynamically imported into the UniPhi database from the Cato database. Once the data was captured, aggregating and sharing the information was simple due to UniPhi’s portfolio functionality and its web- based architecture.

The end result was the “Cato import” button. Clicking this button brings up two drop down lists with data populated from the underlying Cato database. For cost managers to see data that they had created in one application being displayed in a completely different interface of a web application was amazing. From there, the rest was just a matter of time.

 

The Systems Architecture

UniPhi’s project management software is built on the philosophy of distributed data capture. This means that the system is designed to assist not only project managers but those working in project teams. This philosophy was maintained when designing a solution to the issues described above. This meant incorporating the data capture into the standard processes of a cost manager. To do this, UniPhi had to integrate with all the cost planning tools used by Davis Langdon. Once this technical aspect was achieved, UniPhi’s enterprise version was able to be the centralising interface to thousands of cost plans captured daily during the normal working environment of the business.

However, to be able to maintain speed of performance, allow for regional nuances and therefore gain buy-in by cost managers to use the tool, it was decided to deploy geographic specific UniPhi databases. This meant deploying the web application and database in 6 different locations: United Kingdom and Europe, North America, South Africa, The Middle East, Australia and New Zealand and finally Asia Pacific. Each geography had a subtly different elemental breakdown structure and captured cost and quantity information in subtly different ways. While it would have been possible to customise the UniPhi application to handle this within one database, by distributing 7 around the world we were able to ensure high performance and limit the level of change to the core software code base.

The UnPhi software development team has learnt the hard way over the years the importance of not deviating too far from the core application when customising the application for a client. The system has a broad range of configuration options that allow it to look considerably different from one deployment to the next without required code changes. However, the specificity of the requirements for Global Unite meant that it was going to be difficult to create configurable options for all the nuanced differences from one country to the next. As the UniPhi application was an intermediary to the final product known as the global data warehouse and its associated analysis services cube, distributing the applications to seven geographies did not limit the globalisation of the information. A graphic of this design is below:

unite_architecture.png

UniPhi’s software development team utilises Microsoft’s architecture. UniPhi’s web application is written in C#, the software development team work in Microsoft’s visual studio .Net platform. Source code is committed to a subversion repository and accessed using Tortoise SVN. The database for the UniPhi web application is Microsoft SQL Standard 2008. Using this architecture as the bases, Microsoft SQL Server Integration Services (MS SSIS) was utilised to write extract, transform and load scripts that would take the relevant information from each UniPhi database and import it into a new database structure stored in a Microsoft SQL Enterprise 2008 database. These SSIS packages are scheduled to run at 11pm in each geography meaning that the data warehouse is almost constantly being updated.

The purpose of global unite is to provide cost benchmark information. As this means aggregating data stored at the cost plan level to averages for particular sectors and project types etc. It was decided that we could utilise Microsoft’s Analysis Services. The Analysis Services Cubes take the data warehouse data and provides a variety of dimensions and facts for both software developers and end users to access. Software developers use the cube to write MDX queries. These queries are then distributed through a SOAP web service to be displayed within the UniPhi application next to the imported cost plan so that end users can compare straight after an import just how different their plan is to the average for that type of project. Displaying Global Unite data within the UniPhi application has been key to getting momentum in the use of this data.

This same web service (reducing maintenance) is utilised by an iPad application that downloads abridged data sets filtered by the end user to display specific sectors and types offline. This enables cost managers to present cost information dynamically to clients on client premises, in cafes, on site or anywhere they find it useful to do so.

The same MDX queries are also used in Microsoft Reporting Services reports (accessed through a reports tab within the UniPhi web application). End users are able to query the data directly and create their own bespoke piece of analysis by adding the cube to Microsoft Excel through the Analysis Services Data Connection function. This provides complete flexibility for end users to analyse cost information any which way they can imagine.

 

Data Required for Valid Benchmarking

Many obstacles had to be overcome to achieve a valid data warehouse. The range of data required to be able to use a cost plan for benchmarking purposes into the future and across geographic areas includes:

  • Time factors to allow for inflation

  • Location factors to allow for differences in labour and material prices

  • Currency exchange

  • Measurement differences including metric to imperial conversion and different base floor metrics (e.g. Fully Enclosed Covered Area versus Gross Floor Area versus Gross Area)

  • Cost breakdown differences for example to include windows and doors together or as separate line items

Interfaces were built within the UniPhi for the enterprise application to capture all the elements above. Mapping cost code structures both within a geographic region and across geographies allowed for different cost breakdowns to be merged into regional and global standards essential for quantity based benchmarks.

Similar characteristics are employed when pricing a job for a specific location. For example, all new school buildings of a certain floor plate range built in the United States can be converted to be priced as if they were going to be built in 2014 in San Francisco. Differences can be analysed using key cost drivers like wall to floor ratios and specifics about the plan like timber versus steel frames.

 

UniPhi for Major Projects Used to Rollout UniPhi for the Enterprise

As per usual, all of the items described above were not identified in the first prototype of the project. The first step was to solve technical issues connecting to Cato. UniPhi for major projects was deployed to resolve technical issues, collaborate with end users and allow the entire global project team to keep abreast of issues in one central place. The UniPhi software platform has been built and based off the emerging research field dedicated to managing complex projects. Methodologies like semi – structured time series (or JAZZ), iterative development, collaborative procurement, risk mind mapping, virtual teams and values based management.

The Global Unite project was an excellent case study for adopting complex adaptive tools and processes to manage complex projects rather than the traditional PM tools of Gantt charts and spreadsheets that follow a controlled project management plan and requirements document. The project involved a team working across multiple time zones in every continent bar Antarctica. Many of the team members have never met and yet are engaged in robust debates about how to resolve key issues like displaying benchmark values to the key user as they sense check their current cost plan. This particular issue was resolved through three iterations of development. Each new iteration was communicated via a video of the functionality being uploaded to the specific issue with comments by the entire project team following its submission. This means that each iteration can be seen in on long conversation thread that resulted in probably the best piece of functionality possible, utilising all the development that had occurred not just for this issue but for the entire project.

uniphi_comments.png

The success of the project (see above) demonstrates that this methodology can produce far superior results than traditional project management methods. The expansion of uptake in the use of complex adaptive processes is essential if the major technical, directional, temporal or structural projects being invested in around the world are going to improve their effectiveness and efficiency of implementation.