Digital infrastructure – the future of engineering

We can radically improve infrastructure performance by bringing BIM to maturity, embracing the ‘internet of things’ and making sense of big data – but can our industry rise to the challenge, asks Ian Galbraith to launch the Infrastructure Intelligence/Mott MacDonald Digital Infrastructure Hub? Plus video welcome from Mark Enzer.

Allying traditional infrastructure with new technology could unleash unprecedented efficiencies across the built environment. GE Energy has calculated that the world economy would save up to $276bn over 15 years from an efficiency gain of just 1% in five key industries: oil & gas, power, healthcare, aviation, and rail. However, a 1% efficiency gain is a very modest ambition considering the massive potential of digital systems to improve the design, delivery, management and use of infrastructure.

"A fully interconnected built environment is just visible on the horizon, but asset information management (AIM) is here already, with building information modelling (BIM) and geographic information systems (GIS) data informing design, construction and operation."

Plunging IT hardware costs, installation of a national broadband nervous system and leaping advances in computing power mean it is now practical to envisage a world in which assets are interconnected, with sensors providing real time awareness of the condition and performance of the built environment. This will allow us to optimise the way buildings and infrastructure are managed, leading to a number of direct and indirect benefits.

In transport, for example, sensors are already being installed on our road networks to detect traffic density, control lane use and driver speeds and to reroute vehicles to avoid congested areas. Developed further, monitoring and control could make traffic jams a thing of the past, with several related benefits. The economy would no longer lose millions each year to lost working hours; more efficient road use would mean a reduction in carbon emissions; road safety would be improved; and commuters would find travelling much easier and less stressful, with the associated benefits this brings to health.

Linking this system to adjacent digital systems would multiply the benefits further. For example, by bringing the weather forecasting system into play, we could predict where and when rainfall may affect the road network, while congestion hotspots could be predicted and pre-empted by utilising data on ticket sales for tourist attractions across the city.

In Cleveland, Ohio, sensors detect when domestic waste bins are full and refuse collection routes are adjusted accordingly. And in Auckland, New Zealand, sensors across the city’s water system detect pipe pressure and flow, with sudden reductions indicating leaks.

A fully interconnected built environment is just visible on the horizon, but asset information management (AIM) is here already, with building information modelling (BIM) and geographic information systems (GIS) data informing design, construction and operation.

Building capability with BIM

Data-rich information models are now being regularly taken to 5D capability (3D design together with cost and schedule details). But multi-dimensional BIM is also being championed – containing data on manufacture and construction, embodied and operational energy and carbon, and maintenance regimes, for example.

A project information model becomes an asset information model, and is increasingly seen as an organised database bringing together graphical and non-graphical information plus core documentation to form a comprehensive and accurate digital representation of the asset, to be valued and maintained with the same care as the physical asset itself.

This is an important component of asset information management, contributing to improved whole life asset management, including future extensions or upgrades. BIM models will in future interact with the digital systems needed to realise AIM.

GIS – seeing the bigger picture

Geographical Information Systems (GIS) typically provides data about the physical and spatial aspects of a geographical area but can also be used to record and represent social and economic factors. When integrated with BIM within a common data environment, it provides context and describes the interface between an asset and its external environment.

Gaining insight with sensors

Sensors can be installed across an asset system such as a network utility to provide real time intelligence about its condition and performance. Sensors monitoring an asset’s condition can detect wear, breaks, anomalies, and external stresses – factors which could adversely affect functionality.

Sensors monitoring performance detect variables such as temperature, load, flow rates, pressures, chemical composition and internal stresses – the factors that determine operational efficiency and the quality of service delivered to end customers. Condition and operational data enable better decision making and proactive asset management – with many standard responses already being managed by automated operating systems under human supervision.

Grappling with big data

Data is becoming more and more prolific. Increasingly we’ll see information being sucked out of its native format – whether from BIM and GIS models, from performance and condition databases, or from customer surveys – for uses that were not originally envisaged. And we should be looking at other data streams, sourced from GPS systems, smart meters, utility bills and major event registrations, all of which provide information about the way people use infrastructure and the services it delivers.

Social media could also be very useful as people often register their views about a service via media such as Twitter or Facebook rather than with relevant authorities. Specialist analytics firms are already filtering social media and selling relevant information on to companies in the aviation and hospitality sectors among others, so that they can improve their customers’ experience.

Making sense of the information

This digital infrastructure needs its own engineering to ensure robustness, cybersecurity, resilience and adaptability, and cost-efficiency. It also requires the development of ‘middleware’, to connect sources of data and to enable effective data streaming and management, and to make information intelligible.

Data is just data. It is what you do with it that matters. Data-crunching and analysis provide meaningful insights that can be used to improve asset performance. The real skill – possibly the art – of future infrastructure management will be in figuring out what data is needed to enable the analysis and indicators that owners and operators need to provide better outcomes.

Making better decisions

In some industries simple decisions are already being handed over to machines which can better be relied on than humans in some cases to carry out prescribed actions in response to predetermined triggers. In the case of a fault, this could involve automatically closing down localised parts of the asset, notifying maintenance teams or public authorities, or adjusting the operating intensity of mechanical equipment to varying loads.

While autonomous decision making will allow for simple self-optimisation, better data and analytics offer owners and operators much more powerful decision making over anomalous, first time and strategic issues.

Intelligent machines

It seems a small step from self-optimisation to the concept of artificial intelligence and decision support tools that can ‘learn’ from experience and modify their behaviour (output) accordingly.

In buildings, the experience of maintaining office temperatures as the seasons change could help to pre-empt heating and cooling requirements for the following year. Electricity grids could ‘learn’ how to deal with surges in demand, maximise capacity and, with the advent of new energy storage technologies, smooth grid frequencies – the building blocks of ‘smart grids’.

Eventually, we can expect to see autonomous self-learning taking a greater role across most sectors. But two key factors limit how soon AIM can be delivered as a matter of course. The first is skills – not just the shortage of engineers who ‘get’ the efficiencies that new technologies can deliver but the scarcity of software programmers, lateral thinkers and leaders who can force the pace of change.

The second factor is client demand. Clients typically pursue compliance with legislation, regulatory requirements and investors’ expectations and shy away from challenges associated with innovation and long-term change, even where these will make them more competitive. Nor are clients fully aware of the possibilities as many have lost in-house engineering capability over the last couple of decades.

This is where our industry has to up its game, providing the high level, long-term and imaginative solutions associated today with technology companies. While much depends on our ability to address the skills gap within our own industry, we are entering what will be a very exciting period in engineering.

Ian Galbraith is a Mott MacDonald development director