Opinion

The evolving form factors of computing

Keith Bentley

In the ‘70s and early ‘80s, organizations had large computer systems, costing hundreds of thousands of dollars. They put them in climate-controlled rooms, and usually behind glass – so they could show them off to visitors. These systems did a lot, but they also locked the organization’s information in a vault of sorts. 

Around the time we started Bentley Systems, people began to talk about connecting these large systems to graphic terminals costing around $5000. The idea was to create a pipe into the climate-controlled room that would enable you to look at your data more quickly. Creating software to use such a terminal became one of my first projects at DuPont. When I left DuPont, I negotiated an agreement that I would continue to enhance the system I created for them, in exchange for the right to sell the software. And that’s how we started Bentley Systems.

A few years later the personal computer came along. This enabled you to get the data out from behind the glass wall and put it right on your own desktop. Migrating computing power to the desktop solved many scalability issues that the simple graphic terminal posed, but it also caused data management problems – with a lot of people asking “Who has the latest copy of the data?” 

"I don’t know if there’s ever been a time when we’ve had more opportunity to improve over the past than today. In fact, it’s clear that 30 years from now, or even less, people will look back in amusement at the computing world of 2014. Our state of the art will seem like a toy to them." Keith Bentley

Then, the advent of the Internet allowed computers to all be connected together virtually. This changed how teams, organizations, and their suppliers could work together. 

More recently, when computers like the iPhone came along that fit in your pocket, some said we would soon be doing everything with these devices, while others said they were a waste of time. Neither was right. The fact that these devices allow you to have your data with you at all times changes not only the design of software but also the use cases. Suddenly it becomes relevant that programs be able to run on an iPhone, for example, because these devices are now more powerful than the computers that used to live behind the glass wall. This is particularly relevant in AECO work, where information mobility on, for instance, a construction site is so crucial to collaborative workflows and project success.

Clearly a lot has changed in 30 years in the computing form factor of the client end. But the shared computers behind the glass didn’t go away; they’ve just evolved by orders of magnitude. They’re so fast and plentiful that you can start doing really cool things with them, like writing some software to make one type of computer simulate a different type of computer. So you can have computers that are virtual computers, and that’s really the idea behind the cloud. In other words, you make a computer that is so scalable, so fast, and has so much memory that you can write software to say, “Make it look like 100 other computers or even 1000 other computers.” 

Having these infinitely scalable computers at very reasonable prices allows software for solving business problems to take very different forms. We can do everything we’ve always done in the past on desktop computers and dedicated computers that organizations own. But, we also can add capabilities so user organizations can run computations on someone else’s computers. 

"The real premise of big data is that you can have more information than you could ever process on a single computer or even a finite collection of computers. Today we can think of computing power as being infinite or nearly infinite." Keith Bentley

By making the combination of all those things as seamless as possible – through the right software, data modeling, bandwidth, and networks – we can solve business problems that we previously would have said were too hard or require too much hardware to tackle.

So we’re at yet another breaking point, I think, in the way software is conceived, the type of problems we can address, the scale of the problems that we can address. We used to talk about thousands of something or tens of thousands of something. Over time those tens of thousands may have gotten to be hundreds of thousands or even millions. But when you have infinitely scalable resources available, you can start thinking about billions of things, and perhaps many more. As a result, the world has started to refer to “big data.”

The real premise of big data is that you can have more information than you could ever process on a single computer or even a finite collection of computers. Today we can think of computing power as being infinite or nearly infinite, and what previously would require a database with a million rows in it can now be addressed by a collection of database farms and servers that work on billions, trillions, etc. of copies. Scale is not the issue anymore and the problem to be solved is no longer about one person getting to his or her own data. Rather, it’s one team, one organization, or even a collection of organizations all working on a project getting to the right data, at the right time, in a secure environment. 

For example, with our new Bentley CONNECT cloud services we can make a team more productive and enable it to focus on the most important problems first. In the field of designing, building, and operating infrastructure, things have gotten a lot more complicated in the past 30 years in terms of the amount of information that is required to do something. Why? Because the time scales are so much smaller and the projects are so much bigger, and the problems are more about how to optimize the whole rather than how to optimize the pieces. 

"So we’re at yet another breaking point, I think, in the way software is conceived, the type of problems we can address, the scale of the problems that we can address." Keith Bentley

I don’t know if there’s ever been a time when we’ve had more opportunity to improve over the past than today. In fact, it’s clear that 30 years from now, or even less, people will look back in amusement at the computing world of 2014. Our state of the art will seem like a toy to them. 

When you’re optimizing the whole, there’s a tremendous opportunity to do things differently in software. Take, for example, the program that I wrote at DuPont. Its primary goal was to streamline the process of generating a P&ID drawing by doing it on a computer. But the definition of the problem was to get the same piece of paper out – of course more quickly, and probably with fewer errors. In essence, it was about automation. And there was zero opportunity for me to say, “Suppose we did it differently by keeping this information in a database.” 

Today, the goal is to end up with something you can build, maintain, and reuse effectively and efficiently, and the information generated can be as important as the physical asset. And when you can have a computer in your pocket, no one presumes, like we did back then, that the workflow is fixed, and that all you can do is optimize the steps in between. So we’ve gone from process automation to performance automation – that is, automation of the end result rather than of the steps to get there. This is so much more valuable in terms of the return on our users’ investments in their computing resources, and allows us to do a lot more for our users than ever before. And, from my perspective, that makes it a really great time to be in the business of developing software for advancing infrastructure!

 

Keith Bentley is co-founder and chief technology officer of software firm Bentley Systems. This article was reproduced from the Bentley Systems 2014 annual report which is published this week.