It depends on which part of your question you are emphasizing: importance or usage rates.
Mainframes are still very important. They are still heavily relied upon by the financial industry and for many of the critical operations required by large enterprises. IBM has about 90% of that market. Most vendors have cut back on mainframe software development but IBM has been increasing their R&D in mainframe hardware and has a very fast-growing mainframe software development business.
In terms of overall usage rates, many of the traditional roles of mainframes can now be achieved by networked server farms and even desktop systems. For example:
Massively redundant operations -- systems that have to run for years continuously -- can now be achieved with individually networked computers or servers with redundant switching.
Virtualization: A single mainframe system could run multiple operating system and multiple applications so serve the needs of an entire company using remote terminals. Today, desktop systems are more typically installed and customized specifically for each users' task.
Multi-user operations: In pre-web days, if large groups of users needed access to a central database (and centralized applications) mainframe systems where the only option to support such large utilization rates and massive throughput. Today, the Internet (and intranets) connect users and data through distributed systems (on-site and off-site) which can be sized according to the specific transaction levels needed.
Scalability - Huge up-front hardware cost (and massive software licensing cost) of mainframes give you vertical scalability (big hardware) to handle huge demand (current and future). Distributed, smaller servers give you horizontal scalability whos costs can be spread out as capacity needs increase.
So, some will say that steadily reducing prices and the economies of large database scale will feed a continuing demand for the largest systems. Other will argue that investing in mainframe technologies is not cost-effective.
In banks, the answer is a resounding no. There is too much legacy code and too many business rules which are either poorly documented, or written so long ago that the option of moving off mainframe would be too costly.
Here in Sweden, the Swedish Social Insurance Agency recently switched from their old mainframes (Bull) to a UNIX based system.
The biggest reason for this being that the employees who know something about their old system has either retired or has delayed their retirement for several years to keep the systems running.
I suspect similar migrations will occur in bank systems, since they will soon (or already do) suffer from the same problem.
Cloud computing could bring back the mainframe? Rhetorical but one that is asked in lots of places. You could provision a bundle of large VMs in there.
Honestly if I ran a gigantic IT department, I would whole-heartedly consider IBM iron, likely on lease. With the virtualization and smaller staff (though more expensive) it could make a LOT of sense.
Would I commission COBOL apps on it? No. But surely DB2, Java, XML, and other Linux apps.
As an anecdote, I am just now responding to an RFP for very new Java apps on System Z.
Yes. The people who still use them are consolidating them, and the brave new world of Linux on the mainframe is mostly myth.
I've never heard so much of a rumor of somebody migrating applications or developing new applications on Mainframes in many years. I am aware of many projects where "impossible to replace" COBOL is in fact being replaced, mostly to J2EE applications.
The places still using mainframes, especially non-IBM mainframes, are dysfunctional orgs like government agencies and banks. As the people who know stuff about mainframes continue to retire and die, the nonsense arguments that you actually need crappy legacy code from 1975 will no longer be made, and someone will finally put a bullet in the mainframe.
It depends on which part of your question you are emphasizing: importance or usage rates.
Mainframes are still very important. They are still heavily relied upon by the financial industry and for many of the critical operations required by large enterprises. IBM has about 90% of that market. Most vendors have cut back on mainframe software development but IBM has been increasing their R&D in mainframe hardware and has a very fast-growing mainframe software development business.
In terms of overall usage rates, many of the traditional roles of mainframes can now be achieved by networked server farms and even desktop systems. For example:
So, some will say that steadily reducing prices and the economies of large database scale will feed a continuing demand for the largest systems. Other will argue that investing in mainframe technologies is not cost-effective.
In banks, the answer is a resounding no. There is too much legacy code and too many business rules which are either poorly documented, or written so long ago that the option of moving off mainframe would be too costly.
Here in Sweden, the Swedish Social Insurance Agency recently switched from their old mainframes (Bull) to a UNIX based system.
The biggest reason for this being that the employees who know something about their old system has either retired or has delayed their retirement for several years to keep the systems running.
I suspect similar migrations will occur in bank systems, since they will soon (or already do) suffer from the same problem.
Cloud computing could bring back the mainframe? Rhetorical but one that is asked in lots of places. You could provision a bundle of large VMs in there.
Honestly if I ran a gigantic IT department, I would whole-heartedly consider IBM iron, likely on lease. With the virtualization and smaller staff (though more expensive) it could make a LOT of sense.
Would I commission COBOL apps on it? No. But surely DB2, Java, XML, and other Linux apps.
As an anecdote, I am just now responding to an RFP for very new Java apps on System Z.
Yes. The people who still use them are consolidating them, and the brave new world of Linux on the mainframe is mostly myth.
I've never heard so much of a rumor of somebody migrating applications or developing new applications on Mainframes in many years. I am aware of many projects where "impossible to replace" COBOL is in fact being replaced, mostly to J2EE applications.
The places still using mainframes, especially non-IBM mainframes, are dysfunctional orgs like government agencies and banks. As the people who know stuff about mainframes continue to retire and die, the nonsense arguments that you actually need crappy legacy code from 1975 will no longer be made, and someone will finally put a bullet in the mainframe.
Fake Steve Jobs (aka Dan Lyons) has a very interesting take on why Mainframes are still hanging around here: http://www.fakesteve.net/2009/10/why-ibm-is-in-trouble-with-antitrust.html