Mainframe Computing

By June 21, 2015

The earliest mainframe computers were introduced into large private firms and some government organisations during the 1940s and 1950s. They were used to automate tasks involving numerical calculations (such as accounting, taxation or statistics). Data were entered into the computer system and then processed in batches.
Batch: A group of jobs, data, or software programs treated as a unit for computer processing.
‘Batch processing’ refers to the computer processing in groups, or batches, of data accumulated in advance, or over a period of time, without the user being able to make changes during the processing. The output of batch processing was aggregated data that could be used in summaries, bills, accounts and other business documents, as well as in reports and in analysis of scientific research.
Mainframe computers were expensive to acquire and operate. They required complex software that was developed for each new type of application. Most organisations set up separate computing departments and hired specialised systems analysts, programmers and computer operators to run and maintain operations. These specialists decided which hardware and software would be used, which applications lent themselves to automation and how the systems should be designed.
During the 1960s, computer manufacturers introduced the concept of ‘time-sharing’, allowing several users to access the computer simultaneously. Time-sharing gave rise to an early form of computer networking and remote access and stimulated the development of new kinds of software. New software, together with declining costs for running and storage, made it possible for organisations to automate more complex tasks and applications (such as managing law enforcement information, natural resources information, regulatory licensing and so on). However, the design of the systems and the operation of the computers remained a specialised technical area distant from users.
Throughout the 1970s and 1980s the impact of mainframe-based applications on records management was not apparent. Most computer centres established ‘tape libraries’ and handled the storage, disposal and recycling of machine-readable media. For records managers, the most obvious impact of early automation was a rapid increase in printed output from computer systems, which added to the growing volume of paper records. The prevailing view of electronic records at this time was that they were special media records. They were primarily valuable because of their informational content while records that were needed as evidence of actions and decisions were printed on to paper and stored in established filing systems.
During this period, the experience of archivists was restricted almost exclusively to the appraisal, acquisition and preservation of computer files containing the results of social science research (such as opinion polls and census data). Some large databases were also appraised and acquired, but archivists were primarily concerned with data files. The initial machine-readable archives programmes in the National Archives of the United States and Canada, the only two archival institutions in the world to support such programmes during this period, were modelled on data libraries.
The bulk of the world’s data is stored on mainframe computers.
By the 1990s, as personal computing and networking became more common, many information technologists were predicting the demise of mainframes. However, mainframes continue to be used to support important applications for organisations across a wide range of industries. Despite the fact that they represent a smaller percentage of the global market for information technology, the bulk of the world’s data is still stored on mainframes.