Centralized computing


Centralized computing

Centralized computing is computing done at a central location, using terminals that are attached to a central computer. The computer itself may control all the peripherals directly (if they are physically connected to the central computer), or they may be attached via a terminal server. Alternatively, if the terminals have the capability, they may be able to connect to the central computer over the network. The terminals may be text terminals or thin clients, for example.

It offers greater security over decentralized systems because all of the processing is controlled in a central location. In addition, if one terminal breaks down, the user can simply go to another terminal and log in again, and all of their files will still be accessible. Depending on the system, they may even be able to resume their session from the point they were at before, as if nothing had happened.

This type of arrangement does have some disadvantages. The central computer performs the computing functions and controls the remote terminals. This type of system relies totally on the central computer. Should the central computer crash, the entire system will "go down" (i.e. will be unavailable).

History

The very first computers did not have separate terminals as such; their primitive input/output devices were built in. However, soon it was found to be extremely useful for multiple people to be able to use a computer at the same time, for reasons of cost - early computers were very expensive, both to produce and maintain, and occupied large amounts of floor space. The idea of centralized computing was born. Early text terminals used electro-mechanical teletypewriters, but these were replaced by cathode ray tube displays (as found in 20th century televisions and computers). The text terminal model dominated computing from the 1960s until the rise to dominance of home computers and personal computers in the 1980s.

Contemporary status

As of 2007, centralized computing is now coming back into fashion - to a certain extent. Thin clients have been used for many years by businesses to reduce total cost of ownership, while web applications are becoming more popular because they can potentially be used on many types of computing device without any need for software installation. Already, however, there are signs that the pendulum is swinging back again, away from pure centralization, as thin client devices become more like diskless workstations due to increased computing power, and web applications start to do more processing on the client side, with technologies such as AJAX and rich clients.

In addition, mainframes are still being used for some mission-critical applications, such as payroll, or for processing day-to-day account transactions in banks. These mainframes will typically be accessed either using terminal emulators (real terminal devices are not used much any more) or via modern front-ends such as web applications - or (in the case of automated access) protocols such as web services protocols.

Hybrid client model

Some organisations use a hybrid client model partway between centralized computing and conventional desktop computing, in which some applications (such as web browsers) are run locally, while other applications (such as critical business systems) are run on the terminal server. One way to implement this is simply by running remote desktop software on a standard desktop computer.

Hosted computing model

A relatively new method of centralized computing, hosted computing, solves many of the problems associated with traditional distributed computing systems. By centralizing processing and storage on powerful server hardware located in a data center, rather than in a local office, it relieves organisations of the many responsibilities in owning and maintaining an information technology system. These services are typically delivered on a subscription basis by an application service provider (ASP). [http://www.coredesktop.com/purpose.htm Core Desktop Solutions, Inc. –] . Retrieved on 5 September 2007.]

References


Wikimedia Foundation. 2010.

Look at other dictionaries:

  • Decentralized computing — is the allocation of resources, both hardware and software, to each individual workstation, or office location. In contrast, centralized computing exists when the majority of functions are carried out, or obtained from a remote centralized… …   Wikipedia

  • End-user computing — (EUC) is a group of approaches to computing that aim at better integrating end users into the computing environment or that attempt to realize the potential for high end computing to perform in a trustworthy manner in problem solving of the… …   Wikipedia

  • Edge computing — provides application processing load balancing capacity to corporate and other large scale web servers. It is like an application cache, where the cache is in the Internet itself. Static web sites being cached on mirror sites is not a new concept …   Wikipedia

  • Cloud computing — logical diagram Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid) over a… …   Wikipedia

  • Distributed computing — is a field of computer science that studies distributed systems. A distributed system consists of multiple autonomous computers that communicate through a computer network. The computers interact with each other in order to achieve a common goal …   Wikipedia

  • Association for Computing Machinery — Infobox Organization name = Association for Computing Machinery image border = size = 80x80 caption = formation = 1947 type = headquarters = New York, NY location = membership = 83,000 language = leader title = President leader name = Wendy Hall… …   Wikipedia

  • Transparency (computing) — Any change in a computing system, such as new feature or new component, is transparent if the system after change adheres to previous external interface as much as possible while changing its internal behaviour. The purpose is to shield from… …   Wikipedia

  • distributed computing —       the coordinated use of many computers (computer) disbursed over a wide area to do complex tasks.       Distributed computing is a method that researchers use to solve highly complicated problems without having to use an expensive… …   Universalium

  • Library (computing) — This article is about the programming concept. For Windows 7 Libraries, see Features new to Windows 7#Libraries. Illustration of an application which uses libvorbisfile to play an Ogg Vorbis file In computer science, a library is a collection of… …   Wikipedia

  • Orchestration (computing) — Orchestration describes the automated arrangement, coordination, and management of complex computer systems, middleware, and services. It is often discussed as having an inherent intelligence or even implicitly autonomic control, but those are… …   Wikipedia


Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”

We are using cookies for the best presentation of our site. Continuing to use this site, you agree with this.