In 1998, Ian Foster and Carl Kesselman, together with thirty distinguished experts in high-performance computing and networking, laid the foundations of a new computing model called GRID. Their vision was introduced in the book, The GRID: Blueprint for a New Computing Infrastructure (see review, SCPE Vol. 3, No. 3). The editors wrote: … A computational grid is a hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities… allowing new classes of applications to emerge… as yet, we have only a preliminary understanding of what these new applications will look like.
Over the next five years, Grid computing became the hottest R&D field. Academia and national labs built experimental infrastructures, new startups were established to develop Grid technologies, and the HPC giant vendors joined the party. Each one pushed in a different direction based on what and how he believed Grid computing should look. It was only a matter of time until people started to ask what the Grid is all about. A survey of ten Grid experts produced ten different answers. The question remains without a clear answer, leaving many people confused.
Ian Foster picked up the gloves. In his article, What is the GRID? A three point checklist (GRIDtoday, July 22, 2002: Vol. 1, No. 6), he redefines the Grid model to reflect the trades and changes in the field during the past five years. According to Foster, a Grid is a system that:
- Coordinates resources that are not subject to centralized control.
- Uses standard, open, general-purpose protocols and interfaces.
- Delivers a nontrivial quality of service.
But Foster's new definition could not encompass all of the Grid's approaches. Moreover, Foster explained by example what the Grid is not: A cluster management system such as Sun's Sun Grid Engine, Platform's Load Sharing Facility, or Viridian's Portable Batch System can, when installed on a parallel computer or local area network, deliver quality of service guarantees and thus constitutes a powerful Grid resource. However, such a system is not a Grid itself, due to its centralized control of the hosts that it manages
Within a week the responses started to come in. The first one was from Wolfgang Gentzsch, Director of Grid computing at Sun Microsystems (GRIDtoday, August 5, 2002: Vol. 1, No. 8). Gentzsch wrote: I very much like Ian's two earlier definitions. Combined, they may provide a very generic Grid definition. The new checklist, however, reduces the potential of a wide and rich variety of Grids, with different architectures, operational models, and applications, and may miss the chance to becoming widely accepted as a standard definition of the Grid.
Why do we need a definition for the Grid?
Can the Grid be defined?
Is a definition is necessary to the Grid's development?
The Grid is an open vision and thus cannot be defined. An attempt to define the Grid can only reach the foreseen technology horizon, and this horizon is only one to two years ahead. Grid computing is a paradigm in shift. New architectures and standards are changing the shape and direction of the Grid's development. In the past year one such prominent architecture was Open Grid Services Architecture (OGSA).
The OGSA is the first attempt to bridge the gap between two cyberspace worlds: the Internet and the Grid. The Internet and the Grid are two different computing models. There are people who believe that the Grid is the next generation of the Internet and the World Wide Web, saying that the Great Global Grid is still many years away. Others state that the Grid will complement, not replace, the Internet as we know it.
In 1998, when the Grid blueprint was published, a new Internet language appeared called eXtensible Markup Language (XML). XML is a development tool for a new Internet computing model known as Web Services. At that time, nobody understood that there is a relationship between Web Services and the Grid. Three years past before Steve Tuecke from Argonne National Laboratory started to write a spec for Grid Services. Later, people from IBM joined the mission and together they created OGSA. The OGSA specification outlines interfaces to grid computing software that comply with Web Services standards. If adopted, Grid services such as job scheduling, authentication, failure-detection, staging of applications and data, and migration of applications and data, will all be accessed through standard Web Services architecture. OGSA is basically where Grid meets Web Services and maybe the first step towards full integration of the Internet and the Grid.
The Internet appeared a decade ago. In the beginning, the Net offered primitive applications: a mailing system, ftp services, and a web of home pages. Nobody asked, What is the Internet? The years passed and new emerging Internet technologies were developed. Today we can talk on the phone over the Net, chat with friends, and video conference with colleagues on the other side of the world. New computing and business models were developed: e-commerce, e-learning, e-science, and e-you-name-it. And nobody asks what the Internet is.
The Grid is a super-model, a bag of networking models associated with many types of architectures, software packages, applications, and standards. Today, when people talk about the Grid they mention Data Grids, Science Grids, and Campus Grids. And at the same time they mention Farm computing, Peer-to-peer computing, and Utility computing, among others. Ten years from now many of today's technologies will disappear and new ones will be developed. I don't know how Grid computing will look a decade from now, but I know that we will have g-commerce, g-learning, g-science, and g-chat. Nobody will ask what the Grid is. It will be obvious, just as the Internet is today.
The Computer Aided Design Laboratory
School of Engineering and Computer Science
Hebrew University of Jerusalem