Every complex system is built on small components, which are, in most of the cases, autonomous. In time, the system evolves, including more and more components, leading to a heterogenous ensemble. Evolution leads indisputable to diversity. But these components cannot operate fully independent. They must be interconnected one with others in some ways. The word everything is floating around is communication. But in the system there are a lot of objects, manufactured most of the times by different companies. They must "talk" using a standardized language known by both sides.
The history of computer industry evolution shown that big companies' tendency is to monopolize the market and to impose proprietary systems and architectures. This tendency decreased in time, because of the users' demands. Most of the systems (both hardware and software) created this way are black-boxes, where you provide an input and you get an output, without having any knowledge of what is happening inside. Most of the times, the black-box cannot be configured by the user to change the function they perform. Imagine, for example, two SCSI hard-disks, from two manufacturers, with different connectors outside. Or imagine - and this is a real example - that if you want to add some RAM to a HP computer, you must use a RAM chip produced by HP. Anyway, there are much less strange situations in hardware domain than in software domain. At least in computing industry, the monopol is an aberation.
In opposition of black-boxes, it comes the concept of open system.
A definition for the open system require two aspects:
1. the system itself;
2. the system's interface with the exterior (documentation, availability, etc.).
What is an open system?
"... a system that implements sufficient open specifications for interfaces, services, and supporting formats to enable properly engineered applications software to be ported across a wide range of systems with minimal changes, to interoperate with other applications on local and remote systems, and to interact with users in a style which facilitates user portability" (Guide to the POSIX Open Systems Environment, IEEE POSIX 1003.0).
This definition is refering to software, of course. Here I propose a general definition for open system: is a system clearly described in a documentation freely available, which must be able to communicate with other systems using a standardized communication protocol described too in a freely available documentation; if the system includes more components, these components must be able to intercommunicate using standardized protocols. The function that system implements have all the parameters (if there are any) tunable.
Some properties of open systems in addition to the definition:
In the beggining of computer hardware evolution, IBM was the single source vendor for mainframe computers. After some time, their users start to wonder if IBM don't cheat them, because the cost of the equipments was very high, but may be the production costs are much lower than their's price. Clients think that the cost of computer equipment should be reduced, and think that a second source would break the monopol established by IBM. Even if another company wasn't capable to manufacture computers, at least some components inside IBM mainframes would be produced by somebody else, so to reduce their's cost. So the interest for open systems started with a interest for a "plug compatible" idea.
There was a basic set of beliefs about open systems:
These beliefs are still very strong today.
As we saw, the keyword was standardization, and still is. In the early 1960s, the major standardization bodies for information technology began with the creation of the International Organization for Standardization Technical Committee 97 (replaced by ISO/IEC JTC 1 in the late 1980s), the European Computer Manufacturer's Association (ECMA), and Accredited Standards Committee X3. These organizations had a big success and have influenced a lot the computing industry: they created standards ranging from the QWERTY keyboard to databases and to objects. A lot of other groups joined then these organizations; all of them have one common thread, which is the belief that the standards they are producing will lead to better products for the users.
Then, the increasing power and decreasing price of computing began to attract more and more users. IBM wasn't the only computer provider any more; the minicomputers had appeared, supported mostly by DEC and Data General. The diversity of computers grown and that lead to communication needs. The users found out soon that computers did not talk the same language, being even incapable of talking to one another at all. Each vendor have created a unique solution based on proprietary technologies.
In the late 1970s, the Open Systems Interconnect (OSI) model was begun. OSI focused on standardizing the communication paths between computers. Vendors had no other choice and have standardized the connecting hardware (the RS-232 connector and so on).
Two attributes was then added to those of the past:
JTC 1 began to create standards to satisfy the needs of the market. The OSI model required the use of "anticipatory standards", which means to standardize a technology before become available as a product in a commercial form. Finally, this lead to a model grow and the concomitent number and options of standards to support the growth, and this caused a total confusion to the users. Because of the number of options available in the standards, an OSI system provider could both comply to the OSI standards and be totally noninteroperable with another compliant system.
At this step, the keyword was intercommunication between systems, and still is today.
To solve this problem, the industry began to create consortia (a new form of standardization organizations) that could test and validate the multiple profiles that were being created from the multiplicity of standards. The major vendor contributors to the OSI model created the Corporation for Open Systems (COS), which failed in his mission.
The next step in evolution of open systems started with the success of UNIX, which was owned by AT&T at that time. The main idea was a operating system that would allow applications that were written for a computer to run successfully on another system, with a minimum of supplemental work. This desired feature was partially achieved by UNIX, since there were lot of variants - produced by different vendors - that didn't allow true portability. Then, the idea was to create a set of standardized calls to an operating system. This was done by IEEE Computer Society, which written a series of standards called POSIX. But the applications' portability wasn't the only task achieved by POSIX: he added the idea that the operating system would be the same on different platforms. Then POSIX created the IEEE POSIX 1003.0 committee, which produced a Guide to the POSIX Open Systems Environment.
Fearing not to repeat the mistakes of OSI, the vendors established attendant vendor consortia such as X/Open to test the implementations of standards. These consortia were to ensure the interoperability of standard-based applications.
If the vendors are usually the only creators of standards, there was an organization called The User Alliance for Open Systems (UAOS), which tried to support the idea of the utility of open systems. A description of the open system have been made by Max Black:
"... the following facts, which hold for every known language: (1) there is no upper limit on the length or complexity of grammatical sentences; (2) from a finite number of words infinitely many grammatical sentences can be constructed; (3) a competent speaker of the language knows in advance how to understand indefinetely many sentences that he has never considered or met. To say that language has 'synthetic resources' listed above is to claim that it is an 'open system'".
Manifestations of concept of open system in software:
These aspects will:
Following the open system model, software can be built on components instead of
creating software monoliths difficult to modify and extend. Different components could be
supplied by other vendors. But the interoperability leads to interdependence. We saw that
companies' tendency is to create proprietary systems. Will these companies agree to
follow standards, and not to impose their own? When creating a object oriented system,
the whole must be based on a set of foundation classes, as in the case of Java. Yes, SUN
Microsystems proposed to ISO/IEC the Java specifications as standards, in March '97. But
if someone want to base an application on Java, must licenciate the Java SDK kit provided
by SUN. Isn't this a proprietary behaviour?
Talking about operating systems, UNIX is almost always linked with open systems. Because it's interface (sources available for a small fee - in the past, freely available sources - today, a lot of documentation, availability on every platform, etc.) we can say it's an open system. But refering to it's architecture, which is monolithic, it's not an open system. I think UNIX will migrate to an open-system model, to a modular design, based on a client-server architecture, and support for running on a distributed network, in the near future.
The big problem is that a lot of big companies are still creating proprietary systems. And the main purpose of these companies is to increase the selling of their products, created not for users' demand or to increase products's quality, but on a clever marketing stategy (and the best example is Microsoft). Vendors tend to use "open" and "open systems" to describe whatever they are selling. Sometimes, even when pretending that their products are based upon official standards, they are talking about their interpretation of the standards. Standards are not enough; there must be organizations to test product's compliance to these standards. An example is Microsoft's Windows NT, which they are claiming is POSIX compliant. But how to test if they are saying the truth?
But not every standard means immediately open system. Compare the "standards" created by Microsoft (like DDE, MFC, etc.) and an open standard, like RFC (which mean "Request For Comments"). The big difference between these two is that the first one is imposed, and the second one is proposed. Or compare the MicroChannel architecture, created by IBM, and the SCSI standard. The politic that commercial companies follow doesn't mean that commercial systems can't be open systems: a good example is the NFS (Network File System), which was created by SUN.
I think that the Free Software Foundation and generally the free software will have an important contribution to the open systems evolution. The software and standards created in free software world are not controlled by any organization, but generally the free software world is producing systems much more opened than companies that creates commercial software (that's mainly because the free software world is responding very quick to users' needs).
The interoperability of systems is precisely the key to open systems.
The question will not be just "Will vendor X put UNIX on all hardware platforms?" but rather "Will users be able to buy machines from multiple vendors and link them together, thereby minimizing their training and applications investments?".
The highest-level priciple of an open system is to optimize the whole rather than
the parts. As we saw, any complex system is built upon more small components. The
ability of these individual objects to operate as parts of a unit is the main purpose for a
system with a open architecture. To optimize the performance of an open systems, the
This way, applications based on open system architecture will achieve in the future the highest levels of portability and interoperability.
Indisputable, the systems' (hardware or software) design is moving from monolithic model to open system model, and this movement is going to accelerate in the future.