Visual Basic .NET 2010 Express - Programming Logic and System Architecture
< Continued from page 3
We've talked about one system running on multiple processors and one processor running multiple programs. The next step is multiple computers cooperating to do a job together: Client-Server and Multi-tier Architectures.
An example of this that is familiar to everyone today is the web. Your PC runs a program called a web browser - IE, Firefox, Opera, Chrome or something else - but most of the data comes from another program called a web server.
Some popular server systems include Microsoft Internet Information Server (IIS), Apache and Sun ONE.
Getting all these different computers and programs to work together (mostly) flawlessly is truely one of the major information technology wonders of our time. It works because a standard exists for passing data back and forth called HTTP - Hypertext Transfer Protocol. For years, getting computers to exchange information over a network was a technical trick that was difficult under any circumstances. Now, we all do it every day. In my view, this is one of the best demonstrations of the power of the simple idea of having good rules and requiring everyone to follow them. (This is like that rule in management: You can have the most powerful people on your team, but if they're pulling in different directions, you still don't go anywhere.)
The web isn't the only example. An example that is almost as widespread, but not as well known, is when one computer is a database server. In this client-server relationship, the rule book that is usually followed is called SQL (Structured Query Language).
Although SQL rules are reasonably consistent, there are minor differences in the way different companies implement it. But even minor differences are enough to make the success of SQL only a shadow of the phenominal achievement that we've seen in the web. (For example, the SQL programming to retrieve data from an Oracle database normally won't run without at least some tweaking on a Microsoft SQL Server database.)
The "client" and "server" in this architecture are called "tiers" or "layers". It's common today to have many tiers, all passing data back and forth. An example of a "middle" tier that is often used is called the "business logic" tier. This tier integrates different data sources (For example, multiple ATM machines for a banking application.) and ensures that data is consistent and doesn't violate any company policies. (An example of a policy is a banking rule that states that transactions over a certain amount have to be reported to the government.)
When the program that actually processes the data can assume that it's all good data, computers submitting data for processing also get a faster response when something is wrong.
We've come a long way from the old days when you didn't discover errors until the next month's processing!
Multi-tiered systems written today are usually highly customized for a particular business (a bank or a factory, for example), a particular brand of technology (such as Microsoft or Oracle), and a particular application (accounting or process control). Business invests in multi-million dollar software development projects with exacting requirements that often take years to complete. (If they finish at all. Some researchers claim that well over half of these mega-projects actually fail before being finished. Statistics on this are hard to get because businesses often cover up their failures.)
We've talked about one system running on multiple processors and one processor running multiple programs. The next step is multiple computers cooperating to do a job together: Client-Server and Multi-tier Architectures.
An example of this that is familiar to everyone today is the web. Your PC runs a program called a web browser - IE, Firefox, Opera, Chrome or something else - but most of the data comes from another program called a web server.
Some popular server systems include Microsoft Internet Information Server (IIS), Apache and Sun ONE.
Getting all these different computers and programs to work together (mostly) flawlessly is truely one of the major information technology wonders of our time. It works because a standard exists for passing data back and forth called HTTP - Hypertext Transfer Protocol. For years, getting computers to exchange information over a network was a technical trick that was difficult under any circumstances. Now, we all do it every day. In my view, this is one of the best demonstrations of the power of the simple idea of having good rules and requiring everyone to follow them. (This is like that rule in management: You can have the most powerful people on your team, but if they're pulling in different directions, you still don't go anywhere.)
The web isn't the only example. An example that is almost as widespread, but not as well known, is when one computer is a database server. In this client-server relationship, the rule book that is usually followed is called SQL (Structured Query Language).
Although SQL rules are reasonably consistent, there are minor differences in the way different companies implement it. But even minor differences are enough to make the success of SQL only a shadow of the phenominal achievement that we've seen in the web. (For example, the SQL programming to retrieve data from an Oracle database normally won't run without at least some tweaking on a Microsoft SQL Server database.)
The "client" and "server" in this architecture are called "tiers" or "layers". It's common today to have many tiers, all passing data back and forth. An example of a "middle" tier that is often used is called the "business logic" tier. This tier integrates different data sources (For example, multiple ATM machines for a banking application.) and ensures that data is consistent and doesn't violate any company policies. (An example of a policy is a banking rule that states that transactions over a certain amount have to be reported to the government.)
When the program that actually processes the data can assume that it's all good data, computers submitting data for processing also get a faster response when something is wrong.
We've come a long way from the old days when you didn't discover errors until the next month's processing!
Multi-tiered systems written today are usually highly customized for a particular business (a bank or a factory, for example), a particular brand of technology (such as Microsoft or Oracle), and a particular application (accounting or process control). Business invests in multi-million dollar software development projects with exacting requirements that often take years to complete. (If they finish at all. Some researchers claim that well over half of these mega-projects actually fail before being finished. Statistics on this are hard to get because businesses often cover up their failures.)
Source...