- Shopping Bag ( 0 items )
Previously web functionality was static and limited but using IIS, ASP and ADO sophisticated, dynamic, data-driven web applications can be created. The web is the fastest growing programming area, now a powerful new platform for VB6 programmers to program web-based applications exists....
Ships from: Nashua, NH
Usually ships in 1-2 business days
Ships from: fallbrook, CA
Usually ships in 1-2 business days
Previously web functionality was static and limited but using IIS, ASP and ADO sophisticated, dynamic, data-driven web applications can be created. The web is the fastest growing programming area, now a powerful new platform for VB6 programmers to program web-based applications exists.
Web Foundations and the Windows DNA SuperstructureFor a long time now, Microsoft has promoted their vision of "Information at Your Fingertips" so that a user can operate more efficiently. It was during his keynote speech at COMDEX in 1990, that Bill Gates (the Chairman and CEO of Microsoft) introduced this concept and discussed his expectations of the future. He painted a picture of the consequences that desktop computer technology would have in many areas of everyday life.
Four years later, Bill Gates gave another COMDEX speech updating his original theme. While he talked about the effects of the recent rapid changes in the technology, there was still no reference to what will no doubt has caused the biggest revolution in the IT industry since the PC . the Internet.
Microsoft's D-Day awakening to the Internet wasn't until December 1995 when they publicly acknowledged its significance and announced an overall Internet strategy. Later in March 1996 at their Professional Developers Conference, Microsoft delivered a promise to use Internet standards to extend and embrace existing IT infrastructures and to deliver a more effective computing environment. This would be achieved by producing a comprehensive set of technologies, products, services and tools that would seamlessly integrate desktops, LANs, legacy systems, GroupWare applications and the public Internet.
While Microsoft joined the Internet game relatively late, they rapidly gained momentum and have since released an incredible range of innovative Internet products. These products have provided users rich Internet experiences and organizations the mechanisms to develop high impact business critical Internet solutions that are secure, robust, high performant and scaleable.
The most recent vision of Microsoft's strategy for delivering their "Information at Your Fingertips" goal was unveiled at the Professional Developers Conference in September 1997. It was announced that Microsoft's future strategy would be based on delivering an architectural framework for creating modern, open, scaleable, multi tier distributed Windows applications using both the Internet and client / server technologies - this was to be known as the Windows Distributed interNet Applications Architecture or Windows DNA.
The roadmap for the delivery of Windows DNA meant that its introduction would be evolutionary and not revolutionary. It would build on current investments in technologies, hardware, applications and skills. Many of the services detailed in Windows DNA had been around for a number of years - however Windows DNA was the formalization of the framework and a blueprint for the future. Fundamental to Windows DNA is Windows 2000 (previously known as Windows NT 5.0) which is not targeted for delivery until at least late 1999; however, some of the key Windows DNA components are available now and supplied in the Windows NT 4.0 Option Pack.
A crucial part of Microsoft's Windows DNA arsenal is their comprehensive portfolio of development products. Most notable is Visual Studio (current version 6.0), which encompasses all of the Microsoft programming languages, development tools and technical information. The aim of the latest releases of these development products is to address Windows DNA by reducing the differences between the development of Internet applications and for those more traditional client/server environments. Without doubt, the most popular programming language is Visual Basic and, as we shall see throughout this book, the latest version (Visual Basic 6.0) meets these objectives in full.
In this chapter, we'll preview the key concepts of Web technologies and Windows DNA, in order to provide a foundation for each of the topics discussed throughout the book. In particular we shall look at:
Why Web Technology?Although the Internet has existed in various forms since the late 1960s, it is only over the last few years that it has become widely known and utilized. In such a very short period of time, it has caused major changes in information technology, business processes, communications and the way many people spend their leisure time. It is probable that no other significant invention or technology has been embraced by so many so quickly, or has had the potential to change our future lifestyle so dramatically. To compare its growth, consider that the World Wide Web attained 50 millions users in four years - it took radio 38 years and television 13 years to reach similar penetration.
To understand the main reasons for the rapid adoption of Web technologies, we need to comprehend the effects of two important and independent issues that are driving future Information Technology directions - one is business led and the other is technical. In the next couple of sections we shall discuss:
IT Strategy for the MillenniumAs we approach the millennium, today's senior management is faced with the unique problems of ensuring that their organizations can meet the challenges caused by the rapidly changing market place.
To survive, it is vital that their organizations have an effective IT strategy to enable competitive advantage by providing productivity improvements, increased customer satisfaction, mechanisms for new and enhanced revenue streams and timely access to key data for effective business decisions. In addition, the IT infrastructure must provide flexibility and allow the business to react quickly to change and when potential opportunities are recognized.
Another term recently coined by Microsoft is the Digital Nervous System and this provides a great analogy. In a human, it is the web of nerves and synapses that enables the body to have the information that it needs, when it needs it, to unconsciously perform its various complex tasks. Similarly, any competitive company must have a healthy IT infrastructure (i.e. 'digital' nervous system) that provides good information flow, allows it to perform its tasks in an efficient manner and that allows the business to respond quickly to the frequently changing market dynamics. For more information on these ideas, take a look at Bill Gates latest book:
Business @ the Speed of Thought : Using a Digital Nervous System; Warner Books; ISBN: 0446525685
Processing Islands in the EnterpriseFor many years, organizations have accomplished their business goals by exploiting their technology base in the following three dimensions:
Integrating the EnterpriseOver the years, software vendors have offered various architectures in an attempt to integrate the various distributed environments within an organization. Unfortunately such solutions have never provided the complete seamless integration and flexibility that the business demanded. At long last, there now seems to be light at the end of the tunnel as it appears that using distributed computing framework based on Web technologies provide the key to fulfilling these requirements. The rapid adoption of Internet standards by every significant IT vendor has provided for the complete integration of the various distributed environments and different infrastructures within an organization
The first serious use of Web technology within business was for the implementation of Intranets. Intranets take full advantage of the open industry standards and the familiar Web browser / server software that was originally devised for the Internet to provide employees with access to corporate information and processes. However, whereas the Internet is global and publicly open to all, an Intranet is closed and has strict user access controls enforced. Many companies are now taking advantage of Intranets to make it more efficient for their staff to locate / process information and to collaborate with each other. The Web browser approach enables a consistent user interface and provides a single view of all company information irrespective of its native file format or the type of data source. For most organizations their business information is key and many have huge amounts of investment in existing data systems and electronic documents- mechanisms enabling the leverage of such valuable knowledge in the organization can have considerable impact on business performance.
Extending the EnterpriseA digital nervous system is not just about moving information around inside a company but also out to customers, partners and suppliers. The recent advances in networking means that both private and public networks (and in particular the Internet) can be exploited to extend the Enterprise to include external parties. Many businesses are now finding that their infrastructure is becoming intelligently intertwined with each other to form Extranets that are large 'Virtual Enterprises', and which frequently comprise many different organizations. As an example consider the manufacturing supply chain; before the goods reach the end customer, numerous organizations are involved along the chain from handling the raw materials, through the manufacturing process, and on to distribution and retail. Each of these organizations can achieve faster delivery times and reduce their inventory costs if they handle their business processes electronically.
Let us now take a look at the some of the technical aspects of this change.
The Evolution of IT System ArchitecturesOver the next few sections we shall be looking at the evolution of IT architectures and see how the pursuit of an optimal solution has led us to using an infrastructure based on Web technologies. Hopefully by the end of this discussion, you shall start to agree that the compulsion to move to using Web architectures is irresistible.
So we shall start with a short history lesson .
The Early Days - Centralized ComputingIn the 1960s / 1970s, the computers adopted by businesses were expensive mainframes that used simple monochrome terminals with keyboards to access the applications and the databases. These resources were located centrally on the mainframe and processing times was shared or 'time-sliced' between the various users.
Such terminals are now often referred to as 'dumb terminals' since they could not perform any of the processing logic - they were text based and only capable displaying screen forms comprising of information received from the central system or entered by the user. The keyboards had a designated button for 'submission' in which all of the information entered into the screen form would be sent to the central system for subsequent processing such as validation and database access.
As time passed on, the hardware technology evolved and many small businesses migrated towards the use of cheaper minicomputers and UNIX super-microcomputers. These systems relied on a new generation of dumb terminals but they were typically no smarter than their predecessors. Such systems would intercept every user key-press and then instantaneously generate the appropriate screen interaction.
This early model of computing was called centralized - the business processing, data management and screen presentation was all controlled by the central systems with just the user interaction handled by the terminal devices.
PC Networking: Moving Towards DecentralizationOver time, further advances in hardware, software and networking enabled the computer systems to move from a centralized shared logic based architecture to a network of workstations and servers.
The first personal computers and local area networks appeared in the early 1980s. Applications that exploited this new model were based on file sharing and various XBase products such as dBase, FoxPro and Clipper popularized this approach. In such systems, the entire data is downloaded from a central file server and operated upon by the application logic that was totally resident on the workstation. File sharing systems worked well for small amounts of data and small populations of users. However, as the momentum for PC networking grew, the capacity of such applications became strained and an alternative approach was required - the solution came from using an application architecture known as client / server computing.
Client / Server ComputingClient / server involves breaking up the system functionality into distinct manageable layers that can be independently developed, deployed across multiple machines and that use a communication mechanism to allow the different layers to co-operate.
This approach is regarded as an enabling technology that can implement systems across an organization in a modular and flexible manner. It allows for the distribution of applications away from single machines located in isolated departments to an implementation across the enterprise. For example, it is now possible for one person in customer services to access all corporate systems - gone are the old days of annoying transfers between the different representatives in each department.
It is common to find that most client / server solutions involve the following three layers:
As we shall now see, there are many variations to client / server architectures.
Two Tier Client / ServerThe first generation of client / server systems was an evolution of the file sharing applications mentioned above. With these applications, the central file server is replaced with a specialized relational database management system (RDBMS). Such databases can offer high transaction rates at a fraction of the costs associated with mainframes. When the client (a workstation application typically using a GUI) needs to act upon data, it makes a request via the network to the database server - the database then processes the request and returns just the data appropriate the client's needs.
When compared to the file sharing application (which returned the complete file), this client / server architecture dramatically reduces network traffic. In addition, today's databases provide many features that enable the development of advanced multi-user applications - for example, allowing multiple users to access and update the same set of data.
Because the processing is split between distinct layers - the workstation and the database server - such architectures are referred to as being two-tier client / server architecture.
This approach become very popular as it was found that sophisticated systems could be implemented quickly (and thus cheaply) using development tools like Visual Basic and rapid application development (RAD) techniques.
However, as expectations increased it was found that it had certain limitations and problems, including the following:
Three Tier Client / ServerDrawing on the lessons learnt from the two-tier systems, an increase in application performance and a notable reduction in network traffic can be been achieved by inserting an additional middle tier between the workstation and the database server to create three tiers. Furthermore, it is found that such architectures are more maintainable and supportable, and provides greater flexibility to address the ever-changing business requirements.
This is approach is known as a three-tier client / server architecture. The middle layer is called the application server and is used to handle the business logic of the system. The workstation is now only responsible for handling the presentation logic. As before, the database server handles the data access logic.
It is possible for the application server and database server to physically reside on the same machine - and in many cases this can provide an optimum solution. However, for it to be recognized as three-tier, distinct boundaries or interfaces must exist between the two layers to ensure the advantages of the architecture are achieved.
Multi Tier Client / ServerA further extension to three-tier solutions is the multi-tier or, as sometimes called, n-tier. These solutions are the most flexible and scaleable and build on all the advantages of the three-tier architecture.
In a multi-tier client / server solution, the business logic is partitioned and distributed over several machines. As requirements change during a systems lifetime, this partitioning and deployment can be reviewed and amended with minimal impact. Furthermore, additional tiers are included to support multiple databases and other services such message queues, legacy systems, data warehouses, communication middleware and so on.
By enabling the distribution of the workload over many CPUs (using either symmetric multiprocessing or massively parallel clustered technology), it is obvious how scalable solutions can be delivered. Sometimes the distribution of the logic over separate geographical regions can be considered to achieve optimum performance; for example, to locate processes at the sites where it limits the amount slow network communications performed.
Unfortunately, three-tier and multi-tier client / server solutions are not trivial to implement. There are more tasks to undertake and complex issues to address than when building two-tier systems. A strong understanding of the multi-tier client / server development techniques and an appreciation of the potential pitfalls are vital.
Web Technologies: Centralized Computing but with a TwistJust as we were beginning to get use to the issues of developing multi-tier architectures, the new paradigm of Web Technologies arrived to direct interest from the traditional client / server architectures.
The Web architecture is a flexible and scaleable implementation of multi tier computing and uses a Web browser (client) to retrieve information from a Web server. The Web server can interface with application servers and databases to determine programmatically the information that is returned to the user.
In the simplest scenario, the three processing layers may reside on the same machine, as shown below.
Many people confuse the terms Internet and Web or consider the two as equivalent but in fact, the two are very distinct. It is important to recognize that the Web is not a network but an application that operates over networks using a communications protocol called HTTP (Hyper Text Transfer Protocol).
Most documents retrieved over the Web contain HTML (Hyper Text Markup Language) which is a page description language that defines both the data content of the document and information on how the information should be rendered by the Web browser. Modern Web browsers supports the display of multimedia within the retrieved HTML documents - including text, graphics, sound, video - and hyperlinks in which items on the document are linked to other Web resources. By clicking on a hyperlink, the Web browser automatically navigates to the target document.
We can now see how the Web based data access model has traveled a full 360 degrees back to an apparently centralized computing model. The twist however, is in this model, the client is accessing geographically distributed applications resident on multiple servers interconnected by networking technologies.
The great thing about this Web architecture is that solves the several problems with traditional client / server. By restricting the client processing logic to HTML it is possible to develop a single universal application that can be deployed across different types of platforms (e.g. Windows, Mac, various flavours of Unix, etc).
In addition, all client logic is centrally administered and dynamically deployed - this means that any bug fixes or enhancements will automatically be applied the next time the user accesses the application. This avoids the process of having to manually deploy software on every desktop, which can be very costly on a large population of users.
So recapping on what we have seen so far, there are two main messages to take with us as we proceed. Firstly, the next hugely successful companies will be those that quickly enable the next generation of business systems that exploit the convergence of computers and networking for business advantage. Secondly, it will be the Web technologies that will be the key enabler in their search for prosperity.
In the next section we shall introduce some of the basics concepts about Web technologies that are prerequisite knowledge in order to understanding the topics discussed in this book.