Professional Visual Basic 6 Web Programming

Overview

With Version 6, VB has added the ability to develop internet applications on Internet Information Server (IIS), the free web server that comes with NT Server.

Previously web functionality was static and limited but using IIS, ASP and ADO sophisticated, dynamic, data-driven web applications can be created. The web is the fastest growing programming area, now a powerful new platform for VB6 programmers to program web-based applications exists.

...
See more details below
Available through our Marketplace sellers.
Other sellers (Paperback)
  • All (15) from $1.99   
  • New (3) from $19.95   
  • Used (12) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$19.95
Seller since 2010

Feedback rating:

(1811)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
186100222X BRAND NEW. A portion of your purchase of this book will be donated to non-profit organizations. We are a tested and proven company with over 900,000 satisfied ... customers since 1997. Choose expedited shipping (if available) for much faster delivery. Delivery confirmation on all US orders. Read more Show Less

Ships from: Nashua, NH

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$21.20
Seller since 2008

Feedback rating:

(169)

Condition: New
186100222X BRAND NEW NEVER USED IN STOCK 125,000+ HAPPY CUSTOMERS SHIP EVERY DAY WITH FREE TRACKING NUMBER

Ships from: fallbrook, CA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
$50.00
Seller since 2014

Feedback rating:

(136)

Condition: New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

With Version 6, VB has added the ability to develop internet applications on Internet Information Server (IIS), the free web server that comes with NT Server.

Previously web functionality was static and limited but using IIS, ASP and ADO sophisticated, dynamic, data-driven web applications can be created. The web is the fastest growing programming area, now a powerful new platform for VB6 programmers to program web-based applications exists.

Read More Show Less

Product Details

  • ISBN-13: 9781861002228
  • Publisher: Wrox Press, Inc.
  • Publication date: 8/1/1999
  • Series: Professional Ser.
  • Pages: 1120
  • Product dimensions: 7.31 (w) x 9.03 (h) x 2.09 (d)

Read an Excerpt

Extract from Chapter 1 of 'Professional Visual Basic 6 Web Programming'

1

Web Foundations and the Windows DNA Superstructure

For a long time now, Microsoft has promoted their vision of "Information at Your Fingertips" so that a user can operate more efficiently. It was during his keynote speech at COMDEX in 1990, that Bill Gates (the Chairman and CEO of Microsoft) introduced this concept and discussed his expectations of the future. He painted a picture of the consequences that desktop computer technology would have in many areas of everyday life.

Four years later, Bill Gates gave another COMDEX speech updating his original theme. While he talked about the effects of the recent rapid changes in the technology, there was still no reference to what will no doubt has caused the biggest revolution in the IT industry since the PC . the Internet.

Microsoft's D-Day awakening to the Internet wasn't until December 1995 when they publicly acknowledged its significance and announced an overall Internet strategy. Later in March 1996 at their Professional Developers Conference, Microsoft delivered a promise to use Internet standards to extend and embrace existing IT infrastructures and to deliver a more effective computing environment. This would be achieved by producing a comprehensive set of technologies, products, services and tools that would seamlessly integrate desktops, LANs, legacy systems, GroupWare applications and the public Internet.

While Microsoft joined the Internet game relatively late, they rapidly gained momentum and have since released an incredible range of innovative Internet products. These products have provided users rich Internet experiences and organizations the mechanisms to develop high impact business critical Internet solutions that are secure, robust, high performant and scaleable.

The most recent vision of Microsoft's strategy for delivering their "Information at Your Fingertips" goal was unveiled at the Professional Developers Conference in September 1997. It was announced that Microsoft's future strategy would be based on delivering an architectural framework for creating modern, open, scaleable, multi tier distributed Windows applications using both the Internet and client / server technologies - this was to be known as the Windows Distributed interNet Applications Architecture or Windows DNA.

The roadmap for the delivery of Windows DNA meant that its introduction would be evolutionary and not revolutionary. It would build on current investments in technologies, hardware, applications and skills. Many of the services detailed in Windows DNA had been around for a number of years - however Windows DNA was the formalization of the framework and a blueprint for the future. Fundamental to Windows DNA is Windows 2000 (previously known as Windows NT 5.0) which is not targeted for delivery until at least late 1999; however, some of the key Windows DNA components are available now and supplied in the Windows NT 4.0 Option Pack.

A crucial part of Microsoft's Windows DNA arsenal is their comprehensive portfolio of development products. Most notable is Visual Studio (current version 6.0), which encompasses all of the Microsoft programming languages, development tools and technical information. The aim of the latest releases of these development products is to address Windows DNA by reducing the differences between the development of Internet applications and for those more traditional client/server environments. Without doubt, the most popular programming language is Visual Basic and, as we shall see throughout this book, the latest version (Visual Basic 6.0) meets these objectives in full.

In this chapter, we'll preview the key concepts of Web technologies and Windows DNA, in order to provide a foundation for each of the topics discussed throughout the book. In particular we shall look at:

  • v Why the Web and Internet technologies are so popular and the reasons that many businesses are now basing their IT strategies based upon them
  • v The fundamentals of Web technologies and cover the key concepts that the remainder of the book will assume that the reader understands
  • v How Windows DNA provides architectural framework that frees the developers from the complexity of programming infrastructure and allows them to concentrate on delivering business solutions
  • v How Visual Basic 6.0 fully addresses the new Web paradigms and provides the most productive tool for building Windows DNA applications
So let us start by discussing why the Web technologies are so popular and understand fully why their wide acceptance is not just due to the frequent hype in the media.

Why Web Technology?

Although the Internet has existed in various forms since the late 1960s, it is only over the last few years that it has become widely known and utilized. In such a very short period of time, it has caused major changes in information technology, business processes, communications and the way many people spend their leisure time. It is probable that no other significant invention or technology has been embraced by so many so quickly, or has had the potential to change our future lifestyle so dramatically. To compare its growth, consider that the World Wide Web attained 50 millions users in four years - it took radio 38 years and television 13 years to reach similar penetration.

To understand the main reasons for the rapid adoption of Web technologies, we need to comprehend the effects of two important and independent issues that are driving future Information Technology directions - one is business led and the other is technical. In the next couple of sections we shall discuss:

Bullet Points

  • v the critical issues facing organizations as they adjust to address the changing market forces and do business in the 21st century
  • v the convergence of computing and communications, and discuss the evolution of IT system architectures

IT Strategy for the Millennium

As we approach the millennium, today's senior management is faced with the unique problems of ensuring that their organizations can meet the challenges caused by the rapidly changing market place.

To survive, it is vital that their organizations have an effective IT strategy to enable competitive advantage by providing productivity improvements, increased customer satisfaction, mechanisms for new and enhanced revenue streams and timely access to key data for effective business decisions. In addition, the IT infrastructure must provide flexibility and allow the business to react quickly to change and when potential opportunities are recognized.

Another term recently coined by Microsoft is the Digital Nervous System and this provides a great analogy. In a human, it is the web of nerves and synapses that enables the body to have the information that it needs, when it needs it, to unconsciously perform its various complex tasks. Similarly, any competitive company must have a healthy IT infrastructure (i.e. 'digital' nervous system) that provides good information flow, allows it to perform its tasks in an efficient manner and that allows the business to respond quickly to the frequently changing market dynamics. For more information on these ideas, take a look at Bill Gates latest book:

Business @ the Speed of Thought : Using a Digital Nervous System; Warner Books; ISBN: 0446525685

Processing Islands in the Enterprise

For many years, organizations have accomplished their business goals by exploiting their technology base in the following three dimensions:
  • · Data Processing - these are the core IT systems that control the fundamental business processes in an organization; examples can include Accounting, Stock Control, Order Processing, Job Tracking, etc. Many types of technologies, from large mainframes to client / server architectures have been applied to these business critical systems and most of these still have a part to play in today's IT infrastructures.
  • · Personal Productivity - the huge popularity of the PC and the Integrated Office suites has forever changed the way individual employees work with information and has often changed business practices and strategies. These tools have dramatically increased productivity, streamlined operations and made IT more cost effective.
  • · Collaboration - the use of communications and GroupWare software has enabled both organizations and individuals to work in partnership and teams. Such systems can scale to thousands of users across the enterprise enabling businesses to redefine their operations for further advantage and reduced costs.
However, many of these benefits do not come without high cost. Each of these dimensions typically has their own infrastructures and businesses have been faced with the complex problem of building 'information bridges' between the different systems and applications - building systems that span all dimensions has been historically difficult. Furthermore, having multiple different technology infrastructures always results in additional costs for software, hardware, support and training.

Integrating the Enterprise

Over the years, software vendors have offered various architectures in an attempt to integrate the various distributed environments within an organization. Unfortunately such solutions have never provided the complete seamless integration and flexibility that the business demanded. At long last, there now seems to be light at the end of the tunnel as it appears that using distributed computing framework based on Web technologies provide the key to fulfilling these requirements. The rapid adoption of Internet standards by every significant IT vendor has provided for the complete integration of the various distributed environments and different infrastructures within an organization

The first serious use of Web technology within business was for the implementation of Intranets. Intranets take full advantage of the open industry standards and the familiar Web browser / server software that was originally devised for the Internet to provide employees with access to corporate information and processes. However, whereas the Internet is global and publicly open to all, an Intranet is closed and has strict user access controls enforced. Many companies are now taking advantage of Intranets to make it more efficient for their staff to locate / process information and to collaborate with each other. The Web browser approach enables a consistent user interface and provides a single view of all company information irrespective of its native file format or the type of data source. For most organizations their business information is key and many have huge amounts of investment in existing data systems and electronic documents- mechanisms enabling the leverage of such valuable knowledge in the organization can have considerable impact on business performance.

Extending the Enterprise

A digital nervous system is not just about moving information around inside a company but also out to customers, partners and suppliers. The recent advances in networking means that both private and public networks (and in particular the Internet) can be exploited to extend the Enterprise to include external parties. Many businesses are now finding that their infrastructure is becoming intelligently intertwined with each other to form Extranets that are large 'Virtual Enterprises', and which frequently comprise many different organizations. As an example consider the manufacturing supply chain; before the goods reach the end customer, numerous organizations are involved along the chain from handling the raw materials, through the manufacturing process, and on to distribution and retail. Each of these organizations can achieve faster delivery times and reduce their inventory costs if they handle their business processes electronically.

Let us now take a look at the some of the technical aspects of this change.

The Evolution of IT System Architectures

Over the next few sections we shall be looking at the evolution of IT architectures and see how the pursuit of an optimal solution has led us to using an infrastructure based on Web technologies. Hopefully by the end of this discussion, you shall start to agree that the compulsion to move to using Web architectures is irresistible.

So we shall start with a short history lesson .

The Early Days - Centralized Computing

In the 1960s / 1970s, the computers adopted by businesses were expensive mainframes that used simple monochrome terminals with keyboards to access the applications and the databases. These resources were located centrally on the mainframe and processing times was shared or 'time-sliced' between the various users.

Such terminals are now often referred to as 'dumb terminals' since they could not perform any of the processing logic - they were text based and only capable displaying screen forms comprising of information received from the central system or entered by the user. The keyboards had a designated button for 'submission' in which all of the information entered into the screen form would be sent to the central system for subsequent processing such as validation and database access.

As time passed on, the hardware technology evolved and many small businesses migrated towards the use of cheaper minicomputers and UNIX super-microcomputers. These systems relied on a new generation of dumb terminals but they were typically no smarter than their predecessors. Such systems would intercept every user key-press and then instantaneously generate the appropriate screen interaction.

This early model of computing was called centralized - the business processing, data management and screen presentation was all controlled by the central systems with just the user interaction handled by the terminal devices.

PC Networking: Moving Towards Decentralization

Over time, further advances in hardware, software and networking enabled the computer systems to move from a centralized shared logic based architecture to a network of workstations and servers.

The first personal computers and local area networks appeared in the early 1980s. Applications that exploited this new model were based on file sharing and various XBase products such as dBase, FoxPro and Clipper popularized this approach. In such systems, the entire data is downloaded from a central file server and operated upon by the application logic that was totally resident on the workstation. File sharing systems worked well for small amounts of data and small populations of users. However, as the momentum for PC networking grew, the capacity of such applications became strained and an alternative approach was required - the solution came from using an application architecture known as client / server computing.

Client / Server Computing

Client / server involves breaking up the system functionality into distinct manageable layers that can be independently developed, deployed across multiple machines and that use a communication mechanism to allow the different layers to co-operate.

This approach is regarded as an enabling technology that can implement systems across an organization in a modular and flexible manner. It allows for the distribution of applications away from single machines located in isolated departments to an implementation across the enterprise. For example, it is now possible for one person in customer services to access all corporate systems - gone are the old days of annoying transfers between the different representatives in each department.

It is common to find that most client / server solutions involve the following three layers:

  • Presentation logic - this handles how the user interacts with the application; usually implemented by providing an easy to use graphical user interface (GUI)
  • Business logic - this handles the mechanics (or business rules) of the application
  • Data access logic - this handles the storage and retrieval of data; it is vital that the integrity of the data is maintained
The development of these separate layers needs careful design and an accurate definition of the distinct boundaries to ensure that logic within the different layers is not intertwined. Encapsulating the logic in this fashion ensures that future changes can be implemented without impacting the other layers and this enables both reusability and reliability.

As we shall now see, there are many variations to client / server architectures.

Two Tier Client / Server

The first generation of client / server systems was an evolution of the file sharing applications mentioned above. With these applications, the central file server is replaced with a specialized relational database management system (RDBMS). Such databases can offer high transaction rates at a fraction of the costs associated with mainframes. When the client (a workstation application typically using a GUI) needs to act upon data, it makes a request via the network to the database server - the database then processes the request and returns just the data appropriate the client's needs.

When compared to the file sharing application (which returned the complete file), this client / server architecture dramatically reduces network traffic. In addition, today's databases provide many features that enable the development of advanced multi-user applications - for example, allowing multiple users to access and update the same set of data.

Because the processing is split between distinct layers - the workstation and the database server - such architectures are referred to as being two-tier client / server architecture.

This approach become very popular as it was found that sophisticated systems could be implemented quickly (and thus cheaply) using development tools like Visual Basic and rapid application development (RAD) techniques.

However, as expectations increased it was found that it had certain limitations and problems, including the following:

  • the database requests can generated large result sets; the performance of two-tier architectures is found to rapidly deteriorate as networking bottlenecks occur when an optimum number of users is exceeded.
  • the architecture imposes substantial processing on the client; this means workstations with powerful CPUs and large amounts of disk and memory may be required
  • each workstation session requires a separate database connection; this can drain resources on the database server - for example, Microsoft SQL Server requires 37K of memory for each user connection (and this is much lower than many other vendor's RDBMSs)
  • deploying the business rules on the client can lead to high costs of deployment and support; if the logic changes, the effort in updating software on numerous workstations can be excessive
So whilst two-tier client / server is justifiable for small workgroup applications, it now generally agreed that this approach offers a poor solution does not provide the flexibility or scalability for large-scale applications deployed across the corporate enterprise.

Three Tier Client / Server

Drawing on the lessons learnt from the two-tier systems, an increase in application performance and a notable reduction in network traffic can be been achieved by inserting an additional middle tier between the workstation and the database server to create three tiers. Furthermore, it is found that such architectures are more maintainable and supportable, and provides greater flexibility to address the ever-changing business requirements.

This is approach is known as a three-tier client / server architecture. The middle layer is called the application server and is used to handle the business logic of the system. The workstation is now only responsible for handling the presentation logic. As before, the database server handles the data access logic.

It is possible for the application server and database server to physically reside on the same machine - and in many cases this can provide an optimum solution. However, for it to be recognized as three-tier, distinct boundaries or interfaces must exist between the two layers to ensure the advantages of the architecture are achieved.

Multi Tier Client / Server

A further extension to three-tier solutions is the multi-tier or, as sometimes called, n-tier. These solutions are the most flexible and scaleable and build on all the advantages of the three-tier architecture.

In a multi-tier client / server solution, the business logic is partitioned and distributed over several machines. As requirements change during a systems lifetime, this partitioning and deployment can be reviewed and amended with minimal impact. Furthermore, additional tiers are included to support multiple databases and other services such message queues, legacy systems, data warehouses, communication middleware and so on.

By enabling the distribution of the workload over many CPUs (using either symmetric multiprocessing or massively parallel clustered technology), it is obvious how scalable solutions can be delivered. Sometimes the distribution of the logic over separate geographical regions can be considered to achieve optimum performance; for example, to locate processes at the sites where it limits the amount slow network communications performed.

Unfortunately, three-tier and multi-tier client / server solutions are not trivial to implement. There are more tasks to undertake and complex issues to address than when building two-tier systems. A strong understanding of the multi-tier client / server development techniques and an appreciation of the potential pitfalls are vital.

Web Technologies: Centralized Computing but with a Twist

Just as we were beginning to get use to the issues of developing multi-tier architectures, the new paradigm of Web Technologies arrived to direct interest from the traditional client / server architectures.

The Web architecture is a flexible and scaleable implementation of multi tier computing and uses a Web browser (client) to retrieve information from a Web server. The Web server can interface with application servers and databases to determine programmatically the information that is returned to the user.

In the simplest scenario, the three processing layers may reside on the same machine, as shown below.

Many people confuse the terms Internet and Web or consider the two as equivalent but in fact, the two are very distinct. It is important to recognize that the Web is not a network but an application that operates over networks using a communications protocol called HTTP (Hyper Text Transfer Protocol).

Most documents retrieved over the Web contain HTML (Hyper Text Markup Language) which is a page description language that defines both the data content of the document and information on how the information should be rendered by the Web browser. Modern Web browsers supports the display of multimedia within the retrieved HTML documents - including text, graphics, sound, video - and hyperlinks in which items on the document are linked to other Web resources. By clicking on a hyperlink, the Web browser automatically navigates to the target document.

We can now see how the Web based data access model has traveled a full 360 degrees back to an apparently centralized computing model. The twist however, is in this model, the client is accessing geographically distributed applications resident on multiple servers interconnected by networking technologies.

The great thing about this Web architecture is that solves the several problems with traditional client / server. By restricting the client processing logic to HTML it is possible to develop a single universal application that can be deployed across different types of platforms (e.g. Windows, Mac, various flavours of Unix, etc).

In addition, all client logic is centrally administered and dynamically deployed - this means that any bug fixes or enhancements will automatically be applied the next time the user accesses the application. This avoids the process of having to manually deploy software on every desktop, which can be very costly on a large population of users.

--- --- --- --- ---

So recapping on what we have seen so far, there are two main messages to take with us as we proceed. Firstly, the next hugely successful companies will be those that quickly enable the next generation of business systems that exploit the convergence of computers and networking for business advantage. Secondly, it will be the Web technologies that will be the key enabler in their search for prosperity.

In the next section we shall introduce some of the basics concepts about Web technologies that are prerequisite knowledge in order to understanding the topics discussed in this book.

Read More Show Less

Table of Contents

Chapter 1: Web Fundamentals
Chapter 2: Windows DNA - a framework for building applications
Chapter 3: An Overview of Building
Chapter 4: Client-Side Programming with Visual Basic
Chapter 5: Building Client-Side ActiveX Controls
Chapter 6: Dynamic HTML Applications
Chapter 7: Integrating Web Browsing Using the WebBrowser Objects
Chapter 8: Getting Started By Understanding ASP
Chapter 9: Building Server Components in Visual Basic
Chapter 10: Advanced Visual Basic Server Components
Chapter 11: Interfacing With Server Services in Visual Basic
Chapter 12: Introducing Webclasses and IIS Applications
Chapter 13: The Fundamentals of IIS Applications
Chapter 14: Advanced Webclass Techniques
Chapter 15: Meet RDS
Chapter 16: A Guide to XML
Case Study 1 : Message Board Part 1
Case Study 1 : Message Board Part 2
Case Study 2 : Web Based Document Management
Case Study 3: DHTML Applications
Case Study 4 : CGI Integration
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)