Coming to Terms with NT
Back when I mainly wrote programs, magazine columns, and technical
books for a living, Windows NT looked like the most hopeful development for PCs since the integrated circuit. Even in the earliest alpha test versions, Windows NT was so much more stable and powerful than DOS and so much easier to install than OS/2 that it made software development a joy. Of course, in those days, I used Windows NT in its Workstation incarnation, and gave little thought to Windows NT Advanced Server. Now that I am involved in building networks, setting up file and application servers, and helping to support hundreds of naive end-users on a daily basis, I am much more conscious of the unwieldy aspects of Windows NT.
Frankly, Windows NT is a royal pain to administer. Each installation of a server must be followed up by the application of the latest service pack, a host of hot-fixes, and scrupulous adherence to a checklist of tweaks that close various security loopholes. It's quite routine for service packs to break applications, and vice versa, which lends a certain flavor of mystery and suspense to the process. The system registry is a can of worms, support for multiple domains is hardware-expensive and labor-intensive, and the behavior of the security system in a complex environment is often difficult to predict. Add the various components
of BackOffice to the picture -- particularly Microsoft Exchange with its own hierarchy of organizations, sites, servers, connectors, directories, and service packs -- and you've got a real witch's brew.
Windows NT wouldn't be quite as difficult to cope with if it came
with a proper set of manuals. Unfortunately, as Microsoft's stranglehold on the software industry has increased, its commitment to comprehensive, accurate documentation -- which never went too deep to begin with -- has gone down the tubes along with most of Microsoft's rivals. When Microsoft does bother to throw together any non-trivial documentation, it is packaged as separate "Resource
Kits" so that the public can be bled for a few more dollars to prop up the Microsoft Press profit center. The Developer Network and TechNet CD-ROMs are other Microsoft ploys to squeeze additional revenue out of hapless customers for bug fixes and information that ought to be delivered with the base system.
Mother Nature and book publishers scorn an unpopulated ecological niche, and there are plenty of third-party Windows NT books out there to try and fill the needs created by Microsoft's software defects and slovenly manuals. But too many of the authors follow Microsoft's lead, rehashing the same material in much the same order, and, more importantly, omitting or glossing over the same problem areas. This is, I strongly suspect, because most of these authors are NT mavens
on the lecture circuit, hit-and-run NT consultants, or born-again OS/2 promoters, and they do not have the foundation of years of practical, real-world experience to understand what is really needed.
O'Reilly and Aeleen Frisch to the rescue! Essential Windows NT
System Administrationbreaks the mold and closes the information
gap for NT administrators. Ms. Frisch has been responsible for a variety of VMS, UNIX, and Windows NT systems for some 15 years, and she clearly has an unusually thorough understanding of what it takes in the way of skills, knowledge, and resources to keep a industrial-strength network of servers and clients running smoothly over a long period of time. How fortunate for us all that she appears
to have a generous allotment of writing and organizational talent as well.
By the time I had finished the first chapter, it was evident to me that Frisch approached Windows NT from a completely different perspective than most NT book authors. Instead of browsing Microsoft's sorry excuses for documentation and trying to figure out how she could cover the same ground using different words, she drew up a list of things she had to know and tasks she had to accomplish
based on her VMS and UNIX background, and then set out to find their Windows NT counterparts. A startlingly adult strategy!
The result is an eminently practical book that is light-years beyond its competitors in usability and credibility. The security chapter, which organizes its discussion from the standpoint of what is needed rather than what Windows NT most easily does, is an outstanding example of the strength of Frisch's approach. Throughout the book, Ms. Frisch's methods are eclectic and
ecumenical. Her goal is to get the job done quickly and reliably, and she will use whatever tool works best, whether it be graphically-based, command-line, custom script, or third-party utility.
Among biologists, there is discussion of an evolutionary theory called punctuated equilibrium. Epochs of apparent stability are interrupted by episodes of rapid change, with the emergence of new species and capabilities over a relatively short period. Perhaps the technical book market is similar. During an apparently stagnant interval, a certain critical mass of information and techniques accumulates, and then the right author and publisher come together at the right place and time to hatch a book that is unlike any of its predecessors. Essential Windows NT System Administration is such a book.--Dr. Dobb's Electronic Review of Computer Books
Read an Excerpt
From Chapter 7: Backups
Any user of any computer system figures out sooner or later that files are occasionally lost. These losses have many causes: users may delete their own files accidentally, a power failure can corrupt an important database file, a hardware failure may ruin an entire disk, and so on. The damage resulting from these losses can range from minor to very expensive. To guard against them, one primary responsibility of a system administrator is planning and implementing a backup scheme that periodically saves all of the files on the systems for which she is responsible. It is also the administrator's responsibility to see that backups are performed in a timely manner and that backup tapes (or other media) are stored safely and securely. This chapter begins by discussing backup strategies and options and then turns the tools that are available for making them.
Planning a Backup Schedule
Developing an effective backup strategy is an ongoing process. You usually inherit something when you take over existing systems, and start out doing the same thing you've always done when you become responsible for new systems. While this may work for a while, such an approach all too often ends up in chaos, with no viable policy ever replacing the outdated one. The time to develop a good backup strategy is right now, starting from however you are approaching things at the moment.
Ultimately, backups are insurance. They represent time expended in an effort to prevent future losses. The time required for any backup schedule must be weighted against the decrease in productivity, product schedule slippage, and so on if the files are neededbut are not available. The overall requirement of any backup plan is that it be able to restore the entire computing core within an acceptable amount of time in the event of a large-scale failure while at the same time not sacrificing too much in the way of convenience in either what it takes to get the backup done or how easy it is to restore one or two files when a user deletes them accidentally. The approaches one might take in the abstract when considering just disaster recovery or just day-to-day convenience in isolation are often very different, and the final backup plan will need to take both of them into account (and will accordingly reflect the tension between them).
There are many factors to consider in developing a backup plan. The following questions are among the most important:
What files need to be backed up? The simplest answers is, of course, everything. But while everything but scratch files and directories needs to be saved somewhere, it doesn't all have to be saved as part of the formal backup procedures. For example, since the Windows NT operating system is delivered on CD-ROM, there may not be a pressing need to back up the system files (although you may choose to do so anyway for reasons of convenience). However, it is a good idea to be cautious and to err on the side of backing up too many files. If you overlook something important, chances are it will be the first file to be lost.
Whare are these files? This question involves both considering where the important files are and identifying the systems that hold important data. The type of system on which the important data reside also needs to be taken into account (we'll consider developing a backup plan for a heterogeneous network in a bit).
Who will back up the files? The answer may depend on where the files are. For example, many sites assign the backup responsibility for server systems to the system administrator, but make users responsible for files that they keep on their workstation's local disks. This may or may not be a good idea, depending on whether all of the important files really get backed up under such a scheme.
What resources are available for performing backups? Both the number and characteristics of available output devices, such as their media capacity and write speed, and the specific software packages that are present on the systems in question are important factors in developing an effective backup plan.
Where, when, and under what conditions should backups be performed? This refers to the computer system on which the backup will be performed, which obviously need not be the same as the system where the files are physically located. Similarly, in an ideal world, all backups would be performed after hours on idle filesystems. That's not always practical in the real world, however.
How often do these files change? This information will help you decide both when and how often to perform backups and the type of schedule to implement. For example, if your system supports a large ongoing development project, then the files on it are likely to change frequently and will need to be backed up at least daily and probably after hours. On the other hand, if the only volatile file on a system is a large database, its filesystem might need to backed up several times every day, while the other filesystems on the same system would be backed up only once a week.
Backup facilities designed for the Windows NT environment are generally designed to perform these different types of backups:
All of the files within some predefined set are copied to tape (or other media), and they are all marked as backed up. This archive attribute is cleared for a file whenever it is modified.
All of the files within the set are copied, but none of them are marked as backed up.
All files in the set that have been modified since the most recent full backup are copied and marked as backed up.
All files in the set that have been modified since the most recent backup of any type are copied and marked as backed up.
How do differential and incremental backups differ? Suppose you perform a full backup on Monday. On Tuesday, both types of backup will include any files that have changed since Monday. On Wednesday, however, a second differential backup will include all files changed since Monday while a second incremental backup will include only those changed since Tuesday.
Incremental backups mark the saved files as backed up while differential backups do not. Thus, an incremental backup and a differential backup are identical when run as the first backup following a full backup. Differential backups inevitably increase in size over time while the size of successive incremental backups depends solely on the volatility of the filesystem...