Mastering Windows 2000 Server

( 4 )


Mark Minasi, the world's #1 Windows NT authority, updates his #1 bestselling book for Windows 2000. Every system administrator needs this book! This high-level, irreverent yet readable explanation of Windows 2000 Server provides the best discussion of the architecture, features, and new utilities of the new OS in print. You'll learn from the expert: the differences between NT and 2000, the advantages (and potential pitfalls) of the the new OS; scores of undocumented secrets, tips, tricks, and workarounds, ...
See more details below
Available through our Marketplace sellers.
Other sellers (Hardcover)
  • All (24) from $1.99   
  • New (2) from $8.93   
  • Used (22) from $1.99   
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any coupons and promotions
Seller since 2005

Feedback rating:



New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

2000 Hardcover New NO DJ.

Ships from: Ventura, CA

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
Seller since 2015

Feedback rating:


Condition: New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Sort by
Sending request ...


Mark Minasi, the world's #1 Windows NT authority, updates his #1 bestselling book for Windows 2000. Every system administrator needs this book! This high-level, irreverent yet readable explanation of Windows 2000 Server provides the best discussion of the architecture, features, and new utilities of the new OS in print. You'll learn from the expert: the differences between NT and 2000, the advantages (and potential pitfalls) of the the new OS; scores of undocumented secrets, tips, tricks, and workarounds, practical, hand's-on advice for installing and using Windows 2000 in an enterprise environment, and hundreds of useful and essential nuggets of information that will save you hundreds or thousands of dollars in calls to Microsoft's help desk.
Read More Show Less

Editorial Reviews

From Barnes & Noble
The Barnes & Noble Review
When it comes to Mark Minasi's Windows 2000 books, stories of near-mythical quality abound: Fast-food deliverers read them and are suddenly transformed into savvy enterprise network administrators; newbies crack the covers on Friday night, and by Monday morning they're setting up Active Directory and DNS without breaking a sweat. We can't vouch for those stories, but we can tell you this. With the latest edition of his Windows 2000 book, Minasi's outdone himself.

Mastering Windows 2000 Server, Third Edition has matured to reflect the best of the experience that can only be gained through hundreds of "live" Windows 2000 Server deployments. Minasi, who teaches Win2K nationwide, has gathered wisdom wherever he's traveled, and integrated it into this book, which was exceptional even in its first edition.

For example, this new edition includes extensive new coverage of Microsoft's poorly-documented tools for streamlining Windows 2000 rollouts, RIS and sysprep. Since you really need an in-depth understanding of DNS to make the most of Windows 2000 Server, Minasi has substantially expanded the book's DNS coverage.

Once you really understand Windows 2000 DNS, you're likely to be more willing to consider Active Directory -- and you'll appreciate this edition's improvements here, too: new coverage of sites, replication, operations masters, and above all, deeper coverage of migration. When should you implement multiple domains? When should a single domain be divided into organizational units? How come AD was working fine on Friday night, and on Monday morning it's dead as a doornail?

From user accounts to storage, Terminal Services to Macintosh clients, Minasi's coverage is practical, realistic -- and a pleasure to read. Much of what's in here is utterly indispensable -- for example, the 90-page chapter on preparing for and recovering from server failures, and the shorter chapter on scaling Windows 2000 for the enterprise. Minasi's been there, done that, and boy has he ever written about it: 1,800 pages, none of them wasted.

--Bill Camarda is a consultant and writer with nearly 20 years' experience in helping technology companies deploy and market advanced software, computing, and networking products and services. His 15 books include Special Edition Using Word 2000 and Upgrading & Fixing Networks for Dummies, Second Edition.

Read More Show Less

Product Details

  • ISBN-13: 9780782127744
  • Publisher: Wiley, John & Sons, Incorporated
  • Publication date: 2/28/2000
  • Series: Mastering Series
  • Edition description: 2ND
  • Edition number: 2
  • Pages: 1632
  • Product dimensions: 7.82 (w) x 9.36 (h) x 3.03 (d)

Meet the Author


How did you get started in the computer industry?

Well, I can't dance . (sorry, couldn't resist). When I was a teen in the late 60's and early 70's, I was pretty math-smart but resented the drudgery of calculating things; I lusted after time on a computer, but of course only the Big Rich Schools had computer time. Also, the idea of not having to type papers in college, and having the computer do it with no typos, sounded pretty neat - "the backspace key tells no tales," as it were. (I'm a fast typist but not a very accurate one.) Finally, let's tell the truth, there were some great games on the mainframe - Star Trek, Adventure, that sort of thing - that sounded like a lot of fun, and I wasn't getting any of it!

In my senior year in high school, I got to go to a brief class on computers at the local university, the State University of New York at Stony Brook. Dr. Lud Braun showed a couple dozen of us how a PDP-8 worked or, rather, how a 110 bps teletype ATTACHED to a PDP-8 worked. He taught us a little BASIC and asked for a volunteer to write a short program. I volunteered and wrote my first program - with an audience! - to calculate Olympic averages - you know, take a bunch of numbers, drop the top and bottom value, and average the remaining ones. It worked the first time, a feat I have seldom repeated, and I was hooked. That was 1973. In the following year, I entered college and scientific calculators came onto the market; soon after, they became programmable. Word got around about a single-board computer, a "micro" computer that you could solder together to do your own computing without a mainframe. I got my first -an RCA 1802 COSMAC-based single board - working by 1976.

In graduate school, I pursued degrees in economics, policy analysis and management despite an aptitude in computers because I reasoned that computers were merely a tool, and how could one make a living simply knowing computers? (Remember that the next time I predict the future.) Out of grad school, however, I found that knowing computers was indeed of value, and in 1980 I moved to Washington, DC, and worked in a job with the official title of "economist" but mainly I wrote FORTRAN programs to simulate economic models. In 1982 I started a four-year stint at Brookhaven National Laboratory working for Andy Kydes in DC. BNL was a great experience for a number of reasons, not the least of which was that I became the unofficial "PC support" guy, and got the chance to learn a lot about the then-new IBM entrant into the microcomputer market. Andy shared my enthusiasm for the power of small desktop computers, and without his support I probably couldn't have learned as much as I did in those days.

How did you get started as a computer book writer?

Around 1983 I was asked to give a lecture to our local PC group about how computer graphics works. Over 300 people showed up and they loved it. Somebody came up to me afterward and said "hey, you're great at this, do you do this for a living?" I thought about it for a second and replied, "no. but I could." I started taking jobs on the side teaching PC classes and found that I enjoyed it tremendously. Eventually I started writing course manuals when I couldn't find books that covered what I needed. Those course manuals got a lot of "reviewing," so to speak - thousands of students read them and offered feedback - and one day I realized that I had a pretty good book on my hands: Practical Techniques of Artificial Intelligence Programming Using Turbo Pascal. Sadly, no one else thought so, but it was a start and not a complete loss at all, as Miller-Freeman heard about the text and let me cut it up into about 60 columns, which became a regular feature called "AI Apprentice" in their AI Expert magazine.

In 1985, I came up with a new course, a course on PC repair for non-technical people. It "shipped platinum," filling hotel meeting rooms from coast to coast. That gave me the chance to write and rewrite it, leading to another fairly good text. For me, that seems to be the key to good writing - write and review and revise, then do it again several times. I'm not a very fast writer as a result. So, again, a great book (in my opinion, at least) -- but could I get a publisher interested?

In 1988, I was lecturing in Boston about OS/2 and, at the end of the day, a participant came up to me and handed me his card - Fred Langa, editor in chief of BYTE magazine. Fred asked me to do a monthly column for BYTE on OS/2, for which I'm very grateful. That led to a contact at Que, who asked me to contribute to a book on OS/2 1.1. I can't say it was a socko smash success, but it was nice to have my name on a book spine. Another BYTE contact led me to Stephen Levy, the publisher at Compute Press. (Not the "Hackers" Stephen Levy, different guy - the Compute Levy returns e-mail; the "Hackers" Levy doesn't. Can't wait until I'm important enough to be able to ignore readers.) Steve bought the PC troubleshooting book.

I met Sybex's Dianne King in 1989 through my agent at the time, Jeff Herman. Dianne bought my hard disk repair book and wondered if there was any way we could do a PC repair book? I got her in touch with Steve, who'd told me after lackluster sales of the Compute PC repair book that it really wasn't their market, and Steve graciously sold the book to Sybex. Since then - we're now working on the 11th edition - it's been a perennial best-seller.

At the same time, I was teaching networking classes related to my OS/2 and Windows interests. Microsoft released in 1993 a competitor to Novell's Netware called "NT Advanced Server 3.1." I put it on a system with the full intention of watching it crash in interesting and funny ways. But it didn't. In fact, it was pretty good - not as good as Novell, but in the ballpark - and I suspected that if it came down to a battle of marketing power rather than simply a battle of technological quality, Microsoft would win, hands down. So I put together a class on it, developing a large course book in the process. After teaching from it for a few months, I approached Gary Masters at Sybex about printing it. The book has sold wonderfully well and readers have been very kind in telling me that it's been quite useful to them. Many have written to tell me that they've used Mastering Windows NT Server 4.0 so much that the book's fallen apart and they need to buy another. There's no higher praise for an author.

Is writing a full-time job for you? If not, what's your "real" job?

Writing is part of my job. I try to take complex topics and make them understandable to normal humans, and different people need to learn in different ways. Some folks like multi-day classes, other short lectures. Still others prefer videotapes, and many see books and magazines as the best vehicle. It's a tragedy that so many of us get jobs that force us to rely on computers and yet the programs that we run can be so frustrating. Everyone has the right to be able to feel like an expert at their job; I hope with my books to help people get to that expertise.

What do you think makes a good computer book?

Several things. First, good structure - a table of contents should have a "story," a flow. Second, examples that someone has actually TRIED out, rather than the sadly-common case where someone on deadline just makes something up and hopes that it'll actually work. Third, a mix of theory and practice. I don't like cookbooks that just say "to solve THIS specific problem, do THIS." I'd rather first understand the overall idea, and then see a specific example. Third, to paraphrase Einstein, books should be as short as they absolutely need to be - but no shorter. We have a tendency to create huge books just to create huge books. (On the other hand, I realize that now that the NT Server book is over 1700 pages I may be sounding hypocritical, but I'll plead in my defense that the book's in its seventh edition, and the topic of NT networking is big, really big.)

In your opinion, how does your book Mastering Windows NT Server 4.0 differ from other books on the same technologies?

I honestly haven't read the other books, so I can't say. I learn NT by consulting, from student and reader feedback, and from painful experience

. What did you find to be the most challenging aspect of writing your books Mastering Windows NT Server 4.0 and Mastering Windows 2000 Server?

Not spending my whole life twiddling with them. I'm constantly learning new things about NT or, now, Windows 2000, and I have the constant urge to go back and rewrite some section of the book. The Windows 2000 book was very, very challenging because my team of writers and I were trying to hit a moving target: will it ship in 1998? Early 1999? Late 1999? If we'd known in 1997 that this wouldn't ship until 2000, I think we could have gotten the book done with less effort and late nights. But I'm happy it's finally out and look forward to hearing more reader feedback - all of the letters so far have been very positive.

"What impact do you expect Windows 2000 Server to have on the networking market?"

Well, now Microsoft has come out with a networking OS that's almost enterprise-quality. Just as Netware 4.0 wasn't yet ready for prime time, I think people will find that Windows 2000 has its quirks. But in time it'll be a significant part of the market. My guess, however, is that most folks will be a bit leery about moving their enterprise over to a "1.0" product too quickly. People didn't really trust NT 4.0 until Service Pack 3 arrived. I wouldn't be surprised if Windows 2000 didn't see wide acceptance until late 2001.

Having said that, however, I think that nearly every firm will be interested in Windows 2000 and will have a pilot project set up to check it out. And many people will be flocking to training for Windows 2000 and buying books; I hope some of them will buy mine.

What do you enjoy doing in your spare time?

Lots of things. I live on a big old farm in the middle of nowhere in southeastern Virginia, a village named Pungo distinguished mainly by its sole traffic light. It's great to watch the wildlife around here - we get plagues of tree frogs in the summer, beautiful snakes and lizards in the summer, bald eagles, foxes, herons, and more. My web site at has some pictures of the things that I've come across - it's not great photography, but I get a lot of enjoyment out of it. I also do amateur astronomy. We don't have the darkest or clearest skies, but when it's clear, it's glorious.

If money weren't an issue, what job would you like to have?

I have spent much of my adult life creating the job that I have right now. I always wanted to be the author of a book or books that people truly found useful, and readers tell me that I've accomplished that. I love entertaining crowds and now I do keynote addresses to technical crowds of up to 2,000 people - if I can make them laugh and pass along a bit of knowledge or two, then I've succeeded.

I wish I had more time for my wildlife and astronomy interests, but devoting that time would take from noodling around with computers and I'd miss that greatly, so I guess all's in balance right now.

What are your thoughts about the computer industry and where it's heading?

In one word: reliability. It's time. Bob Muglia of Microsoft told me in an interview in 1992, "if you ever have to reboot NT after you've gotten it configured right, we've failed." That was eight years ago, and many firms have learned that "a reboot a week stops the ole' memory leak." Firms like Microsoft need to focus less on adding new features and focus more on fixing bugs and raising the bar on acceptable defect - yes, that's defect, let's stop saying "bug" - levels in commercial software. I talk more about this in my book The Software Conspiracy and interested readers can find out more at

If Microsoft doesn't make quality a priority, they may start hearing "no thanks, we've switched to Linux" - more and more often.

Read More Show Less

Read an Excerpt

Chapter 1: Windows 2000 Server Overview

After years of talk about "Cairo" (the Microsoft code name for their "ultimate" server software) and even more years of work, Microsoft has finally shipped Windows 2000. After training us to expect roughly annual releases of new versions of NT-NT 3.1 shipped in 1993, 3.5 in 1994, 3.51 in 1995, and 4 in 1996-NT 5 finally arrived, but it was considerably later than a year after the release of NT 4. Furthermore, NT 5 arrived with a new name: Windows 2000. But the name's not all that's new.

So what took so long? Was it worth the wait? For many, the answer will be "yes." Much of NT's foundation-the internal kernel structure, how drivers are designed, how Windows 2000 multitasks-hasn't changed all that terribly much from NT 4, but network professionals really don't see that part of NT. Instead, we network types will notice that the above-ground structures, the tools built atop the foundation, are so different as to render Windows 2000 Server almost unrecognizable as a descendant of NT 3.x and 4.x. For comparison's sake, and to extend the structural metaphor, think of using Windows NT 3.1 Advanced Server as renting a room in someone's basement, using NT 4 as renting a 2-bedroom apartment, and using Windows 2000 Server as living in Bill Gates's new mansion on Lake Washington: more rooms than anyone can count all filled with new and wonderful electronic gadgets.

In the mansion, many of the things that you know from the basement room are unchanged-the electricity comes out of sockets in the wall, the pipes are copper or PVC, bathrooms have sinks and commodes in them-but there's so much more of it all, as well asso many new things, both useful ("Hey, cool, a garden, and automatic sprinklers for it!") and of debatable value ("What does this bidet thing do, anyway?"). That's not to say that NT's underpinnings will never change, not at all-the next (and still-unnamed) version of NT will go a step further, digging up NT's 32-bit foundation and replacing it with a 64-bit one.

The main point, however, is this: If you're an NT network administrator, be prepared for culture shock. The difference between NT 4 and Windows 2000 is at least 10 times as great as the difference between NT 3.1 and NT 4. And if you've never worked with NT in any flavor, be prepared to find Windows 2000 both delightful and frustrating-as is the case with most Microsoft software.

It would be somewhat shortsighted of me to simply say, "Here are the new features you'll find in Windows 2000," and then to just dump the features-it sort of misses the forest for the trees. So let me start off by briefly discussing the big picture and what Microsoft's trying to accomplish; then I'll move along to those new features and, finally, take a look at a few of Windows 2000's shortcomings.

Microsoft's Overall Goals for Windows 2000

The changes in Windows 2000 from NT 4 are quite significant, but they were long in coming. What was the wait all about?

Make NT an Enterprise OS

Microsoft wants your company to shut off its mainframes and do your firm's work on big servers running NT. That's why there is a version of Windows 2000 Server called Datacenter Server. Microsoft is also hoping that "enterprise" customers will exploit new Windows 2000 Server facilities such as Active Directory and Microsoft Application Server (nee MTS) and COM+ to write gobs of new and hardware-hungry distributed applications. Before they can accomplish that, however, they need to clear three hurdles: reliability, availability, and scalability.

NT Must Be More Reliable
Since their appearance in the late '70s, microcomputer-based network operating systems have been seen as fundamentally different from "big-system" OSes like IBM's MVS and OS/400, Compaq's Open VMS, and the myriad flavors of Unix. PC-based network operating systems weren't exactly seen as toys, but neither were they seen as something that one would base one's business on, if one's business was truly critical. For example, it's hard to imagine the New York Stock Exchange announcing that they'd decided to get rid of their current trading system and to replace it with a Net-Ware 4.1 or NT 4-based client-server system. PC-based stuff just wasn't (and largely still isn't) seen as sufficiently reliable yet to take on the big guys.

Nor is that an unfair assessment. Most of us would be a bit uncomfortable about discovering in midflight that the state-of-the-art airliner taking us across the Pacific was run by NT, or that the Social Security Administration had decided to dump their old mainframe-based software in favor of a Lotus Notes-based system running atop NT. Years ago, many firms discovered that NT servers crashed far less often if rebooted weekly; it's hard to imagine running a heart-and-lung machine on something like that. But Microsoft wants to shed that image. They want very much to build an OS that is sufficiently industrial-strength in reliability so that one day it wouldn't be silly to suggest that AT&T's long distance network could run atop some future version of NT, Windows 2000-something. With Windows 2000, Microsoft believes that they've taken some steps in that direction.

NT Must Be More Available
A server being rebooted to change some parameters is just as down as one that is being rebooted after a Blue Screen Of Death, the symptom of a system crash that is all too familiar to NT 4 veterans. Many Windows 2000 parameters can be changed with-out a reboot where a change to the corresponding parameter in Windows NT 4 would require one. Unfortunately, as we will see, some of the most common parameter changes still require a reboot.

NT Must Be Able to "Scale" to Use Big Computers
Reliability's not the only big-network issue that Microsoft faces. The other one is the limit on the raw power that NT can use-to use a word that the PC industry created a few years ago, NT must be more scalable.

Being an "enterprise" operating system requires two different kinds of scalability which are somewhat at odds with each other: performance scalability and administrative scalability. The first asks, "If I need to do more work with NT, can I just run it on a bigger computer?" The second asks, "If I need to support more users/computers/giga-bytes of hard disk/etc., can I do it without hiring more administrators?"

Performance Scalability CPUs are simply not getting all that much faster in terms of the things they can do. To create faster or higher-capacity computers, then, computer manufacturers have been putting more and more CPUs into a box. And while NT has in theory been designed to use up to 32 processors since its first incarnation, in reality, very few people have been able to get any use out of more than 4 processors. With Windows 2000, Microsoft claims to have improved the scalability of NT-although I've not yet heard anyone say with a straight face that Windows 2000 will "run like a top" on a 32-processor system.

Besides the ability to use a larger number of CPUs, there were internal restrictions within Windows NT, such as the number of users that a SAM database would allow, that simply had to go. With Active Directory, many restrictions, including this one, have been removed.

The three versions of Server support different numbers of CPUs. Windows 2000 Server supports four processors. Windows 2000 Advanced Server supports 8 processors, and Windows 2000 Datacenter Server supports 32 processors.

NOTE Oh, and if you're looking in your Webster's for a definition of scalability, don't bother; it's not a real word. Microsoft made it up a few years ago. Basically, scalable roughly means, "As the job's demands grow, you can meet them by throwing in more hardware-processors and memory-and the system will meet the needs." It's become an issue because, while NT has theoretically supported 32 processors since its inception, much of the basic NT operating system itself can't use many processors-for example, adding a ninth processor to an eight-processor domain controller won't produce any faster logins. That's also true of NT programs; depending on whom you ask, SQL Server maxes out at four or eight processors. Beyond that, adding more processors does nothing more than run up the electric bill.

Administrative Scalability/Manageability Large enterprises do not like to add headcount in their core business areas, much less just to administer Windows NT. Windows 2000 Server contains a number of facilities such as Intellimirror, designed to allow customers to support more users running with more complex desktop environments with fewer support personnel. Microsoft typically refers to this area as "Manageability," though I think "Administrative Scalability" better captures the flavor of the topic.

In this area, one of the most important additions to Windows 2000 is its support for both issuing and honoring digital certificates in place of userids and passwords for identification and authentication. The overall system needed to manage the life cycles of digital certificates and verify their authenticity and current validity is called Public Key Infrastructure (PKI). PKI-based security is both more secure and vastly more administratively scalable than userid+password-based security, but it is also much, much more technically complex.

Modernize NT

Three years can be an awfully long time in the computer business. The years since 1996 have seen the emergence of Universal Serial Bus, IEEE 1394, Fiber Channel, and 3-D video cards, just to name a few areas of technological growth, as well as the introduction of hundreds of new network cards, video boards, sound cards, SCSI host adapters, and so on. A new crop of network-aware PCs has appeared, PCs that under-stand networking right in their BIOSes and that are designed to be taken straight out of the box without anything on their hard drives, plugged into the network, and started up from the network rather than from any on-disk software. And on a more mundane note, nearly every PC sold in the past five years supports a hardware system called Plug and Play (PnP).

NT supports none of these things right out of the box. Some of these devices can be made to work, but some can't. Hardware support has always been something of an afterthought in NT, and it's amazing that Microsoft shipped NT 4 without any Plug-and- Play support, save an undocumented driver that could sometimes make a PnP ISA board work but that more commonly simply rendered a system unusable. NT 4's off-hand support, of PC Card laptops and its near-complete lack of support for Cardbus slots forced many an NT-centric shop to put NT Server on their servers, NT Workstation on their corporate desktop and Windows 95 on their laptops. One of Windows 2000's goals, then-and an essential one-is to support the new types of hardware and greatly improve the way that it works on laptops.

Make NT Easier to Support

The past 10 years have seen the rise of the graphical user interface (GUI), which brought a basically uniform "look and feel" to PC applications and made learning a PC application and PCs in general so much easier for users. We've seen programming tools go from some very simple development environments that crashed more often than they worked to today's very stable 32-bit suite of programming tools, making it possible for developers to create large and powerful 32-bit applications. Users and developers are better off-sounds good, doesn't it?

Well, it is, for them. But many of us fall into a third category: support staff. And while some things have gotten better-the graphical nature of many of NT's administrative tools helped get many new admins started on a networking career-the actual job of sup-port hasn't gotten any easier. Consider this: Would you rather rebuild a CONFIG.SYS file to stitch back together a damaged DOS machine from memory, or would you prefer to pick through a broken Registry trying to figure out what's ailing it?

Microsoft's competition knew that support was the Achilles' heel of both Windows and NT, and so in the mid-'90s, Sun and others began extolling the importance of considering the Total Cost of Ownership (TCO) of any desktop system. It wasn't hard to make the argument that the biggest cost of putting Windows on a desktop isn't the hardware or the software-it's the staff hours required to get it up and keep it running. With Windows 2000, Microsoft starts to reduce desktop TCO. A group of Windows 2000 improvements called Change and Configuration Management tools makes life easier for support folks and network administrators in general.

Specific New Capabilities and Features

So much for the good intentions. What about the new goodies?

Microsoft lists pages and pages of enhancements to Windows 2000-the PR people have, after all, had over three years to cook up those lists. I'm sure they're all of value to someone, but here are the things that I find most valuable in Windows 2000, arranged according to my three earlier categories-making NT more enterprise ready, modernizing NT, and improving its administrative tools/lowering TCO.

Making Windows 2000/NT More "Enterprising"

Several functions help push NT's latest incarnation to a place in the big leagues. In particular, the most significant "big network" changes to NT include:
  • Active Directory
  • Improved TCP/IP-based networking infrastructure
  • More scalable security infrastructure options
  • More powerful file sharing with the Distributed File System and the File Replication Service
  • Freedom from drive letters with junction points and mountable drives
  • More flexible online storage via the Removable Storage Manager

Active Directory

The crown jewel of Windows 2000, Active Directory is also the single most pervasive piece of the OS. Many of the things you'll read about in this book, many of the compelling features of Windows 2000, simply cannot function without Active Directory. Group policies, domain trees and forests, centralized deployment of applications, and the best features of the Distributed File System (to name a few) will not operate until you've got a system acting as an Active Directory server.

NOTE The whys and wherefores of Active Directory are complex enough that they'll get a chapter all their own. In Chapter 2, you'll read about what Active Directory is trying to accomplish, how it does so, and how you can best design the Active Directory for your enterprise.

Network Infrastructure Improvements

Anyone building an NT-based network around the TCP/IP protocol needed three important infrastructure tools:

  • The Windows Internet Name Service (WINS), which helped Windows 2000-and NT-based servers and workstations locate domain controllers (which handled logins and authentication in general) as well as file and print servers.
  • The Dynamic Host Configuration Protocol (DHCP), which simplified and centralized the once-onerous task of configuring TCP/IP on workstations.
  • The Domain Name System (DNS), which did the same kind of job as WINS-it keeps track of names and addresses-but instead of helping workstations locate domain controllers and file/print servers, DNS helps programs like Web browsers and e-mail clients to find Web and mail servers. Some firms have avoided moving their networks to TCP/IP, staying instead with IPX (a protocol that owes its popularity to Novell's networking products) or NetBEUI (the main protocol for Microsoft networking prior to 1995). But with Windows 2000, pretty much everyone should be using TCP/IP, making DHCP, WINS, and DNS essential parts of any Windows 2000-based network.


Why did NT have two services-WINS and DNS-that kept track of names? This was the case because of a questionable choice that Microsoft made back in 1994. Of the two, WINS was the most troublesome and, for some networks, unfortunately the most vital. Thus, it was to many people quite excellent news when Microsoft announced that Windows 2000 would be the end of WINS.

Reports of its death, however, turned out to be greatly exaggerated. The actual story is that, if you have a network that is 100-percent Windows 2000, both on the workstation and server, then yes, you can stop using WINS. But most of us won't have that for years, so Windows 2000 still has a WINS service. Thankfully, it's greatly improved; one expert commented to me that it's ironic that Microsoft finally "fixed" WINS, just as they were about to kill it. Chapter 18 shows you how to set it up and make it work.


DNS was something of a sidelight under NT 4 as NT didn't really need DNS-DNS's main value was to assist Internet-oriented programs like Web, FTP, and POP3/SMTP mail clients in finding their corresponding servers. Under Windows 2000, however, DNS takes center stage. Without it, Active Directory won't work.

NT 4's DNS server was a pleasure to work with, although that's just my opinion: I've spoken with people who tell me that it couldn't handle high volume loads. I didn't have any bad experiences with it, so I can't comment. NT 4's DNS wrapped a well-designed GUI around a standard DNS implementation, making basic DNS tasks simpler than they would be for a Unix DNS implementation at the time. Windows 2000 takes that a step further with improved wizards. First-time DNS administrators will find that Windows 2000's DNS server almost does all the hand-holding you could need.

Additionally, Windows 2000's DNS supports dynamic updates, a process wherein adding information about new machines to a DNS database can be automated. Based on the Internet standard document RFC 2136 (the Internet's standards are described in documents called Request for Comments, or RFCs), it combines the best of NT 4's WINS and DNS servers. The DNS server also supports another Internet standard, RFC 2052, which greatly expands the kind of information that DNS servers can hold onto. For example, a pre-2052 DNS server could tell you what machines acted as mail servers for a given Internet domain, but not which machines were Web or FTP servers. 2052-compliant DNS servers can do that, and more: Active Directory now uses RFC 2052 to allow DNS to help workstations find domain controllers and other Active Directory-specific server types.

NOTE Chapter 18 covers how Active Directory uses RFC 2052 in more detail. DHCPB

DHCP frees network administrators from having to walk around and visit every single desktop in order to configure the TCP/IP protocol. The basic idea is that a workstation broadcasts over the network, seeking an IP address (every computer on an intranet must have a unique IP address); a DHCP server hears the plea and assigns that computer its own unique IP address.

The End of Rogue DHCP Servers This is in general great, but now and then some dodo would decide to "practice" with DHCP by setting up a DHCP server on some PC. The budding new administrator's new DHCP server would then start handing out completely bogus addresses to unsuspecting workstations. Those workstations would then have IP addresses, but they'd be worthless ones, and as a result those workstations would be unable to function on the company's network.

With Windows 2000, however, not just anyone can create a DHCP server. Now, DHCP servers must be authorized in the Active Directory before they're allowed to start handing out addresses. This is a great advance, the end of what we used to call "rogue" DHCP servers.

DHCP Works with DNS to Register Clients You read before that the new DNS supports dynamic updates, a process standardized in RFC 2136 whereby the DNS server will automatically collect address information about machines on the network. This is an improvement over NT 4's DNS server because that DNS server couldn't automatically collect DNS information about machines-you, the administrator, had to type the names and IP addresses of new machines into the DNS Manager administration tool.

Windows 2000's DNS server collects its information about machines on the net-work with the help of those machines. When a machine starts up, one of the things it's doing while booting up-one of the reasons that booting modern PCs takes so long-is contacting the DNS server to tell the DNS server that the machine exists. In effect, each workstation and server on the network must know to register itself with the DNS server.

Unfortunately, as RFC 2136 is a fairly recent development in the DNS world, most existing operating systems-DOS, Windows for Workgroups, Windows 9x, NT 3.x, and 4.x-do not know to register themselves with a DNS server. That's where Windows 2000's DHCP server helps out. You can optionally tell the DHCP server to handle the DNS registrations for non-2136-aware workstations. This is a very useful new feature because, without it, dynamic updates wouldn't be worth much except for the rare firm that runs solely Windows 2000 on its desktops, laptops, and servers. NOTE You can read more about DHCP in Chapter 18.

Quality of Service
The Internet's underlying protocols, TCP/IP, have something of an egalitarian nature; when the Net's busy, it's first come, first served. But the protocols have always had a built-in capability that would theoretically allow an Internet operator to give greater priority to one user over another, to dial in a better response time for some than for others. That's called Quality of Service, or QoS. It was always there but not really implemented as it sort of ran against the way the Net was run.

The growth of corporate intranets, however, changes that story. Network operators in corporate networks aren't serving a mass public; rather, they're serving a diverse and hierarchical organization whose leaders may well want to be able to say, "We direct that this individual get more bandwidth and faster access to network resources than this other individual." That's possible if you're using expensive Cisco routers- but now you can do it if you use Windows 2000 machines as your IP routers as well.

New Security Infrastructure

As one security expert once said to me, "We knew that NT had `made it' when hackers started targeting it." Hardly a month goes by without word of a new security hole in NT 4 and the hot fixes that are intended to plug that hole. Patch a plaster wall with Spackle enough and eventually you have to wonder if you've got a plaster wall or a Spackle wall-so Microsoft must have decided early on that one of the things that Windows 2000 couldn't live without was a new security system.

So they built two.

Originally, Windows 2000 was supposed to replace NT 4's authentication system, known as NTLM (for NT LAN Manager), with a system popular in the Unix world called Kerberos. Kerberos is well understood and works well in large-scale systems, assisting Microsoft in their "scalability" (there's that nonword again) goal. Partway through the Windows 2000 development process, Microsoft decided to supplement Kerberos with a third security system, a public key system based on the X.509 standard. They did that mainly because a public key system is considered far more scalable than either an NTLM or Kerberos system. Several companies offer hard-ware readers that allow users to log in by inserting credit card-sized devices called smart cards into the readers.

Kerberos and public key provide as a side effect a feature that NT administrators have asked after for a long time-transitive trust relationships.

Distributed File System

NT's first and probably still most prevalent job is as a file server. And as time has gone on and versions have appeared, it's gotten better at it. Some benchmarks have rated it as fast or faster than NetWare, the guys to beat. And where NT 4's file server software was largely unable to deliver throughput faster than 90Mbps, Windows 2000 can transfer data almost 10 times faster.

Disconnecting Physical Locations from Names
But NT's file server system is hampered by the way it addresses shares on servers. A share named DATA on a server named WALLY would be accessed as \\WALLY \DATA .

Although that makes sense, it's limiting. Suppose the WALLY server goes up in a puff of smoke? We install a new server, perhaps named SALLY rather than WALLY, restore the data from WALLY, and re-create the DATA share. But now it's \\SALLY \DATA rather than \\WALLY \DATA , and configurations that are hardwired to look for and expect \\WALLY \DATA will fail. In other words, if a share's physical location changes, so must its "logical" location-its name. It'd be nice to be able to give a share a name that it could keep no matter what server it happened to be on.

Windows 2000 takes NT beyond that with the Distributed File System. In combination with Active Directory, Dfs-note the lowercase in the acronym; apparently some-one already owned DFS when Microsoft started working on the Distributed File System-allows you to give all of your shares names like \\domainname \sharename rather than \\servername \sharename . You needn't know the name of the file server that the share is on.

Fault Tolerance
You probably know that Windows 2000 offers you many ways to add reliability to your network through RAID storage and two-system computer clusters. RAID boxes aren't cheap, and clusters require a lot of hardware (two identical machines, external SCSI storage, extra network cards, and either the Advanced or Datacenter edition of Windows 2000 Server). But there are some very inexpensive fault tolerance options for Windows 2000 networks as well; Dfs provides one.

If you have a file share that you want to be available despite network misfortune and failure, then one way to accomplish that is with a fault tolerant Dfs share. To create one, just create two or more file shares that contain the same information, then tell Dfs to treat them like one share. So, for example, in a domain named ROCKS, you might have a share named STUFF on a server named S1 and a share named STUFF on a server named S2. To the outside world, however, only one share would be visible as \\ROCKS \STUFF . Then, when someone tries to access \\ROCKS \STUFF , Dfs will basically flip a coin and either send her to \\S1 \STUFF or \\S2 \STUFF . It's not full-blown fault tolerance-if S1 goes down, nothing automatically transfers people from \\S1 \STUFF to \\S2 \STUFF -but it's a low-cost way to increase the chance that a given share will be available, even under network "fire...."

Read More Show Less

Table of Contents

Ch. 1 Windows 2000 Server Overview 1
Ch. 2 History: The Story So Far 29
Ch. 3 The Windows 2000 Registry 57
Ch. 4 Setting Up and Rolling Out Windows 2000 77
Ch. 5 The Windows 2000 Server UI and MMC 189
Ch. 6 Understanding and Using TCP/IP in Windows 2000 Server 245
Ch. 7 Building a Windows 2000 TCP/IP Infrastructure: DHCP, WINS, and DNS 357
Ch. 8 Understanding and Using Active Directory 525
Ch. 9 Managing and Creating User Accounts 669
Ch. 10 Managing Windows 2000 Storage 845
Ch. 11 Creating and Managing Shared Folders 923
Ch. 12 Software Installation 995
Ch. 13 Configuring and Troubleshooting Network Print Services 1043
Ch. 14 Connecting Clients to Windows 2000 Server 1119
Ch. 15 Supporting Clients with Windows Terminal Services 1141
Ch. 16 Connecting Macintoshes to Windows 2000 1243
Ch. 17 Web, Mail, FTP, and Telnet Services in Windows 2000 Server 1281
Ch. 18 How Running a Big Windows 2000 Network Is Different 1411
Ch. 19 Integrating NetWare with Windows 2000 Server 1429
Ch. 20 Tuning and Monitoring Your Win2K Network 1461
Ch. 21 Preparing for and Recovering from Server Failures 1523
Ch. 22 Installing and Managing Remote Access Services in Windows 2000 Server 1615
Ch. 23 Installing Hardware in Windows 2000 1707
App. A Performance Objects in Windows 2000 1739
Index 1748
Read More Show Less

Customer Reviews

Average Rating 5
( 4 )
Rating Distribution

5 Star


4 Star


3 Star


2 Star


1 Star


Your Rating:

Your Name: Create a Pen Name or

Barnes & Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation


  • - By submitting a review, you grant to Barnes & and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Terms of Use.
  • - Barnes & reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing all of 4 Customer Reviews
  • Anonymous

    Posted May 9, 2003

    Every Network Administrator should have this book.

    Mastering Windows 2000 Server has been a life saver to me. I have searched everywhere on how to do several things on a server and this was the only book that I found had the instructions on exactly how to do what I need to do.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted January 28, 2002

    Speaks English

    As a self taught W2k systems administrator for a small company this book is a God send. A down to earth approach to W2k. After getting this book, many of the issues I had been struggling with or trying to implement were resolved. This book gives a complete overall view of the issue and then walks you through the implementation of that issue. I would recommend this to anyone who is battling with W2k or wants to educate themselves on how to do more with it.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted December 28, 2000

    Excellent as always

    Mark Minasi has a way of writing as though he talking to you, which makes what could be dull factual information and transforms it into something that is easy to comprehend and understand. If you need to learn Windows 2000, buy this book: period.

    Was this review helpful? Yes  No   Report this review
  • Anonymous

    Posted June 22, 2000

    A Must Have Book For All IT Professionals

    I had purchased several books before finding the Mastering Windows 2000 Server. All of them fell short of answering my questions. Mastering Windows 2000 is a must have for all IT Professionals that have NT installations and are either thinking about or have moved to Windows 2000! This book answers all the questions without having to spend money on support calls. Thanks Mike and company for producing a book worth more than it's weight in gold!

    Was this review helpful? Yes  No   Report this review
Sort by: Showing all of 4 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)