Slashdot is powered by your submissions, so send in your scoop


Forgot your password?
Networking Apple IT

Large-Scale Mac Deployment? 460

UncleRage writes "I've been asked to research and ultimately recommend a deployment procedure for Macs across a rather large network. I'm not a stranger to OS X; however, the last time I worked on deployment NetRestore was still king of the mountain. Considering the current options, what methodologies do admins adhere to? Given the current selection of tools available, what would you recommend when planning, prototyping, and rolling out a robust, modular deployment scenario? For the record, I'm not asking for a spoon-fed solution; I'm more interested in a discussion concerning the current tools and what may (or may not) have worked for you. There are a lot of options available for modular system deployment... what are your opinions?"
This discussion has been archived. No new comments can be posted.

Large-Scale Mac Deployment?

Comments Filter:
  • by Anonymous Coward on Monday September 21, 2009 @07:45PM (#29498183)

    that is a whole lot of gay to be rolling out

  • by Anonymous Coward on Monday September 21, 2009 @07:51PM (#29498225)
    Is there even such a thing in this world? Folks like to disparage Windows, but it really is the only OS built for very large enterprises. Linux solutions don't really compare to Windows solutions - there, I said it...
    • by norkakn ( 102380 ) on Monday September 21, 2009 @08:07PM (#29498387)

      radmind ftw

      • by Anonymous Coward on Monday September 21, 2009 @09:46PM (#29499237)

        Mod parent up. Radmind [] is the only way to deploy a managed Mac OS environment.

    • by Brian Gordon ( 987471 ) on Monday September 21, 2009 @08:08PM (#29498399)

      I preemptively beg mods not to bury this comment. We all know that Linux is great on hackers' workstations and on servers and in computing clusters, but not so great as a desktop system for average users.

      Well large managed networks is two miles away in the distance on the scale of things Linux is awesome at. Active Directory, Exchange, Terminal Services... Windows really does have a very impressive offering in this area, while Linux stays behind the scenes and rarely faces the user.

      • by amirulbahr ( 1216502 ) on Monday September 21, 2009 @08:46PM (#29498751)

        Active Directory

        You can't be serious on this one. LDAP + Kerberos can easily take on that role plus some.


        Email is easy enough to offer but shared address books and calendaring may give Exchange the edge. No harm in deploying Exchange on the back-end and using Evolution or Thunderbird or web based Exchange on the front-end.

        Terminal Services

        This is the most outrageous of your claims. Linux, Solaris, *BSD all come up trumps in this. You've got X11, NX, VNC, and the most advanced thin client solution at the moment, Sun Ray [].

        • Re: (Score:3, Informative)

          by anagama ( 611277 )

          Email is easy enough to offer but shared address books and calendaring may give Exchange the edge.

          Darwin Calendar Server. Open Source, free, runs on Linux. I thought I read in the mailing list that address book sharing is coming, though I can't be positive on that. Still, makes a great calendar server and it works with Thunderbird, though Thunderbird is not an awesome calendar client. Some howtos for installation: []

        • by ilmdba ( 84076 ) on Monday September 21, 2009 @09:58PM (#29499339)

          please... X11, NX, VNC and Sun Ray all suck ass compared to RDP. i use them all on a daily basis, and RDP is far and away the best of them all. authentication, remote devices (USB, printing), sound, mapped drives, etc. etc. none of these other solutions even touch on any of those features. not to mention, the performance of RDP smokes all of those others completely out of the water.

      • Re: (Score:3, Interesting)

        by Frosty Piss ( 770223 )

        We all know that Linux is great on hackers' workstations and on servers and in computing clusters, but not so great as a desktop system for average users.

        We do? Well, we're not really talking about Linux here, we're talking about Apple, which is a whole different ball game. But as to your Linux comments, people repeat these anecdotes so many times, they are taken as fact even though there is really not much to back them up. Recent Ubuntu and Red Hat offerings (and to a lessor extent SuSE and Mandriva) prove this tired anecdote to be essentially no longer true. Just because the Über Geeks use Debian, *BSD, or roll their own doesn't mean that's a true rep

      • Re: (Score:3, Informative)

        by itzdandy ( 183397 )

        I would argue this.

        Linux may be less prefered for a stand alone desktop mainly because of the windows apps that consumers like to clutter their computers up with. Linux excels in large deployment, standardized desktops.

        Simply put, linux workstations are easy to setup against LDAP with NFS home directories. You can tighten the desktops up to limit apps. Use Terminal Server and RDP for necessary windows apps. You can run specific applications on centralized servers and access them via remote X sessions on

    • by thatkid_2002 ( 1529917 ) on Monday September 21, 2009 @08:24PM (#29498551)
      Wrong! Novell Zenworks is on Linux too - so why can't you have a heterogeneous large scale Linux and Windows rollout? There is Zenworks for Mac but none of our customers (though there is quite a few Macs) use it. If you are going to roll out Novell stuff you may as well do Novell Groupwise while you are at it.

      Novell solutions pwn Microsoft, sorry to say.
      • by genner ( 694963 )

        Wrong! Novell Zenworks is on Linux too - so why can't you have a heterogeneous large scale Linux and Windows rollout? There is Zenworks for Mac but none of our customers (though there is quite a few Macs) use it. If you are going to roll out Novell stuff you may as well do Novell Groupwise while you are at it. Novell solutions pwn Microsoft, sorry to say.

        This is the only real solution anyone has listed. The only downside is that both your microsoft and mac fanboy users will complain about having to use it.

      • Re: (Score:3, Interesting)

        by Z80xxc! ( 1111479 )

        Novell solutions pwn Microsoft, sorry to say.

        Actually, no they don't. Not by a longshot. The school district I attend (with over 100 schools) uses ZenWorks, NDS, GroupWise, etc. Yes, ZenWorks is extremely powerful, and Novell has good integration. Yes, you can do a lot of cool stuff with it. Novell also happens to make incredibly slow software. Our district can't afford new computers on a standard 5-year cycle (or chooses to blow their money on computers twice as expensive as they need to be yet still with crap specs, but I digress), so many of our ma

    • by DoofusOfDeath ( 636671 ) on Monday September 21, 2009 @08:25PM (#29498553)

      Is there even such a thing in this world? Folks like to disparage Windows, but it really is the only OS built for very large enterprises.

      Agreed. It's the only OS for seriously large botnets.

    • by Logic Bomb ( 122875 ) on Monday September 21, 2009 @08:28PM (#29498591)

      There are many huge Mac deployments: universities, school districts with 1-to-1 laptop programs where every student gets a laptop, Google (thousands of Macs), the Fountainbleau hotel in Miami, and more. Apple gear isn't always used to manage everything: most of these sites are probably using Active Directory or some UNIX-based LDAP service for account management. But there are plenty of large Mac deployments out there.

    • Re: (Score:3, Insightful)

      Is there even such a thing in this world?

      Yes. Next question?

      Seriously, it's obvious from the story that there is, indeed, "such a thing in this world." Windows users love to accuse Mac and Linux users of fanaticism, but honestly, there's nothing more fanatical than a Windows drone who can say something like "[Windows] really is the only OS built for very large enterprises" and believe it.

      • Re: (Score:2, Informative)

        by Anonymous Coward
        sorry I am not a winblows fanboi but if you actually believe Mac's are built for very large enterprises it is you taking fanatism to a new level. Linux Sure, but Mac's are a hodge podge of half arsed solutions that can be bound together with twine to work in an enterprise, as someone that supports them on a daily basis in an enterprise I can say without any doubt Mac's are NOT built with the enterprise in mind.
    • Re: (Score:3, Insightful)

      OS X is a certified Unix platform. Why is it hard to believe it's capable of being used as a large enterprise OS.
      • by Magic5Ball ( 188725 ) on Monday September 21, 2009 @09:28PM (#29499115)

        Among my experiences (mostly historic):
        -Some shims/extensions installed to compensate for hardware issues were unconditionally loaded, even on hardware that didn't need/couldn't boot with them. That made reusing disk images on slightly different hardware revisions... fun.
        -Wake on LAN should do... stuff. Consistently.
        -I've autodiscovered a shared printer which I'll share with everybody. I've autodiscovered a shared printer which I'll share with everybody. I've autodiscovered a shared printer which I'll share with everybody...
        -What's that? The mounted ASIP resource disappeared for a few seconds and now everyone's trying to reconnect? At once? And their workstations are beachballed until the share comes back, even though they have no open resources on it?
        -Restoring resource forks from backup always works!
        -What do you mean by "the QuickTime update broke the AppleScript methods for a completely unrelated subsystem"?
        -I've autodiscovered the same printer share which I'll share with everybody...
        -ls -lr on a folder with a few hundred files in subfolders ... get coffee as much of the btree is traversed
        -I've connected to this resource before, so I'll make a new alias for it with a subtilely different name
        -What do you mean you've deleted stuff to the network trash and now it's locked?
        -I've autodiscovered the same printer share which I'll share with everybody...

    • How on earth did this get modded up? It's obviously a troll.
  • by Anonymous Coward on Monday September 21, 2009 @07:52PM (#29498233)

    I have had great success out of both DeployStudio ( and LanREV ( in K-12 schools with 200+ machines.

    • Re: (Score:3, Informative)

      DeployStudio looks fantastic with it's multicasting capabilities, but the System Image Utility in Leopard Server is just so trusty I have a hard time looking at anything else. [] You don't hear much about Leopard Server but it is by far the most promising aspect of the platform. It is the key to any large scale OSX network. I am a one man shop for 400 users. I'm sure that with a staff of three It could scale way up.
      • by Architect_sasyr ( 938685 ) on Monday September 21, 2009 @09:50PM (#29499269)
        I have a DeployStudio installation that supports 1132 laptops, iMac's and G5's, with only one IT member (who, to be fair, outsources any really difficult questions to me). Maintaining that is easy as hell - if a user complains too much about a problem, he tells them to netboot - they can choose which building they are etc. or he will VNC for them. Either way, 1 person scales well with DeployStudio - me, I'm an Apple Certified Systems Administrator, with a strong focus on Deployment, and I will push DeployStudio every time.
    • by inKubus ( 199753 )

      I second LanREV, and they will have a Linux agent component in the next 6 months and a Linux server after that. Make sure all your desktop machines have the same administrative password (or groups of them do). Also make sure the firewall is turned off for SSH from your LanREV server. Then it'll scan subnets, SSH in and remotely install the agent. Then you have a lot of capabilities.

      I do agree with the GP, this is really Microsoft's strength, AD+Kerberos+System Center/Forefront or whatever they call it n

  • Planning (Score:2, Funny)

    by NoYob ( 1630681 )
    You really don't need to do anything. See, with Macs being so user friendly, you just have the truck back up with skids of computers, plop them on folks' desks, and BINGO! everything is ready to go!

    Man, I'd update your resume because they won't need you anymore. Or, insist that some MS products are still around because of ..., that's your problem.

  • Suggested reading: (Score:5, Informative)

    by Anonymous Coward on Monday September 21, 2009 @07:55PM (#29498261)

    Check out the following:

  • Now that NetRestore is going the way of the dodo, is there anything out there better that Apple Software Restore, it is pain in butt because another boot disk is needed, NetBoot sets without NetRestore more work

  • by mewsenews ( 251487 ) on Monday September 21, 2009 @07:55PM (#29498271) Homepage

    .. of OS X server []? It doesn't require client access licenses like Windows server versions do, and many of the services seem tailored to providing the best administration possible for an OS X network. I don't have any personal experience, but that's the first place I'd look if I had to admin an entirely OS X network.

    • Re: (Score:2, Informative)

      by Knara ( 9377 )
      If the prices are what I remember they were back in 2002-2003, though, he's gonna need a lot of lube to absorb the premium he's gonna pay for the hardware.
      • Re: (Score:3, Informative)

        by falcon5768 ( 629591 )
        Give me a break 3-4 grand for a server is not at all that bad. Its actually middle of the road for a decent server of that type.
  • Options (Score:5, Informative)

    by schmidt349 ( 690948 ) on Monday September 21, 2009 @07:57PM (#29498289)

    You have two choices in general on the Mac side:

    -- UNIX-y utilities, usually on the command line and a bit crufty in places, but free and nicely configurable
    -- Mac-type utilities with marvelous interfaces that will probably set you back a nice chunk of change

    When I was in the business, we used Carbon Copy Cloner, but g4u, Remote Desktop 3, or just plain old rsync are all pretty good bets depending on what type of imaging you're planning to do. CCC actually has one foot in both of the two camps I just described.

    Of course, I even remember the crusty old days of Assimilator.

    • by raddan ( 519638 ) * on Monday September 21, 2009 @08:29PM (#29498603)
      Apple Software Restore, which comes "in the box". We set up a base machine, populate the /System/Library/User Templates/English.lproj/ and then make a disk image to our fileserver using ASR. Then, boot new machines in Target Disk Mode and deploy the image using your workstation.

      We could probably come up with something clever using a boot partition, but this works fine for us. If you want to get fine-grained, have a look at Radmind [] but keep in mind that Adobe apps will thwart your every attempt to manage them at that level.

      All of the above are Free/free. We handle patching using Apple Remote Desktop (not free, but well worth the money). You can also configure your machines to authenticate against an Active Directory (like we do); if you're willing to modify your schema, you can even manage your installation from your MMC snap-ins like you can with Windows boxen.
    • by GigsVT ( 208848 )

      Yeah except if you want rsync to preserve resource forks, you invoke the broken and shitty part of the code.

      rsync -E runs out of memory on anything approaching a large data set, and it also considers the resource forks "dirty" every time you sync, so it's slow as hell too.

  • Waste of energy (Score:3, Informative)

    by MouseR ( 3264 ) on Monday September 21, 2009 @08:01PM (#29498335) Homepage

    If you post on slashdot a question on the best way to deploy lots of Macs, all you'll get is trollish comments from pre-pubs.

    Really. It's the car equivalent on asking how to adjust the stock Caliber SRT4 wastegates on a Honda Civic SI site.

    For real answers, check out System X []. The hardware FAQ and history links will provide lots of useful info.

    • That's demonstrably untrue. At this point in this thread's life there are a couple of funny comments, a couple of 'don't do it' comments, and the rest are thoughtful and full of good information. There are inevitably trolls on every slashdot thread. So what? Thanks for the question!

  • Easy Steps (Score:3, Informative)

    by Anonymous Coward on Monday September 21, 2009 @08:01PM (#29498337)

    For initial deployment, Deploy Studio:

    For authentication and settings management, use OpenDirectory.

    For ongoing control and user support, use Remote Desktop (from Apple).

    For a more advanced option, try Radmind to keep the Macs in sync:

  • Virginia Tech (Score:2, Interesting)

    by TitusC3v5 ( 608284 )
    I don't know anything about their deployment procedure, but here at Virginia Tech the Math Emporium [] has over 500 macs set up for student access. The courses I've had there have been boring, but the actual setup of the place is pretty neat.
  • JAMF Casper (Score:5, Informative)

    by cwgmpls ( 853876 ) on Monday September 21, 2009 @08:04PM (#29498367) Journal
    Check out the Mac management software from JAMF software []. It pretty much covers it all, from package management to image deployment to remote desktop to inventory. Used in many mac-based school districts and Universities.
  • by Tibor the Hun ( 143056 ) on Monday September 21, 2009 @08:05PM (#29498377)

    First we build and test a good image on a machine for a couple of weeks.
    Then we either use that image,if it was correct the first time, or build a new one from it if it required touching up.
    We use Apple's free Disk Utility which comes free with all macs.

    We then get about 10 - 15 firewire drives and copy that image on them. (You have to make sure the drives are bootable, you can actually deploy that same image onto the drive itself.)
    Then we line up 10-15 machines and use again the Disk Utility to image them.
    Depending on the size of the image, just about the time you have the next 10-15 unboxed and set up (very easy to do since they're all all-in-ones), the first batch is ready.
    Works for us, but then again, our schedule is flexible and we can afford a couple of days of leisurely imaging.

    Oh, yeah, and if you do have an image you can also work with Apple, they'll preload it on for you.

  • DeployStudio [] appears to be the anointed successor to the venerable and discontinued NetRestore from Mike Bombich. Mike personally recommended DeployStudio. Best thing about it is that it's cross-platform and will also image PXE-capable PCs with Linux or Windows or what have you.
  • Need more info.. (Score:4, Informative)

    by engele ( 1049374 ) on Monday September 21, 2009 @08:07PM (#29498391)
    Here is an excellent resource (at least last time I checked and it has been awhile, they used to be called [] As far as tools, the built in tools are very good. A third party tool that can be very useful for bootable drive images is Carbon Copy Cloner. When you say large, do you mean hundreds or thousands, or less? It will definitely change things for you. I think that you will be surprised by both the ease of the transition, and the things that should be easy that are not. Really I don't know how we can help you unless you have specific areas where you are interested in learning solutions (and I don't say that to be a jerk, I'll try to answer questions where I can). How many servers? Directory Server? File Sharing? Exchange Server/POP/IMAP? Calendaring? Centralized home directories? Budget per user? Of course there are cool things that cost money and are not really needed, and hard things that are cheap but work well once set up etc. I would help more, but I don't know where to start... take a look at the link above, and ask questions as you get a better idea of he scope
  • I like you, developed deployment for a mac based network (600 or so macs) back when command line ASR and netrestore were the best options. However, we also upgraded our deployment methods as Apple incorporated some of the technologies we used (cloning and automatic install options) into their server software. Today that particular piece of software is very well polished and does the job extremely well. The last time we did an installation (a few years ago) we used custom netboot images with automatic install options for different types of computers (lab, classroom, etc.) based on mac address. At the time we used a third party unix package manager or OS X called Radmind, but it proved to be more trouble than it was worth. However, Apple Remote desktop's package management and monitoring work very well and lets your do most of the upgrade install tasks you need to. In the end, the only per-machine work was setting up the machine to boot from the network by default.

    Also, if you have the bandwidth, you can centralize your OS installs as server based images that are never installed on the thin clients. If you get it to work, it makes upgrades and deployment very easy.

    If you want to discuss some of the problems we faced and our solutions, please feel free to contact me.
  • radmind (Score:5, Informative)

    by norkakn ( 102380 ) on Monday September 21, 2009 @08:09PM (#29498407)

    I used to run a network with hundreds of apples with radmind. We installed the initial images with NetRestore (multicast for the larger influxes), and upon reboot, the computers would download their radmind certificate from LDAP and install all of the software that it needed.

    It takes more up front time to set up and configure radmind, but it works wonderfully for almost anything you want to do.

    • Re: (Score:2, Informative)

      by limako ( 45118 )

      A previous poster argued that you have to choose between unix-ey freeware and pricey, pointy-clicky commercial software, but radmind actually bridges that gap nicely. It is a free set of unix command-line utilites with several GUI applications that can bind it together on the client and server sides -- if you like that sort of thing. In my implementation, we use perl scripts to actually do most of the heavy lifting. Moreover, it's relatively to give end users more-or-less control over the rest of the syst

  • DeployStudio (Score:3, Informative)

    by howlatthemoon ( 718490 ) on Monday September 21, 2009 @08:10PM (#29498421)
    We use DeployStudio, a freeware project [] . Support for DS is pretty from the community, or you can buy training, but if you want to go with a vendor product JAMF Casper suite makes a great product, that we did not think was outrageously expensive.
  • by bbk ( 33798 ) on Monday September 21, 2009 @08:14PM (#29498453) Homepage

    Apple has a robust remote installation suite with OS X Server, which is darn cheap compared to most other commercial offerings.

    10.6 includes a first party version of NetRestore (full system image deployment, similar to Ghost or Flash Archive on Solaris), but most people deploying across a large number of systems should roll their own images with packaged based tools like DeployStudio or InstaDMG: [] []

    Some other good sites for finding info: [] []

  • try serverfault (Score:3, Insightful)

    by gbrandt ( 113294 ) on Monday September 21, 2009 @08:15PM (#29498465)

    Try asking this on Lots of advice can be found there.

  • Radmind (Score:5, Informative)

    by profplump ( 309017 ) <> on Monday September 21, 2009 @08:20PM (#29498513)

    It's been mentioned a couple of times, but mostly with -1 scores, so it's easy to miss: Radmind. It's a very powerful deployment tool with a totally transparent mechanism so you can tweak it to do *exactly* what you want in terms of mucking with files on the disk. I've seen people complain about it being hard to use, but I thought it was pretty straightforward -- install an app, run the change detector, tweak as desired (if at all), build an app image, deploy. []

  • by Logic Bomb ( 122875 ) on Monday September 21, 2009 @08:24PM (#29498547)

    Why on earth is this being asked on Slashdot? Head to and (particularly its mailing list). You'll find info on InstaDMG, DeployStudio, even Radmind.

  • So... (Score:2, Funny)

    by Kyn ( 539206 )

    This is a Big Mac deployment? Sounds like a job for my tummy!

  • by lymond01 ( 314120 ) on Monday September 21, 2009 @08:29PM (#29498605)

    Open Directory []
    By centralizing information about users and network resources, directory services provide the infrastructure required for managing users, groups, and computers on your network. Directory services can benefit organizations with as few as 10 people and are essential for enterprise networks that have thousands of users. Deploying a directory server helps reduce administrative costs, improve security, and provide users with a more productive computing experience.

    Remote Desktop []
    Apple Remote Desktop is the best way to manage the Mac computers on your network. Distribute software, provide real-time online help to end users, create detailed software and hardware reports, and automate routine management tasks -- all without leaving your desk. Featuring Automator actions, Remote Spotlight search, and a new Dashboard widget*, Apple Remote Desktop 3 makes your job easier than ever.

    * You'll notice Open Directory has no Dashboard widget. It's because it isn't uniquely Apple and therefore isn't polished to a blinding shine.

  • from experience (Score:5, Informative)

    by v1 ( 525388 ) on Monday September 21, 2009 @08:30PM (#29498615) Homepage Journal

    You're likely to get some laptops in addition to desktops. Get yourself a large room, a dozen or more firewire cables, power strips together. Before the machines arrive, use a macbook pro or macbook (a laptop) to develop your base image. Install all software on it that is going to be on most of the machines. Test thoroughly. Be sure all your remote access is tested. (ARD/SSH)

    Use netrestore to create the base image. When the computers arrive, copy the base image to a group of laptops, with netrestore app. The number varies depending on how many computers you are going to be imaging, the size of your base image, and how much help you have. 8-12 is typical if only one person is going to be restoring.

    First thing you should do with machines out of the box is label them, have labels made up in advance. Then set them all up imaging over firewire, just get an assembly line going. You CAN do netrestore over the network, but it's been my experience it's less reliable. (machines randomly fail to restore, sometimes entire groups fail at an annoying 99% etc) Firewire is usually faster anyway since your fileserver or switch is very unlikely to be able to keep up with imaging a dozen at once. FW800 imaging is an amazing thing.

    Once machines are imaged, there should be a folder of scripts sitting on each machine's local admin acct, one for each group of machines. The script will prompt for computer name and run. When run it will rename the computer and delete all the apps that should not be on that particular image. This can also be done by running the script remotely over apple remote desktop. If you don't have ARD, *get it now*. It will save you incredible amounts of time. Using this removal script method adds only a few minutes of time per image but you're doing them in parallel so its negligible, and saves you the major headache of managing a half dozen different base images.

    As long as you made the image on a laptop, it should have full hardware support for the camera etc. Different images are required for PPC, but fortunately that's not a headache you have to worry about. (I did, PAIN)

    Boot camp adds a level of complexity, requiring you to partition the hard drives before restoring to them, and then using something like Ghost or Acronis. One person can image between 40-80 machines in 8 hrs depending on how things go. Helps to have grunts to do the minor things like unpacking and delivery to stations. Find some carts so you can move machines several at a time. Inform the cleaning staff that you're going to have a mountain of packing material to dispose of. Keep 1 box for every 20 machines in case you need to box them up to send to a repair shop down the road.

    If you insist on using netrestore over the network, be sure you have multicast enabled on the switches. It doesn't like crossing subnets but can be made to work.

  • you know... (Score:3, Informative)

    by buddyglass ( 925859 ) on Monday September 21, 2009 @08:47PM (#29498757)
    If your installation is big enough, you could probably get some good advice Apple technical sales rep.
  • Can you do virtualization with thin clients and Apple servers?
  • Bombich Software (Score:3, Informative)

    by SammyIAm ( 1348279 ) on Monday September 21, 2009 @09:00PM (#29498889)

    I worked at a school district for some time with a significant Mac deployment. We used Mike Bombich's software [] extensively, and especially for deployment, his NetBoot utility.

    It does take a little bit of configuration on the server-side to start, but it looks like some other posters have already linked to tutorials for setting that up. MB has a utility to create a net-bootable-image that can used to image that machine with your choice of disk images (we had different images for different architectures, and different software packages), or can be automated to pick an image automatically.

    His NetBoot software also has the ability to run a shell script to complete configuration settings that may need to be done on a per-machine basis (setting the computer network name for example).

    For running updates, and modifying settings after the initial imaging, Apple's remote desktop is actually very useful. Although the feature set is limited, it DOES allow for the execution of shell commands from the Remote Desktop interface, which makes upgrading or changing settings on a large number of machines fairly easy.

  • Radmind (Score:4, Informative)

    by fitterhappier ( 468051 ) on Monday September 21, 2009 @09:14PM (#29498989)

    I managed a deployment of roughly 800 Macs across the campus of a large university using Radmind []. I've also managed the campus Linux, Solaris and OpenBSD kerberos servers, web servers and file servers with the same software. Radmind's learning curve is a little steeper at first, but it's one of the most flexible deployment options out there once you get the hang of it.

    Radmind's not really a competitor with tools like NetRestore. When used correctly, NetRestore is great for total reimaging of deployed hardware: nothing beats a block-copy installation for speed. Where NetRestore falls down is when dealing with deployment entropy. After imaging, the machine is in an unknown state ("post-image"), and the only way to be sure all machines are in the same state is to blow away the entire disk and reimage, usually at a cost of gigabytes of bandwidth per machine.

    This is where Radmind excels. It's basically a tripwire with software deployment and roll back, all based on the differences between what should be installed and what's actually on the disk. The core utility, fsdiff, looks at all files and directories designated as managed by the administrator and generates a list of differences. You can capture those changes as a loadset and upload them to the Radmind server for deployment to other machines, or you can undo any changes detected by fsdiff and restore the client to a known good state.

    The great thing about this method of management is that there's minimal bandwidth used. If fsdiff detects no changes on the filesystem, there's no reason to download anything: your system is in a known good state. On the other hand, it makes deploying Apple's system and security updates pretty damn easy. Grab the updater from Apple's website, install, and run the Radmind tools to capture the changes. Store the changes on the server, add the new loadset to your machines' profile (command file), and let your clients pull down the changes.

    The Radmind community is very helpful. Most questions to the mailing list (hosted on, Google groups mirror here []) are answered very quickly, and people are eager to share details about local setups and scripting solutions. A typical setup for a Radmind-managed Mac OS X client usually involves a few possible methods for initiating updates, most of which involve iHook [] as the UI:

    1. Check for updates on Radmind server during logout, update client if found.
    2. Run a nightly tripwire regardless of updates from server.
    3. Run a Radmind update during boot if a special flag file is found on the disk.

    Since we relied on students to help run our labs, we also deployed a special, unprivileged local user account, whom the students could log in as. This also triggered a Radmind update. And of course you can trigger updates over ssh (which works well in combination with something like pdsh).

    We combined Radmind with NetBoot for rapid, consistent deployments. Once the hardware was in place and on the network, we netbooted, used ASR to install a minimum and relatively recent system, and let Radmind bring everything up to date, including per-host license files and location specific software.

    Radmind's not perfect. It manages at the file level. If you want something to manage, say, config files on a line-by-line basis, Radmind isn't going to fit the bill (yet). Generally speaking, though, Radmind manages Mac OS X with ease. Once you've got Radmind managing your Macs, you'll find you have a lot of extra time to do interesting things instead of troubleshooting problems brought on by stale deployments.

    The Radmind wiki [] is a decent place to start looking. Good luck.

  • by admiralex ( 1410225 ) on Tuesday September 22, 2009 @12:28AM (#29500369)

    I do this for the federal government, after coming from a university environment where I grew up with the Mac from the bad ol' days of the late 90s through Apple's phoneix-like rise from those ashes into the titan it is now. Truth be told, not much has changed.

    For mass deployments, I'm about to look into Casper, but NOTHING I've seen or heard about beats netboot/netrestore -- and mind you, I live and breathe Mac. I use PCs to manage Remedy tickets, and that's it. The ability to create a master image, upload it to a server, restart a machine with the n key pressed and have it image itself was and is nothing short of magical, and it's the deployment solution I'm moving toward for the portion of the Treasury Department network I control (if I die, money will cease to be printed). Unless Casper can top that, netinstall + n is still my deployment solution of choice, and one that the folks where I used to work are still trying to replicate three years later. There's nothing faster or more foolproof.

    Prototyping is just as easy. I deal with everything from banknote designers (pull out a bill. Isn't it pretty? My designers drew all that stuff on their Macs) to executive management, and though they use their machines differently, they all have the same baseline needs -- a rock solid configuration that's hardened to IT Security's exacting (if evolving) standards, and Office to handle collaboration. My base image is a hardened installation of Leopard with fully-patched Office. That's standard across all machines. This base image is what I run in regular user mode on my personal production machine so I will know firsthand exactly what the user experiences from day to day. I customize the default user environment on the standard image to suit _my_ tastes and allow the users to tweak and refine that environment as they see fit. I learned years ago that this is the best approach for standardizing a user's desktop because I know how to work around the various quirks of OS X that can become annoying after using it for an extended period of time, and they usually haven't been on Macs long enough to have figured these things out. The more experienced of my newest users typically bristle at this since to a person they always think their approach/way of configuring the Finder/desktop is THE way to have their machines work, but I usually don't hear a peep from them after a week or two of working in my environment. The biggest compliment to me is when I cease to get trouble tickets from my bitchiest users because they find that I've already anticipated and addressed their most obvious complaints in the standard image.

    On top of the standard image, I install applications specific to the machine's role. The designers, for instance, get Adobe CS 4 and additional design-focused applications such as Quark and a font manager. My video people get Final Cut Studio. My engravers get the same package as the designers. My method of choice for deploying to these disparate groups lately has been to install the specialized applications on the standard image and create secondary images applicable to specific groups. Banknote design machines, for example, have their own special image and the video production machines have an image all their own. This simplifies things mightily because all I have to know when I want to deploy a new workstation (or repair a broken one) is where it's going. Oh, this is a replacement banknote machine? Put the banknote image on it. Copy the _user folder_ -- and nothing else -- from the old machine, create an account on the new machine, point it at the old user folder, and voila. Completely new hardware, and the user has no idea anything's changed. I've upgraded users from Tiger-running G5s to Leopard-running 8 core Mac Pros, and the only difference they noticed was the machine was "a lot faster." And the Apple menu's a different color. That's the power of Mac OS X.

    Security, as I'm sure you well know, is not an issue on the Mac, but given the sensitivity of what my users do, I

"The pyramid is opening!" "Which one?" "The one with the ever-widening hole in it!" -- The Firesign Theatre