Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Networking Apple IT

Large-Scale Mac Deployment? 460

UncleRage writes "I've been asked to research and ultimately recommend a deployment procedure for Macs across a rather large network. I'm not a stranger to OS X; however, the last time I worked on deployment NetRestore was still king of the mountain. Considering the current options, what methodologies do admins adhere to? Given the current selection of tools available, what would you recommend when planning, prototyping, and rolling out a robust, modular deployment scenario? For the record, I'm not asking for a spoon-fed solution; I'm more interested in a discussion concerning the current tools and what may (or may not) have worked for you. There are a lot of options available for modular system deployment... what are your opinions?"
This discussion has been archived. No new comments can be posted.

Large-Scale Mac Deployment?

Comments Filter:
  • by Anonymous Coward on Monday September 21, 2009 @07:52PM (#29498233)

    I have had great success out of both DeployStudio (http://deploystudio.com/) and LanREV (http://www.lanrev.com) in K-12 schools with 200+ machines.

  • Suggested reading: (Score:5, Informative)

    by Anonymous Coward on Monday September 21, 2009 @07:55PM (#29498261)

    Check out the following:

    http://www.macenterprise.org/
    http://www.deploystudio.com/Home.html
    http://rsug.itd.umich.edu/software/radmind/

  • by mewsenews ( 251487 ) on Monday September 21, 2009 @07:55PM (#29498271) Homepage

    .. of OS X server [apple.com]? It doesn't require client access licenses like Windows server versions do, and many of the services seem tailored to providing the best administration possible for an OS X network. I don't have any personal experience, but that's the first place I'd look if I had to admin an entirely OS X network.

  • Options (Score:5, Informative)

    by schmidt349 ( 690948 ) on Monday September 21, 2009 @07:57PM (#29498289)

    You have two choices in general on the Mac side:

    -- UNIX-y utilities, usually on the command line and a bit crufty in places, but free and nicely configurable
    -- Mac-type utilities with marvelous interfaces that will probably set you back a nice chunk of change

    When I was in the business, we used Carbon Copy Cloner, but g4u, Remote Desktop 3, or just plain old rsync are all pretty good bets depending on what type of imaging you're planning to do. CCC actually has one foot in both of the two camps I just described.

    Of course, I even remember the crusty old days of Assimilator.

  • by Knara ( 9377 ) on Monday September 21, 2009 @08:01PM (#29498329)
    If the prices are what I remember they were back in 2002-2003, though, he's gonna need a lot of lube to absorb the premium he's gonna pay for the hardware.
  • Waste of energy (Score:3, Informative)

    by MouseR ( 3264 ) on Monday September 21, 2009 @08:01PM (#29498335) Homepage

    If you post on slashdot a question on the best way to deploy lots of Macs, all you'll get is trollish comments from pre-pubs.

    Really. It's the car equivalent on asking how to adjust the stock Caliber SRT4 wastegates on a Honda Civic SI site.

    For real answers, check out System X [vt.edu]. The hardware FAQ and history links will provide lots of useful info.

  • Easy Steps (Score:3, Informative)

    by Anonymous Coward on Monday September 21, 2009 @08:01PM (#29498337)

    For initial deployment, Deploy Studio: http://www.deploystudio.com/

    For authentication and settings management, use OpenDirectory.

    For ongoing control and user support, use Remote Desktop (from Apple).

    For a more advanced option, try Radmind to keep the Macs in sync: http://rsug.itd.umich.edu/software/radmind/

  • JAMF Casper (Score:5, Informative)

    by cwgmpls ( 853876 ) on Monday September 21, 2009 @08:04PM (#29498367) Journal
    Check out the Mac management software from JAMF software [jamfsoftware.com]. It pretty much covers it all, from package management to image deployment to remote desktop to inventory. Used in many mac-based school districts and Universities.
  • Need more info.. (Score:4, Informative)

    by engele ( 1049374 ) on Monday September 21, 2009 @08:07PM (#29498391)
    Here is an excellent resource (at least last time I checked and it has been awhile, they used to be called macosxlabs.org). http://www.macenterprise.org/ [macenterprise.org] As far as tools, the built in tools are very good. A third party tool that can be very useful for bootable drive images is Carbon Copy Cloner. When you say large, do you mean hundreds or thousands, or less? It will definitely change things for you. I think that you will be surprised by both the ease of the transition, and the things that should be easy that are not. Really I don't know how we can help you unless you have specific areas where you are interested in learning solutions (and I don't say that to be a jerk, I'll try to answer questions where I can). How many servers? Directory Server? File Sharing? Exchange Server/POP/IMAP? Calendaring? Centralized home directories? Budget per user? Of course there are cool things that cost money and are not really needed, and hard things that are cheap but work well once set up etc. I would help more, but I don't know where to start... take a look at the link above, and ask questions as you get a better idea of he scope
  • I like you, developed deployment for a mac based network (600 or so macs) back when command line ASR and netrestore were the best options. However, we also upgraded our deployment methods as Apple incorporated some of the technologies we used (cloning and automatic install options) into their server software. Today that particular piece of software is very well polished and does the job extremely well. The last time we did an installation (a few years ago) we used custom netboot images with automatic install options for different types of computers (lab, classroom, etc.) based on mac address. At the time we used a third party unix package manager or OS X called Radmind, but it proved to be more trouble than it was worth. However, Apple Remote desktop's package management and monitoring work very well and lets your do most of the upgrade install tasks you need to. In the end, the only per-machine work was setting up the machine to boot from the network by default.

    Also, if you have the bandwidth, you can centralize your OS installs as server based images that are never installed on the thin clients. If you get it to work, it makes upgrades and deployment very easy.

    If you want to discuss some of the problems we faced and our solutions, please feel free to contact me.
  • Resources (Score:1, Informative)

    by Anonymous Coward on Monday September 21, 2009 @08:08PM (#29498401)

    Check out Mike Bombich's stuff for some good tips: http://www.bombich.com/mactips/index.html

    I also found the Apple support discussions to be useful http://discussions.apple.com/category.jspa?categoryID=96 and also this site http://www.afp548.com/

    Good luck!

  • radmind (Score:5, Informative)

    by norkakn ( 102380 ) on Monday September 21, 2009 @08:09PM (#29498407)

    I used to run a network with hundreds of apples with radmind. We installed the initial images with NetRestore (multicast for the larger influxes), and upon reboot, the computers would download their radmind certificate from LDAP and install all of the software that it needed.

    It takes more up front time to set up and configure radmind, but it works wonderfully for almost anything you want to do.

  • DeployStudio (Score:3, Informative)

    by howlatthemoon ( 718490 ) on Monday September 21, 2009 @08:10PM (#29498421)
    We use DeployStudio, a freeware project http://www.deploystudio.com/ [deploystudio.com] . Support for DS is pretty from the community, or you can buy training, but if you want to go with a vendor product JAMF Casper suite makes a great product, that we did not think was outrageously expensive.
  • by bbk ( 33798 ) on Monday September 21, 2009 @08:14PM (#29498453) Homepage

    Apple has a robust remote installation suite with OS X Server, which is darn cheap compared to most other commercial offerings.

    10.6 includes a first party version of NetRestore (full system image deployment, similar to Ghost or Flash Archive on Solaris), but most people deploying across a large number of systems should roll their own images with packaged based tools like DeployStudio or InstaDMG:

    http://www.deploystudio.com/ [deploystudio.com]
    http://code.google.com/p/instadmg/ [google.com]

    Some other good sites for finding info:
      http://www.afp548.com/ [afp548.com]
      http://www.macenterprise.org/ [macenterprise.org]

  • Radmind (Score:5, Informative)

    by profplump ( 309017 ) <zach-slashjunk@kotlarek.com> on Monday September 21, 2009 @08:20PM (#29498513)

    It's been mentioned a couple of times, but mostly with -1 scores, so it's easy to miss: Radmind. It's a very powerful deployment tool with a totally transparent mechanism so you can tweak it to do *exactly* what you want in terms of mucking with files on the disk. I've seen people complain about it being hard to use, but I thought it was pretty straightforward -- install an app, run the change detector, tweak as desired (if at all), build an app image, deploy.

    http://rsug.itd.umich.edu/software/radmind/ [umich.edu]

  • Re:Macs (Score:4, Informative)

    by azav ( 469988 ) on Monday September 21, 2009 @08:22PM (#29498523) Homepage Journal

    Stupid post.

    2 would never happen and would cost WAY more than 400 bucks in time alone.

    Get Applecare and it's covered for 3 years. Ship it back to Apple while they fix it. That's what we do.

  • by Logic Bomb ( 122875 ) on Monday September 21, 2009 @08:24PM (#29498547)

    Why on earth is this being asked on Slashdot? Head to afp548.com and macenterprise.org (particularly its mailing list). You'll find info on InstaDMG, DeployStudio, even Radmind.

  • by thatkid_2002 ( 1529917 ) on Monday September 21, 2009 @08:24PM (#29498551)
    Wrong! Novell Zenworks is on Linux too - so why can't you have a heterogeneous large scale Linux and Windows rollout? There is Zenworks for Mac but none of our customers (though there is quite a few Macs) use it. If you are going to roll out Novell stuff you may as well do Novell Groupwise while you are at it.

    Novell solutions pwn Microsoft, sorry to say.
  • by Logic Bomb ( 122875 ) on Monday September 21, 2009 @08:28PM (#29498591)

    There are many huge Mac deployments: universities, school districts with 1-to-1 laptop programs where every student gets a laptop, Google (thousands of Macs), the Fountainbleau hotel in Miami, and more. Apple gear isn't always used to manage everything: most of these sites are probably using Active Directory or some UNIX-based LDAP service for account management. But there are plenty of large Mac deployments out there.

  • by raddan ( 519638 ) * on Monday September 21, 2009 @08:29PM (#29498603)
    Apple Software Restore, which comes "in the box". We set up a base machine, populate the /System/Library/User Templates/English.lproj/ and then make a disk image to our fileserver using ASR. Then, boot new machines in Target Disk Mode and deploy the image using your workstation.

    We could probably come up with something clever using a boot partition, but this works fine for us. If you want to get fine-grained, have a look at Radmind [umich.edu] but keep in mind that Adobe apps will thwart your every attempt to manage them at that level.

    All of the above are Free/free. We handle patching using Apple Remote Desktop (not free, but well worth the money). You can also configure your machines to authenticate against an Active Directory (like we do); if you're willing to modify your schema, you can even manage your installation from your MMC snap-ins like you can with Windows boxen.
  • by lymond01 ( 314120 ) on Monday September 21, 2009 @08:29PM (#29498605)

    Open Directory [apple.com]
    By centralizing information about users and network resources, directory services provide the infrastructure required for managing users, groups, and computers on your network. Directory services can benefit organizations with as few as 10 people and are essential for enterprise networks that have thousands of users. Deploying a directory server helps reduce administrative costs, improve security, and provide users with a more productive computing experience.

    Remote Desktop [apple.com]
    Apple Remote Desktop is the best way to manage the Mac computers on your network. Distribute software, provide real-time online help to end users, create detailed software and hardware reports, and automate routine management tasks -- all without leaving your desk. Featuring Automator actions, Remote Spotlight search, and a new Dashboard widget*, Apple Remote Desktop 3 makes your job easier than ever.

    * You'll notice Open Directory has no Dashboard widget. It's because it isn't uniquely Apple and therefore isn't polished to a blinding shine.

  • from experience (Score:5, Informative)

    by v1 ( 525388 ) on Monday September 21, 2009 @08:30PM (#29498615) Homepage Journal

    You're likely to get some laptops in addition to desktops. Get yourself a large room, a dozen or more firewire cables, power strips together. Before the machines arrive, use a macbook pro or macbook (a laptop) to develop your base image. Install all software on it that is going to be on most of the machines. Test thoroughly. Be sure all your remote access is tested. (ARD/SSH)

    Use netrestore to create the base image. When the computers arrive, copy the base image to a group of laptops, with netrestore app. The number varies depending on how many computers you are going to be imaging, the size of your base image, and how much help you have. 8-12 is typical if only one person is going to be restoring.

    First thing you should do with machines out of the box is label them, have labels made up in advance. Then set them all up imaging over firewire, just get an assembly line going. You CAN do netrestore over the network, but it's been my experience it's less reliable. (machines randomly fail to restore, sometimes entire groups fail at an annoying 99% etc) Firewire is usually faster anyway since your fileserver or switch is very unlikely to be able to keep up with imaging a dozen at once. FW800 imaging is an amazing thing.

    Once machines are imaged, there should be a folder of scripts sitting on each machine's local admin acct, one for each group of machines. The script will prompt for computer name and run. When run it will rename the computer and delete all the apps that should not be on that particular image. This can also be done by running the script remotely over apple remote desktop. If you don't have ARD, *get it now*. It will save you incredible amounts of time. Using this removal script method adds only a few minutes of time per image but you're doing them in parallel so its negligible, and saves you the major headache of managing a half dozen different base images.

    As long as you made the image on a laptop, it should have full hardware support for the camera etc. Different images are required for PPC, but fortunately that's not a headache you have to worry about. (I did, PAIN)

    Boot camp adds a level of complexity, requiring you to partition the hard drives before restoring to them, and then using something like Ghost or Acronis. One person can image between 40-80 machines in 8 hrs depending on how things go. Helps to have grunts to do the minor things like unpacking and delivery to stations. Find some carts so you can move machines several at a time. Inform the cleaning staff that you're going to have a mountain of packing material to dispose of. Keep 1 box for every 20 machines in case you need to box them up to send to a repair shop down the road.

    If you insist on using netrestore over the network, be sure you have multicast enabled on the switches. It doesn't like crossing subnets but can be made to work.

  • by scottdmontreal ( 1003416 ) on Monday September 21, 2009 @08:32PM (#29498625)
    DeployStudio looks fantastic with it's multicasting capabilities, but the System Image Utility in Leopard Server is just so trusty I have a hard time looking at anything else. http://www.deploystudio.com/Home.html [deploystudio.com] You don't hear much about Leopard Server but it is by far the most promising aspect of the platform. It is the key to any large scale OSX network. I am a one man shop for 400 users. I'm sure that with a staff of three It could scale way up.
  • Re:radmind (Score:2, Informative)

    by limako ( 45118 ) on Monday September 21, 2009 @08:36PM (#29498669) Homepage Journal

    A previous poster argued that you have to choose between unix-ey freeware and pricey, pointy-clicky commercial software, but radmind actually bridges that gap nicely. It is a free set of unix command-line utilites with several GUI applications that can bind it together on the client and server sides -- if you like that sort of thing. In my implementation, we use perl scripts to actually do most of the heavy lifting. Moreover, it's relatively to give end users more-or-less control over the rest of the system: you want a lab computer? Radmind can do that. You want a user's workstation? Radmind can do that too.

    Radmind is effectively a tripwire: it builds transcripts about what has changed on the system and can either capture those changes as a package or apply changes to restore (or setup) a system to a known state.

    The only downside of radmind is that to use it effectively, you really need in-depth knowledge about the MacOS. In order to build transcripts, you need to know which of the changed things are meaningful and which are noise. You need to understand how packages have the potential to create dependencies and conflict with one another -- and to make sure the packages get applied in the right order.

  • you know... (Score:3, Informative)

    by buddyglass ( 925859 ) on Monday September 21, 2009 @08:47PM (#29498757)
    If your installation is big enough, you could probably get some good advice from...an Apple technical sales rep.
  • by Anonymous Coward on Monday September 21, 2009 @08:47PM (#29498767)
    sorry I am not a winblows fanboi but if you actually believe Mac's are built for very large enterprises it is you taking fanatism to a new level. Linux Sure, but Mac's are a hodge podge of half arsed solutions that can be bound together with twine to work in an enterprise, as someone that supports them on a daily basis in an enterprise I can say without any doubt Mac's are NOT built with the enterprise in mind.
  • by Toe, The ( 545098 ) on Monday September 21, 2009 @08:48PM (#29498777)

    The above are good resources, but also check out the OS X Server list [apple.com]. It is a good, geeky community of people actively building and working on OS X Server networks.

  • by itzdandy ( 183397 ) on Monday September 21, 2009 @08:57PM (#29498859) Homepage

    I would argue this.

    Linux may be less prefered for a stand alone desktop mainly because of the windows apps that consumers like to clutter their computers up with. Linux excels in large deployment, standardized desktops.

    Simply put, linux workstations are easy to setup against LDAP with NFS home directories. You can tighten the desktops up to limit apps. Use Terminal Server and RDP for necessary windows apps. You can run specific applications on centralized servers and access them via remote X sessions on the local lan or over the internet and tunnel that through compressed ssh tunnels. Got a really heavy app that only a 10 users need? buy one high end workstation instead of 10. LDAP carries usernames and permissions across the network. DNS keeps every server easy to maintain because a DNS change lets you quickly relocate services.

    Consider that linux is easily installed via network, can be installed in a reliable software raid environment, and is very very stable when users dont have root access to the box to install software and tweak the system.

    You can run your workstations of flash keys. you can net-boot them if you like. LTSP and you can use old hardware and net-boot them.

    You can load balance remote apps easily and LDAP handles authentication and NFS handles preferences so your users dont even care about the server they are using. to them, blender.domain.local is all they know, even though that is just the load balancer.

    The shortcomings of open source are really that making everything fit a windows environment is difficult because it is a moving target and is actively evading OSS.

  • Bombich Software (Score:3, Informative)

    by SammyIAm ( 1348279 ) on Monday September 21, 2009 @09:00PM (#29498889)

    I worked at a school district for some time with a significant Mac deployment. We used Mike Bombich's software [bombich.com] extensively, and especially for deployment, his NetBoot utility.

    It does take a little bit of configuration on the server-side to start, but it looks like some other posters have already linked to tutorials for setting that up. MB has a utility to create a net-bootable-image that can used to image that machine with your choice of disk images (we had different images for different architectures, and different software packages), or can be automated to pick an image automatically.

    His NetBoot software also has the ability to run a shell script to complete configuration settings that may need to be done on a per-machine basis (setting the computer network name for example).

    For running updates, and modifying settings after the initial imaging, Apple's remote desktop is actually very useful. Although the feature set is limited, it DOES allow for the execution of shell commands from the Remote Desktop interface, which makes upgrading or changing settings on a large number of machines fairly easy.

  • Re:Macs (Score:2, Informative)

    by PC and Sony Fanboy ( 1248258 ) on Monday September 21, 2009 @09:01PM (#29498897) Journal

    Ship it back to Apple while they deny that it's a manufacturing defect, but agree to repair it out of the goodness of their heart.

    That's what the rest of the world does.

  • Radmind (Score:4, Informative)

    by fitterhappier ( 468051 ) on Monday September 21, 2009 @09:14PM (#29498989)

    I managed a deployment of roughly 800 Macs across the campus of a large university using Radmind [radmind.org]. I've also managed the campus Linux, Solaris and OpenBSD kerberos servers, web servers and file servers with the same software. Radmind's learning curve is a little steeper at first, but it's one of the most flexible deployment options out there once you get the hang of it.

    Radmind's not really a competitor with tools like NetRestore. When used correctly, NetRestore is great for total reimaging of deployed hardware: nothing beats a block-copy installation for speed. Where NetRestore falls down is when dealing with deployment entropy. After imaging, the machine is in an unknown state ("post-image"), and the only way to be sure all machines are in the same state is to blow away the entire disk and reimage, usually at a cost of gigabytes of bandwidth per machine.

    This is where Radmind excels. It's basically a tripwire with software deployment and roll back, all based on the differences between what should be installed and what's actually on the disk. The core utility, fsdiff, looks at all files and directories designated as managed by the administrator and generates a list of differences. You can capture those changes as a loadset and upload them to the Radmind server for deployment to other machines, or you can undo any changes detected by fsdiff and restore the client to a known good state.

    The great thing about this method of management is that there's minimal bandwidth used. If fsdiff detects no changes on the filesystem, there's no reason to download anything: your system is in a known good state. On the other hand, it makes deploying Apple's system and security updates pretty damn easy. Grab the updater from Apple's website, install, and run the Radmind tools to capture the changes. Store the changes on the server, add the new loadset to your machines' profile (command file), and let your clients pull down the changes.

    The Radmind community is very helpful. Most questions to the mailing list (hosted on SF.net, Google groups mirror here [google.com]) are answered very quickly, and people are eager to share details about local setups and scripting solutions. A typical setup for a Radmind-managed Mac OS X client usually involves a few possible methods for initiating updates, most of which involve iHook [ihook.org] as the UI:

    1. Check for updates on Radmind server during logout, update client if found.
    2. Run a nightly tripwire regardless of updates from server.
    3. Run a Radmind update during boot if a special flag file is found on the disk.

    Since we relied on students to help run our labs, we also deployed a special, unprivileged local user account, whom the students could log in as. This also triggered a Radmind update. And of course you can trigger updates over ssh (which works well in combination with something like pdsh).

    We combined Radmind with NetBoot for rapid, consistent deployments. Once the hardware was in place and on the network, we netbooted, used ASR to install a minimum and relatively recent system, and let Radmind bring everything up to date, including per-host license files and location specific software.

    Radmind's not perfect. It manages at the file level. If you want something to manage, say, config files on a line-by-line basis, Radmind isn't going to fit the bill (yet). Generally speaking, though, Radmind manages Mac OS X with ease. Once you've got Radmind managing your Macs, you'll find you have a lot of extra time to do interesting things instead of troubleshooting problems brought on by stale deployments.

    The Radmind wiki [umich.edu] is a decent place to start looking. Good luck.

  • by firstnevyn ( 97192 ) on Monday September 21, 2009 @09:19PM (#29499047)

    With puppet [reductivelabs.com] of course.

  • by anagama ( 611277 ) <obamaisaneocon@nothingchanged.org> on Monday September 21, 2009 @09:27PM (#29499103) Homepage

    Email is easy enough to offer but shared address books and calendaring may give Exchange the edge.

    Darwin Calendar Server. Open Source, free, runs on Linux. I thought I read in the mailing list that address book sharing is coming, though I can't be positive on that. Still, makes a great calendar server and it works with Thunderbird, though Thunderbird is not an awesome calendar client. Some howtos for installation: http://dcswiki.org/ [dcswiki.org]

  • Comment removed (Score:3, Informative)

    by account_deleted ( 4530225 ) on Monday September 21, 2009 @09:28PM (#29499119)
    Comment removed based on user account deletion
  • Re:Macs (Score:2, Informative)

    by TyIzaeL ( 1203354 ) on Monday September 21, 2009 @09:37PM (#29499175)

    Rule 35: If no porn is found at the moment, it will be made.

    Source [tinypic.com]

  • by Anonymous Coward on Monday September 21, 2009 @09:37PM (#29499183)

    There is a HUGE difference between a large scale deployment of individual machines that may participate at some level in a domain environment and a large scale deployment of machines that are COMPLETELY managed, scanned, updated, patched, installed, backed up, and configured from a single place. I'm talking about a hardware guy setting a brand new machine and doing nothing but plugging into the network and walking away. The next morning, the new secretary or student has a fully usable and installed machine with all of the apps at a very specific version and customizations and functions they will need to perform their work up to and including the background, the startup music, the power settings, and the icons on the desktop depending on what department they are in. Yes, anyone can preinstall a Word processor on a machine but can you have the correct custom toolbars and have it integrated into your companies document management and purchasing systems and the required tools for deploying to the companies portal system? Take that example 20 times for all of your software. For large businesses, a SINGLE common interface that can be deployed and and updated seemlessly in the background saves huge amounts of money in time, training, and productivity. If a tech has to make a single trip to more than two peoples computers to install or update a piece of software, you are not doing things in the most efficient manner. We have over 2000 desktops in 5 countries and they are completely managed by three people in a single office. The amount of people we have maintaining them is based on the amount of updates and software we mange, not the amount of desktops deployed. Those three people can mange 10 or 10000 desktops with just about the same amount of work.

  • by Anonymous Coward on Monday September 21, 2009 @09:46PM (#29499237)

    Mod parent up. Radmind [umich.edu] is the only way to deploy a managed Mac OS environment.

  • by thegreatemu ( 1457577 ) on Monday September 21, 2009 @10:08PM (#29499407)

    I second that one wholeheartedly. The GUI admin, which is billed as this "any average Joe can run a network" (which is how I got stuck with it with no training) is completely inadequate if you're doing anything completely non-trivial, but thinks it know better than you and clobbers any edits you make to the config files.

    Also, the DHCP and NAT fail tremendously. I told the server to serve DHCP and provide NAT services to the subnet so that my cluster would have one forward facing IP address. This worked great until someone unplugged the LAN cable, leaving the WAN as the only living connection. Since I had NAT on, OSX Server decided I must really want it, and just made a mistake for what side I wanted it on. So it happily started serving up DHCP requests on the wider network, at least until OIT hunted it down and screamed at me.

    it just works my ass

  • by Culture20 ( 968837 ) on Monday September 21, 2009 @10:46PM (#29499699)

    You don't, you use the many available tools to do what you want to all the machines via scripts. This is the same thing you do when you realize that group policy only exists for a couple things and everything else you are on your own.

    I admin both 'doze and 'nix, and although what you say about AD is true, you're not completely correct. AD is so handy to create GPOs with batch files to apply to machines automagically when they are thrown in an OU. Sure, you can always add computer names/IPs to a config file for automated scripts in cfengine, but AD is easy for subordinates to deal with.

  • by admiralex ( 1410225 ) on Tuesday September 22, 2009 @12:28AM (#29500369)

    I do this for the federal government, after coming from a university environment where I grew up with the Mac from the bad ol' days of the late 90s through Apple's phoneix-like rise from those ashes into the titan it is now. Truth be told, not much has changed.

    For mass deployments, I'm about to look into Casper, but NOTHING I've seen or heard about beats netboot/netrestore -- and mind you, I live and breathe Mac. I use PCs to manage Remedy tickets, and that's it. The ability to create a master image, upload it to a server, restart a machine with the n key pressed and have it image itself was and is nothing short of magical, and it's the deployment solution I'm moving toward for the portion of the Treasury Department network I control (if I die, money will cease to be printed). Unless Casper can top that, netinstall + n is still my deployment solution of choice, and one that the folks where I used to work are still trying to replicate three years later. There's nothing faster or more foolproof.

    Prototyping is just as easy. I deal with everything from banknote designers (pull out a bill. Isn't it pretty? My designers drew all that stuff on their Macs) to executive management, and though they use their machines differently, they all have the same baseline needs -- a rock solid configuration that's hardened to IT Security's exacting (if evolving) standards, and Office to handle collaboration. My base image is a hardened installation of Leopard with fully-patched Office. That's standard across all machines. This base image is what I run in regular user mode on my personal production machine so I will know firsthand exactly what the user experiences from day to day. I customize the default user environment on the standard image to suit _my_ tastes and allow the users to tweak and refine that environment as they see fit. I learned years ago that this is the best approach for standardizing a user's desktop because I know how to work around the various quirks of OS X that can become annoying after using it for an extended period of time, and they usually haven't been on Macs long enough to have figured these things out. The more experienced of my newest users typically bristle at this since to a person they always think their approach/way of configuring the Finder/desktop is THE way to have their machines work, but I usually don't hear a peep from them after a week or two of working in my environment. The biggest compliment to me is when I cease to get trouble tickets from my bitchiest users because they find that I've already anticipated and addressed their most obvious complaints in the standard image.

    On top of the standard image, I install applications specific to the machine's role. The designers, for instance, get Adobe CS 4 and additional design-focused applications such as Quark and a font manager. My video people get Final Cut Studio. My engravers get the same package as the designers. My method of choice for deploying to these disparate groups lately has been to install the specialized applications on the standard image and create secondary images applicable to specific groups. Banknote design machines, for example, have their own special image and the video production machines have an image all their own. This simplifies things mightily because all I have to know when I want to deploy a new workstation (or repair a broken one) is where it's going. Oh, this is a replacement banknote machine? Put the banknote image on it. Copy the _user folder_ -- and nothing else -- from the old machine, create an account on the new machine, point it at the old user folder, and voila. Completely new hardware, and the user has no idea anything's changed. I've upgraded users from Tiger-running G5s to Leopard-running 8 core Mac Pros, and the only difference they noticed was the machine was "a lot faster." And the Apple menu's a different color. That's the power of Mac OS X.

    Security, as I'm sure you well know, is not an issue on the Mac, but given the sensitivity of what my users do, I

  • by Kaedrin ( 709478 ) on Tuesday September 22, 2009 @12:36AM (#29500427)

    So here you go. Far too much conceptual information about a process I suspect almost no one here knows beyond the few that already mentioned it. Enjoy.

    So the best I can do is telling you how I do it for about 400 Mac's, and the tools I use. I basically use two OS-X 10.6 servers that host NetBoot images and Radmind, and then Apple Remote Desktop (ARD) on a client to control events occurring on all the clients be they booted locally or NetBooted.

    I'll also be up front, if you are not computer savvy, and don't want to be, do not touch Radmind with the idea of using it to deploy anything beyond software to an already existing deployment. Stick with an image based package. If however you are computer savvy, can get around a command line, and need to support an unlimited number of *nix machines, especially in a lab, Radmind is an incredibly strong tool.

    I solely use Radmind for both OS deployment and software updates because it's a delta based package and tripwire system which you don't need to rebuild over time unless an administrator makes horrible mistakes without a backup. If I really needed an image, I would have Radmind generate that build for me and then use 10.5/10.6's NetBoot/NetInstall creation tool on the results.

    I do not use NetRestore, NetInstall, or any other deployment tools for OS-X. It is a waste of time to constantly rebuild and maintain various images over time vs a delta based deployment system, especially when I'm the only one supporting the image. It may take *slightly* longer to deploy than a sector based image, but the amount of effort placed on the administrator in the long term significantly decreases. Sure, learning Radmind might take a whole lot of time and effort, but the more random and variously configured machines you need to support are, the more attractive it becomes to spend time learning how to use it beyond a software package deployment tool. Heck, the right people behind it could probably support thousands of *nix servers without much of any effort.

    You can also reverse the use of Radmind over time to maintain just software packages by making a negative transcript targeting just ".". If you do that, and make sure clients don't see the overall OS level packages, you can update software only without updating the OS at its core.

    So radmind has a set of tools that come with it, and I'm only going to mention the most critical of them. One scans a computer for changes. Two other apps takes that scan and either uses it to upload data to a server, or to use the knowledge on the server to 'cause' changes to the client. Another downloads the command lists from the server, and those command lists have knowledge of all the "package" transcripts that actually define almost every file on the computer. Using them all in combination in scripts by someone that knows how to manipulate the results are what can make Radmind powerful.

    Up front there are negatives and positives about Radmind:
    Negatives:

    It can be very complicated.

    A lot of the documentation is poor, though it's better today than when I started using it.

    Simple mistakes in a transcript can suddenly prevent the client-side app from functioning. Discovering why can sometimes be very difficult. (especially if it's a nested command file level issue that only gives you "Input/Output error" when lapply crashes.)

    It only supports network compression, which frankly is worthless. No file-based compression during capture.

    Almost any error in a delta file will break process of updating/deploying machines. It really requires you have someone learn it in and out.

    The default method of deploying images to massive numbers of machines that may need different builds is unwieldy. There are ways around some of this.

    The GUI console in OS-X once you have several hundred transcripts is annoying to use, and creating and using subfolders for transcripts or command files will seriously screw your deployment life up.

    It has no GUI on anything except OS

  • Re:from experience (Score:2, Informative)

    by Kaedrin ( 709478 ) on Tuesday September 22, 2009 @01:48AM (#29500829)

    For Mac Deployment, I script the disk partitioning with the terminal version of diskutil, making the Windows partition the exact same size on all machines and have diskutil mark it as MS-DOS. I then use Bombich's OS-X compilation of NTFS-Progs v1 to capture and deploy both Windows 7 and Vista images to the Mac's while OS-X is in use. Students using the computers at the time don't even realize it's happening. NTFS-Progs v2 requires Darwin Ports; I don't believe anyone has made a truly native build of v2.

    It's doesn't have multicast, but you can re-deploy Windows while students are using OS-X during a class. For me, students only may screw up a Windows push if they reboot a machine while I'm doing it. Then I start over. I can also do it all while netbooted SSH/ARD the commands for imaging to the machine. Never have to directly visit them.

    NTFS-Progs is also open source.

    Using my method though, you do have to use "dd" to capture and deploy the Windows boot sector located on what is my /dev/disk0 while the computer is either NetBooted or booted from a firewire drive. I also make my "MS-DOS" partition disk0s2 on a GPT disk while OS-X uses disk0s3. It's more important that the Windows partition be identical on all machines this way than the OS-X partition, so it's just easier to plan on it being the first available partition. The side effect is that if anyone launches bootcamp in OS-X as an administrator and tells it to get rid of the Windows partition, it actually will immediately get rid of the OS-X partition even if your booted from it. Doesn't affect me though, as I strip Bootcamp off my OS-X deployment image. Very few people could launch it even if I didn't.

    The terminal version of diskutil I believe is in 10.4.7 and above. Though maybe it was released with 10.4.8.

  • by bertok ( 226922 ) on Tuesday September 22, 2009 @08:14AM (#29502371)

    Not even that. OpenLDAP supports user-defined schemas. Active Directory doesn't. You have to go out and buy something if you don't like the stock set. Kerberos and one or more LDAP servers come standard with all the major Linux distros.

    100% wrong, AD does allow schema customizations, using a simple command-line tool. Many applications do exactly this, not just Microsoft software. Developers steer clear of it, because a forest-wide schema change terrifies most PHBs, but it's actually rather trivial if you need it. Microsoft does request that if you sell boxed software that makes schema extensions, then you should register your schema IDs with them to prevent conflicts, but that's not enforced or anything.

    Oh look.. it's even documented for you:
    LDIF Scripts
    http://msdn.microsoft.com/en-us/library/ms677268%28VS.85%29.aspx [microsoft.com]

    What I especially like about AD is that once you've extended your schema (say by adding a few attributes to the User class), you can then write a management console add-in that adds an extra tab to the User property dialog box. Nifty.

  • by Anonymous Coward on Tuesday September 22, 2009 @08:36AM (#29502491)
    Please tell this to everyone else I know with a mac and a smug attitude for having spent way too much money for their computer.
  • by RulerOf ( 975607 ) on Wednesday September 23, 2009 @01:20PM (#29517999)
    Yes, yes, I know.

    I was referring to the functionality you see in RDP, where any client edition of the OS can connect to any box running Terminal Services (XP Pro and all Server Eds.) without licensing more crap.

    I may have misstated the licensing terms, but I firmly believe they're bullshit enough that such doesn't matter.

So you think that money is the root of all evil. Have you ever asked what is the root of money? -- Ayn Rand

Working...