Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Unix Windows Linux

Ask Slashdot: Moving From *nix To Windows Automation? 427

Zubinix writes "I have a background in doing automation in a Unix/Linux environment using scripting languages such as perl and bash shell, as well as ssh for remote scripting. My next project will be in the Windows environment so what approach and methodology is best for developing, say, the automation required for a test system? I don't want to use things like Cygwin, as I need to integrate with Windows applications such as Exchange and Sharepoint. Is there a list of should and should not dos when it comes to Windows automation?"
This discussion has been archived. No new comments can be posted.

Ask Slashdot: Moving From *nix To Windows Automation?

Comments Filter:
  • Tons of options (Score:4, Interesting)

    by Skuld-Chan ( 302449 ) on Friday May 06, 2011 @07:04PM (#36052866)

    There are bunches of tools out of the box to automate things in Windows - the scripting host (supports js/vb), power shell, and wmi - or a combination of things. You can open a wmi interface in vb for instance.

  • by Alanbly ( 1433229 ) <alanbly@gmail.com> on Friday May 06, 2011 @07:04PM (#36052872) Homepage
    When I was working in Software Quality Assurance we had a lot of luck with Mercury Quick Test Pro and Test Batch Runner. They have a solid recording interface than can be coded manually (in VBScript.Net but what can you do?). Integrated fabulously with C# .Net code for doing black-box and grey-box testing. I'd also suggest Symantec Ghost for setting up test systems.
  • by gbrandt ( 113294 ) on Friday May 06, 2011 @07:10PM (#36052932)

    http://stackoverflow.com/ [stackoverflow.com] has experts that go there just to help with questions like this.

  • Re:a VM... (Score:5, Interesting)

    by JWSmythe ( 446288 ) <jwsmythe@[ ]mythe.com ['jws' in gap]> on Friday May 06, 2011 @07:16PM (#36052990) Homepage Journal

        Amazingly enough, that's something I'm doing right now. (no, I'm not the original author). I smoothed over the rough spots as much as possible, and now we're doing the Linux migration. It's a long, slow process to change a decade of deep rooted Windows-isms, but it will happen soon enough. The only Windows that will be left will be the workstations for those who choose not to switch over to Linux. In a web based enterprise, there's no excuse for being locked into any platform.

  • by onyxruby ( 118189 ) <onyxrubyNO@SPAMcomcast.net> on Friday May 06, 2011 @07:26PM (#36053078)

    There's nothing fundamentally really that different from a management standpoint. The more you'll dive in the more your going to find that differences boil down to syntax. Your going to need to test your scripts, your going to need to have peer review, your going to need to use change control, maintenance windows and so on.

    There is absolutely no reason for your process to be any different on the Windows side than the Unix side, and if it is than your process needs to be rebuilt. Process should always be tool agnostic and your operating system is simply a tool. If you know your best practices than the only thing you need to figure out is the syntax (tool, language etc) to do what you want to do.

    If your simply asking what tools you should use to do scripting or deployment, than you need to look at tools like Altiris, SCCM and so on. If your looking for tools for packaging applications than you would at tools like Wise Packaging Studio. What you really haven't done is explained your need very well. Are you trying to manipulate certain things about exchange with a script? Are you trying to export or import SQL database data with your script?

    If you want to run a script on just a few systems than you can schedule the server to run scripts manually and you just need to supply the scripts. If your looking to deploy something on a consistent basis than a combination of GPO's and Active Directory can work quite well. If your looking to automate the distribution of a series of scripts based on certain characteristics of your systems than it would be hard to beat Altiris. If you want to run custom queries or actions than WMI based scripting can do some pretty neat things.

    Feel free to message me offline, where I work I'm responsible for the management of both Windows and Mac based systems and do this for a living on a daily basis.

  • Re:Don't do it... (Score:5, Interesting)

    by lucm ( 889690 ) on Friday May 06, 2011 @07:43PM (#36053210)

    I don't agree. Powershell is actually very powerful as it can extend or be extended by the .Net framework. It is also very flexible, which is very convenient for systems automation.

    Big enterprise schedulers, such as Tidal, have built-in support for Powershell and many enterprise storage solutions, such as Compellent, also have built-in support. Also VMWare has the very impressive PowerCLI, which is basically a series of extensions for Powershell that can automate almost everything in VirtualCenter.

  • Re:Don't do it... (Score:5, Interesting)

    by oakgrove ( 845019 ) on Friday May 06, 2011 @08:58PM (#36053656)
    Those cli utilities are 30 plus years old. Any sysadmin worth his salt will know them inside and out. How is the syntax any more arcane than what is spit out by ps? And, er, last I checked the ease with which completely unrelated utilities can be chained is the point!
  • Re:Don't do it... (Score:5, Interesting)

    by homesnatch ( 1089609 ) on Friday May 06, 2011 @09:59PM (#36053976)
    I'm a *nix guy, and I do have to say Powershell is pretty sweet.

    Here's an example of something extending Powershell. VMWare released a module for PowerShell that allows for control of VMWare env using Powershell.

    #Load VMWare snap-in for powershell
    LoadSnapin -PSSnapinName "VMware.VimAutomation.Core"

    #Create VM from template and then start it
    $myNewVMName = "NewVM_01"
    $myTemplate = Get-Template "TemplateName"
    $strDestinationHost = "ESX01"
    $myNewVM = New-VM -Name $myNewVMName -Template $myTemplate -VMHost (Get-VMHost $strDestinationHost)
    Start-VM $myNewVM


    VMWare created a really awesome extension to Powershell, that allows for all sorts of inheritance and piping. Microsoft created a rather poorly implemented Active Directory extension for Powershell (can't pipe or inherit on things I would expect to be able to)... MS could have used the VMWare example to make a better AD extension.
  • Re:Don't do it... (Score:5, Interesting)

    by bertok ( 226922 ) on Saturday May 07, 2011 @04:59AM (#36055234)

    I'm a programmer-slash-sysadmin, and I've taken a stab at writing both command-line tools using C/C++, .NET, and these days with PowerShell, so I might be able to answer your question.

    When I write complex software, I usually provide some command-line tools for the admins to do things like import data, export logs, or whatever. This is a royal pain in the ass, but I do it because it's useful. I always end up spending something like 80% of the time on annoying stuff like handling command-line parameters, input validation, and error messages. Of course, because whatever I come up with is new and different, I have to write doco for it, train staff, and so on.

    Now, with PowerShell, instead of having to write a whole new program to implement a command, I can simply extend a class from a framework library. The PS framework does almost everything automagically: handle the pipeline, input parameters, parameter validation, parameter sets, optional parameters, help, tab-complete, wildcard matching, output formatting, etc...

    To give you an idea, it is possible to write a PowerShell command in C# (.NET) with several parameters and complex output that does a useful task in about 50 lines of rather trivial code. The equivalent C program would be thousands of lines.

    More importantly, the resulting PowerShell command will be orthogonal, consistent, discoverable, and embeddable

    By orthogonal, I mean things like: every PS command that handles wildcards does it with the same shared library, which is trivial to use. Hence, I can do things like: Get-VM "prod_win_[a-k]*" which is a VMware snapin, and the behavior will be exactly the same as if I had called a Citrix XenApp snapin to lists farm servers with: Get-XAServer "prod_win_[a-k]*". Similarly, the output of commands is structured data. You can say: Get-VM | Export-CSV or Get-XAServer | Export-Csv and get similar results. Compare with legacy command line tools, which often implement such formatting internally, and badly. I just had to work on some Novell systems, for example, which export invalid XML files because the utility doesn't escape ampersands! Of course, our example 50 line command will get all of that. Export to CSV? You have it! Export to XML? Done! Quickly, tell me how to export CSV formatted data from the following 3 Linux command-line tools: 'ls', 'apt-get', and 'ps'.

    By consistent, I mean that parameters are always specified with "-param", never "/param", or "--param", or "-abcdgh" where each letter does something that you can only determine by reading a five page document. Similarly, Microsoft has established a strict naming system for developers, so that instead of "retrieve-foo", "ask-foo", "query-foo", "request-foo", "list-foo", "foo-list", "fooenum", or god knows what else, the only acceptable standard is "get-foo". VMware has "Get-VM", Citrix has "Get-XAServer", Microsoft has "Get-Process", etc... no guessing! There are standards for command names, parameter names, and coding conventions. E.g.: Verb Naming Rules [microsoft.com], and Cmdlet Development Guidelines [microsoft.com].

    By discoverable, I mean that if I write a little 50-line utility with a bunch of parameters, an administrator at the command line can simply press 'tab' and have both the command and the parameter names automatically completed. The help is automatically generated from my 50 lines of code. GUI script wizards can load a bunch of metadata about each parameter to enable drag & drop script development.

    By embeddable, I mean that even a 50-line utility can be called not just from the shell, but from within a hosting application in the same process. Instead of having to handle text-based streams, PS commands take .NET objects, and return .NET objects. There's no guessw

Your computer account is overdrawn. Please reauthorize.

Working...