That does not mean that we all have to become command line and PowerShell experts (although if you are an IT Professional: LEARN POWERSHELL!), it just means we need to install the Remote Server Administrative Tools (RSAT) on a desktop computer, and use the same (mostly) MMC consoles to manage our computers remotely.
While you can install the RSAT Tools on both Windows Server and Windows 10/11, the PowerShell cmdlet to do so is different. Why? What is called a Windows Optional Feature in Windows Server is called a Windows Capability in the desktop OS.
But Mitch, why don’t you just add the RSAT tools via the GUI? That is an excellent question, and I am glad you asked. I service customers in both English and French. Conversationally, I am pretty fluent in French. Unfortunately my reading is not quite as good, and because so many server environments are installed in English (even when the desktops are in French), it is not something I have gotten used to. Fortunately, the PowerShell cmdlets are always in English, and it is easier for me.
Administer It!
I am not an advocate of people defaulting to running anything as Administrator unless absolutely necessary. To install the RSAT Tools, it is. Make sure you run PowerShell as an Admin.
Server Side
Before we install a particular RSAT tool, we need to know the name. We should also see if it is running or not before we go installing it again. Let’s run the following cmdlet:
Get-WindowsOptionalFeature –Featurename *RSAT* –Online | Select-Object Featurename, state
Excellent. We see what is there and what is not. Now let’s install one… for no reason at all, let’s pick the Hyper-V Manager. We’ll run this cmdlet:
Enable-WindowsOptionalFeature -FeatureName RSAT-Hyper-V-Tools-Feature –Online
We see that after a few seconds in returns successfully, and that no restart is required.
On the Windows Client we are going to do the same thing, but there are more steps to it.
1) We not only need the list of the Windows Capabilities, we also need to know the name of the installer. We’ll use the following cmdlet:
Get-WindowsCapability -name *rsat* -Online | Select-Object -Property Displayname, name
**The Hyper-V tools are not listed because Hyper-V is a part of the Windows client OS, and as such is still a Windows Optional Feature.
Notice that the name (which is what we need to install the tool) is not the same as the DisplayName (Friendly name). Notice also that some of the lines end with an ellipsis because there is not enough room on the line (thanks to AD DS and LDS Tools). So once we know which tool we want to install, we will run the same cmdlet again, but modified to the specific RSAT tool we want. For this article we will use Server Manager. Run the following cmdlet:
Get-WindowsCapability -name *Server* -Online | Select-Object -Property Displayname, name
Okay… we can now run the following cmdlet to install it:
Add-WindowsCapability -Name Rsat.ServerManager.Tools~~~~0.0.1.0 -Online
We are good to go… except note that the RestartNeeded is True, so we will need to reboot our computer for this console to work.
Conclusion
Server Core is a great way to reduce the attack surface and patch footprint on our servers. You will also discover that by removing the GUI we can save as much as 10GB storage space per server. In a virtualized environment where every server is stored on an iSCSI SAN device, that can add up to a lot of space.
Server Core is also a great way to discourage people who do not need to be logging on to your servers, which means you will save even more by not automatically creating user profiles for each user.
Administering our servers using remote consoles is an efficient way to work without compromising the advantages, nor having to learn the PowerShell cmdlets to administer all of our server features.
]]>Before going any further, let’s define a few terms you will need to understand:
Active Directory Domain Services: This is the good old on-premises AD that we have been using since the advent of Windows Server 2000. It was renamed ADDS at some point, but it is the same AD, only evolved. It leverages Kerberos authentication, and is controlled by our domain controllers that run the AD services.
Azure Active Directory: The cloud authentication service may share part of its name with ADDS, but it is quite different. For one, it is not a Kerberos system, rather it leverages OAuth and other modern protocols.
Rather than investing in new hardware, I opted to build my new domain controller in a Hyper-V environment. The configuration of that infrastructure for that is out of scope for this article.
I opted to start with Microsoft’s latest server operating system, Windows Server 20H2. This iteration does not include the graphical user interface (GUI, or Desktop Experience) that its predecessors do. Because of that, we will be relying entirely on PowerShell to build and configure our DC.
The Preliminaries!
I have installed the server OS, and am logged in to my new (clean) server. This is a lab environment in my home office that is not segregated from my regular devices, so it has gotten a DHCP address from my home router. The first thing I want to do is to change that to a static IP address.
I verify my current IP configuration using the Get-NetIPConfiguration cmdlet in PowerShell… essentially the modern version of ipconfig. I checked my existing environment and decided my IP address would be 10.0.0.2. I know that my subnet mask is 255.255.255.0, so my prefix length is 24. What I needed from this cmdlet was the Interface Index.
So to set my IP address, I will use the following cmdlet:
New-NetIPAddress –InterfaceIndex 4 –IPAddress 10.0.0.2 –PrefixLength 24 –DefaultGateway 10.0.0.1
That sets the IP Address, but clears the DNS Server information. I’ll fix that with the following:
Set-DnsClientServerAddress –InterfaceIndex 4 –ServerAddresses “10.0.0.1”
With that, my virtual machine is connected to the Internet again.
I don’t like the idea of having my domain controller named WIN-GQ35FV9 (or whatever random name Windows selected, so I’ll do a quick computer rename:
Rename-Computer MDG-DC
This won’t take hold until I reboot my system, so let’s do that now.
**Note: Lab environments can be tricky when they are on your production network. If I was building a completely segregated lab, or if I was building a lab that did not need the Internet, I would install a DHCP Server in this machine. As I am not, I will have to configure static IP addresses on all lab machines.
Let’s Do It!
Now that our networking is configured, we can move ahead with the domain creation.
The first step is to download the PowerShell module. That’s simple enough, although the name of it has changed a few times, so I want to make sure I download the right one:
Install-WindowsFeature AD-Domain-Services –IncludeManagementTools
It won’t take but a couple of minutes to download and install them.
Now let’s build my AD Forest:
Install-ADDSForest –DomainName <domain.name>
This will run for a few minutes, and when completed, you will be informed the computer needs to reboot.
When I am prompted to log in, I now need to know my username (it will be Administrator, as well as my password.
So let’s go back into PowerShell, and make sure that everything worked.
Get-Addsdomain | fl name, domainmode
Get-Adforest |fl name, domainmode
Get-Service ADWS,KDC,Netlogon,DNS
This will show us that the domain is properly configured, and that the necessary services are running.
That’s It?
Well, not quite… but that’s the scope of the article. To manage it, I am going to create a virtual machine running Windows 10 with the necessary Remote Server Administration Tools to manage my AD. Yes, you can do everything in PowerShell… but there are some things I still prefer to do in MMC consoles!
]]>It is also big. There was a time (prior to System Center 2012) when you could pick and choose the components you wanted to buy – if you only wanted monitoring then all you bought was System Center Operations Manager (SCOM). If all you wanted was the virtualization management than all you bought was System Center Virtual Machine Manager (SCVMM).
When Microsoft announced in 2012 that all of the pieces would now be sold as a single package I thought it was a good decision for Microsoft, but not necessarily a good one for the customer. Certainly it would increase their market share for components such as System Center Data Protection Manager (DPM) – which was probably from a 0.1% market share to something somewhat higher – but that was not what the customers wanted. I want a reasonably simple monitoring tool that could be deployed (and purchased) independent of everything else; I could then use the backup tool that I want, the deployment tools that I want, the anti-malware tools that I want.
So when I got an e-mail from representative of SolarWinds asking if I would try out their product (Server & Application Monitor) I decided to give it a try. After all, I knew SolarWinds by reputation, and due to the non-invasive nature of the tool I could easily deploy it along side my existing SCOM environment and monitor the same servers without risk.
The Good
The first thing I noticed about SolarWinds was the ease with which it installed. Compared to SCOM (which even to simply install it was a bit of an ordeal (See article) it was a simple install – it did not take long, and was pretty straight-forward.
While the terminology was a little different that SCOM it was easy to understand the differences, and I suspect for a junior sys admin would be pretty easy to understand. At the top of the Main Settings & Administration page the first option is Discovery Central, which allows SAM to search your entire environment for servers.
The Alerts & Reports option helps you set up your mail account that sends alert & notification e-mails to the admins based on the current environment and issues. It is just as easy to send these e-mails to individuals as to groups, and configuring what is sent to whom is relatively simple.
Fortunately SAM is completely Active Directory integrated, so I can just authorize my Domain Admins and other groups to access what data they need in SAM, and to grant individuals and groups granular permissions to see and/or change what they are allowed to.
The dashboard is easy to read and understand, as well as customize. I want my graphs to be at the top, and I want to know anything critical up front. As with any good monitoring tool, Green=Good, Red=Bad. All of my alerts are hyper-linked so if I see something Red I can just click and go right to it.
Actions, not words… If this happens then do that is a requirement in this day and age… Of course, if my monitoring tool can notify me that a service is down it is great… but how much better that it can bring it back up for me at the same time. That can be as simple or complicated as you need, but the fact that certain conditions can trigger actions and not just alerts is key for me. This was a simple task in SAM.
Of course it is important to realize that some system admins will not be as comfortable learning a tool this powerful on their own, and the fact that SolarWinds offers scores of free training resources is key. The Customer Portal has more than just videos; they offer live classes and expert sessions with their engineers and experts which you can attend live or watch later. They have on-demand recordings of everything you might want to learn. Their Virtual Classroom is an amazing resource for customers who need help – whether that is learning a simple tidbit in a few minutes, or going from zero to hero over the course of a few days.
My initial impression of SolarWinds SAM was that it would be a great tool for smaller businesses; that impression changed drastically reasonably quickly. Yes, I installed SAM in one of my 100 server environments in Q3 2015, and it performed brilliantly. However as I learned about it and got to know the product I was convinced it was definitely Enterprise-Class, and by the end Q1 2016 I also had it installed at a client with 19,000 users and thousands of servers.
The Bad and the Ugly…
There is really only one aspect of SolarWinds that irked me, and that is the licensing model. With some monitoring tools if you have 200 servers you know you need 200 licenses. With SolarWinds a single server may require 100 licenses, depending on what you are monitoring. That is not to say that SAM will be more expensive than other tools… it is just a different way of looking at the calculations that I needed to wrap my head around. A small thing to be sure, but it is certainly an issue for me.
Conclusion
I was offered a trial period with SAM to try it out in my environment, and when that trial period ended I decided to renew. SolarWinds has a great tool here, but more important to me is the support that I have been able to get from the company, which has extended beyond simple ‘how do I…’ questions. Their engineers have gotten on-line with me to help solve a couple of custom issues that arose, and they were happy to do it.
The product offering is a home run for system admins who want a monitoring and reporting tool and do not want to break the bank… or change out all of their other management tools to drink Microsoft’s Kool-Aid.
Small environment or large, SolarWinds is worth it. Contact them at www.solarwinds.com for more information, and a demo of their offerings!
]]>In a secure, well-managed IT environment we monitor to make sure that things are working the way they are supposed to. When we spin up a new server, for example, the proper agents are installed for anti-malware and monitoring without our lifting a finger. Tuesday evening a new server is spun up, Wednesday morning it is already letting us know how well it is running.
But what about the other environments? Many smaller environments do not have automated deployment infrastructures that make sure every new server is built to spec. What do we do for those?
The answer is simple… where automation is lacking we have to be more vigilant in our processes. When a new server (virtual or otherwise) is created, we not only install an operating system… we also make sure we add the monitoring agent, the anti-virus agent, and make sure you schedule proper backups because if you don’t it will all ne for naught if everything goes down.
So the answer is to make my environment completely automated, right?
Well, yes of course it is… in an ideal world. In the real world there are plenty of reasons why we wouldn’t automate everything. The cost of such systems might outweigh the benefits, for example… or maybe we do not have an IT Pro managing it, just the office computer guy. Ideally we would get that guy trained and certified in all of the latest and greatest… but if you work in small business you know that might not always be the reality.
So what IS the answer?
Simple. I have a friend who has made a fortune telling people around the world how to make checklists. I am not the guru that Karl is, and you don’t have to be either. But if you do have a manual environment, spend the time to make a checklist for how you build out systems – make one for servers, one for desktops, and probably one for any specific type of server. You don’t have to do it from memory… the next time you build a machine write down (or type!) every step you take. 1) Create virtual machine. 2) Customize virtual machine. 3) Install operating system… and so on. When you are satisfied that your system is built the way you want it (every time) then you should try it again… but rather than using what you know, follow the checklist.
These checklists, I should mention, should not be written in stone. There are ten rules that were so written, and that’s enough. Thou shalt not murder is pretty unambiguous. Thou shalt install Windows 8.1 may change when you decide to upgrade to Windows 10. So make sure that every time you use the checklist you do so with a critical eye, trying to see if there is a way to improve upon the process. The Japanese word for this is Kaizen. They are pretty good at a lot of things from what I have seen ![]()
True story: I gave this advice to a colleague once who thought it was great. He started creating checklists, and had his employees and contractors follow them. One day he invited me for a drink and told me a funny story. His client had been using System Center Operations Manager (SCOM) to monitor all of their servers. He had a checklist that included installing the SCOM agent in all servers. One day the client decided to switch from SCOM to SolarWinds (a great product!) and after several weeks he decommissioned his SCOM infrastructure. Six months later the client (a pretty big small business) complained that since they switched from SCOM to SW all of their new servers kept reporting a weird error. It seems that the IT Pro who was following the checklists had continued installing the SCOM Agent into their servers, and since it could not find a SCOM server to report to, it was returning an error. As I said, these checklists should be living documents, and not set in stone.
Conclusion
There is no one right or wrong answer for every environment. What is a perfect inexpensive solution for one company might be cost prohibitive for another. The only thing you have to do is use your mind, keep learning, use common sense, and keep reading The World According to Mitch!
]]>I posted on Friday that it was my last day working full time at Yakidoo. I really enjoyed my time there, and am glad that my next venture will allow me to stay on there on a limited basis.
This afternoon I am meeting a colleague at the airport in Seattle, and that will begin my first day at my new gig. I will talk more about it in a few weeks, even though today will be my first billable day. That is what’s Next.
However the reason he and I will be in Seattle – Bellevue/Redmond actually – is the Airlift for Windows Server, System Center (WSSC), and Windows Azure vNext… the next generation of datacenter and cloud technologies that Microsoft is ‘showing off’ to select Enterprise customers several months prior to launching them. It will be a week of deep-dive learning, combined with the usual Microsoft Marketing machine. How do I know? It’s not my first kick at the can ![]()
It is, of course, not my first such Airlift. The first one I attended was for System Center Configuration Manager (SCCM) 2007, back in November of that year. It was a consulting firm that had sent me, in advance of my heading off to Asia to teach it. I have since been to a couple of others, each either as a consultant, a Microsoft MVP, or as a Virtual Technology Evangelist for Microsoft. I have not given this a lot of thought, but this will be my first Airlift / pre-Launch event that I am attending as a customer. It will be interesting to see if and how they treat me differently.
I suspect that the versions of WSSC that I will learn about this week will be the first that I will not be involved in presenting or evangelizing in any way dating back to Windows Server 2003. I will not be creating content, I will not be working the Launch Events, and I will not be touring across Canada presenting the dog and pony show for Microsoft. I will not be invited by the MVP Program to tour the user groups presenting Hyper-V, System Center, or Small or Essential Business Servers. I will not be fronting for Microsoft showing off what is new, or glossing over what is wrong, or explaining business reasons behind technology decisions. It is, in its way, a liberating feeling. It is also a bit sad.
Don’t get me wrong… I will still be blogging about it. Just because Microsoft does not want me in their MVP program does not mean that I will be betraying my readers, or the communities that I have helped to support over the years. I will be writing about the technologies I learn about over the next week (I do not yet know if there will be an NDA or publication embargo) but at some point you will read about it here. I will also, if invited, be glad to present to user groups and other community organizations… even if it will not be on behalf of (or sponsored by) Microsoft. I was awarded the MVP because I was passionate about those things and helping communities… it was not the other way around.
What else can I say? I am at the airport in Toronto, and my next article will be from one of my favourite cities in North America… see you in Seattle!
]]>When the server rebooted for no discernible reason last week, we were concerned. When it didn’t come up again, and did not present any hard drives… we realized we had a problem.
I was relieved to discover that it was still under warranty from Lenovo, with NBD on-site support. I called them, and after the regular questions they determined that there might be a problem with the system board. They dispatched one to me along with a technician for the next morning, Their on-site service is still done by IBM, and in my career I have never met an unprofessional IBM technician. These guys were no exception. They were very professional and very nice. Unfortunately they weren’t able to resolve the problem.
Okay, in their defense, here is what everyone (including me) expected to happen:
1) Replace the system board.
2) Plug all of the devices (including the hard drives)
3) Boot it up, and during the POST get a message like ‘Foreign drive configuration detected. Would you like to import the configuration?’
4) We answer YES, the configuration rebuilds, and Windows boots up.
Needless to say, this is NOT what happened. Why? Let’s start with the fact that low-end on-board RAID controllers apparently suck. Is it possible that a procedure was not properly followed? I am not sure, and I am not judging. I know that I watched most of what they did, and did not see them do that I felt was overtly wrong.
The techs spent six hours on-site, a lot of that spent in consultation with the second level support engineer at Lenovo, who had the unenviable task of telling me, at the end of the effort, that all was lost, and I would have to restore everything from our backup.
I should mention at this point that we did have a backup… but because of maintenance we were doing to that system over the December holidays the most recent successful backup was twelve days old.
Crap.
Okay, we’ll go ahead and do it. In the meantime, the client and I went to rebuild the RAID configuration. We decided that although we were going to bolster the server – including a new RAID controller – we were going to try to rebuild the array configuration exactly as it had been, and see what happened.
Let me be clear… even the Lenovo Engineer agreed that this was a futile effort, that there was no way that this was going to work. Of course it would work as a new array, we just weren’t going to recover anything. I agreed… but we tried it anyways.
…and the server booted into Windows.
To say that we were relieved would be an understatement. We got it back up and running exactly as it had been, with zero data loss. We were not going to leave it this way of course… I spent the next day migrating data into new shares on redundant virtual servers. But nothing was lost, and we all learned something.
I want to thank Jeff from Lenovo, as well as Luke and Brett from IBM who did their best to help. Even though we ended up resolving it on our own (and that credit goes mostly to my client), they still did everything they could to make it right.
So my client has a new system board in their server, and hopefully with a new RASID controller, some more memory, and an extra CPU this server can enjoy a new and long, productive life as a vSphere host in the cluster.
…But I swear to you, I will never let a customer settle for on-board ‘LSI Software RAID Mega-RAID’ type devices again!
Happy week-end.
]]>When I was a Microsoft MVP, and then when I was a Virtual Technical Evangelist with Microsoft Canada, you might remember my tweeting the countdown to #EndOfDaysXP. That we had some pushback from people who were not going to migrate, I think we were all thrilled by the positive response and the overwhelming success we had in getting people migrated onto either Windows 8, or at least Windows 7. We did this not only by tweeting, but also with blog articles, in-person events (including a number of national tours helping people understand a) the benefits of the modern operating system, and b) how to plan for and implement a deployment solution that would facilitate the transition. All of us who were on the team during those days – Pierre, Anthony, Damir, Ruth, and I – were thrilled by your response.
Shortly after I left Microsoft Canada, I started hearing from people that I should begin a countdown to #EndOfDaysW2K3. Of course, Windows Server 2003 was over a decade old, and while it would outlast Windows XP, support for that hugely popular platform would end on July 14th, 2015 (I have long wondered if it was a coincidence that it would end on Bastille Day). Depending on when you read this article it might be different, but as of right now the countdown is around 197 days. You can keep track yourself by checking out the website here.
It should be said that with Windows 7 there was an #EndOfDaysXP Countdown Gadget for the desktop, and when I migrated to Windows 8 I used a third party app that sat in my Start Menu. One friend suggested I create a PowerShell script, but that was not necessary. I don’t remember exactly which countdown timer I used, but it would work just as well for Windows Server 2003 – just enter the date you are counting down to, and it tells you every day how much time is left.
The point is, while I think that migrating off of Server 2003 is important, it was not at that point (nor is it now) an endeavour that I wanted to take on. To put things in perspective, I was nearing the end of a 1,400 day countdown during which I tweeted almost every day. I was no longer an Evangelist, and I was burnt out.
Despite what you may have heard, I am still happy to help the Evangelism Team at Microsoft Canada (although I think they go by a different name now). So when I got an e-mail on the subject from Pierre Roman, I felt it important enough to share with you. As such, here is the gist of that e-mail:
1) On July 14, 2015 support for Windows Server will come to an end. It is vital that companies be aware of this, as there are serious dangers inherent in running unsupported platforms in the datacenter, especially in production. As of that date there will be no more support and no more security updates.
2) The CanITPro team has written (or re-posted) several articles that will help you understand how to migrate off your legacy servers onto a modern Server OS platform, including:
3) The Microsoft Virtual Academy (www.microsoftvirtualacademy.com) also has great educational resources to help you modernize your infrastructure and prepare for Windows Server 2003 End of Support, including:
4) Independent researchers have come to the same conclusion (IDC Whitepaper: Why You Should Get Current).
5) Even though time is running out, the Evangelism team is there to help you. You can e-mail them at [email protected] if you have any questions or concerns surrounding Windows Server 2003 End of Support.
Of course, these are all from them. If you want my help, just reach out to me and if I can, I will be glad to help!
(Of course, as I am no longer with Microsoft or a Microsoft MVP, there might be a cost associated with engaging me
)
Good luck, and all the best in 2015!
You know, I always thought that there were some things that were so blatantly obvious that you just didn’t have to say anything. I was reminded today that I was wrong about that. So: For those of you who may ever be asked to clean a Server Room: NO PRESSURE HOSES.
That’s all.
]]>Server Core was a new way to deploy Windows Server. It was not a different license or a different SKU, or even different media. You simply had the option during the installation of clicking ‘Server Core’ which would install the Server OS without the GUI. It was simply a command prompt with, at the time, a few roles that could be installed in Core.
While Server Core would certainly save some resources, it was not really practical in Windows Server 2008, or at least not for a lot of applications. There was no .NET, no IIS, and a bunch of other really important services could not be installed on Server Core. In short, Server Core was not entirely practical.
Fast Forward to Windows Server 2012 (and R2) and it is a completely different story. Server Core a fully capable Server OS, and with regard to resources the savings are huge. So when chatting with the owner of a cloud services provider recently (with hundreds of physical and thousands of virtual servers) I asked what percentage of his servers were running Server Core, and he answered ‘Zero’. I could not believe my ears.
The cloud provider is a major Microsoft partner in his country, and is on the leading edge (if not the bleeding edge) on every Microsoft technology. They recently acquired another datacentre that was a VMware vCloud installation, and have embarked on a major project to convert all of those hosts to Hyper-V through System Center 2012. So why not Server Core?
The answer is simple… When Microsoft introduced Server Core in 2008 they tried it out, and recognizing its limitations decided that it would not be a viable solution for them. It had nothing to do with the command line… the company scripts and automates everything in ways that make them one of the most efficient datacentres I have ever seen. They simply had not had the cycles to re-test Server Core in Server 2012 R2 yet.
We sat down and did the math. The Graphical User Environment (GUI) in Windows Server 2012 takes about 300MB of RAM – a piddling amount when you consider the power of today’s servers. However in a cloud datacentre such as this one, in which every host contained 200-300 virtual machines running Windows Server, that 300MB of RAM added up quickly – a host with two hundred virtual machines required 60GB of RAM just for GUIs. If we assume that the company was not going to go out and buy more RAM for its servers simply for the GUI, it meant that, on average, a host comfortably running 200 virtual machines with the GUI would easily run 230 virtual machines on Server Core.
In layman’s terms, the math in the previous paragraph means that the datacentre capacity could increase by fifteen percent by converting all of his VMs to Server Core. If the provider has 300 hosts running 200 VMs each (60,000 VMs), then an increased workload of 15% translates to 9,000 more VMs. With the full GUI that translates to forty-five more hosts (let’s conservatively say $10,000 each), or an investment of nearly half a million dollars. Of course that is before you consider all of the ancillary costs – real estate, electricity, cooling, licensing, etc… Server Core can save all of that.
Now here’s the real kicker: Had we seen this improvement in Windows Server 2008, it still would have been a very significant cost to converting servers from GUI to Server Core… a re-install was required. With Windows Server 2012 Server Core is a feature, or rather the GUI itself is a feature that can be added or removed from the OS, and only a single reboot is required. While the reboot may be disruptive, if managed properly the disruption will be minimal, with immense cost savings.
If you have a few servers to uninstall the GUI from then the Server Manager is the easy way to do it. However if you have thousands or tens of thousands of VMs to remove it from, then you want to script it. As usual PowerShell provides the easiest way to do this… the cmdlet would be:
Uninstall-WindowsFeature Server-Gui-Shell –restart
There is also a happy medium between the GUI and Server Core called MinShell… you can read about it here. However remember that in your virtualized environment you will be doing a lot more remote management of your servers, and there is a reason I call MinShell ‘the training wheels for Server Core.’
There’s a lot of money to be saved, and the effort is not significant. Go ahead and try it… you won’t be disappointed!
]]>It has been an incredible start to the Windows Server 2012 R2 Launch Series. Here is brief summary of what we covered so far…
Keep plugged in to the series to continue learning about Windows Server 2012 R2
– See more at: http://itproguru.com/expert/2013/10/whats-new-in-windows-server-2012-r2-lessons-learned-week-1/#sthash.JWWX9vKZ.dpuf
]]>On Saturday I spent the day with the Vancouver Technology Users Group (VANTug). We spent the morning talking Windows 8 and Office 365, and then in the afternoon we discussed System Center 2012 and Microsoft’s Private Cloud solutions. We had a great time at the Burnaby campus of BCIT. I always love coming out to Vancouver, and today was no different.
And yet I couldn’t get Calgary out of my mind. I know that a lot of people are scared, cold, wet, and hungry… and will have a very tough time rebuilding. I am sure that when the IT Pros of Southern Alberta do get back into their offices they will have discussions around disaster recovery, business continuity, and minimizing loss. Today, and through the middle of the week I expect most of them are with their families worrying about things much more important… their homes, their memories.
I showed up at BCIT with a Big Box o’ Swag full of prizes, and as is always the case at Install Fests I was asked early on if they were going to get licenses of Windows 8. They were not… but as luck would have it I had one license in my laptop case that I had received at an event a few weeks ago that I did not really need, so I told them I would raffle off that license at the end of the day.
When the raffle time came some fifteen people won mice, keyboards, and Xbox controllers. I then put all of the winning tickets back into the hat and was about to draw for the Windows 8 Pro license when I had a thought…
I had a one year subscription to Microsoft Office 365 Home Premium in my bag that I was supposed to give to a friend last week, but didn’t see them. As I stood at the front of the room I asked the group leader (Peter) if they support charities, and he said that they did. Normally they support the local children’s hospital, but for this I asked him to agree to support the Red Cross Alberta Floods Fund. I told the group that I would draw for a winner of the Windows 8 license, and if the winner was willing to donate $50 to the fund (through VanTug) then he or she would also receive the subscription for Office 365.
The winner agreed and is now the proud owner of two great products… but should be even prouder to be helping a very important cause that is near and dear to my heart, and one that should be important for all Canadians.
I received a comment on my blog that same morning in response to an article I wrote about the relationship between Quebec and the rest of Canada. He said that we have nothing in common across this great land (obviously not his words). I disagree. I think we share a heart and a love of our fellow man that transcends the political views of one side or another of any political debate, most of which seem petty in the face of disasters that befall regions and peoples from time to time. I will respond to that comment in an article later this week, but in the meantime I hope my Quebec reader takes some food for thought from this one, and says a prayer or even donates a little to the people of Alberta… so distant, but so close to all of us.
]]>I performed a Backup of all of my data. Nobody in their right mind would destroy an environment before they back up their data… especially if they are planning to actually delete the machines and start from scratch.
I performed a complete test-Restore of all of my data. Now that my Mail Server is completely cloud-based this was easier than it might have been – If I had Exchange, SQL, and SharePoint it would have been more complicated, but also more crucial. I always stress the importance of doing test-restores because the worst time to find out that your backup did not work is when you need to recover it. Make sure that it works before you are faced with real data loss.
Planning was actually relatively simple for me, because the main environment was going to look very similar to the lab environment I had recently built for my Private Cloud camp. I still had the planning documents for that, and I was able to follow them pretty closely for the first few machines. There was a time when I would have done the planning in my head, but now I make sure that I have all of my plans on paper before I go forward. As the old adage goes, measure twice, cut once. By having your thoughts on paper it is easier to stay on track… and if you do have to veer then you should document why you did.
Cleaning Up may not seem all that important, but destroying a cluster before destroying the domain is infinitely simpler than doing so afterwards. It is doable of course, but there are PowerShell commands such as Remove-ClusterResource –force that one will get intimately familiar with if you do not think ahead.
Make sure you have all of the installation Media at hand… either on physical DVD or in an ISO repository. This should not only include the obvious ones such as operating systems and applications, but also make sure you have the latest hardware drivers. By looking at my Plan I know that I will need the following media:
Additionally I would need several bits that I would simply download as one-offs… the Report Viewer, Silverlight, and things like that. However since my networking topology is already in place, I would be able to do that from within the virtual machines.
Now that I have everything ready to go, I am ready to move forward. Building an environment from scratch (green-field) would be simpler, but there are some aspects that prevented that. In your production environment (should you ever decide to start from near-scratch) you will have to run through the same sort of project plan. Make sure you think it out – do not simply sit down one morning and expect to implement in the afternoon; rather make sure you observe your environment for a few cycles and build your plan over time so that you don’t run into any surprises.
In my next piece I will go through the actual build architecture of how I decided to build my server infrastructure; I will also introduce some actual build videos of the System Center components. If there is something in particular that you would like to see please let me know by commenting! -M
]]>NOTE: I included all of the sub-roles and sub-features as well for all except for the Remote Server Administration Tools, which would show a tool for all of the roles and features.
Roles:
1. Active Directory Certificate Services
2. Active Directory Domain Services
3. Active Directory Federation Services
4. Active Directory Lightweight Directory Services
5. Active Directory Rights Management Services
6. Application Server
7. DHCP Server
8. DNS Server
9. Fax Server
10. File and Storage Services
a. File and iSCSI Services
i. File Server
ii. BranchCache for Network Files
iii. Data Deduplication
iv. DFS Namespaces
v. DFS Replication
vi. File Server Resource Manager
vii. File Server VSS Agent Services
viii. iSCSI Target Server
ix. iSCSI Target Storage Provider
x. Server for NFS
b. Storage Services
11. Hyper-V
12. Network Policy and Access Services
13. Print and Document Services
14. Remote Access
15. Remote Desktop Services
16. Volume Activation Services
17. Web Server (IIS)
Features:
1. .NET Framework 3.5 Features
a. .NET Framework 3.5 (includes .NET 2.0 and 3.0)
b. HTTP Activation
c. Non-HTTP Activation
2. .NET Framework 4.5 Features
a. .NET Framework 4.5
b. ASP.NET 4.5
c. WCF Services
i. HTTP Activation
ii. Message Queuing (MSMQ) Activation)
iii. Named Pipe Activation
iv. TCP Activation
v. TCP Port Sharing
3. Background Intelligent Transfer Service (BITS)
a. IIS Server Extension
b. Compact Server
4. BitLocker Drive Encryption
5. BitLocker Network Unlock
6. BranchCache
7. Client for NFT
8. Data Center Bridging
9. Enhanced Storage
10. Failover Clustering
11. Group Policy Management
12. Ink and Handwriting Services
13. Internet Printing Client
14. IP Address Management (IPAM) Server
15. iSNS Server Service
16. LPR Port Monitor
17. Management OData IIS Extension
18. Media Foundation
19. Message Queuing
a. Message Queuing Services
b. Message Queuing DCOM Proxy
20. Multipath I/O
21. Network Load Balancing
22. Peer Name Resolution Protocol
23. Quality Windows Audio Video Experience
24. RAS Connection Manager Administration Kit (CMAK)
25. Remote Assistance
26. Remote Differential Compression
27. Remote Server Administration Tools
28. RPC over HTTP Proxy
29. Simple TCP/IP Services
30. SMTP Server
31. SNMP Server
a. SNMP WMI Provider
32. Subsystem for UNIX-based Applications (Deprecated)
33. Telnet Client
34. Telnet Server
35. TFTP Client
36. User Interfaces and Infrastructure
a. Graphical Management Tools and Infrastructure
b. Desktop Experience
c. Server Graphical Shell
37. Windows Biometric Framework
38. Windows Feedback Forwarder
39. Windows Identity Foundation 3.5
40. Windows Internal Database
41. Windows PowerShell
a. Windows PowerShell 3.0
b. Windows PowerShell 2.0
c. Windows PowerShell ISE
d. Windows PowerShell Web Access
42. Windows Process Activation Service
a. Process Model
b. .NET Environment 3.5
c. Configuration APIs
43. Windows Search Service
44. Windows Server Backup
45. Windows Server Migration Tools
46. Windows Standards-Based Storage Management
47. Windows System Resource Manager (Deprecated)
48. Windows TIFF IFilter
49. WinRM IIS Extension
50. WINS Server
51. Wireless LAN Service
52. WoW64 Support
53. XPS Viewer
Now: Adding roles and features in Windows Server 2012 is easier than it was previously… either use the Add Roles and Features Wizard (See my article and video here). Or you can use Windows PowerShell (which is the preferred way to do it) by using the cmdlet Install-WindowsFeature. Even though there is a distinction between Roles and Features, the cmdlet to install them is the same for both.
Now go forth and serve, my fellow IT Pros!
]]>While all of that is true, to say that virtualization is the only benefit to Windows Server 2012 is doing it a disservice. Don’t get me wrong, Hyper-V officially rocks; but if virtualization was the only benefit to the new Server, couldn’t companies simply deploy the new version on their host hardware, and leave their virtual machines running Windows Server 2008 R2?
Going forward when someone asks me what is new and exciting in Windows Server, I am going to start with the improvements to Hyper-V… but then we can go into the real meat of the product, and see where it takes us. Improvements such as:
Storage Spaces (or Storage Pools), which I have equated to software-RAID after ten generations of improvement. With Storage Spaces you can build your volume from multiple disks of equal or disparate size, on similar or disparate architecture. Imagine having three SAS disks of 450GB, 146GB, and 72GB combined into a single volume of 668GB… or a 146GB SAS disk, a 500GB SATA disk, and a 2TB USB disk combined into a 2.46TB volume. Add to that the ability to hot-add drives on the fly (in a recent demo I added two disks in under 30 seconds), and have your volume protected by Mirroring or Parity. All of this is built into Windows Server 2012, and we have written about it extensively. Try it for yourself by following my article here.
Data Deduplication is built into the operating system. Previously a tool that storage-conscious companies would pay thousands of dollars to third-party vendors for, is now a check box away when creating your volume. Once it is enabled on your volume you can either use the GUI tool or, if you are efficient, Windows PowerShell to either schedule your dedup or run the job immediately on either your local or remote systems.
Software iSCSI Target was exclusively a feature of Microsoft Storage Server until April of 2010 when Microsoft released it as a fully supported free download. Now integrated in Server 2012, it gives you the ability to create a software SAN device on your server with all of the functionality of most hardware SANs, but at a fraction of the cost. While I will still not replace my hardware SAN devices in large organizations, it brings that functionality to smaller businesses without the budget for the extra hardware. Couple this feature with Storage Spaces and Data Dedup and you have yourself a real ballgame! To get started check out our article here.
MinShell is the new ‘compromise’ step between the full GUI Server installation and the Server Core installation. It allows you to have a sort of ‘safety net’ of the GUI management tools, without actually having the Windows GUI environment installed. You will save tons of resources across your virtualized environment because you no longer need the GUI on hundreds of virtual machines, as we wrote about here.
Server Manager was introduced to Windows Server 2003 R2 with all of the ho-hum yawning that it deserved. Okay, a lot of our tasks were brought into one app, but that was about it. That is why I was so surprised that the modern Server Manager in Server 2012 blew me away with its true multi-server management, the Dashboard functionality that gives the administrator a birds-eye view of the health of all of his or her systems, and the ability to manage… well, everything from one console. Install roles and features on your local or remote servers with the same ease. Manage multiple servers from the same console – add them by simply right-clicking the All Servers context, and then without any more work see that all of the services running on that (or those) remote server(s) are instantly added to your Dashboard. I recorded a video of some of the great functionality in Server Manager for our blog here.
PowerShell 3.0 is the breakout version of this already incredible scripting environment, with nearly ten times the cmdlets than previously available (out of the box). Add to that the Integrated Scripting Environment (ISE) and you have a powerful scripting environment that is even easier to learn and use than before!
Active Directory Administration Center is a new all-encompassing tool for Active Directory management. No longer will admins have to open one of several different consoles depending on what they wanted to do, the ADAC is it… plain and simple!
Active Directory Recycle Bin was introduced in Windows Server 2008 R2, and is now even easier to use to use. Enable it in the ADAC (remember that once enabled it cannot be disabled). To lean how to enable it read our article here, and the to use it to restore an object we have another article here.
Windows PowerShell History Viewer records the underlying Windows PowerShell commands when action is taken in the Active Directory Administrative Center so that the admin can copy and reuse the scripts. This is also a great way for admins to start learning PowerShell!
Cloning and Snapshotting Domain Controllers, along with DCs that are fully aware of virtualization, mean that we no longer need to maintain a physical domain controller in our fully virtualization (or cloud-based) organization. I can rapidly deploy new domain controllers (either in an existing or new domain), and quickly and easily restore business continuity during disaster recovery. I can rapidly provision test environments and quickly meet increased capacity needs in branch offices. Our virtualized domain controllers will detect snapshot restoration and non-authoritatively synchronize the delta of changes for Active Directory and the SYSVOL, making DC virtualization safer.
Fine-Grained Password Policies in Active Directory allows me to have better security for my infrastructure by making it easier for users with no access to sensitive information have more lenient password policies, while enforcing stricter policies for users with more access and for service accounts. While everyone will still have to have password awareness, this will see a marked decrease in Post-It Note Security Violations.
Dynamic Access Control is a new way of securing your information, whether on file shares, in SharePoint Document Libraries, or even in e-mail. It works with Rights Management Server using Central Access Policies to verify who is accessing what information from where (what device). The expression-based access policies determine before decrypting the content that both the user and the device are trusted. If you have highly sensitive information that should only be accessed on corporately managed devices this is going to be a great new security feature available to you!
DirectAccess was introduced in the 2008 era with a plethora of complex requirements and prerequisites needed to implement. In 2009 Rodney Buike wrote an article that is a great explanation of DirectAccess on our blog which can be read here. In Server 2012 it is so much simpler to plan for, deploy, and use. Anthony Bartolo wrote the article about what it is, what it needs, and what it does recently, and you can read that article here.
…and the list just keeps going and going. I urge you to download the evaluation software and try it out by clicking on the appropriate link:
In addition to downloading the software and reading our articles, you could have a chance in winning your lab computer by participating in free Microsoft offered Virtual Academy. To have a chance to win an HP EliteBook Revolve and two chances to win 400 Microsoft Points enter here. Complete two TechNet evaluations, and take the selected Microsoft Virtual Academy courses for your chance at a $5,000 grand prize!
]]>On July 5th I posted an article in this space called A Response to VMware’s ‘Get the Facts’ page comparing vSphere to Hyper-V and System Center. In the four weeks since the article was published it has become the fourth most-read article on my blog (I have 437 articles publicly posted, dating back several years… the statistics cited are since the re-launch of The World According to Mitch in November of 2010). It is certainly the most discussed and commented on.
The first comment, from a manager at VMware, says that we should compare what is in the market TODAY with what’s in the market TODAY, and since vSphere 5.1 was (and remains) in a private beta, we should not discuss Windows Server 2012.
Today Microsoft is releasing to manufactures (RTM) Windows Server 2012, and while they are still the number one virtualization technology in the market with regard to market share, they have a lot to worry about with today’s release.
I have been saying for several years that when Microsoft puts its mind to something (as well as its considerable financial and intellectual resources) you should never bet against them. In February of 2008 they released Hyper-V, and two years later they released Hyper-V 2008 R2. The former was decent, but (as VMware enthusiasts were quick to point out) lacked a lot of the features that enterprise IT departments needed. The second release did a good job of adding many of those features, and with Service Pack 1 came even more features.
I have been a Hyper-V evangelist for a little over two years now, and I have seen the writing on the wall. Even with Hyper-V 2.1 (2008 R2 SP1) Microsoft offered most of the features and functionality that businesses needed and wanted, but at a fraction of the cost.
Today, with the launch of Hyper-V 3.0, the circle is now complete. The technological advantages of VMware have evaporated in the momentum of progress that Microsoft has made to Windows Server, Hyper-V, and System Center.
Over the course of the coming weeks and months you will be reading a lot about Hyper-V, both from myself and others. If you are Canadian you might want to come out to an IT Pro Boot Camp offered by Microsoft Canada. Even if you are not, I encourage you to download the preview and try it. Play with it, and when you read about tricks in blog articles try them yourself. It will not take long for you to realize that it is not just hype surrounding Windows Server 2012, it is substance, it is momentum, and it is a new era of server capabilities, without having to pay a fortune for the privilege.
Welcome to Server 2012 my friends… It is going to be an exciting one!
]]>Microsoft Windows Small Business Server, a product that launched a thousand (and more) careers in IT, that has a loyal and vocal following that has blossomed since the very early days of Windows NT, is in its final iteration as we know it. (See the official announcement on the Windows SBS blog: Windows Small Business Server Essentials becomes Windows Server 2012 Essentials)
I couldn’t even begin to tell you when I first started writing about SBS – it was on newsgroups way before I started blogging. I do remember when I first heard about it. You would think that it would have been in one of the many MOC courses I had taken on Windows Server, Active Directory, and so on but it wasn’t. I was actually in a job interview for a company that would eventually hire me called Poppy Industries. Fred Blauer – a consultant that company used – asked me how I would configure the infrastructure for a given organization, and I told him that it would require five servers – a domain controller (two if they were smart), a mail server, a database server, a SharePoint/web Server, a firewall, and a file server. Fred said to me ‘Would you consider using Small Business Server?’ He proceeded to tell me what SBS was – a single Windows Server box that was a domain controller, Exchange Server, SQL Server, SharePoint Server, ISA Server, and more… all for just under $2,000.
Obviously back then I knew everything, and I told him that no such product existed. He opened up a web browser and showed it to me. I told him that what he (and the page on microsoft.com) was telling me was going to break every rule of enterprise best practices, but I would definitely look at the product and see if it was really all that.
I did… and I fell in love.
How cool is it that Microsoft had taken all of those products that I had been learning about, and rather than having to invest in six individual servers (virtualization was not yet a serious option), and put it all into one relatively low-end box?
Over the next few years I spent most of my career working in SBS. I deployed and supported it for dozens of customers, supported the community, and for a short time I was even an SBS MVP. I wrote courseware and an exam for SBS 2003 and the exam for SBS 2008 for Microsoft Learning, lectured on it to dozens of groups around the world, and wrote numerous articles for my blog. On January 17, 2007 the Microsoft Canada DPE team’s blog (IT Pro Connection) published my article ‘Why I am not an SBSer’. To say that it ruffled a few feathers is the very nicest thing that can be said about it. I would go so far as to say that it was the beginning of the end of my amicable relationship with the entire SBS MVP group.
Over the last few years many of you have heard me predict the end of SBS. In fact months ago I submitted a session to SMB Nation (which I will present at that conference this October in Las Vegas) called ‘The SWMI Vision of SMB IT in a post-SBS Era’. I had no inside information when I coined the term ‘Post-SBS Era’ but it looks like I finally called one right.
Now to be fair, I have been predicting for years that the next version of SBS – that is, SBS 2011, which is a full-blown viable product – would actually rely heavily on SBS, separating the roles into multiple servers – AD, Exchange, SharePoint, SQL on four different virtual machines. That did not happen. I predicted that the future of SBS would look more like an enterprise datacentre in a single box with several virtual machines, and all managed by System Center 2012. I never liked the idea of a ‘Windows Server Essentials’ that would facilitate the SMB to use cloud-based Exchange services, but alas, that is what we are seeing as the future of the product. The idea was first floated to the SBS MVPs at least two years ago (and maybe three) at the MVP Summit… it was long enough ago that I was still attending SBS sessions at Summit, and the outcry against it was loud and strong.
So today when the announcement was made, it was indeed a sad day for SBSers, although not one that was unforeseen and certainly not one that was unexpected. My SBS MVP friends will probably have to find new categories to fall into, as I had to when EBS was discontinued. I expect some of them will transition to a new award category called Windows Server Essentials MVP, and others will find new categories. I expect that some will resign the award in protest, and others, having lost the passion, will stop contributing to community and will eventually lose the award.
Most, however, will adapt and persevere. Greg Starks, SMB Solutions Program Manager for Hewlett Packard, wrote this on his Facebook page, and I could not have put it better myself:
To all my SBSer, MVP and SMB IT Pro friends… I know today’s SBS End-Of-Life news is kind of a kick in the gut to some of you, but remember that it’s YOU, not the product, that helps your small business customers succeed. The market will adapt and so will you. You are too good at what you do to let the comings and going of a single product inhibit your success. No matter which OS, SMB IT Pros RULE!!!!
I am looking forward to seeing Greg at WPC next week… He and his team at HP have done so much to support the SBS community over the past several years, and I expect they will continue to do so in the future… no matter what the product is called.
In the meantime, I will raise a toast to all of my friends, past and present, who are mourning the loss of SBS this week. My thoughts are with you!
You have a small business. You have been running Windows Small Business Server 2003 for six years, and you know that it is time to retire it. The question is, what should replace it?
Before you make any definitive decisions, why not review what you need your server to do:
For the past several years you have paid a consultant to manage the server and your client PCs, and have primarily called him in for break-fix issues. Maybe you were industrious and decided to learn the basics of IT so you could do a lot of the maintenance yourself. You might even be a small-business IT consultant who has been managing and maintaining SBS environments for your clients.
You have heard so much about the cloud that you are in a bit of a fog… you know that people are talking about cloud-services, but haven’t quite figured out how they can work for you… to save you money, to earn you money.
Replacing the Server
For most small businesses I still recommend a centralized server; Active Directory is still the best mechanism you will find for centralized user management, and Group Policy allows you to lock down your environment.
With that being said, many of the functionalities offered in Microsoft Small Business Server are now available as part of two cloud-services offerings from Microsoft. Microsoft Office 365 offers all of the functionality listed above (File Server, Mail Server, Internet Portal) and much more. It is actually all of the following products in the cloud:
Office 365 allows you to have the functionality of all of these tools… without having to purchase or maintain them. It also means that you will always have the latest versions of all of these… without having to upgrade. ‘Your servers’ will be maintained by the Microsoft IT team, without your having to pay them hundreds of dollars per hour. If any of your services go down (and admittedly they do occasionally) you can rest assured that before you even discover the outage the people who know the products best will already be well on their way to fixing the issues.
Managing the Desktop
Between the operating system and the applications, there is a lot of work that goes into the proper maintenance of your PCs. That includes anti-malware, patch management, policies, and more. Additionally being able to generate and view reports is a huge benefit – as I always say If you cannot measure it, you cannot manage it!
So Before we get into application side of things, let’s discuss the benefits of the second cloud-services offering, Windows InTune. InTune installs as a simple agent on your Windows PC, and the list of benefits is amazing:
When you subscribe to Windows InTune (per-PC subscription) you get the right to upgrade your legacy Windows client (Professional/Business/Enterprise SKUs) to Windows 7 Enterprise. Right there you have the basis for the common operating system required to simplify management.
Windows 7 Enterprise Edition includes two features that Business Edition does not:
With the preponderance of mobile computing these days, as well as organizations doing business around the world, there is no question that Windows 7 Enterprise is an easier feature-by-feature sell than the lower-priced options, but that lower price seems to be a deciding factor so often. With the Use Rights in Windows InTune you don’t have to settle.
Once the Windows InTune agent is deployed on a PC it will start populating the individual computer’s information to the InTune system, and you will be able to get a better idea of what you have. On the Devices screen you will be able to see:
| Computer Name | Total Disk Space | CPU Speed |
| Chassis Type | Used Disk Space | Last User to Log On |
| Manufacturer & Model | Free Disk Space | Serial Number |
| Operating System | Physical Memory | Last Hardware Status |
Included in the Windows InTune installation is the Windows Intune Endpoint Protection engine, which will protect your PCs from malware. It uses the built-in patch management system to keep the definitions up to date, and offers real-time protection, as well as centralized reporting and e-mail alerts to the Help Desk / Support Team / IT Guy when a computer is infected.
InTune 2.0 added the ability to centrally deploy applications to client PCs. InTune 3.0 adds an extra to this – the ability for end-users to install published applications on-demand. The new Company Portal allows users to help themselves on-line, before eventually ‘escalating the call’ to you.
Users can also deploy their own client from the portal, assuming they have the proper credentials. This allows them to download a client using their corporate credentials, rather than you having to send them the file (along with the ACCOUNTCERT file) which would allow anyone (in theory) to install on any device that would automatically be managed by… you.
By far the most common application suite found on desktops in the workplace is Microsoft Office. The most common complaint I hear about Office is the cost (followed by the difficult to understand SKUs). Of course, with Office in the name it is no wonder that it is part of Office 365.
Of course there are several different SKUs to Office 365, and each one has different offerings. The small business SKU (P1) costs $6/month, and does not include the installable suite. However it does include Office Web Apps, which means you can create and edit Word documents, Excel spreadsheets, PowerPoint presentations, and of course use OneNote… all within your web browser. This is great if you work on multiple systems, or if you are ever remote and need to work on a document. The convenience loses its thrill when you realize you cannot work if you don’t have an Internet connection.
The E1, E2, and E3 SKUs do come with the client software, so if that is a requirement then those SKUs (which cost quite a bit more) are probably better for you.
Why you should consider maintaining a server on-site
Our mail server is gone… so are our SharePoint and File Servers. Why then would I still recommend a small server in a small business environment? There are several reasons.
As you see the combination of cloud-based services from Microsoft and an on-line Windows Server are the best way to manage your entire SMB IT infrastructure, but even if you are not going to maintain an on-premise server the cloud-based services can manage most of the needs of most SMBs.
By the way, there is one more advantage to these solutions… you will always have the latest and greatest. Right now the Windows InTune subscription comes with use rights for Windows 7 Enterprise. When Windows 8 is released, you will automatically have access to that platform. Office 365 comes with Office 2010… but when the next version is released you will have that version right away too!
Interested in hearing more? Drop me a line and we’ll talk… or you can check out www.windowsintune.com and www.office365.com to download 30-day trials of each!
]]>Enter the Remote Server Administration Toolkit (RSAT).
Unless you are using System Center to administer your servers, chances are you are either using PowerShell or, more likely at this point, MMC (Microsoft Management Console) consoles. As we learned in Microsoft Windows 2000 Server, MMC consoles can connect to remote servers (or desktops) as long as Windows Remote Management (WinRM) is enabled (Actually WinRM and the Windows Firewall were only introduced in Windows Server 2003 R2 if memory serves, but MMC consoles were remoteable
).
You can enable WinRM in Windows Server 2008 R2 from the Server Manager main screen (as shown):
(Note: For those of you running Server Core installations… good for you! you can do all of this with a simple command line: WinRM /quickconfig)
Now that we can remotely manage our servers, we can do so from any Windows Server 2008 R2 box by adding the appropriate feature from the Add Feature Wizard:
I should mention that you will not be able to manage systems on which you do not have credentials, and although the RSAT tools can work in a workgroup, they are much more fluid and trouble-free in a domain environment. Also remember that adding the role or feature under RSAT does not install the actual role or feature, only the consoles required to manage them.
This is great for administrators who want to manage their servers remotely from another server… but what about managing them from your desktop? There’s a simple solution for that. Simply download the Remote Server Administration Tools (RSAT) for Windows 7 (http://www.microsoft.com/download/en/details.aspx?id=7887) from the Microsoft Download Center. Using another version of Windows? There is an RSAT download available for WIndows Vista, but if you are still running Windows XP then I am afraid you are out of luck (…and have 777 days until #EndOfDaysXP!).
Once you have downloaded and installed RSAT into your Windows 7 machine you will see no difference. However if you go to Turn Windows features on or off, things start to change. To get there, open Windows Explorer and navigate to Computer. If you do not see the option to Uninstall or change a program chances are you have not clicked on Computer.
You should see a list of your installed programs on the right, but to the left there should see an option ‘Turn Windows features on or off (shown). Click there.
It will take a couple of minutes, but when it is done you are ready to start administering your servers from Windows 7… just click on the Start pearl, expand Administrative Tools, and the new consoles should be there.
You can load any of them up (for this example we will use Hyper-V Manager) and you have… nothing. However you can right-click on Hyper-V Manager in the Navigation pane, and click Connect to Server…
You can add multiple remote servers to the same MMC console (seen below), including full installations of Windows Server, as well as Server Core installations and (in the case of Hyper-V hosts) Windows Hyper-V Server, which have to be managed remotely as they have no graphical user interface (GUI).
So go ahead… manage your servers from your desktop without ever having to leave your office/cubical/desk/cafeteria. Wherever you like to work from!
]]>—
With Microsoft’s Hyper-V you can consolidate many servers down to fewer physical servers without compromising on services. Hyper-V allows for the consolidation of multiple server instances as separate virtual machines running on a single physical machine (the Virtualization Host).
So how does Hyper-V compare to VMware, the de facto standard for virtualization? Microsoft has made great improvements to Hyper-V and with the latest release (2008 R2 SP1 has added many features that can be found in VMware. This list outlines many of them:
There are differences in the products with VMware having some features that Hyper-V does not have, and vice versa. With Microsoft’s Server Management Suite Enterprise (SMSE) products like SCVMM (Service Center Virtual Machine Manager), OpsMgr (Service Center Operations Manager), and ConfigMgr (System Center Configuration Manager) you can monitor and administer not only the virtualization environment, but also the virtual machine operating systems, host operating systems, and the physical hardware much more richly and robustly than the VMware products can.
The Hyper-V role is available in all versions of Windows 2008 R2, as well as with the free Hyper-V Server 2008 R2. Guest OS licensing for the operating system does not favour either platform, because the Virtual Licensing Model that Microsoft released with Server 2003 R2 applies to both platforms. The licensing is “1 + N” which means that based on the version of Windows 2008 you purchase you can run “N” virtual machines.
· Windows 2008 Standard – 1 + 1 virtual machine
· Windows 2008 Enterprise – 1 + 4 virtual machines
· Windows 2008 Datacenter – 1 + Unlimited virtual machines
While there is no difference on licensing, there is a huge difference with regard to the cost of the platform. VMware does offer a free hypervisor (ESXi) but in order to use any of the advanced features (vMotion, DRS, etc…) you have to purchase licenses for it. As well VMware is sold on a per-CPU basis, and with a ‘core tax’ for CPUs with more than six cores per CPU.
Microsoft also has the Hyper-V Server 2008 R2 which is a dedicated standalone product and contains only the Hyper-V role, Windows Server driver model and virtualization components. No additional license is required to use any of the advanced features, which can be implemented using tools such as Failover Cluster Manager.
One last major difference is to the certification program for each. In order to become a VMware Certified Professional (VCP) you must take a one week class (which the instructor can decide to pass or fail you) and then take the exam. In order to achieve any of the Microsoft certifications you can take a class, or you can choose to learn the technology on your own, and then sit the exam.
For further information on Windows 2008 R2 with Hyper-V please visit – http://www.microsoft.com/windowsserver2008/en/us/hyperv-main.aspx
For further information on Hyper-V Server 2008 R2 please visit – http://www.microsoft.com/hyper-v-server/en/us/default.aspx
]]>As a trainer I see this technology as the way every classroom I ever work in should be configured. It eliminates the need to have a PC at every station, giving way to a simple thin client. It allows the teacher to control the environment in a way that is both simpler and more robust than any other such tool I have ever seen. Rather than writing a long, drawn-out explanation, I recorded the video for you to see what I got to see. Check it out! –M