System Center

MDT handoff to SCCM

I will start by saying this is not in any of the best practices books but it works well and is used for certain scenarios.

Sometimes when I get to a customer they have MDT setup and working for OSD but someone higher up have decided that they need ConfigMgr to manage clients going forward. Don’t get me wrong I’m all for using ConfigMgr to manage clients but that being said not everyone finds ConfigMgr the easiest or most understandable platform to use. So the question then arises “Could we still use MDT to deploy the machines and then ConfigMgr to manage them?” and of course the answer is YES!

So how do we accomplish this? There are two ways and I will describe both but only show one.

The first way of doing it by using the excellent startup script created by Jason Sandys (found here http://blog.configmgrftw.com/configmgr-client-startup-script/) it is easy to setup and only requires a small startup GPO and a file share. The upside to using this is that if someone for some reason didn’t get the agent during initial setup or someone uninstalled it from a client that is targeted by the GPO the client will get reinstalled. Jason has also managed to add some repair functions to it. So the downside then is that when using a GPO the client has to actually read the GPO and for that work the client has to be a member of the domain so workgroup computers are out.

The second way is what we are going to focus on for the rest of this post. That way is to install it during OSD in MDT as an application. The upside to doing it this way is, as soon as the deployment is done the client is also installed regardless of if the client joins a domain or not. Another upside to doing it this way instead of with a GPO is that if the client restarts at any point during deployment and the GPO is enabled the client will be installed during OSD possibly messing around while you are doing other installations or configuration steps.

So how do I do this? Well first off we need to create an application in MDT then we link that application into our sequence.

Step 1 – Creating the application

Create a folder named “CMAgent” so we have something to work with. Inside that create another folder called “Source”. Next to the Source folder place the script file and the xml which you download a bit further down in the post. In the Source folder you copy the client installation files from your site server in \\<your site server>\sms_<sitecode>\Client.

You should then have a folder that looks like this

Folder

Now we import that into MDT. So you give the application a name, point to your source folder and set a command line. For name I prefer Install – CCM Agent so I can easily see what the application does by just looking at the name. For command line you should use the following

PowerShell.exe –ExecutionPolicy ByPass –File Install-Application.ps1

If you open the application when its done it should look like this

Application

Step 2 – Adding it to the Task Sequence

The next bit is to add it to the sequence in the correct spot to avoid it being installed and then messing with your deployment. So open your sequence go down all the way to the end and mark the step called Apply Local GPO Package, Click Add at the top and Create a group. Now name the group so you know what it does, either Custom Steps or as in this case I named it Custom Handoff. In that group we add a step for Install application. Change the step to install a single application and point to your newly imported application.

The sequence should then look something like this

Sequence

Step 3 – Customizing the agent installation

The last thing you need to do is change some settings to point the agent to your specific environment. So open up your deploymentshare folder and browse to Applications\Install – CCM Agent. Use notepad to edit the settings.xml file and change the Installswitch section of the file. Below is a sample of how it can look, make sure to change it to suit your server name and infrastructure.

settings

Your all set! Next time you image a computer it will then have the CCM agent installed.

Link to download the script is here http://bit.ly/1TAczuB

Happy deploying!

/Peter

Advertisements

MDT Database – The PowerShell module fix

A long time ago Michael Niehaus wrote this brilliant module for PowerShell to manipulate the MDT database. Works great for bulk import of computers, creating assigning roles and so forth. You can read more and download the module from his blog here http://blogs.technet.com/b/mniehaus/archive/2009/05/15/manipulating-the-microsoft-deployment-toolkit-database-using-powershell.aspx

The reason behind this blogpost is that there is an issue with the module, or with a view in the database used by the module. This gives the effect that when searching for computers you cannot use the description field to find them.

So if we take a look at my database I have two entries both with a description.

Computers

But when I have imported the module and connected to the database and use Get-MDTComputer -Description “LT-PETER” I get an error.

Error

So me and Mikael Nyström (http:///www.deploymentbunny.com) did some digging and found that there is a mismatch between the query and the view being used.

The Fix

There are two ways of fixing this. You can either do it manually or use the script I have included here.

The manual way. Open up SQL Management studio and browse to your database. Open up the view called dbo.ComputerSettings. Choose design and check the box in ci marked Description. Save and you’re done.

Manual

The script way, download the script here and run it using powershell the only thing you need to enter are the name to your sqlserver including instance name and the name of your mdt database.

Script

The script can be run with parameters on one line or just run the script and it will ask for server and database name.

Done!

Now when you run the command it can find the computer!

fixed

Download Script

/Peter

MDT Database – the why and the how

The Why

When I discuss with customers the most common response to the MDT database is “well we don’t need it, we can fill in the form for each computer that is just faster”

For me using the database is a given. It gives me flexibility and the result will always be the same regardless of who images a computer.

Imagine the following scenario: the user Bob’s computer has an error. To solve the problem, he calls the service desk and they create a ticket. Now he has to get the computer to the service desk or wait for an onsite technician to get to him to help him reimage is his computer. So it gets the correct name, apps etc.

Would it not be better to have all these thing predefined so Bob himself can reimage the computer and be back to work quicker? Most would agree. Then we have two main options.

  1. Generate data

This is a good option but only work for generic data or for information that does not have to be specific

  1. The database

This gives the option to preset information and in an easy way create roles for different type of information.

So if we look at the options above the usual settings that fit in category 1 is something like computer name, this can be generated and base on ex. the computers serial number.  Settings that normally fit in category 2 is applications or user specific settings, ex this user should have these applications.

So if you find this interesting, let’s move on to how to set this up.

The How

Setting up the database for use with either LTI (Lite touch) or ZTI (Zero Touch) is easy and requires no additional licenses or products (well almost, if you run LTI you will need SQL Express).

First up the SQL Server; you will need a SQL server. I would not recommend this to be part of your SQL server cluster as you need to enable Named Pipes as an authentication method. If you run LTI install SQL Express on your MDT server and if you are running ZTI well you should already have SQL on your primary site server, use that one.

The database will have an initial size of 4mb and after using it for a while and entering in a couple of computer well it might even grow to 10mb. So this will not be the database that takes all the memory or space from your server.

The database is created from inside the Deployment workbench and when created it is also supported to extend and modify the database.

Step 1 – Create the database

Open up the deployment workbench and in your deployment share go to Advanced Configuration. Right click the database and select new database.

Next follow the guide to create a new database and give it name.

Step 2 – Adding a computer

In this guide we will cover how to create a computer with the GUI. However, the nice Michael Niehaus has created a MDTDB PowerShell module so you can do batch importing and other modifications to the database with PowerShell, you can read his blogpost about it here: http://blogs.technet.com/b/mniehaus/archive/2009/05/15/manipulating-the-microsoft-deployment-toolkit-database-using-powershell.aspx

To create a computer, select the computer node under database, right click the node and select new. Now it asks for a way of identifying the computer and you have four options: asset tag, serial number, mac address or UUID number. You ONLY need to enter one! You can fill in a description and I usually fill in the computer name. The only reason I do this is in the GUI the description is shown in the list of computers and gives me an easy way of identifying the computer.

NewComputer

Step 3 – Adding settings to the computer

Next we need to add some settings to the computer. You can view this as filling in the wizard without being there. If you look under the details tab you can fill in information for computer name, network adapter settings, domain join etc. This is pretty much all the settings then can be defined in the wizard and some extras as well.

Step 4 – Adding applications to the computer

You can also specify applications that should belong to that computer, this can be either ConfigMgr 2012 applications or LTI applications. You can also add ConfigMgr packages if that is what you use.

Step 5 – Adding roles

I will not cover how to create roles and since I have already done a post about that. You can find that here: https://syscenramblings.wordpress.com/2015/06/22/how-to-sccm-and-mdt-roles/

This post also covers how to create the database and link the settings into ConfigMgr.

Step 6 – Adding administrators

You can also add local administrators or domain groups/users that will become a local administrator of the computer

Step 7 – Configure rules

The last step to get this working is configuring rules. This ensure that as you deploy the computer it will query the database and get the relevant settings, applications, roles and administrators specified.

Under the advanced section right click on the database node and select “Configure database rules”. You will get a short wizard with what you should query for. Since this is a basic setup you can without any issues query for everything so leave everything selected and go through the wizard until its done.

The wizard will add a number of lines into customsettings.ini and if you are using LTI you are now ready to use the database. If you are using ZTI you need to copy the new information into the customsettings.ini in the settings package you have.

 

That’s it! You have successfully configured the database for use with either ZTI or LTI.

In the coming post I will cover extending the database with custom options so stay tuned.

 

/Peter

Merge WIM into one – the space saver

I have gotten this question a couple of times “can i have two operating systems to choose from in one task sequence”. Well the correct answer to that is yes, but it takes up alot of unecessary space and if you are using ConfigMgr and need to download 2 wims instead of 1 well that adds alot of time.

What I would instead recommend is merging two Wim files into one, this will save alot of space and still give you the option to use different ref images in the same task sequence.

So how is this done?

First off you need to create two ref images. The most common senario for this is you have one with Office preinstalled and one without Office preinstalled. So if we look at how that looks you will get something like this:

RefImages

In this case I am using Windows 10 ref images but this works just as well with Windows Vista, 7, 8 and 8.1 (all Wim based OSes).

So as you can see they are around 4-5GB in size. The next step now is to merge them. To help with this i have a small script that you can use.

What the script does is it takes one wim and mounts it. Then it applies the mounted wim into the other wim so you get two indexes and next it cleans up the mounted directory and finally displays the different indexes in the merged wim file.

You can download the script here: http://bit.ly/1TAcO8Q

When that is complete you get something looking like this.

AfterMerge

As you can see the image is now a bit bigger but it has not doubled in size. This is due to the fact that when the wims are merged it will throw away all duplicate files to keep the file size down.

This method is the same method Microsoft have used when they have created Windows Server medias in the past containing core and gui versions on the same media.

The next step is to import this into whatever solution you are using (MDT/SCCM).

In this instance I have used MDT and it looks similar in SCCM but there are a couple of differentes. If you are unsure, drop me an email or pm and I can help you out.

So, import Operating system, custom image and point to the wim created erlier. When its done it looks something like this

Import

If we look at the preferences for these two operating systems you can see that they both use the same file in the background but different indexes.

ImageProperties

Now you can add another install operating system step and select different citeras to run the different steps. For instance, different blocks in CustomSettings.ini, some setting in the MDT database or add a new setting to the MDT database and use that. Use webservices and if the computer is this OU or AD group it should have office and if not it shouldn’t. The possibilities to create rules are as always limitless.

Happy deploying!

/Peter

System Center Configuration Manager 2012 R2 SP1 CU2

Here we are, its time for another upgrade – another update with some bugfixes and goodies inside.

This time its called Cumulative Update 2 for System Center Configuration Manager 2012 SP2 and 2012 R2 SP1. This CU also includes a number of hotfixes that has been released since CU1. Among others the important one for drivers where the size gets a bit bloated.

Installing this is pretty straight forward. Start with making sure you have a backup of the system! This should always be done but since this includes a DB update aswell it’s extra important. Next up is a reboot for good measure. You can always check if you want to se if there are any pending reboots but I prefer just to do a clean reboot to make sure anyway.

If you want to make use of snapshots/checkpoints, make sure the VM is turned off when you take to snapshot/checkpoint. This is to ensure data consistency since there is SQL database in the background.

When this is done you apply the patch and wait for it to complete. Options for automatic upgrade and client package creation will be included in the wizard.

When you are done make sure to to a reboot as the CU requires it!

After the reboot you can start upgrading your clients. And when they are done it should look something like this.

AfterUpgrade

Take not that the new version number is 5.00.8239.1301
Feel free to create a collection to include these so you also get collections with the previous version.

To create collections for most of the agent version you can use a powershell script i created. It can be downloaded here http://bit.ly/1TAeMGd

The Cumulative Update can be found here https://support.microsoft.com/en-us/kb/3100144

Good luck and make sure to have a backup!

Techdays Sweden Day 2

Techdays has ended its second day and with that its over for this year. It’s has been a good year with lots of fun sessions, new insights, new connections and last but not least lots and lots of cool demos!

Keynote

Well this will be a bit of anticlimax as I missed the keynote on the second day. But from what I have been told it was a good keynote. My concern with this second hand is of course that it was Scott Hansel who did the keynote. While Scott is an expert speaker and fun to listen to he is also a developer and as an IT-pro asp.net, javascript and the likes doesn’t really give me anything. For me the keynote should be about the new technologies and how the roadmaps looks ahead or if that is not possible to share for some reason, I would like the keynote be an inspiration talk about something related to it. As an example a couple of years ago at a previous techdays the second day keynote was held by an inspiration speaker and it had nothing to do with it but at the same time it had everything to do with it. This is due to face that the speech was about “Getting things done” which most of us can relate to.

Enough rant!

Sessions

Day 2 sessions for me was about the datacenter, the client, windows10, windows server 2016 and my favorite AzurePack vs AzureStack (more on this later)

Mikael Nyström (@Mikael_nystrom) and Markus Lassfolk (@Lassfolk) did a session about the modern datacenter and how to get there using no investment at all. The fact that this gives you the option to build your private cloud the fabric way with what you already have is one thing. The other is they showed how to get a good grip on your environment and how much old stuff is running and how you can consolidate that. To get this overview there is only one tool to use; Microsoft Assesment and Planning Toolkit and they of course showed you how to used it!

At the same time this was going on Marcus Murray (@MarcusSwede) and Hasian Alshakarti (@Alshakarti) showed how to hackproof your clients in a day. I have seen older versions and variations of this session before and this is well worth your time! Check this out when/if they release the sessions online!

Next up was “The Force Awakens” with Johan Arwidmark (@Jarwidmark) and Mikael Nyström (@Mikael_nystrom) showing how Windows 10 and Windows Server 2016 is better together! This sessions was full. Well full doesn’t really cover it. The seats where filled, people where standing along all three walls to see the session and every open spot on the floor had people sitting in it. If the room was sized for a 100 people my guess is that there where 200 people there. When this sessions shows up if you didn’t catch it. Go see it! In their fun and relaxed way they showed news in both the client and the server and how to deploy them both.

Ending the day for me was the highlight of techdays. Mikael Nyström (@Mikael_Nystrom) and Markus Lassfolk (@Lassfolk) showed how you should be doing Self-service and why in the session AzurePack vs. AzureStack.  Session content was about diffrences between AP and AS and how the roadmap for AP and AS looks. The demos where just amazing, showing how to build a new website in seconds, powershell to AzurePack the same way you would do it to Azure. Running Backup and Restore of a server within the AzurePack Portal from the correct DPM server. This is a must see and if you haven’t setup self-service yet. Now is a good time! If you don’t know how, let me know and I will put you in contact with both Markus and Mikael so they can help you out!

Summary

Hopefully you have a nice experience at Techdays and learned a lot. Gotten some ideas on what to do going forward, I know I have. Hope to see you next year and if you have any questions give a shout below, on twitter or facebook!

The future is now decoded! 😉 

Techdays Sweden Day 1

Day 1 of techdays comes to an end (yes there has been a pre-conf day aswell but I was unable to attend and this it the offical first day) and I can only sum up the day as overall a good day but with some minor failures.

Keynote

To start of the keynote at Techdays. This for me is supposed to be one of the highlights. Showcasing new technology and well versed speakers with innovative content. The first part of the keynote fails to deliver this with speakers as Microsoft Sweden director Jonas Persson shows a couple of slides but no real demos, no new technology and has nothing new or exiting to offer. The only part of the presentation made by Jonas was when he promised that next year the the keynote speaker would be standing on a hover board made by him.

Thankfully this didn’t go on to long as Ben Armstrong (@VirtualPCGuy) soon took over and showcased a number of demos showing off new features of windows 10. While Ben is an excellent speaker this to fell a bit short the information around these features has been out for several months and really didn’t give us anything new.

First session of the day

Trying to save the morning after a keynote lacking in both content and ability to excite, Johan Arwidmark (@JArwidmark) took the stage to talk about Upgrading to Windows 10. Johan did as previous sessions with him an excellent job. There was new content, showcasing amongst other things the new servicing nodes in System Center Configuration Manager Technical Preview 3. He also talked about using the correct edition of Windows and would to all business recommend the Enterprise Edition with one exception and the is to university’s, schools and the like. For these the Education edition would be the way to go. The Education Edition is the same as Enterprise except for how its licensed and priced.

The Afternoon

Started off with a session from Mikael Nyström (@Mikael_nystrom) and Markus Lassfolk (@Lassfolk) about how to build a modern datacenter in a yellow case. An excellent session about automation, the key points in the datacenter and how you should learn build your demo and/or test environment rapidly using powershell. Mikael also showed off an unsupported feature in Microsoft Deployment Toolkit using the comments field in the Task Sequence selection dialog to select a role to go with a that Task Sequence. Mikael has also promised a blogpost on how to do this later, for information on that check out his blog at www.deploymentbunny.com

Next I check out Ben Armstrongs (@VirtualPCGuy) session on Windows Containers and docker. A really interesting session on why they are doing, how it’s being developed and what to expect. For me this was a bit of an eyeopener. I have not work with docker before and seeing the integration and how the use the same tools and language as docker uses today with Linux is nice to se. A key feature here is when you begin to run containers, they can be run both on physical hardware and inside hyper-v. But coming later this year Hyper-V containers will also become available and this has been a driving force behind the nested Hyper-V scenario. And this is feature that is really cool! Stay tuned for more information on this as Microsoft makes it available.

Ending the day was left again to Mikael (@Mikael_nystrom) and Markus (@Lassfolk) showing off the new features they like in Windows Server 2016 Techincal Preview 3. A nice session about software defined storage, JIT and JEA (read on for more information on what it is), LockonDisconnect and other new features.

The big thing coming with the new release of Windows Server is Storage Spaces Direct. This is basically not using a SAS connected storage blob (JBOD) but instead using local drives in a number of server to achieve a highly-available file storage solution. This is then presented using storage spaces as normal. The benefit of using this that both HDD and SSD using SATA connections are a lot cheaper than the same SAS connected drives. This in combination with the new replication feature of storage also gives new ways of building the storage and making sure it is always available both locally and in remote sites.

Next was quick demo of the new witness function, using azure storage as cluster witness quorum give increased flexibility and is easily managed. Check back when the videos of the session are posted for a quick setup guide.

JIT and JEA is next up, with is translated into Just In Time and Just Enough Administrator. This give the ability to give people access to a server and using JIT and JEA only allowing them to do exactly what they are supposed to do and nothing else and also only when they are supposed to do it and not all the time. This is of course done through PowerShell, so if this is a feature you would like to learn more about and play around with and you don’t know PowerShell, now is a good time to learn.

Wrap Up

This wraps up day 1 and my distinct feeling after day one is; it’s more of the same and since the next server version still has a way to go. You should really focus on getting your deployment solution and yourself ready for windows 10. Both in management of it and deploying it. Both me and my Team would be happy to help you both get started or get over any issues you have with upgrading. Get in touch with me here, using email, twitter or facebook and we can set something up.

As a last note I will leave you with my three favorites quotes of the day. (they are in Swedish so if you don’t speak/read this, I’m sorry it loses the authenticity if translated)

          JIT och JEA är inte två fåglar

          Din demomiljö behöver vara ”wife approved”, tyst, drar lite ström och tar liten plats!

          Man får inte nobelpris i PowerShell