Wednesday, 23 January 2013

Polycom Lync Phones new simplified config


Polycoms UCS 4.1.0 software release utilises a completely new, and streamlined provisioning process. The Polycom server-based provisioning process is still available for large deployments in which automation is required, but for SME deployments there is a new Out of the Box (OOBE) experience that is greatly simplified over the previous process. For most environments there is no longer any requirement to use the XML configuration files.
 


The new version is capable of downloading a private CA root certificate used by the Lync Server in the same fashion that the existing Lync Phone Edition devices operate and requires that Option 43 for DHCP be properly configured for the target Lync environment and the phones have access to the Lync Server Certificate Provisioning web service.To confirm the currently installed software for your Polycom IP Phone either check on the phone itself, user underneath or access the phone’s web browser interface. To check the software version from the device interface press the physical Home button and then tap the following menu items: Settings > Status > Platform > Application > Main.

This process is now much more inline with provision of Lync Phone Edition devices such as the Polycom CX600.



Saturday, 12 January 2013

Installing .Net Framework 3.5 Feature on Windows Server 2012

When trying to install certain roles on Windows Server 2012 you may need to add the .Net framework 3.5 feature. If you're trying to do this from within the GUI then it will most likely error as the 3.5 framework is now part of what Microsoft are calling Features on Demand.

To remedy this either open an elevated prompt and key the command below assuming the source media is in drive D:

dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess

Alternatively follow the procedure below to speficy an alternate source path.
1. Insert Windows Server 2012 DVD or iso to D drive.
2. Open Add Roles and Features Wizard.
4. Select Specify an alternate source path link in the Confirm installation selections screen.
5. Path: D:\Sources\SxS and then click OK.
6. Finally click Install button.

You can now proceed with adding your core role.

Thursday, 10 January 2013

System Center 2012 SQL Collation Error

Recently several sys admins have asked me for help with an upgrade to System Center Configuration Manager 2012.
Firstly, although it may seem as though System Center 2012 was designed to be hosted on Windows Server 2012 and with SQL Server 2012 as the backend, this is not the case.
For that you'll need to wait for System Center 2012 Service Pack 1. Grab the beta from here: http://www.microsoft.com/en-us/server-cloud/system-center/sp1-default.aspx.
System Center 2012

To run SCCM 2012 in a production environment today, Windows Server 2008 R2 is an ideal host with SQL Server 2008 R2, Service Pack 1 and Cumulative update 4 for backend.
Iv'e had no issues with this setup in the past.

The second issue people run into is SQL server Collation, SCCM may fail the installation check with the message:

"Installation check of SCCM 2012 fails with: Configuration Manager requires that you configure your SQL Server instance and Configuration Manager site database (if already present) to use the SQL_Latin1_General_CP1_CI_AS collation, unless you are using a Chinese operating system and require GB18030 support"

This can happen not only on production SQL servers with several active databases but on clean installations as well as I have verified.

To change the SQL instance collation, open an elevated command prompt and execute the following command assuming default DB instance name and current logged on user is authoritative:


setup.exe /ACTION=REBUILDDATABASE /SQLCOLLATION=SQL_Latin1_General_CP1_CI_AS /INSTANCENAME=MSSQLSERVER /SQLSYSADMINACCOUNTS=%username%

Then re-run the SCCM 2012 installation check, it should pass the above issue and allow continuation of the installation.

Monday, 7 January 2013

Windows 8, Business ready?

Thanks to an aggressive marketing campaign, no one can claim not to have seen the new "Metro Style" UI shared by both Windows 8 and Windows Phone.
This is the biggest change in a desktop operating system user interface since Windows 95.
Windows 8 - Dell XPS Duo


On the Surface tablet and other multi-touch devices it clearly works and works well with large bright live tiles which launch applications primarily geared toward social media.
I've ran the developer preview on a Zoostorm SL8 since the day of it's release, followed by Windows 8 Pro prior to Surface RT becoming available and I can say that as a web browser and a social application platform it's better than anything else out there. It's a big version of the Windows phone as the iPad is to the iPhone. Microsoft have completely reinvented what it is to be Windows, it offers a great deal more than the i-range with the freshest, cleanest, most modern looking interface to grace the industry to date. And if we were only reviewing its functionality for personal use then that would be it, case closed. Windows 8 and Windows Phone will be a success.

But as Microsoft have consolidated on a single OS for both business and pleasure then it has to work equally on both counts and I'm not sure it does. When showcasing Windows 8 to technicians and admins on desktop machines, their first reaction is to click the desktop icon and drop to that familiar interface. This only delays the somewhat painful transition from Windows 7 to 8. Microsoft have included the desktop to run legacy applications only. Anything written for Windows 8 will run within the new interface only. As we move forward legacy apps will be left behind along with the desktop.

Windows 8 has to work within industry for casual business users as well as serious number crunching apps, currently Microsoft have close to 90% of the desktop market and Windows 7 has sold more than any other Microsoft Operating System with over a billion sales.
At first it seems Microsoft have alienated their dedicated business users forcing them to change the way they work just because they wanted to try something new. But Microsoft have undertaken more research and are more aware of the shifting trends in the market place than anyone. When evaluating Windows 8 on a mouse driven desktop or laptop system designed for XP or Windows 7 it doesn't work. I've had feedback from countess IT managers and senior engineers toting that their organisation will not be adopting it. But this is a very short sighted view and we hear the same cries every time Microsoft change Windows. There are two reasons for this, firstly the IT support teams have to learn a new skill set just as they are getting comfortable with supporting the previous Windows, and secondly all of the end users supported by such teams lodge protests about the change as it prevents them from being as productive. Why change something if it works already?

To balance the argument we need to look at motivation for Microsoft to risk losing some of its lucrative business desktop market. New IT systems and services should only be adopted if they either need replacing due to reaching end of life and are no longer compatible or because they can generate a positive ROI. With Windows 8 it is a bit of both.
To understand the situation we have to realise what Microsoft already know, that desktop and mobile computing platforms we use today are not the ones we will be using tomorrow.
Our computing is becoming more ubiquitous, more portable, easier to interact with.
The mouse and the keyboard are not HCI tools we will be using forever. The next generation of desktop and mobile computing devices will be primarily touch and voice driven with mouse and keyboard secondary input systems and not required for many tasks.

As an example lets look at using a Windows 8 desktop computer with a touch monitor, installed with Microsoft Lync. I wish to call a colleague on his office extension.
I launch Lync by touching the metro icon. It logs me in with SSO, I scroll down my contact list using touch, tap the photo of the colleague I wish to call and tap the phone icon.
I'm in a VoIP call to another worker using my Windows PC and I haven't yet had to use a mouse.

In addition to the new hardware platforms favouring portability, voice and touch, the applications themselves are changing. They leverage features of Presence, social media, activity feeds and connectedness. The office of the future is expected to be a very different place depending upon many services which were once thought of as counter productive, such as messaging, with social networking tools at it's heart.

Windows 8 will help consolidate the diverse deployment technologies currently required and bring better integration with cloud services and while at first it will only be welcomed on tablet devices, eventually Windows 8-like systems will be the destination of all computing devices.



Sunday, 6 January 2013

Office 365 Certifiction, who is it for?

I have been contacted recently by several Microsoft Office Specialists that support information workers wanting to know which resources are best for learning Office 365 with a view to certify at some point in the future. They are currently certified as Microsoft Office Specialist for the 2007 Office System, are aware of  recent changes within the industry and have decided to up-skill.

With the recent hubbub over cloud services it's obvious that people are looking at Office 365 exams; 70-321 and 70-323 as the next step, however for IT staff in this category it may not be the right choice.

Microsoft Office Specialist Certification
Microsoft have ran the "MOS" certification programmes for many years through many iterations of the Office applications. Until the 2010 edition it has confined itself to Access, Outlook, PowerPoint, Excel and Word. With the rise in popularity of SharePoint it's a solid decision by Microsoft to include this with the MOS: 2010 certification giving a total of 8 exams required.

However, the Office 365 certifications belong to the ecosystem of certifications in the professional track. There are two required to provide the full Office 365 Administrator professional series certification:

MCTS: Administering Office 365.
Exam number 70-323 which covers the following learning domains:

  • Administer Microsoft Office 365 (35%)
  • Administer SharePoint Online (31%)
  • Administer Exchange and Lync Online (34%)
Pro: Deploying Office 365. 
Exam number 70-321 which covers the following learning domains:
  • Plan and Implement Office 365 Accounts and Services (20%)
  • Plan for and Configure SharePoint Online (19%)
  • Plan for Exchange Online (20%)
  • Implement Exchange Online (21%)
  • Plan Online Services and Infrastructure (20%)
As can be seen they require detailed knowledge of Active Directory Domain Services, Certificate Services as well as DNS infrastructure and SSO. But theres also a requirement for good working knowledge of Exchange, SharePoint 2013 and to a lesser extent Lync Server. Add to this the Office 365 applications and that the exams require good working knowledge of PowerShell Cmdlets and we have a series of quite tough exams aimed at existing administrators for Microsoft infrastructure and application services and not those supporting information workers.
It's also true to state that the exams centre on larger organisations with thousands of users over many sites, often with Hybrid environments blending on-premise services with private cloud.

For IT Professionals looking to certify with Office 365 to support the SME market it's worth taking a look at  exam 74-324 Administering Office 365 for Small Businesses.

For existing MOS certified staff looking at Office 365 from a support point of view then the Microsoft Office Specialist on Microsoft Office 2010 is the right place to start, specifically exam 77-891 MOS: Microsoft Office 365, and Features.

I'll close with a few links to some of the best Office 365 resources out there at the moment.
First the Official Jump Start videos from Microsoft:

Office 365 Jump Start Videos:

And finally the Microsoft Office PDF by Katherine Murray, Connect and Collaborate Virtually Anywhere, Anytime:

Office 365 PDF eBook: 

Saturday, 5 January 2013

Hyper-V 3 ready to take on VMware

With cloud computing gaining momentum in the market place will Microsoft be the virtualisation platform of choice? VMWare have built their empire doing just one thing, virtualising data centres. They have been a clear choice for most topping Citrix with a great range of products which have seen other vendors playing catch up. One of those has been Microsoft and until I saw the specifications for Hyper-V 3 I would have advised to steer clear of Microsofts' virtualisation technology in all but the smallest of environments.

Hyper-V 3
Released with Windows Server 2008 Microsoft's first attempt at following VMware into data center virtualisation with Hyper-V while innovative lacked many key features essential to success. Hyper-V 2 available with Windows Server 2008 R2 had caught up to VMWares offering but still missed some deal making features.
Let's review a few facts from the VMWare website:

Proven Efficiency:
VMware offers lower capital and operational costs than Microsoft due to VMware’s higher scalability and greater levels of administrative automation.
Third-party analysis (commissioned by VMware) shows that VMware can get 20% higher scalability and 91% lower operational costs.


Proven Business Value:
VMware uniquely solves customers’ business issues leading to greater business value, especially when moving to a private cloud, built on top of a proven foundation.
The result is greater business agility than most company’s enjoy today and reduced business risk by minimising application downtime and security and compliance risks.


While VMWare has certainly been able to claim the virtualisation crown up to this point, with Hyper-V 3 released for Windows Server 2012, we see a service coming of age and ready to take on the best in the industry. Lets look at a few key areas:

Scalability:
Hyper-V now supports twice the number of logical processors (320) and RAM (4TB) per host than VMWare, double the number of VM's (1024) per host and double the maximum clusters per node (64). Certainly on this statistic Hyper-V scales much higher.

Storage:
With Hyper-3.0 a new virtual disk format (VHDX) is introduced that is capable of supporting virtual disk upto 16 TB, against standard market support of around 2 TB, and it runs on Microsoft's SMB 2.2. It can leverage file shares as storage destinations with four-node active-active clustered file servers, providing simultaneous access to file shares. These enhancements fuel the virtualization of Tier 1 applications and are critical for an enterprise-class virtualization platform.

Networking:
Both Hyper-V and VMWare offer similar features, though the distributed switch offered by VMWare is an advantage for Cloud infrastructure as it ensures standard configuration of  virtual switches across all the servers in your cluster. However Hyper-V 3 supports policy-based, software-controlled network virtualization crucial in the cloud era because everything is about policy-driven automation and orchestration, all key enablers of infrastructure-as-a-service deployments. In addition Cisco supports Hyper-V on the Nexus 1000V.

Memory:
Though Microsoft has caught up with VMware memory management techniques
by introducing Dynamic memory, Ballooning and Memory over-commit similar to VMware, VMware offers TPS, Memory compression and resource sharing which all benefit larger environments. VMWare is still ahead in this area.
Clustering and Availability:
Hyper-V offers shared-nothing migration. But VMware offers Fault tolerance and Metro Live migration (migration across long distance with less latency). Share-nothing migration can be achieved , but for this the VM needs to be powered off.
Secondly, the cluster configuration process is simpler for VMware. But Microsoft now have Hyper-V Replica a new feature of Hyper-V 3 which is comparable to VMware Fault Tolerance. It will asynchronously replicate virtual machines from one Hyper-V host to another over an IP network. Additionally this process is configured at the VM level. Add to that Fail over clustering which is able to support 64 nodes and as many as 4,000 VMs and we have continuous availablilty.
Licensing:
Both licences are offered per processor, for Hyper-V it's $4,809 and for VMWare $3,495 (may vary). VMware imposes a 96 vRAM entitlement on its Enterprise Plus edition. Microsoft doesn’t place any restrictions but limits virtualisation rights. Datacenter can create unlimited VM’s. Standard allows only 2. An advantage of Datacenter licence is you can run an unlimited number of virtualized instances of Windows Server on processors without purchasing additional licenses.
Application:
VMware supports more than 85 guest Operating Systems while Microsoft supports around 25 and primarily their own platforms. ESXi 5 is around 144 mb vs Hyper-V's 9.6GB footprint and has a lower attack surface. Windows Server on the other hand being general purpose has a high attack surface, also Hyper-V is an added role to Server 2012 and not specifically designed for Virtualization purpose.

Microsoft have made huge leaps and in a short time scale but still have a way to go to take significant market from Vmware.
Remember VMWare have been virtualising Servers since 2001, Microsoft's first serious attempt wasn't until 2008, they have come a long way in that time and have the resources to throw a lot of development $'s at it if need be.

If you are a Microsoft shop then Hyper-V  may be be the better choice as it's designed for that ecosystem and it sits well with their cloud management tools such as System Center suite.
Also if you are a smaller organisation looking for a cost effective way to ease into the virtualisation space then Hyper-V is also the better choice.
For more diverse environments though, for now VMware is still the market leader but watch this space because Microsoft are accelerating much faster.

The Silo Effect

Over the last few years a term has crept into business and managerial circles for a condition that we've being especially feeling the effects from within technology. It's the Silo effect.

The name comes from farm silos which house different types of grain. Analogous to political divisions within large organisations which represent the departments. Each department has its own clearly defined internal structure. Communication flows from the lower level staff to the department head. In larger organisations there may be several levels of management within each Silo. Communication between departments or silos only occurs at the top level and then usually only during formal meetings with all department heads present. 


Abandoned Grain Silo
This deficiency of communication causes departmental thinking to lack ideas and information from other departments feeding off only creativity from within the silo. It propagates feelings of self importance and keeps the power of synergy from working; the idea that the whole can be greater than the sum of its parts. Each silo comes to believe that it is the dominant structure bidding against other silos for shares of capital with which to further its empire. Often without realising it the organisation is in competition with itself which leads to a self destructive spiral driven by insular thinking of self preservation. Viewing the organisation as an organism it as if its own immune system has turned on it.

The Silo effect for IT systems and services of the organisation is often pronounced and one with which I've had first hand experience on a number of occasions.
Going by a many different nomenclatures, the IT support / services teams role is to provide the Information Technology services which the organisation can utilise to conduct its business. Depending upon the type of organisation these services can be seen as little more than tools to help get the job done or in the case of businesses like Amazon are integral to the very business model they rely on.
In any case these services and systems are becoming exponentially more complex and interconnected as we take advantage of the acceleration provided by Moore's law.
Staff within the IT support / services team are the most qualified and experienced with all aspects of IT hardware and software and should be the team that chooses which platforms and applications etc. to adopt for the good of the business.

This however is often not the case. The Silo effect causes elements within each team to dictate which IT tools they require. Little regard if any is placed upon how that requested platform or application will interoperate with existing technology in other departments which ultimately must share and access similar data.

Usually it goes something like this, the finance director requests a new application at a senior staff meeting. They've recently hired a new finance exec and he informs them that this software offers features which their current one doesn't and besides he's used it for many years and most of their competition use it. They have already been in touch with a sales partner for the application and only require paperwork signing to complete the purchase. The IT support team will be provided with the application when it arrives and will be expected install, manage and support this new product.
At no point were the IT support team called in to consult with the finance team on whether this new product was the best way to go, whether the partner outlet was the best place to procure or how it would interoperate with existing technology.

The result is often a mix of software systems demanded by different departments which are not designed to talk together leaving the IT support team to hire additional specialist staff, train existing staff and spend significant amounts of time ensuring that data can flow from one app to the next often by writing their own code to ensure it does. Of course it never works properly, is in a constant state of flux and the IT services team get a bad name.
Occasionally it works in reverse with IT services wanting to deploy new innovative technology across the organisation which will reduce TCO but the silos respond with claims that work they do requires existing systems and it would be impossible to change.

In my experience more often than not an analysis of benefits gained by adopting new requested software against the TCO over a measured period is not undertaken.
There is no disputing that the new software requested by xyz department will benefit the organisation because it offers new features which will increase productivity for that silo. But there is a case to be made that in enabling this new software to be interoperable with the rest of the technology in terms of how it is deployed, secured and outputs its data within n-tier platforms, we incur a greater cost than can be gained.

Until organisational structures are changed and silos broken, IT service teams will never be able to provide the best end-to-end solutions to take their businesses into the next era and many organisations which are not agile enough and working together at an inter-departmental level simply will not survive.



Wednesday, 2 January 2013

Evolution - Lync 2013



Lync 2010 was the first credible attempt at a Unified Communications platform by Microsoft. Although OCS has been with us for many years it never managed to elevate itself above a corporate version of MSN messenger for most.

Six years ago Microsoft looked at the PBX and saw an isolated system with it's own directories and infrastructure, it was unable to take advantage of existing user groups and policies for the desktops and mobile computers already in use. As Smart phones became popular corporate tools it gave three separate ecosystems for an organisation to support just to use the most common forms of communication. Realising a chance to capitalise Microsoft have built Lync as a complete unified communication platform providing the functions of an enterprise wide IP-PBX with the best features of OCS and including mobile phone devices with desktop and mobile PC platforms as end-points.
Aside from providing common features seen in PBX's such as auto-attendants, hunt groups, IVR's, call parking, extension dialling and voice mailboxes, it integrates with Exchange to provide unified messaging: Voice mail to e-mail transcription and all the benefits of the Microsoft Exchange infrastructure. Being a Microsoft application it makes use of Active Directory negating the need to manage a separate directory. But it's trump card which no other system can claim to match is the way its Presence elements permeate into Office applications; open Outlook, read an e-mail from a few days earlier, decide you need clarification on a few key points, and you can see instantly whether the sender or indeed any of the other recipients are online with their Lync clients. Choose one that's available and you've got all forms of synchronous electronic communication just a click away.

There's no question that Lync can and is replacing PBX driven systems around the world and goes beyond anything they can offer despite the occasional cry to the contrary from telecoms engineers allied to other vendors that specialise only in VoIP systems. It's hardly surprising, a software giant is positioning to take their market away from them.

Interestingly, in my experience of board room meetings discussing Lync with business executives there's strong initial resistance to the idea of replacing a tried and tested system, largely unchanged for decades, which has never let them down and always existed as a separate entity, with a piece of software. And the idea of not needing that plastic box we call a telephone on our desks anymore just seems too strange for many.
Migrating from a traditional IP-PBX system to Lync is a huge leap for both end users and IT support personnel, it's pushing a complete new line of technology and services into their already complex ecosystem.

Although the real benefits of Lync are only seen when using its full potential especially Enterprise Voice, as shown in the Forrester report,  many organisations are opting to retain their existing IP-PBX's to handle EV and adopt Lync for it's messaging and Presence facilities in the short term at least.

Microsoft have played a master stroke with Lync and the upcoming new release, 2013, only builds on what is already a great UC platform. Some of the most awaited new features are:

  • H.264 replacing the proprietary RTV codec. This is a clever move to standardise HD video streams.
  • Persistent chat replacing group chat. Create chat rooms, hold on-going discussions with co-workers. Create topic feeds. And no separate client needed.
  • Lync Web App. Now a full featured browser based client allowing those without Lync client to enjoy the full conferencing experience.
  • Skype Integration. Lync federates with Skype users to provide, IM, voice, video and presence.
  • Lync Online: Create hybrid deployments with a mix of on-premises and Lync Online servers. Microsoft calls this "hybrid voice". 
  • High Availability with each front-end server storing a complete copy of all the databases in the SQL back-end, if the back-end SQL database server is unavailable, the front-end will still function. Also, Lync 2013 supports SQL mirroring on the back-end databases reducing hardware costs associated with clustering SQL.
  • Co-location of AV conferencing and Archiving/Monitoring roles on the Front end servers and the Director role is optional (was it not before?)
  • VDI plugin which allows full Audio Video support for virtual desktops.
  • New Lync Mobile clients releasing first for Windows Phone, then Fruit and finally Android supporting the full voice experience.
So is Lync a replacement for the PBX? Maybe we should be asking if a PBX could be a replacement for Lync.

Tuesday, 1 January 2013

Microsoft Certification - MCSE is back



In 2007 Microsoft released a new certification framework which replaced the well known and respected MCSA and premium MCSE with a pathway of technology based and role based certifications. According to Microsoft this was in response to concerns by hiring managers that the MCSE; Microsoft Certified Systems Engineer, was too vague. The new framework featured MCTS certifications, TS for Technology Specialist, and each product or technology had it's own. Microsoft added job role linked MCITP's, IT Professional certifications. Often called the Pro series they required one or more TS's and the professional series exam to gain the certification.

Industry response to the change was mixed, although the new certifications provided clarity over skill sets, there were too many ITP's and many job roles required a combination of two. Whats more there was no direct equivalent to the much loved MCSE. The closest match being MCITP: Enterprise Administrator which required a total of five exams, four TS's and one pro series to attain. This certainly made the certifications more accessible as the old money MCSE required a minimum of six exams. We also saw a shift in exam style removing the "many answers may be correct but choose the one we think is best" format and introducing a much less vague and assuming "here's a list of answers select the only one that works" system.
MCSE Built for the Cloud
The idea was to provide a larger global skill base of certified personnel. However five years after its introduction I frequently see the MCSE asked for in tech job advertisements.

It's not that Microsoft got it wrong, the MCSE was too vague, but understanding this new framework was asking too much of hiring managers.

With the surge in cloud based technologies Microsoft have re-launched their certification framework again. Demand for “cloud-ready” IT workers will grow by 26 percent annually through 2015, with as many as 7 million cloud-related jobs available worldwide, according to an IDC White Paper sponsored by Microsoft. However, IT hiring managers report that the biggest reason they failed to fill an existing 1.7 million open cloud-related positions in 2012 is because job seekers lack the training and certification needed to work in a cloud-enabled world. (Climate Change: Cloud’s Impact on IT Organisations and Staffing (November 2012)).

This re-launch coincides with new releases of many front line applications and platforms in the latter part of 2012 and early 2013; Windows server 2012, Windows 8, Exchange Server 2013, Lync Server 2013, System Center 2012 and SQL Server 2012 among them. All share common management interfaces, are extremely interoperable, designed to be virtualised and as their catch phrase says, built for the cloud.
Microsoft are certainly positioning to capitalise more than anyone in the Cloud revolution and their new certification framework is designed to assist this.

The new framework has three primary levels, from bottom to top:

2012 Certification Pyramid
The Associate: MCSA (Microsoft Certified Solutions Associate). A starting point for job seekers and those wishing to formalise existing skills. This is the foundation level and represents a pre-requisite for the MCSE. Currently five MCSA's are offered: Windows Sevrer 2012, Windows Server 2008, Windows 8, Windows 7 and SQL Server 2012.

The Expert: MCSE (Microsoft Certified Solutions Expert) and its developer equivalent, MCSD (Microsoft Certified Solutions Developer) are Microsoft’s flagship certifications for individuals who want to lead their organization’s transition to the cloud. Eight MCSE's are currently offered: Server Infrastructure, Desktop Infrastructure, Private Cloud, Data Platform, Business Intelligence, Messaging, Communication and SharePoint.

The Master: MCSM (Microsoft Certified Solutions Master) certification is for the select few that wish to validate the deepest level of product expertise, as well as the ability to design and build the most innovative solutions for complex on-premises, off-premises, and hybrid enterprise environments using Microsoft technologies. In addition to the MCSE certification pre-requisites candidates must complete a Knowledge and a Lab practical exam. Certification lasts for three years and can be renewed by completing a re-certification exam.
Currently there are four MCSM's available: Data Platform, SharePoint, Communication and Messaging.

On closer inspection many of the MCSE and MCSM levels require exams which are not yet available primarily as the technology upon which they are based is not yet released to manufacture.

Having completed three of the new MCSA's and two new MCSE certifications to date; Private Cloud and Server Infrastructure I can report that there is indeed an emphasis placed upon virtualisation and cloud based management, however this is a thin veil over what is basically another release of Windows Server and System Center exams cleverly spun by the silk weavers at Microsoft's marketing department.

Something unexpected was seen when I downloaded my certificates from the Microsoft secured site, in the top left corner of each were the words "Charter Member".
Apparently this status is awarded to the first 5,000 completing certification requirements, an incentive to attract early adopters, as the more certified professionals there are, the more software licenses Microsoft will sell.

Microsoft will make this new certification framework a success, although it stands for something different, it's still the MCSE acronym we all came to understand as the gold standard for IT certs. And now with the technology more complex and interdependent than ever, for many corporations hiring certified staff will no longer be optional.