Monday, 4 March 2013

Are Homogeneous Environments Better?


Not so many decades ago if you wanted software for your IBM computing environment you commissioned a software house to write it for you, you owned it and altered it to suit changes in the marketplace. Nowadays we buy off-the-shelf products and make them fit our needs. There's a whole marketplace of software vendors and when we need to achieve a particular business function we have many products to choose from.

So how should we make our choice? Do we go with the cheapest, the one with the best support contract, or the one that offers us the most functions?
As with any business procurement we should choose the one that, after everything is taken into consideration, offers the biggest return on our investment over a measured period. This could be through additional generated business or reducing operating costs.

Typically it means that we tend to grow our IT environments around multiple vendors which, at the time of purchase, may have offered the best product for us. In slightly less able organisations it's steered by whichever salesman does the best job as the internal team just don't have the skill-set to make the right choices.

There is a case to say that taking each business function in isolation it is correct to choose from the whole of the market an application which best meets its needs, but this will lead to a heterogeneous environment with multiple vendors. Applications which must be tied together which have not been designed to do so, much greater complexity in supporting the environment, less uniformity, a greater chance of a failure and when something does go wrong a "pass the blame" attitude from the vendors.

Taking all of our electronic business functions into consideration, not just the usual authentication, file storage and e-mail but also telephony. It hasn't been possible until recently to build a truly homogeneous environment based around one set of centralised management tools, for many organisations a holy grail. Over the past three years, Microsoft have gone strongly into innovation mode on many product lines. For the first time it seems they have a master plan of cohesiveness with development teams using a high degree of communication to ensure uniformity over the entire product range. For the first time these products are truly designed to work with each other using common management interfaces right across the data centre for our n-tier applications whether on-premise or private cloud. The turnaround started with Exchange 2010 and its leveraging of local storage as opposed to complete reliance on expensive shared storage and extends from there to System Center 2012, the new version of SQL Server following in Exchange 2010's footsteps and technology, Windows Server, Windows 8, the new App-V, Office 365 and not forgetting Lync Sever.
Many of the products in Microsoft new line require us to invest in yet others. Take Lync Server. We may have decided to move from an old PBX system for Enterprise Voice, so we invest in a Lync server infrastructure, we then need Microsoft SQL servers and although we were going to migrate away from it we now have a reason to retain Exchange and upgrade to the latest version. While we are doing that we should probably virtualise using Hyper-V and for management of the whole infrastructure, the one stop shop of System Center suite.

There are many benefits to this approach:

  • A common management interface. All of these products share a common installation routine and web / MMC based tools. Of course we're supposed to be using PowerShell commands for them all and that's a standard across everything.
  • Guaranteed interoperability. With all platforms and applications designed to work with each other there's less time spent on workarounds.
  • Documentation on everything in one place. Microsoft Technet, if we need to know something we just go there.
  • More timely training of support staff. Skills can migrate quicker across applications for the administrators and engineers as many products have shared concepts and terminology.
  • One point of contact for troubleshooting; it's all Microsoft software, we have no one else to be directed to.
  • Less complex environment leading to less downtime.
  • A centralised management and deployment point. Using System Center we can deploy, manage, secure and update our entire infrastructure.
The above benefits feed into a lower total cost of ownership due to requiring fewer support personnel; each staff member can cover a bigger skill base and support multiple products. No more having "the Exchange guy" we now have "the UC guy" and he looks after all of our messaging and telephony. We also don't need to maintain system integrators or hire consultants by the hour to integrate disparate technologies and less downtime ensures less business hours lost through loss of service.

Despite obvious benefits of a homogeneous many organisations don't subscribe to this system and prefer to remain vendor neutral, whatever the case Microsoft will continue to roll out their master plan as we move forward through 2013. I suspect they have many more surprises for us over the next few years.

Wednesday, 23 January 2013

Polycom Lync Phones new simplified config


Polycoms UCS 4.1.0 software release utilises a completely new, and streamlined provisioning process. The Polycom server-based provisioning process is still available for large deployments in which automation is required, but for SME deployments there is a new Out of the Box (OOBE) experience that is greatly simplified over the previous process. For most environments there is no longer any requirement to use the XML configuration files.
 


The new version is capable of downloading a private CA root certificate used by the Lync Server in the same fashion that the existing Lync Phone Edition devices operate and requires that Option 43 for DHCP be properly configured for the target Lync environment and the phones have access to the Lync Server Certificate Provisioning web service.To confirm the currently installed software for your Polycom IP Phone either check on the phone itself, user underneath or access the phone’s web browser interface. To check the software version from the device interface press the physical Home button and then tap the following menu items: Settings > Status > Platform > Application > Main.

This process is now much more inline with provision of Lync Phone Edition devices such as the Polycom CX600.



Saturday, 12 January 2013

Installing .Net Framework 3.5 Feature on Windows Server 2012

When trying to install certain roles on Windows Server 2012 you may need to add the .Net framework 3.5 feature. If you're trying to do this from within the GUI then it will most likely error as the 3.5 framework is now part of what Microsoft are calling Features on Demand.

To remedy this either open an elevated prompt and key the command below assuming the source media is in drive D:

dism /online /enable-feature /featurename:NetFX3 /all /Source:d:\sources\sxs /LimitAccess

Alternatively follow the procedure below to speficy an alternate source path.
1. Insert Windows Server 2012 DVD or iso to D drive.
2. Open Add Roles and Features Wizard.
4. Select Specify an alternate source path link in the Confirm installation selections screen.
5. Path: D:\Sources\SxS and then click OK.
6. Finally click Install button.

You can now proceed with adding your core role.

Thursday, 10 January 2013

System Center 2012 SQL Collation Error

Recently several sys admins have asked me for help with an upgrade to System Center Configuration Manager 2012.
Firstly, although it may seem as though System Center 2012 was designed to be hosted on Windows Server 2012 and with SQL Server 2012 as the backend, this is not the case.
For that you'll need to wait for System Center 2012 Service Pack 1. Grab the beta from here: http://www.microsoft.com/en-us/server-cloud/system-center/sp1-default.aspx.
System Center 2012

To run SCCM 2012 in a production environment today, Windows Server 2008 R2 is an ideal host with SQL Server 2008 R2, Service Pack 1 and Cumulative update 4 for backend.
Iv'e had no issues with this setup in the past.

The second issue people run into is SQL server Collation, SCCM may fail the installation check with the message:

"Installation check of SCCM 2012 fails with: Configuration Manager requires that you configure your SQL Server instance and Configuration Manager site database (if already present) to use the SQL_Latin1_General_CP1_CI_AS collation, unless you are using a Chinese operating system and require GB18030 support"

This can happen not only on production SQL servers with several active databases but on clean installations as well as I have verified.

To change the SQL instance collation, open an elevated command prompt and execute the following command assuming default DB instance name and current logged on user is authoritative:


setup.exe /ACTION=REBUILDDATABASE /SQLCOLLATION=SQL_Latin1_General_CP1_CI_AS /INSTANCENAME=MSSQLSERVER /SQLSYSADMINACCOUNTS=%username%

Then re-run the SCCM 2012 installation check, it should pass the above issue and allow continuation of the installation.

Monday, 7 January 2013

Windows 8, Business ready?

Thanks to an aggressive marketing campaign, no one can claim not to have seen the new "Metro Style" UI shared by both Windows 8 and Windows Phone.
This is the biggest change in a desktop operating system user interface since Windows 95.
Windows 8 - Dell XPS Duo


On the Surface tablet and other multi-touch devices it clearly works and works well with large bright live tiles which launch applications primarily geared toward social media.
I've ran the developer preview on a Zoostorm SL8 since the day of it's release, followed by Windows 8 Pro prior to Surface RT becoming available and I can say that as a web browser and a social application platform it's better than anything else out there. It's a big version of the Windows phone as the iPad is to the iPhone. Microsoft have completely reinvented what it is to be Windows, it offers a great deal more than the i-range with the freshest, cleanest, most modern looking interface to grace the industry to date. And if we were only reviewing its functionality for personal use then that would be it, case closed. Windows 8 and Windows Phone will be a success.

But as Microsoft have consolidated on a single OS for both business and pleasure then it has to work equally on both counts and I'm not sure it does. When showcasing Windows 8 to technicians and admins on desktop machines, their first reaction is to click the desktop icon and drop to that familiar interface. This only delays the somewhat painful transition from Windows 7 to 8. Microsoft have included the desktop to run legacy applications only. Anything written for Windows 8 will run within the new interface only. As we move forward legacy apps will be left behind along with the desktop.

Windows 8 has to work within industry for casual business users as well as serious number crunching apps, currently Microsoft have close to 90% of the desktop market and Windows 7 has sold more than any other Microsoft Operating System with over a billion sales.
At first it seems Microsoft have alienated their dedicated business users forcing them to change the way they work just because they wanted to try something new. But Microsoft have undertaken more research and are more aware of the shifting trends in the market place than anyone. When evaluating Windows 8 on a mouse driven desktop or laptop system designed for XP or Windows 7 it doesn't work. I've had feedback from countess IT managers and senior engineers toting that their organisation will not be adopting it. But this is a very short sighted view and we hear the same cries every time Microsoft change Windows. There are two reasons for this, firstly the IT support teams have to learn a new skill set just as they are getting comfortable with supporting the previous Windows, and secondly all of the end users supported by such teams lodge protests about the change as it prevents them from being as productive. Why change something if it works already?

To balance the argument we need to look at motivation for Microsoft to risk losing some of its lucrative business desktop market. New IT systems and services should only be adopted if they either need replacing due to reaching end of life and are no longer compatible or because they can generate a positive ROI. With Windows 8 it is a bit of both.
To understand the situation we have to realise what Microsoft already know, that desktop and mobile computing platforms we use today are not the ones we will be using tomorrow.
Our computing is becoming more ubiquitous, more portable, easier to interact with.
The mouse and the keyboard are not HCI tools we will be using forever. The next generation of desktop and mobile computing devices will be primarily touch and voice driven with mouse and keyboard secondary input systems and not required for many tasks.

As an example lets look at using a Windows 8 desktop computer with a touch monitor, installed with Microsoft Lync. I wish to call a colleague on his office extension.
I launch Lync by touching the metro icon. It logs me in with SSO, I scroll down my contact list using touch, tap the photo of the colleague I wish to call and tap the phone icon.
I'm in a VoIP call to another worker using my Windows PC and I haven't yet had to use a mouse.

In addition to the new hardware platforms favouring portability, voice and touch, the applications themselves are changing. They leverage features of Presence, social media, activity feeds and connectedness. The office of the future is expected to be a very different place depending upon many services which were once thought of as counter productive, such as messaging, with social networking tools at it's heart.

Windows 8 will help consolidate the diverse deployment technologies currently required and bring better integration with cloud services and while at first it will only be welcomed on tablet devices, eventually Windows 8-like systems will be the destination of all computing devices.



Sunday, 6 January 2013

Office 365 Certifiction, who is it for?

I have been contacted recently by several Microsoft Office Specialists that support information workers wanting to know which resources are best for learning Office 365 with a view to certify at some point in the future. They are currently certified as Microsoft Office Specialist for the 2007 Office System, are aware of  recent changes within the industry and have decided to up-skill.

With the recent hubbub over cloud services it's obvious that people are looking at Office 365 exams; 70-321 and 70-323 as the next step, however for IT staff in this category it may not be the right choice.

Microsoft Office Specialist Certification
Microsoft have ran the "MOS" certification programmes for many years through many iterations of the Office applications. Until the 2010 edition it has confined itself to Access, Outlook, PowerPoint, Excel and Word. With the rise in popularity of SharePoint it's a solid decision by Microsoft to include this with the MOS: 2010 certification giving a total of 8 exams required.

However, the Office 365 certifications belong to the ecosystem of certifications in the professional track. There are two required to provide the full Office 365 Administrator professional series certification:

MCTS: Administering Office 365.
Exam number 70-323 which covers the following learning domains:

  • Administer Microsoft Office 365 (35%)
  • Administer SharePoint Online (31%)
  • Administer Exchange and Lync Online (34%)
Pro: Deploying Office 365. 
Exam number 70-321 which covers the following learning domains:
  • Plan and Implement Office 365 Accounts and Services (20%)
  • Plan for and Configure SharePoint Online (19%)
  • Plan for Exchange Online (20%)
  • Implement Exchange Online (21%)
  • Plan Online Services and Infrastructure (20%)
As can be seen they require detailed knowledge of Active Directory Domain Services, Certificate Services as well as DNS infrastructure and SSO. But theres also a requirement for good working knowledge of Exchange, SharePoint 2013 and to a lesser extent Lync Server. Add to this the Office 365 applications and that the exams require good working knowledge of PowerShell Cmdlets and we have a series of quite tough exams aimed at existing administrators for Microsoft infrastructure and application services and not those supporting information workers.
It's also true to state that the exams centre on larger organisations with thousands of users over many sites, often with Hybrid environments blending on-premise services with private cloud.

For IT Professionals looking to certify with Office 365 to support the SME market it's worth taking a look at  exam 74-324 Administering Office 365 for Small Businesses.

For existing MOS certified staff looking at Office 365 from a support point of view then the Microsoft Office Specialist on Microsoft Office 2010 is the right place to start, specifically exam 77-891 MOS: Microsoft Office 365, and Features.

I'll close with a few links to some of the best Office 365 resources out there at the moment.
First the Official Jump Start videos from Microsoft:

Office 365 Jump Start Videos:

And finally the Microsoft Office PDF by Katherine Murray, Connect and Collaborate Virtually Anywhere, Anytime:

Office 365 PDF eBook: 

Saturday, 5 January 2013

Hyper-V 3 ready to take on VMware

With cloud computing gaining momentum in the market place will Microsoft be the virtualisation platform of choice? VMWare have built their empire doing just one thing, virtualising data centres. They have been a clear choice for most topping Citrix with a great range of products which have seen other vendors playing catch up. One of those has been Microsoft and until I saw the specifications for Hyper-V 3 I would have advised to steer clear of Microsofts' virtualisation technology in all but the smallest of environments.

Hyper-V 3
Released with Windows Server 2008 Microsoft's first attempt at following VMware into data center virtualisation with Hyper-V while innovative lacked many key features essential to success. Hyper-V 2 available with Windows Server 2008 R2 had caught up to VMWares offering but still missed some deal making features.
Let's review a few facts from the VMWare website:

Proven Efficiency:
VMware offers lower capital and operational costs than Microsoft due to VMware’s higher scalability and greater levels of administrative automation.
Third-party analysis (commissioned by VMware) shows that VMware can get 20% higher scalability and 91% lower operational costs.


Proven Business Value:
VMware uniquely solves customers’ business issues leading to greater business value, especially when moving to a private cloud, built on top of a proven foundation.
The result is greater business agility than most company’s enjoy today and reduced business risk by minimising application downtime and security and compliance risks.


While VMWare has certainly been able to claim the virtualisation crown up to this point, with Hyper-V 3 released for Windows Server 2012, we see a service coming of age and ready to take on the best in the industry. Lets look at a few key areas:

Scalability:
Hyper-V now supports twice the number of logical processors (320) and RAM (4TB) per host than VMWare, double the number of VM's (1024) per host and double the maximum clusters per node (64). Certainly on this statistic Hyper-V scales much higher.

Storage:
With Hyper-3.0 a new virtual disk format (VHDX) is introduced that is capable of supporting virtual disk upto 16 TB, against standard market support of around 2 TB, and it runs on Microsoft's SMB 2.2. It can leverage file shares as storage destinations with four-node active-active clustered file servers, providing simultaneous access to file shares. These enhancements fuel the virtualization of Tier 1 applications and are critical for an enterprise-class virtualization platform.

Networking:
Both Hyper-V and VMWare offer similar features, though the distributed switch offered by VMWare is an advantage for Cloud infrastructure as it ensures standard configuration of  virtual switches across all the servers in your cluster. However Hyper-V 3 supports policy-based, software-controlled network virtualization crucial in the cloud era because everything is about policy-driven automation and orchestration, all key enablers of infrastructure-as-a-service deployments. In addition Cisco supports Hyper-V on the Nexus 1000V.

Memory:
Though Microsoft has caught up with VMware memory management techniques
by introducing Dynamic memory, Ballooning and Memory over-commit similar to VMware, VMware offers TPS, Memory compression and resource sharing which all benefit larger environments. VMWare is still ahead in this area.
Clustering and Availability:
Hyper-V offers shared-nothing migration. But VMware offers Fault tolerance and Metro Live migration (migration across long distance with less latency). Share-nothing migration can be achieved , but for this the VM needs to be powered off.
Secondly, the cluster configuration process is simpler for VMware. But Microsoft now have Hyper-V Replica a new feature of Hyper-V 3 which is comparable to VMware Fault Tolerance. It will asynchronously replicate virtual machines from one Hyper-V host to another over an IP network. Additionally this process is configured at the VM level. Add to that Fail over clustering which is able to support 64 nodes and as many as 4,000 VMs and we have continuous availablilty.
Licensing:
Both licences are offered per processor, for Hyper-V it's $4,809 and for VMWare $3,495 (may vary). VMware imposes a 96 vRAM entitlement on its Enterprise Plus edition. Microsoft doesn’t place any restrictions but limits virtualisation rights. Datacenter can create unlimited VM’s. Standard allows only 2. An advantage of Datacenter licence is you can run an unlimited number of virtualized instances of Windows Server on processors without purchasing additional licenses.
Application:
VMware supports more than 85 guest Operating Systems while Microsoft supports around 25 and primarily their own platforms. ESXi 5 is around 144 mb vs Hyper-V's 9.6GB footprint and has a lower attack surface. Windows Server on the other hand being general purpose has a high attack surface, also Hyper-V is an added role to Server 2012 and not specifically designed for Virtualization purpose.

Microsoft have made huge leaps and in a short time scale but still have a way to go to take significant market from Vmware.
Remember VMWare have been virtualising Servers since 2001, Microsoft's first serious attempt wasn't until 2008, they have come a long way in that time and have the resources to throw a lot of development $'s at it if need be.

If you are a Microsoft shop then Hyper-V  may be be the better choice as it's designed for that ecosystem and it sits well with their cloud management tools such as System Center suite.
Also if you are a smaller organisation looking for a cost effective way to ease into the virtualisation space then Hyper-V is also the better choice.
For more diverse environments though, for now VMware is still the market leader but watch this space because Microsoft are accelerating much faster.