When evaluating options for large technology procurements, there are two clear options: buy or lease.
Unfortunately, that’s where the simplicity ends. There can be a number of factors that enter into the decision of “buy vs. lease,” and having a clear understanding of each is key to making the best decision for your business.
Buying is easy. People understand buying, because we do it every day in our personal lives. Even when the subject of outright purchase versus financing over time comes into play, these are still both concepts that we have experience with and comprehend.
Leasing, on the other hand, is a complicated topic. First, there are multiple nuances and options to choose between — such as capital versus operating lease, type of partner, lease length and terms, interest rates, buyout options, and product-only versus product-service bundles, to name a few. Second, and perhaps more telling, is the simple fact that for most of us, long-term leasing just isn’t a practice with which we are familiar. It certainly isn’t as familiar as buying, anyway.
However, in conjunction with a coordinated asset management program, IT leasing can actually lower the total cost of ownership. Leasing also avoids other ownership costs such as:
- Eliminating risks and costs associated with disposing of technology assets.
- Structuring acquired IT assets as an operating expense rather than a capital outlay, generally appears more favorably on the balance sheet than obtaining budgetary approval for capital purchases.
- Off-balance sheet benefit #1 – when money is borrowed to purchase computers, liabilities are increased, impairing asset-to-liability ratios and reducing liquidity.
- Off-balance sheet benefit #2 – lease payments are generally 100% tax-deductible because they are an operating expense.
- 100% financing is normally available and lease terms can be structured beyond normally available loan terms.
To help IT professionals better understand leasing as a viable procurement option, we’ve developed Technology Leasing 101, a free, downloadable eBook.
Click here to download the free eBook (no registration).
In Technology Leasing 101, we cover all the basics of leasing so that IT professionals can make the best possible procurement recommendations for their companies.
- Chapter 1: Introduction
- Chapter 2: Leasing Vocabulary
- Chapter 3: Financing Decisions
- Chapter 4: Advantages of Leasing
- Chapter 5: Capital vs. Operating Leases
- Chapter 6: Understanding Payments
- Chapter 7: Choosing a Lease Partner
- Chapter 8: The OEM Lease
- Chapter 9: Interest-Free Leasing
- Chapter 10: Managing Lease Schedules
- Chapter 11: Managing the End-of-Lease Process
Leasing is a topic we’ve covered several times here on the blog. Due to the response to those posts, we felt it would be helpful to the community to provide technology leasing information in an organized, easy-to-reference, downloadable guide.
Do you want to learn more about technology leasing? Download the Technology Leasing 101 eBook now.
Jeffrey Goldstein is Senior Consultant at MCPc and is responsible for the delivery of hardcopy and value-added services within the Lifecycle Management Group. Connect with Jeff on LinkedIn.
MCPc has been fortunate to benefit from Bruce’s breadth of knowledge related to PC lifecycle management. Closed Loop Lifecycle Planning offers a comprehensive view on all aspects of lifecycle management, including the importance of project and program management. We appreciate Bruce’s willingness to be interviewed for our blog, bringing together two of our core competencies — project management and PC lifecycle management.
Please offer a brief overview of Closed Loop Lifecycle Planning©, and why project management is a critical element.
Closed Loop Lifecycle Planning is the bill of material that is required to support client computing, as well as server computing and component peripherals. There’s also a user segmentation cost-of-change aspect, all of which has evolved from initial research in developing the methodology.
One of the conclusions of my research is simply that lifecycle management (LCM) is a project, and if you try to manage the bill of materials without integration, you sub-optimize various components of it. Project management is really the glue that holds this type of project together and gives you the ability to secure all of the benefits from it.
What are the key project-management deliverables?
The key deliverable is a clearly defined plan complete with milestones. The plan must consider enablers and inhibitors: what obstacles may get in the way and what will enable the velocity of the project.
As you work on the plan, the maturity of the LCM practice levels will determine the reports you generate, how often you run them and how frequently you assess them.
One deliverable I look at in the planning phase is a clear understanding of costs. For example, let’s say that you’re planning a Windows 7 deployment and your goal is continued process improvement over what you’ve done over previous years. To know if you meet this goal, you must have a baseline of your cost.
Then in the definition phase, you need things like software rationalization reports and definitions of service levels (particularly if new service levels will be built).
As the architect, you need to address issues such as what client automation tools you’re using. If the last refresh of a significant nature you completed was 24 months ago (because of the recession, for example), you’re now looking at very mature client-management toolsets. Consider how you use them differently than before to deploy.
A key role of the project manager is to determine the unintended consequences from some of the strategies you’ve done. A good project manger will be able to look at business processes and determine their implications and costs. Those that are truly skilled will put things both in the technology context as well as business context.
Can you expand on the idea of enablers and inhibitors as they relate to risk?
It is critically important to understand the level of risk in a given project. If a project has a lot of risk, you need to develop countermeasures; if risk is minimal, then that gives a program more flexibility.
Inhibitors can include: regulatory and legal issues, culture and politics of the company and technological considerations. You also have to consider outcomes, such as disposing of old assets in an environmentally safe way, getting DoD 3-pass wipe at end of life and having certificates of destruction to protect consumer information and intellectual property.
The project manager must manage the risk cycle. This means that each element in the bill of material is reviewed for risk. For example, if you hire contract labor to do the deployment, you have to ask: Are you indemnified? Are they certified? Have you done all your due-diligence to treat them as a full-time equivalent of your company?
Enablers include things like senior leadership commitment — it’s always good to have senior leadership buy-in into the methodology and outcome, then have periodic reviews so that they are a part of the ongoing project. Other enablers could be the technology and the maturity of client-management automation tools. For example, tools that do end-user profiling enable end-user segmentation, which can get you to root-cause analysis — ideal for project managers to look at. In a PC deployment project, the PCs themselves can be an enabler if they allow for new service levels and ongoing support.
What are some key mitigation strategies for top risks of refresh projects?
I’ll give you a very specific example: If a project manager looks at risk for data protection, he or she needs an encryption strategy and password protection. Part of what he or she should do is look at the existing, business-as-usual model, determine the associated risk and exposure, quantify the business case and then report these findings to senior leadership.
With this example, you get into details such as software license compliance, harvesting licenses, encryption, whether a desktop is more secure than a laptop and user segmentations.
What reports are typically generated or maintained during the project?
Every project should have a scorecard or dashboard. The dashboard should include: size of the install base, percentage of deployment within the install base, customer satisfaction (using clearly defined metrics), incident management control, time to deploy, improvement strategy and status against the cost baseline that was established at project implementation.
The most important data for the project, which was identified in the project plan, should be included on the dashboard. In a successful project, all metrics on the dashboard should be met or exceeded.
We are seeing more clients attempting to blend domestic and international lifecycle refresh projects. What are the critical success factors for these complex global projects?
There’s nothing wrong with domestic and international deployments, but there is an additional complexity in global projects, and all parties involved should understand this.
Most businesses can do U.S.-based lifecycle planning, but when you add in the global complexities of local laws, customs, relationships, taxing structures and labor requirements, it’s clear that there is more needed from the project manager to ensure a successful project.
When blending the project management office (PMO), roles, responsibilities and workflows still need to be clearly fleshed out. There’s really no technical or business reason not to share that construct, but if you put together one blended figure you need to be good at sizing it, because it’s difficult to break specific elements out at the end if there are issues.
How does user segmentation blend across domestic and international deployments? Is it common to have different user segments and service levels?
The key point is that global projects must start with a service-level discussion. For example, let’s say a service level for executives is concierge, meaning that there are dedicated resources associated with them. Can you deliver that globally? Are there resources in the various countries you’re supporting that you can deal with that? Is the infrastructure in place?
Or, for road warriors: What is the support plan on a global basis? Is there a mail-in service? What’s the SLA? Does it vary from country to country or business unit to business unit? How is that priced out if cost drivers are different as users move around the globe?
Global LCM projects also add a level of risk and cost. If you look at imaging, for example: What’s your core image? Can you deliver that core image across the globe, consistently? There are global SKUs as well, in terms of product configuration. If different business units and user segments will have different images, that adds a lot of complexity, and then there is a dependency upon what software-management tools will be used.
What tools must the project manager have in their kit bag (i.e. tools, template, techniques) to ensure a successful project?
This can vary greatly from business to business. I do, however, try to encourage people to use tools that are commercially available, particularly for outsourced projects, because they are very predictable and mature.
There are several options when it comes to service- and software-management tools, from companies including HP, Symantec and Microsoft. The most important things for program managers to consider when determining what toolset to use is that it is comprehensive, understandable and easy to manage.
How often do you see customers complete a refresh without implementing additional tools, still managing assets in Excel?
I’m not seeing as much of that as I used to. Most businesses today recognize that management by spreadsheet is typically not sufficient due to regulatory issues and consumer protection laws.
To remain compliant, businesses often have to be able to look at their complete install base at any time, know what assets are on the network, validate that they are encrypted, validate end users, validate where that system might be on any given day for mobile users, and put an agent on the device to do remote asset management. All of this is awfully hard to do via spreadsheet. For larger enterprises in particular, asset management tools are clearly the way of the future.
That being said, some small businesses may be able to get away with management by spreadsheet, provided they put in a fair amount of manual effort and have a solid understanding of policies.
One thing that I encourage is to talk to your deployment team and planning partner to determine the most appropriate toolset for the scope of the project. Another thing to keep in mind is that lot of solution providers host these tools for their clients, so companies need not always purchase them in-house.
Many thanks to Bruce for his time. If you enjoyed the information here, read Bruce’s Client Computing Blog or check out his book, Closed Loop Lifecycle Planning.
Ira Grossman, VP, Personal Systems Group, has more than 15 years of technology project management experience and is an expert in lifecycle management and mobile device management for the enterprise, including the iPad. Connect with Ira on LinkedIn.
In MCPc’s small but mighty Marketing department, we spend a great deal of time planning, meeting, executing and following up on customer events. With such a diverse group of people with whom we want to reach and build relationships, I admit that it can be challenging to plan events that are both fun and informative. Everyone has busy schedules so we are always trying to think about what is worth your valuable time.
This past Friday’s event made me realize just how valuable such events can be; truly worth all of the hard work that goes into them.
Our Marketing Coordinators and masterminds behind the event, Lauren and Erin, met me out at Hill'n Dale Club in Medina, Ohio where we had about 30 customers and prospective clients in attendance. We all enjoyed chicken and ribs that fell of the bone, redskin potatoes, corn on the cob, a great salad (or should I say I had a little greens with my ladle of ranch dressing) and deliciously chewy sugar cookies and brownies.
Getting Down to Business
Steve Libby, our Citrix Field Sales Manager, was prepared with his IPad and presented a concise overview of the Citrix story. He shared with us that today almost any mobile device can be converted into a business tool, speaking a universal language that allows it to communicate with other devices and corporate networks. The Citrix solution is truly amazing technology that is changing the way we all do business.
All Work and No Play…
When Steve’s presentation was over, it was time to shoot.
As I was walking to the shooting area, all I could think of was Jeff Goldblum in the movie Independence Day as he was explaining how he’d give the mother ship a computer virus and mumbled “this will buy you some time to do your stuff, take them out.” I started to think, “Wow, my job is diverse.”
The atmosphere was relaxed. We were clothed in jeans, boots, and sweatshirts, I in my Akron Zips gear and my team in their Kent State Golden Flashes apparel. (What is a Golden Flash anyway?). The interaction was light and very enjoyable. Steve Libby and our sales team had the opportunity to discuss virtualization with our clients while cheering (or chuckling) as others attempted to shatter the clays into tiny pieces.
In marketing, we don’t often get the opportunity to meet and get to know our customers in person, so this was a true pleasure, particularly for Lauren and Erin.
As our turn approached, nerves in check, I stood back and watched as Erin and Lauren gave it their best “shots.” The results were not exactly what they had had hoped for as that 12-gauge shotgun was just a little too overpowering. I too only nicked one clay of my ten opportunities. Our turn was completed and the next round of true marksmen stepped forward.
A short time later, Charles, a very patient and kind customer, encouraged us to give it another “shot.” He took us under his wing, brought out the 20-gauge shotgun and gave us personal guidance on how to execute this skill properly: fingers lined up, focus on the right eye, butt of the gun up against your chin, tuck it into your shoulder and follow the clay as it moves across the landscape. And wouldn’t you know it: Erin hit two clays and I hit three in a row. Lauren nicked one but as our instructor said her form was excellent.
An Unexpected Lesson
You know what I never expected to get out of this event? I learned a lot about my team. I knew they would seamlessly execute this event. But what I found out is that they are true competitors. They want to excel in everything they do and they didn’t want to give up. They are passionate and give everything 110%. Herein lies the true value of the MCPc Marketing Department, and really everyone at MCPc: What we do, we do to the hilt and take complete ownership. It’s one of MCPc’s three main differentiators: our people, process and technology. The pride in our work is second to none. This event gave my team the opportunity to demonstrate these tremendous attributes even if I was the only one who noticed.
Even better, as we were leaving, Erin was pumped up and said, “ I feel so…so…liberated. I did it.” I was sincerely happy for her, and proud of her and Lauren — two awesome women with so much potential for their professional careers. I’m sure those of you in management positions know exactly what I’m talking about.
So, at the end of the day I realized that not only are these events an excellent way to get to know our customers better and a true catalyst for enhanced business relationships, they also offer opportunities for people to shine, grow and gain experiences that otherwise may never have occurred.
Have you had an experience in your career that surprised you with an unexpected lesson? Or if you attended on Friday, feel free to share what you enjoyed about this event. It was truly our pleasure.
A big thank you goes out to our sponsor Steve Libby with Citrix, and all of our customers who attended this event. We appreciate your partnership with MCPc, and look forward to more traditional and not-so-traditional business dealings in the future.
Anne Browning is Marketing Manager for MCPc and is responsible for the development and execution of corporate marketing strategies that enable us to better communicate with, reach and serve IT professionals. Connect with Anne on LinkedIn.
Stay Connected with MCPc: Subscribe to the blog; follow us on Twitter, Facebook or LinkedIn.
Pictures courtesy of Copyright © J.DELL Photography
Virtualization, at the core, is not a new concept. In fact, mainframe computers have been around since the late 1950s and can be considered an early phase of virtualization as we know it today.
Though virtualization has dramatically evolved over the last 50-plus years, the driving need for this technology remains the same — IT professionals require controlled, secure and centralized management of their business-computing environments.
As technology has become an integral part of modern business operations, companies have incorporated significantly more hardware into their office environments to support it. However, as processing power increases, we see that the current state of IT is one in which physical units are taking up a disproportionate amount of space, time and money to support.
The Current State of IT
- Low infrastructure utilization. Servers are greatly underutilized, as are PCs, when looked at over long periods of time. Essentially, businesses are not getting optimal use from the hardware infrastructure in which they have invested.
- IT physical infrastructures get continually more expensive to support (heating and air conditioning costs, power demands, cost per square foot, etc.).
- IT management costs keep increasing due to demand for more senior personnel and higher levels of expertise.
- Many businesses operate with insufficient failover and disaster protection strategies. Even for those that do, testing these plans is rarely done outside of enterprise organizations due to the significant cost and time required to execute.
Simply put, there is so much complexity in business technology today that, on average, more than 70% of IT budgets are used simply to maintain the status quo — leaving less than 30% for innovation and competitive advantage. This makes it almost impossible for businesses to get ahead, or even stay current, with advances in technology!
*From IDC and VMware TAM Program
Virtualization and Cloud Computing
I mentioned earlier that virtualization really started with the mainframe. From there, we moved to PC/client-server environments, to the web and now on to cloud computing.
Cloud computing solutions give businesses the ability to grow on demand and use more storage capacity, at need, for a low cost. Conversely, within our current physical environments, it is very hard to grow quickly and make changes efficiently, as storage costs must be incurred and hardware in place in advance of need. In short, cloud computing offers unrivaled agility. But how do you get there?
Virtualization enhances a business’ ability to get to a cloud computing strategy by putting a more cloud-friendly environment in place.
Types of Virtualization
Server virtualization is both the most common type of virtualization employed by businesses today and the easiest to think about. With server virtualization, you move away from physical servers by installing software that runs on one machine and in turn supports many virtual servers.
Businesses today typically use only 5-15% of their server capacity. This is largely due to the exponential increase we’ve seen in the past several years in processing power and memory. That being said, this increase in processing power is the key to making server virtualization a reality. Even as recently as two-to-three years ago, we were limited to how many virtual machines could run on one server. Today, dozens of physical servers can run on one virtualized machine.
With application virtualization, we get into software. Rather than installing user applications in the traditional sense, applications are installed on the server and then executed on end-user machines as if on the original operating system. When an application is executed, it is fooled into thinking it’s running with the original operating system and all its resources when in fact it is run all on top of one system.
In short, application virtualization removes applications from operating system dependency. Each application is isolated and a layer of abstraction is put in place through encapsulation, thus the application is given the mobility to transfer from one physical machine to another while keeping user preferences and data stored.
Though virtualizing applications does not change licensing requirements, it does make it much easier to keep track of licenses as they are all installed on one server rather than individually on all end-user machines. Upgrades and true-ups are more efficient because you can always see how many people use different applications across the organization.
Until recently, IT professionals avoided virtualizing tier-1 applications such as Exchange, Oracle, SAP, Sharepoint and SQL but today it is generally recognized as a suitable solution and can even improve performance and stability.
Desktop virtualization is the ability to host and centrally manage virtual machines in the data center while still giving end users the full PC experience, even with thin client devices.
At its core, the key benefits of desktop virtualization are to streamline management and deployment of the desktop. Business continuity is ensured as applications are no longer housed on the desktop — rather they are sitting in your central computer room on servers. This also results in increased security through centralized control.
Though desktop virtualization can be a challenge for employees that require mobility, there are new solutions that essentially allow end users to “check out” applications to their machine, make applications local, then sync back to the host when the machine returns to the main office.
The Case for Virtualization
- Server consolidation and containment. With virtualization, you can decrease server sprawl, increase server containment and better control your server environment and associated costs.
- Development and testing optimization. New servers offer development and test environments for low cost as the need for new hardware is removed.
- Business continuity. This is a huge upside to virtualization solutions, as products from top manufacturers support high availability, real-time replication and failover capabilities for disaster recovery. In addition, virtualization is cost effective and reliable.
- Desktop manageability and security. With virtualization, you reduce business risk and improve security by removing the potential for end user machines to contract viruses.
Though the initial cost difference for a virtualized solution could be a 30-40% increase in capital cost for installation in comparison to traditional models, it’s really all about the long-term strategy.
As we’ve seen, virtualization solutions can combat current inefficiencies in IT by helping your business “do more with less,” by squeezing more useful life out of hardware and paying for only the storage and software you need. These solutions allow IT professionals to spend more time and budget on strategic initiatives rather than maintenance. In addition to time- and cost-savings, virtualization offers two priceless attributes: security and control — through centralized management.
For more details and visuals on the concepts covered here, see the presentation below or on Slideshare.
What virtualization solutions have your business implemented and what benefits have you seen?
Does your business have a virtualization plan in the works for 2011?
Dale Philips is Managing Director - Converged Network Group and is responsible for directing MCPc's technology focus in its Network Solutions, Data Center and Visual Collaboration practices. As programmer, IT manager, Director, CIO and now Managing Director, Dale's business experiences make him uniquely qualified to provide business-savvy technology solutions to MCPc and our customers. Connect with Dale on LinkedIn.
Are you a business leader in Northeast Ohio interested in learning more about virtualization, cloud computing and other advanced technologies for your organization? Join us for the Modern Technology Lessons Summer 2011 Roundtable Series. This three-session series includes Path to the Cloud (7/19), Mobile Device Explosion (8/10) and Intuitive Collaboration (9/29). All three events will take place at our future headquarters, 1801 Superior Ave. in downtown Cleveland. You can attend all three or any combination of sessions. Click here to learn more and register.
We’re talking about Tape? Really?
Yes, believe it or not we are talking about tape — tape drives, autoloaders and libraries. While many people have adopted spinning disk in one form or another for their primary backups needs, tape remains the best choice for long-term archival and is still an economical choice for smaller requirements.
For example, consider the portability of an LTO-5 tape cartridge: It can hold up to 3-TB on a single tape cartridge (perhaps more using software de-duplication) measuring 4.1 x 4.0 x 0.8 inches, which you can probably fit in your pocket! Furthermore, LTO tape meets long-term regulatory compliance requirements and has a shelf life of 30 years. Also, a tape cartridge resting on a shelf uses no electricity and requires no cooling. In short, tape provides practical, inexpensive long-term storage.
LTO-5 (Linear Tape Open) Ultrium
Even though we are firmly in the age of disk backup, there are still quite a few tape formats available: DAT/DDS, SLR, VXA, DLT/SDLT and LTO. However, it seems clear that LTO has the lead in longevity and market dominance. The roadmap for LTO is one reason; while other formats are nearing the end of their lifecycle because of engineering limitations, LTO still has further advancements planned. As you can see in the image below, while LTO-5 is the current model, the roadmap indicates improvements out to LTO-8.
The LTO-5 specification is 280 MB/s at 2:1 compression which is 40 MB/s or 15% faster than LTO-4 and 120 MB/s faster than LTO-3. In fact, LTO-5 is so fast that the preferred connectivity is either SAS or Fibre-Channel and therefore traditional SCSI-equipped models are few. An update over LTO-4, the LTO-5 Fibre-Channel interface supports the newer 8-Gb Fibre-Channel network specification.
Backup Server Support Capabilities
We often see companies invest in a new tape library and cable it up to a backup server that has worked well for the past five years only to be disappointed by the apparent lack of improved performance. Guess what? That new tape library is not the problem; the problem is the old backup server. This situation is analogous to installing a small lawnmower engine into an otherwise high-performance sports car and expecting stellar results.
To ensure compatibility with tape library upgrades, it is of paramount importance to consider the complete configuration of the server, including processor, memory and the I/O interfaces. The server must be able to pull the data from the sources and then feed the data to the tape device as fast as the tape device can accept it. Otherwise the tape drive will either lay down blank space on the tape cartridge or shoe-shine back and forth as it waits for the data stream to catch up. The tape actually moves past the heads at several miles per hour!
Backup Server Recommendations for Tape Library Upgrades
So, how can you ensure a successful tape library upgrade? First, your backup server should be a dedicated, physical machine — not a virtual server. This is because it will have a physical interface card (SAS, Fibre-Channel or SCSI) that is connected directly to the tape drive, autoloader or library.
However, if you have a SAN (Storage Area Network) and have your tape library is cabled directly to your SAN switch, your backup server might be running upon a virtual machine.
Assuming that your backup server is a physical machine, start with any guidelines or recommendations provided by your backup software vendor. After that, use the following advice:
- Your backup server should be a modern, dedicated 64-bit machine unencumbered with other tasks or applications.
- A single, 64-bit quad-core processor should be sufficient. Alternatively, use a dual-processor capable server, then monitor the server during use to determine if it is processor bound and install the second processor if required.
- Install 16-GB of memory as a minimum. Memory is relatively inexpensive these days and this is not the area in which to be conservative.
- If your backup software supports it, use the 64-bit version of the operating system.
- Separate regular LAN communications traffic from your backup traffic. You can do this by installing extra Gigabit LAN adapter cards in all of the servers that are to be backed up and in your dedicated backup server as well. Then, use a separate Gigabit Ethernet switch or a VLAN to create a dedicated backup network, thus achieving the desired result of segregating your traffic. This is illustrated in the following diagram.
Have you recently updated your tape library? What process did you go through to ensure a successful upgrade?
Perry Szarkais the Data Center Strategic Business Unit leader
Perry Szarka is a Solution Consultant at MCPc with expertise in data storage and network infrastructure. He works closely with clients to understand their business objectives and discover solutions to help them achieve their goals.
Image credit: Ultrium
Below, we share articles from last month that provide insight on the following IT topics: the changing landscape of telecommunications, benefits and best practices for Windows 7, why virtualization works (how to make it work for your business), the security of cloud computing and how to clean up your WiFi network.
Telecommunications & Telepresence
Things are changing in the world of business telecommunications, and September continued to shed light on how businesses can integrate new technologies in videoconferencing, telepresence, VoIP and mobile to improve collaboration.
In Getting Videoconferencing Right, http://www.networkworld.com/columnists/2010/092110-getting-videoconferencing-right.html?page=1 Brian Kopf http://www.linkedin.com/pub/brian-kopf/2/343/655 shares several tips for SMBs who are interested in videoconferencing, but are unsure how to start the transition. Kopf covers everything from choosing the right system for your business needs to common pitfalls to avoid in deployment.
Our favorite tip: “Don't cut corners when it comes to infrastructure. When organizations fail to ensure that the network can support the new system, the result can be poor quality and an overall sub-par user experience.”
When thinking about videoconferencing and telecom, we can’t forget about the potential market disruption coming from unexpected sources. Dave Michels http://www.linkedin.com/in/davemichels offered insight on The Other Voice Channels, http://www.ucstrategies.com/unified-communications-strategies-views/the-other-voice-channels.aspx including: managed service providers, Internet dealers and retailers, web services and more.
According to Michels: “These disparate channels don’t speak the same language, but can all fill the role of the primary voice partner. It creates multiple opportunities and challenges for voice vendors, to straddle the various channels and attempt to meet the needs of very different go-to-market approaches.”
Keep in mind that these are not things to be aware of in the future. Changes are happening now. Additional articles on the topic last month, which have potential to dramatically affect the industry and your network, include:
Christina Tynan-Wood http://www.linkedin.com/pub/christina-tynan-wood/1/29/992 reported that Windows 7 Lifts PC Satisfaction Rates, http://www.infoworld.com/d/adventures-in-it/windows-7-lifts-pc-satisfaction-rates-238 as found in the American Customer Satisfaction Index http://www.theacsi.org/index.php annual manufacturing and durable goods report.
Since Windows 7 is clearly gaining momentum and end-user praise, here are a few additional articles to help you effectively install, secure and work with the new software.
After VMWorld, Alan Radding http://www.linkedin.com/in/independentassessment reported that Virtualization is the Future of Corporate IT. http://bigfatfinanceblog.com/2010/09/08/virtualization-the-future-of-corporate-it/ Basing the article on a virtualization survey conducted by CommVault http://www.commvault.com/ of more than 10,000 of its customers, Radding concludes that, “for the CFO, the question is not whether the organization should adopt virtualization, but how extensively it should virtualize and how fast.”
So, if this is the case, and you’re interested in virtualization for your business, here are a few additional articles to help guide your way:
As you know, one key benefit to virtualization is that it can set your organization up to employ cloud computing. However, there is concern among many IT professionals about the security of cloud solutions.
You can rest assured. Scott Campbell, http://www.linkedin.com/pub/scott-campbell/1/70a/918 citing Gartner’s Neil MacDonald http://www.linkedin.com/pub/neil-macdonald/2/a86/234 reported that in fact, Cloud Security Is Better Than What You Have Today. http://www.crn.com/news/cloud/227500386/gartner-cloud-security-is-better-than-what-you-have-today.htm?pgno=1
Why? According to Macdonald, cloud solutions are the “first generation of IT to bake in security, rather than treat it as an afterthought.”
In addition, the simple fact that the standard business workforce is increasingly mobile makes a strong case for storing data on the cloud: “We're tearing down the walls of our enterprise. The mobilization of the workforce demands anytime access. With consumerization you can [reach] corporate assets from any type of device. Users are demanding this… If you extend that mindset, if the workload can move from this data center to that data center, heck we might as well just move it to Amazon… Increasingly, we do not control elements of our IT infrastructure. Cloud is just one element of that, and yet we're fighting it."
Another subject relevant to the mobile workforce, of course, is WiFi. Cisco’s Chris Kozup http://www.linkedin.com/pub/chris-kozup/0/162/551 spoke with Tom Kaneshige http://www.linkedin.com/pub/tom-kaneshige/0/4a8/212 about 6 Ways to Improve Your WiFi Network http://www.cio.com/article/617585/6_Ways_to_Improve_Your_WiFi_Network for your mobile-in-office users.
- Fill coverage holes.
- Ensure that access points support 802.11a/g.
- Disable and replace outdated technology.
- Run 5 GHz frequency.
- Ensure that mobile devices are secure.
- Test for, and clean up, RF interference.
For details on these tips, see the full article. http://www.cio.com/article/617585/6_Ways_to_Improve_Your_WiFi_Network
What Do You Think?
What articles, blog posts, videos or podcasts did you find interesting last month? Post a comment, and we’ll be sure to keep an eye on those sources for future wrap-up articles like this one.
This post is an MCPc blogging team collaboration.
Stay Connected with MCPc: Subscribe to the blog; follow us on Twitter, Facebook or LinkedIn.
Two weeks ago, we hosted an event with Polycom at our corporate headquarters in Cleveland called Transformation to Visual Collaboration. A lot of great information was shared regarding the business drivers and benefits of telepresence solutions. In this blog post, we’re sharing the Cliff’s Notes version — key details and takeaways from the event.
What is Telepresence?
First and foremost, it may help to answer the question: What exactly are we referring to when talking about telepresence?
Videoconferencing, which is defined by Wikipedia as “a set of interactive telecommunication technologies which allow two or more locations to interact via two-way video and audio transmissions simultaneously,” has been around in business IT for several years. Essentially, videoconferencing is simply the ability to include live video as an element of phone calls.
Telepresence, on the other hand, takes videoconferencing a bit further by integrating new technologies for a higher quality experience. With telepresence solutions, businesses can connect their employees, customers, vendors, prospects and other audiences via high-definition, high-speed connections. At its best, this can cause users to forget that a screen exists between themselves and the people with whom they are communicating. This highest level is sometimes referred to as immersive telepresence or visual collaboration.
Due to advances in technology and IT infrastructure, immersive telepresence is becoming a reality for many businesses. Though enterprise adoption is leading the pack, these solutions are becoming more affordable — and valuable — for smaller organizations as well.
A key support for telepresence is the convergence of voice, video and data networks within the IT environment. As the Converged Transmission Control Protocol/Internet Protocol (TCP/IP) network has largely eliminated the need for separate audio-video, voice and data infrastructures, organizations have taken advantage of this framework to reduce staff overhead, network management and operating costs. It is this simplified infrastructure, with inherent priorities in place for different data package types, that lends itself well to support immersive-telepresence solutions.
In addition to this core infrastructure, Dan Lejeune, Regional Channel Member, Cincinnati area, Polycom, shared some additional advances that not only make telepresence possible, but make it a strong solution for businesses looking to advance their collaboration abilities and improve operational efficiency.
- Growth of unified communications portfolios and infrastructure
- Improved networks and availability of bandwidth
- Accessibility of telepresence within enterprise, SMB and service provider networks
- Mainstream acceptance of visual communications
- Advancements in high-definition voice and video technology
So, not only is the infrastructure in place to support telepresence solutions, but cultural and global expectations are further pushing the desire for these services. In essence, people’s comfort with using video to communicate — largely thanks to consumer software such as Skype — has made video phone calls a common activity for many, and thus a growing expectation in business settings.
Thanks to this combination of technology support and end-user acceptance, we are at a point when telepresence solutions can truly be widely adopted by businesses to save time and costs in critical areas. As found in the July 2009 Wainhouse Research report, Benchmarking the Benefits of Videoconferencing Deployments, enterprises and SMBs alike see similar benefits from implementing videoconferencing solutions in six key areas:
- Travel: 30% cost savings
- Time-to-Market: 24% reduction
- End-User Downtime: 25-27%
- Training: 22-25% cost savings
- Recruitment: 15-19% reduction in time spent
- Sales-Related Costs: 24-26% reduction
Often, the first area identified — reducing travel costs — is the impetus behind organizations installing telepresence solutions. However, as shared by Brian Gilman, Global Director, Enterprise Solutions, Polycom: when asked how they want to further deploy, many IT directors will admit that though they’ve successfully reduced travel costs, they are only using their current system at 20 percent capacity and don’t see any additional uses.
Interesting note: Gilman’s entire segment of the day was delivered via Polycom’s HDX 4000, from his home office using a standard cable modem.
Optimizing Your Business with Telepresence
This is, as seen in the Wainhouse research, a somewhat shortsighted approach. According to Gilman, there are several stages in video deployment. As organizations move through the stages, additional benefits are realized until the true goal — achieving a collaboration culture at your organization, no matter the geographical location employees and audiences — is reached.
Image source: Polycom
Gilman shared several hypothetical situations to illustrate the benefits that organizations can realize through telepresence outside of travel cost savings. Consider how much time and money your organization could save in the following scenarios:
- Your organization allows employees to work from home and communicate via videoconference, thus reducing office space needs by10 percent.
- A 3-day training program for 100 employees becomes a virtual event, rather than your organization paying for hotels, flights and daily cost-of-living for those individuals.
- Top candidate job interviews are completed via video and recorded for review by executives.
- Development time for a new product is reduced 10% by holding high-level feature/functionality review meetings over a telepresence system, rather than mailing samples back-and-forth or sending R&D personnel to manufacturing facilities.
- HR and sales trainings are recorded and saved on an employee intranet and viewed by employees within an allotted timeframe. Instructors are able to see who has completed trainings via a content management system.
So, at the end of the day, we learned that in addition to time- and cost-savings, telepresence solutions can provide organizations with a competitive edge in multiple areas, including:
- Ability to gain a competitive advantage over competitors through innovation and time to market
- Improvements in employee morale due to work/life balance options, which can reduce turnover and increase productivity
- Improved external communications, resulting in customer/partner satisfaction and long-distance trust building
- Social consciousness — environmental benefits are achieved through reduced travel, both long-distance and from work-at-home employees
What telepresence solutions are you considering for your organization? What benefits do you hope to realize?
Are you a business leader in Northeast Ohio interested in learning more about collaboration, cloud computing and other advanced technologies for your organization? Join us for the Modern Technology Lessons Summer 2011 Roundtable Series. This three-session series includes Path to the Cloud (7/19), Mobile Device Explosion (8/10) and Intuitive Collaboration (9/29). All three events will take place at our future headquarters, 1801 Superior Ave. in downtown Cleveland. You can attend all three or any combination of sessions. Click here to learn more and register.
Although application virtualization has been around for many years, desktop virtualization is still a relatively new technology. Due to its unproven nature, IT professionals have been skeptical of desktop virtualization’s ability to perform in enterprise business settings.
However, now that an enterprise-ready stamp of approval has been given to the Citrix and VMware desktop virtualization offerings by Burton Group, a critical eye should be cast toward these technologies and what they mean for your end-user computing environment.
Running Windows with Mac OS X in a virtualized environment is one of the many options coming in the future.
Single-Mode vs. Mixed-Mode Computing
In the past, IT organizations have primarily worked in single-mode, or distributed computing environments with exceptions when necessary. Those exceptions have been, primarily, around large ERP or Financial/HR systems that would be presented to users via a Terminal Services solution. Distributed computing has been the standard operating procedure for companies ever since the PC was introduced into the business world. However, we have reached the breaking point of the distributed computing model.
In 2008 Forrester Research released a paper called Demystifying Client Virtualization, which declared that “…the traditional desktop model — inherently insecure, inflexible and hard to manage — is a thing of the past. Organizations will instead identify their users by criteria like task-based, knowledge, or power users and will deliver dynamic desktops accordingly… it’s the future of the corporate PC.”
Though it has taken a couple of years since the report was published, we are starting to see more organizations identify their end users in this dynamic fashion and use virtualization as an affordable and secure way to deploy appropriate technologies. However, to deliver images for these groups of individuals in the most efficient way possible, IT professionals must look at mixed-mode computing as a deployment option.
Mixed-mode computing, at its simplest, can be defined as having multiple ways to deliver the technology to end users that they need to do their jobs.
The Reality of Mixed-Mode Computing
As IT professionals start to look at desktop virtualization as a way to “deliver dynamic desktops accordingly” to their end users, they will be faced with the challenge of integrating and supporting a virtualized environment within the distributed computing model that already exists. This challenge is the reality of mixed-mode computing.
Unless an organization is planning to replace their entire distributed computing model with a centralized computing model, such as desktop virtualization, two environments will have to be supported. This will require a strategy that takes into account all of the factors necessary to support both a centralized and a distributed computing environment.
IT professionals will need to find ways to integrate new technologies with the old and implement solutions that will complement what their organizations have in place today. For example, desktop virtualization would be a good fit for a salesperson that needs to demonstrate a complicated practice management solution while maintaining a controlled environment separate from their daily work. However, the CEO of that same organization might be able to get by with applications delivered to an iPad. Those technologies must also live in harmony with the service technician who requires his laptop be attached to a proprietary piece of hardware that a virtual desktop or application cannot talk to. All of these technologies can and must co-exist if an organization is going to deliver users what they need when they need it.
The static ways of single-mode computing are on the way out and a more dynamic way of delivering resources to your end users is approaching. Are you ready for it?
Jason Dell is a Converged Network Solution Consultant at MCPc, and is responsible for developing and programming custom solutions for clients. His expertise includes network security and security for mobile devices in the enterprise. Connect with Jason on LinkedIn.