The other day, I was shopping with my wife and 12-year-old son at a discount shoe store. My wife was excited that she found a great buy on a pair of shoes. I realized that this was a perfect opportunity to impart some of my great wisdom, from 20 years of marriage, on to my son.
I pulled him aside and began to explain, “That $19.95 pair of shoes will end up costing me $150. You see when we get home, Mommy will realize that in all the dresses she owns, not one will match those shoes, and then that none of her purses will match the new dress. So, she’ll have to buy a new dress and a new purse, right? Yep, $150, if I’m lucky!”
This same concept applies to business. Many times, we’ll buy into what seemingly is a great bargain only to find out later that we need to spend more money to get to where we wanted to be. This is most true when we talk about network redundancy.
A Hypothetical Yet Common Scenario
Imagine this: You have a tight budget, but you’ve really made the most of it. Your servers have dual power supplies, dual processors and dual NIC cards; the hot swappable drives are mirrored; and you’ve even virtualized your environment. You have that confident “bring it on” attitude, as even your backups are rock solid, de-duplicated and automated — no tapes for you!
Then, one day, some public utility worker in a backhoe takes out your entire access to the building. Bang! Data and Internet circuits are all down. Branch offices and travelling employees cannot connect; everyone is calling the helpdesk in panic. You’re standing in the datacenter when the CFO comes through the door and says, “Hey, I thought you said we had redundancy! Explain to me again why we spent all that money?”
Sometimes, when it comes to redundancy, the connection to the outside world is what gets overlooked. Maybe you have an MPLS network, or a large Internet pipe, maybe a point-to-point circuit connecting a branch office. Either way, it is likely that these circuits all ride in on the same access to your DMARC or datacenter, as it is difficult and costly to try and run alternate routes or circuits into most facilities.
However, there are a couple of options — which have become more popular in recent years — to consider that won’t break the bank.
Wireless Internet and Wireless MPLS
With the growth and popularity of wireless data networks (3G and now 4G), companies have an attractive option for data-circuit redundancy. Granted, these wireless circuits may not deliver the robust 10Meg or higher connections. However, in a pinch, they can provide a nice solution for continuity.
The wireless solutions typically deliver 1.5Mbps to 2.0Mbps speeds (4G would be higher), and they are not reliant upon any local access or land-based local loops like traditional circuits. The other nice benefit is the monthly cost, typically less than $100/month. There may be some initial upfront cost for equipment or provisioning; however, those costs are nominal in comparison and well worth the peace of mind.
These devices commonly connect to the public Internet. However, recently released solutions include wireless MPLS connectivity, which presents some interesting options in regard to security.
Devices come in many variations, like the Cisco 881G (above left) and the USB Franklin U301 (above right). These provide flexible, multiple-connection variations from Ethernet to USB.
An Example of Where Wireless MPLS Could be Considered
Let’s say you have ten locations. Of those, your HQ site and seven other sites have enough employees and bandwidth demand to justify a traditional MPLS circuit. However, the two remaining sites only have three employees each, and you cannot justify a $500-$600 MPLS node monthly cost. This is an ideal scenario for wireless MPLS.
Not only will the monthly cost be low, the fact that the remotes are only connecting to the MPLS — not the Internet — means you have enhanced security via a private IP connection.
Another possible benefit is the mobility of the wireless unit. If the remote sites tend to move frequently, you can relocate the wireless device to another site and be up-and-running quickly. However, remember that your base MPLS network needs to be on the same carrier as the wireless MPLS device.
The above picture depicts Sprint’s solution for
wireless WAN redundancy (Click for larger image).
To Sum it Up
Wireless may be a consideration for your company when you are developing and designing your wide area network (WAN); however, there are some potential drawbacks, including:
- Wireless is not everywhere yet, so be sure to qualify the location to see if it is available and at what speeds (bandwidth).
- The bandwidth speed is not as good as normal terrestrial circuits, but may be sufficient based upon your needs.
- The bandwidth speeds are also not usually symmetrical, so you may get 1.5Meg down and only 512K up, as an example.
- There is no Quality of Service (QoS) on the wireless link, even with wireless MPLS. The wireless leg between your DMARC and the carriers Point of Presence (POP) is only “best effort,” just like the public Internet. That may be an issue if you plan on running VoIP or video over the connection.
What are your thoughts on using wireless for WAN connectivity?
Frank Marro served as Regional Vice President responsible for sales management in Cincinnati, Dayton and Columbus, Ohio. He also directed MCPc’s national carrier service program, which provides solutions for clients looking for voice, video and data circuits for WAN connectivity.
Stay Connected with MCPc: Subscribe to the blog; follow us on Twitter, Facebook or LinkedIn.
You’ve probably heard the old real estate axiom that says location is the first, second and third most important consideration when evaluating property. In many ways, this holds true for data centers, too. The location of your data center is the most important decision you’ll make regarding data security and application uptime.
Ten years ago, if you asked executives if they thought email was a mission-critical business application, most probably would have shrugged their shoulders or stated that they could probably get by for a few hours, or even days, without it. Today, with email and other applications becoming more and more integrated into employee tasks, the answer would be quite different.
As application uptime increasingly becomes a necessity for daily business operations, a critical question emerges for IT managers: Is your office facility truly the best location for your servers, storage devices, and core network connectivity?
Data Center Colocation Centers
Data Center Colocation Centers — facilities that house network, server and storage infrastructures for multiple organizations — offer an affordable, secure and dependable alternative for companies that don’t have the best-suited office environment to house their data centers onsite.
The following are nine key considerations to help you evaluate whether moving your critical data center assets into a colocation building is right for your organization.
- Is your facility equipped with a generator?
- You may have configured protected-power for your server, storage and network components, but what about the cooling system? If your cooling system goes off-line and air flow ceases, how quickly will the temperature become too hot for your systems to continue operating? You may be surprised to discover that it is typically only a few minutes!
- Is your facility equipped with redundant power feeds (from two different electrical providers)?
- Is your facility in a flood zone?
- Is your computer room or data center hardened for earthquake events?
- How many roads lead to your facility? What would happen if those roads were closed?
- Does your facility have provisions for WAN connectivity from more than one provider on (truly) separate infrastructures?
- Do you have the capability to quickly diagnose and remedy copper and optical cabling faults? What about the servers, storage array and network gear? Do you have proactive monitoring and management in place today? Do you have an NOC (Network Operations Center)?
- If you have some redundant or fail-over systems in place today, how often are they checked to determine if they will work as expected when the time comes?
If any of these questions have you reconsidering the strength, security and reliability of your current data center location, a collocation center may be a viable alternative for your organization.
Keep in mind that relocating your hardware, applications and data will likely affect your management of them moving forward. You will almost certainly want to remotely handle as much of the administration as possible, which can be achieved with network monitoring solutions. In addition, the colocation facility you choose may offer administrative services that can help ease the burden of internal operational tasks.
Does your business use a colocation center? Are you considering one? Please share your experiences and concerns in the comments below.
Perry Szarka is a Solution Consultant at MCPc with expertise in data storage and network infrastructure. He works closely with clients to understand their business objectives and discover solutions to help them achieve their goals.
Image credit: traferty
You’re at the gas station. You want a coke. You don’t have any cash.
You could, of course, pull out a credit card, but that seems like such a hassle for a small transaction. What if you could swipe your iPhone and the $0.75 came straight out of your checking account? And, what if you also got a coupon on the screen for $0.25 off a package of Slim Jims to have with that Coke? And, what if by making that purchase, you earned points because you’re in the BP loyalty program?
Expect situations like this to become a reality soon.
When Your Mobile Device Replaces Your Wallet
iPhones and iPads have changed the way we live. We have instant access to virtually all of the information in the world. We have thousands of low-cost apps for productivity, fun and education. But the evolution of the Apple devices certainly isn’t done.
In the iPhone 5 and the iPad 2 — both expected to be released later this year — a technology called Near-Field Communication (NFC) will be introduced. NFC will allow users to make purchases simply by swiping their iPhone or iPad near a terminal.
NFC operates at a frequency of 13.56 MHz and can provide data transfers at up to 848 Kbits/second. It’s both a “read” and a “write” technology, so it can transfer information in both directions. Because it requires close physical proximity, it provides a more secure transaction than WiFi or Bluetooth.
This isn’t the first time we’ve seen NFC on mobile devices: Google has installed the technology on some of its Android phones, but there’s never been a large enough base of customers to encourage merchants to invest in the readers. iPhone and iPad functionality would change that overnight.
How NFC Works
NFC transactions via iPhone may not sound very different from credit card transactions, but they are. The biggest change? Now your transaction will be processed through iTunes. The implications of this are enormous.
iTunes can bill your credit card, or more attractively, remove funds from your checking account. Apple is a master of making the transaction process pleasant for the consumer, and we can anticipate that they will follow suit on NFC. In addition, merchants may be able to enroll you in their loyalty program and let iTunes manage the details.
NFC may also enhance the “Genius” experience, the function at the Apple App Store and iTunes that makes recommendations based on your buying history. Now, through the combination of NFC, WiFi and 4G, you’ll likely be able to activate push notifications on your iPhone or iPad that alert you whenever you’re near merchandise you’ve shown a proclivity for. Love Vera Wang scarves? You’ll know if the department store you’re in has them.
The Apple-NFC Effect
Americans spend $6.2 trillion every year on goods and services, much of which is paid with credit cards. Currently, purchases on iTunes are made with credit/debit cards and Apple pays a fee on every transaction. If by using NFC Apple can convince you to use a method that costs them less — like taking the money straight out of your checking account — it will save (and make) a lot of money. This is the model that’s proven profitable for PayPal.
The obvious worry is that just by coming into the proximity of an NFC reader, you could be charged for something you didn’t buy. However, due to standard electronic purchasing processes, this shouldn’t happen.
For example, say you wanted to use your iPhone to purchase train tickets. You would approach a ticket terminal, the NFC connection would be established and a list of destination options would show up on your iPhone screen. You would select your destination, your departure time and possibly your class of travel. The iPhone would produce a confirmation screen for you to complete your purchase. The ticket would then show up on your iPhone and you could proceed to board. At any point before final confirmation, you could cancel the transaction without being charged.
The remote control changed TV. Think of NFC as remote control for your wallet: the functionality remains the same but the interface changes dramatically.
Do you plan to use NFC-enabled devices to make purchases when they roll out? Will you replace your wallet with your phone?
EDIT: Since the post was published, Apple annuonced that the iPhone 5 will not be NFC-enabled.
Bill Cannon is Vice President of Business Development at MCPc, and an IT industry veteran with expertise in networking and telecommunications technology. Connect with Bill on LinkedIn.
When determining how to best execute an Imaging & Printing Assessment, one of the first decisions you’ll have to make is whether to insource or outsource the assessment. There are merits to both strategies, and also to “splitting the difference” and collaborating with a partner organization that specializes in assessments. Before closely evaluating the options, let’s begin with the end result in mind.
The Final Deliverable
The final deliverable of an Imaging & Printing Assessment can range from a simple spreadsheet to a comprehensive binder. Ultimately, you want the final deliverable to be an accurate accounting of your true imaging and printing costs. It must be easy to understand, and it must be action-oriented, meaning that your organization can take action as a result of the assessment findings.
In order to build your final deliverable you will go through two major assessment phases: Current- and Future-State. Both phases require a combination of raw data, time, knowhow, and software-based assessment tools. The Current-State Analysis involves evaluating all of your existing costs, while the Future-State Analysis involves building an action-oriented plan to reduce cost while maintaining or gaining functionality.
During the Current-State Analysis, you want to evaluate:
- Copier contracts
- Monthly service invoices
- Toner purchase history
- Device usage patterns
- Device physical locations
- Various operations or workflow processes that affect end user productivity
During Future-State Analysis, you build a plan that includes:
- Device lifecycle refresh
- How to implement the appropriate corporate governance
- How to become more green
- Buy vs. lease considerations
- Strategies to reduce paper waste/pages printed
- A plan for ongoing device management
The most significant reason to complete an assessment in-house is keeping your raw data confidential, especially if your organization has policies that forbid sharing the raw data necessary to perform your Current-State Analysis.
Another big reason to insource is timing, as outsourcing to a partner organization may result in a more lengthy assessment than you require. By performing the assessment yourself, you can focus on those areas that are most critical, while some partner organizations or consultants may focus on larger, more complex assessments that take more time to complete.
To organize the data and complete the analysis required for the assessment, most calculations can be created using a spreadsheet, meaning that with some research and time you can create the tools yourself.
If you choose to outsource the assessment, you need to be comfortable sharing information, as missing or estimated data will cause the assessment to lose accuracy and validity. Therefore, choose a partner organization you can trust, and have their representatives sign an NDA if needed.
The most significant reasons to outsource are tools, talent and time. Partner organizations or consultants that specialize in imaging and printing assessments have software tools specifically made for the task. Assessment software tools go way beyond the capabilities of spreadsheets. For example, many of these tools have pre-populated industry data for benchmarking, and the ability to upload floor diagrams to help create great visual references.
In addition, a consultant’s staff will likely include analysts that have performed similar assessments to yours. These individuals who are familiar with imaging and printing assessments know the “gotchas,” and are in tune with intricate details that you may not know to consider.
Lastly, by outsourcing you don’t have to devote as much internal staff time to the project.
The strategy I have seen work best is collaborating. Companies benefit by keeping the most sensitive data close-to-hand, but also share big-picture information and not-so-sensitive data with your assessment partner. This ensures that the right kind of information is being evaluated, and being evaluated accurately.
Sometimes, when assessments are insourced, key components can fall through the cracks simply because nobody has taken ownership of the project and made it a priority over other initiatives. Including an outside partner whose top priority is to get the assessment done accurately and efficiently, and pairing them with internal teams responsible for moving on action items, can help keep everyone on track.
Having stakeholders on the inside is beneficial from the provider’s perspective in that there is more ongoing communication throughout the assessment. This ensures that the project continues moving in a timely manner, information assessed is current and correct, and that the final deliverables are in line with expectations.
Lastly, when there are experts providing the consultation and insiders that have some ownership, it’s more likely that recommendations will be implemented after the assessment is complete.
Has your organization completed an Imaging & Printing Assessment? What approach did you take? What challenges did you face? I encourage your feedback and questions.
Jeffrey Goldstein is Senior Consultant at MCPc and is responsible for the delivery of hardcopy and value-added services within the Lifecycle Management Group. Connect with Jeff on LinkedIn.
It’s probably safe to say that your organization’s IT environment — networks, hardware, software and everything in between — is a large investment. In addition, its optimal performance every day is critical to both short-term operations and long-term success.
The IT Management Nightmare
However, with so many intricacies in IT infrastructure today, keeping a handle on the strength and performance of all assets on an ongoing basis can be a daunting task. In fact, doing so manually, on top of all other daily tasks IT professionals are responsible for, is likely impossible.
Because of this — and due to both technology and human errors — problems occur and systems fail. No matter how infrequent, issues from minor complications to complete outages can cause business disruption, losses in production and sales, and panic among the IT department and company executives.
A report by IBM Global Services found that in 2008, unplanned application outages cost an average of $2.8 million dollars in revenue per hour.
IT Managed Services Benefits
Though it is impossible to completely remove the chance of outages or other errors, by outsourcing the daily management of your IT environment through an IT managed services program, you can greatly reduce the risk. With managed services, your organization can efficiently monitor systems to identify performance changes, and put processes in place in to:
- Automate IT environment management and oversight
- Identify and troubleshoot minor issues before they become major ones
- Rapidly respond to problems, thus reducing outage time if they do occur
- Increase security with increased data- and server-access intelligence
- Ensure compliance with regular OS upgrades
- Identify and invest in your most valuable, mission-critical assets
In addition to the benefits above, studies have shown that there are significant financial benefits to managed services. For example, Cisco’s Business Case for Managed Services in Small and Medium-sized Businesses stated: “The result of purchasing managed IP services is an extraordinary cost savings — in some cases, 60 percent or more.” (Source: Cisco Analysis, 2004)
Finding the Right Fit
To determine if managed services are a good fit for your organization, ask yourself the following questions:
- Is your IT team struggling to keep up with daily maintenance tasks?
- Are senior IT executives spending time on tactics, rather than strategic initiatives?
- Do errors occur that you didn’t expect or anticipate?
- Are you missing a strong overview of your daily network performance?
If you agree with the above, then managed services would likely improve some operational aspects of your IT environment. The only element left to determine, then, is whether managed services makes financial sense.
To get an idea, click below to view a sample managed IT services financial analysis, filled out for a fictitious company (call them Company X):
In the Inputs, we insert data about Company X such as its number of users, servers and hours spent on IT services, as well as costs including revenue, salaries and outsourced IT support.
In Outputs, we crunch the numbers to determine total annual IT support costs at Company X.
Then, in the Analysis, we can do some simple math and see that Company X can save $564,663 — or 90% — on its annual IT support costs by implementing a $62,740 Network Monitoring Managed Service program.
A thorough understanding of the finances, together with a consideration of the benefits above, are great first steps towards deciding whether your organization can benefit from a managed IT services program.
Does you organization have a managed services program? What benefits have you seen?
Jim Burnett is the Managing Director of Service at MCPc and is responsible for all service related functions for MCPc customers. Connect with Jim on LinkedIn.
Like previous months, we’ve compiled a collection of articles that provide insight on prominent IT topics. January’s roundup takes a look at the evolving IT industry, the impact of cloud computing, the future of personal/desktop computing and enterprise apps in the mobile market.
In “The Straight Talk on IT's New Directions,” Galen Gruman explains that as economic conditions improve, companies will again look to invest in IT. However, they will have much different priorities than they did pre-recession. For starters, they likely won’t be investing as heavily in back office and infrastructure solutions, including security. So, what does the new face of IT look like?
- Mobile Management – As the bring-your-own-device market flourishes and new technologies emerge, companies are seeking support in managing mobile, iPads/tablets, social networking, cloud apps and more.
- Data Analytics – As stated in the article, “analytics, especially around decision support, Big Data and fuzzy data, predictions, and inline adjustments (so-called operational BI), are key.”
- Embrace Non-IT, Tech-Savvy Employees – Work with project managers, product managers and solution architects to define reasonable tech limits and create a culture of teamwork.
Overall, IT needs to think differently, and realize that it should be as big a part of “the strategy, development, and execution teams as any other group.” Read the complete article for all the details on these emerging trends.
As cited in Eric Knorr’s article Cloud Computing: IT as Commodity, Siki Giunta, vice president of cloud computing and software services at CSC, expects that “more than half of all IT workloads will become cloud services by 2016.” This is because many IT services will become mass-produced commodities, instead of custom-built solutions.
How will this impact IT as we know it? As Giunta explains, it will lower IT costs since economies of scale will take effect. In addition, it will have a dramatic impact on IT jobs, responsibilities and expectations.
In the Internet of Things and the Cloud CIO of the Future, Bernard Golden predicts that cloud computing will shift the role of the CIO from technology to data. He believes that CIOs will increasingly be expected to make data available within their organizations, and help others understand what can be done with it. This includes building scalable architectures, looking to new storage paradigms (for example, Amazon’s S3) and learning from successful case studies.
He also cites the impending growth of connected devices — those can interact with other devices without any human interaction, and how this too will impact IT operations. In a related, follow-up post, Golden explains that as connected devices become more prevalent, software will automate what were traditionally manual processes. The result is a shift in IT job responsibilities.
For example, enterprise architects will become more important, as added resources will be needed for developing, implementing, and enforcing standardized architectures.
In addition, cloud computing will require:
- A more hands-off approach by operations personnel
- Integration of legal and regulatory teams during the provisioning process
- Enhanced security standards that can support a more open data center environment
- IT financial analysts who can help guide service and resource decisions.
Read Golden’s Cloud CIO: How Cloud Computing Changes IT Staffs for the complete details.
Eric Knorr predicts that in 2011 the personal computer will reinvent itself. He states that “whatever personal computing device you use, desktop or mobile, serves only as a temporary access point for data, preferences and applications. The permanent home for your computing life, to the degree that it exists, will be on a server in the cloud or in your own data center.”
This would enable your desktop to be available for any device at any time, increasing productivity and efficiency of end users – especially mobile workers. While building this virtual desktop won’t be easy, Jon Brodkin offers a look at how some companies are starting to move toward this model in his article “The Complicated New Face of Personal Computing.” Check it out to see how others are managing employee-owned devices, mobile workforces and the work/personal balance as it relates to IT equipment.
In the PC Era is Not Over Yet, Bill Synder argues that the complete shift to mobile and cloud likely won’t happen overnight. He cites privacy, vendor lock-in and spectrum shortage (running out of wireless bandwidth) as key issues that need to be sorted out first.
And, for a look at MCPc’s take on this topic, read The Future of Desktop Computing: 2011 and Beyond.
So, with mobile adoption exploding (in some cases replacing PC use, as seen above) and networks getting faster, a more sophisticated mobile app market needs to evolve, according to Carl Weinschenk in Enterprise Mobile App Development Must Mature Quickly. He explains, “the goal is to design ‘personal enterprise apps,’” that have the same ease-of-use as consumer apps but more robust functionality.
In other words, for mobile to truly become viable alternative app functionality and security needs to improve. And, developers need to work fast to keep up with increased end-user demand, as detailed in this look at the healthcare market.
What Do You Think?
What articles, blog posts, videos or podcasts did you find interesting last month? Post a comment, and we’ll be sure to keep an eye on those sources for future wrap-up articles like this one.
This post is an MCPc blogging team collaboration.
Stay updated: Subscribe to receive MCPc blog posts by email or RSS.
A NAS appliance configured with Microsoft Windows Storage Server (WSS) is a universal file repository that can be used by any organization for many different purposes. Why should you consider a WSS? Perhaps you need to implement shared storage in a remote office quickly and economically, or provide gateway access to SAN storage. Maybe there is a department within your organization that needs a simple and effective way to store and share project files.
Microsoft Windows Storage Server is a special version of the Microsoft Windows Server operating system that is only made available to storage-system OEMs. As of this writing, the current version is Windows Storage Server 2008 R2, which is built on the Windows Server 2008 R2 operating system.
WSS is purpose-built in that it is intended to provide a robust file-serving environment for NAS (Network Attached Storage) appliances. A NAS configured with WSS is not intended to host or run applications like a standard Windows Server O/S installation, with the exception of selected backup applications specifically made for it.
If you’re considering Windows Storage for your organization, following are seven key benefits you can expect.
1. Easy access for clients - To a client (end-user) on a network, a NAS appliance looks pretty much like a file server, the only difference being that a NAS configured with Microsoft WSS is a purpose-built appliance specifically tuned for superior file serving duty. A user can map a drive letter to a NAS folder to save and retrieve files there. This is simple file-based storage access, somewhat like a large electronic filing cabinet for computer data. Users can access and share files amongst themselves based upon permissions set by the administrator. This is an example of the most common NAS appliance usage.
2. iSCSI support – The current version of Microsoft WSS offers a new feature — iSCSI connectivity. This opens up the door for block-based I/O services, which means that storage targets on the NAS device can appear to a server as locally attached storage. The benefit is that a NAS appliance configured with Microsoft WSS can essentially do double duty by serving storage to both clients and regular servers, making unified storage practical and economical. iSCSI support also means that a NAS can be used as a boot device for instances of Microsoft Hyper-V virtual machines, and potentially other servers that utilize boot images.
3. Easy to deploy – NAS systems configured with Microsoft WSS are very easy to implement in most network environments because they are essentially pre-configured. They typically include one or two Gigabit Ethernet connections and a wizard-driven GUI interface for initial setup. Ongoing administration is usually done through a simple GUI, as well. It may take as little as a few minutes to get a NAS appliance online and ready for use.
4. Utilized in many common storage solutions – There are several storage solutions available that utilize the Microsoft WSS operating environment. You should easily find a solution to match your requirements from a reputable storage hardware manufacturer.
5. File data de-duplication is built-in – Duplicate files are identified and consolidated automatically, which conserves disk space. This is also referred to as Single Instance Storage (SIS). Backup applications that are SIS-aware may save time during backups.
6. Use as a gateway – A NAS appliance configured with Microsoft WSS can be used as an SAN Gateway to access existing SAN storage resources. This opens up underutilized SAN disk space to users that may otherwise not have access.
7. Access for Apple and Linux users – End-users on computers equipped with Linux and Apple operating systems can also access the storage on the NAS appliance because SMB and NFS protocols are included.
These are just a few examples of how a NAS appliance equipped with Microsoft WSS can help with your simple storage requirements. The standardized x86 hardware components, combined with the proven reliability of the Microsoft Windows Storage Server operating system, makes these solutions easy to implement and administer.
Have you had experience implementing a Windows Storage Server? Was the process as easy as expected? What do you like about WSS?
Perry Szarka is a Solution Consultant at MCPc with expertise in data storage and network infrastructure. He works closely with clients to understand their business objectives and discover solutions to help them achieve their goals.
image credit: johnseb