You covered your bases and protected your client devices – complex passwords, two factor authentication, application white lists, client firewalls and anti-malware. You also have taken measures to ensure that if a device is lost the data will not be compromised – encrypted hard drives, remote wipe capability, and encrypted VPN tunnels for information transfer. Now you are feeling pretty good about your client security posture. Of course, if you are not doing these things, then later I’ll have to write about why they are still important.
Now, let’s explore a very real scenario where all this security preparation will not and does not matter. While this scenario focuses on U.S. Policy examples, this applies to any governmental action in any part of the world.
Let me set the scene. You travel outside the United States, visiting a client or international branch of your organization. Everything goes well, however upon entering back into the United States, you are pulled into a room by Department of Home Land Security (DHS) personal, who kindly ask you to “power on” your device and enter your password to decrypt the contents.
Oops – Complex passwords; two-factor authentication; encrypted contents, doesn’t matter you have just handed over the keys to the kingdom. Most people in this situation may be a bit nervous, but not overly worried, because they do not believe there is anything to hide. However, as recently reported by David Kravets of Wired Magazine (http://www.wired.com/threatlevel/2013/02/electronics-border-seizures/ ), the DHS office of Civil Rights and Civil Liberties published a two-page executive summary of its findings in regards to DHS suspicion-less search-and-seizure policy pertaining to electronic devices.
The executive summary of this report included this statement, “We also conclude that imposing a requirement that officers have reasonable suspicion in order to conduct a border search of an electronic device would be operationally harmful without concomitant civil rights/civil liberties benefits,”
What does this mean?
Well, it means regardless of what you have done or not done authorities can take your equipment. If they do so after you have given them access to the machine, then they now have access to all data without a warrant, and you no longer have any control over it.
The article illustrates a real world example of this in action. Mr. Abidor, a PHD student was held while DHS agents went through his computer – and what the article doesn’t point out is that Mr. Abidor had to file a lawsuit to get the computer back. DHS held it for over two weeks, though there were no charges filed against Mr. Abidor, nor any suspicion of wrong doing alleged.
What is on your company laptops?
How much financial information, patented company information, personal identifiable information (PII), personal health information (PHI), credit card account information, among other sensitive data, are you, your colleagues, or your employees carrying around? You may think even in an extreme scenario like this, the data wouldn’t be compromised in a way that would hurt your or your company, after all the Government has protocols and processes that should protect what they have seized.
Are you ready to bet your company on it?
Today, I know of no known instances where a border seized asset led to leaking of corporate data or privacy breaches. That doesn’t mean it has not happened, or that it will not happen. How do you protect against this scenario, or even a similar one, say, in another country, where you have even less recourse against it? I outline a few options below:
How about the Public Cloud?
Be careful on the what/where/how of your provider. International and national providers are inundated with requests for access to data hosted in their “cloud” by both foreign governments, U.S. federal government, and countless other local government entities. Of particular concern is that some companies, most visibly those major communications providers, have deals in place where they make millions of dollars by sharing your information with the government. How much do they really have vested in your privacy?
Even with the spotlights put on these programs in recent years, it appears the future will be even more “cloudy” in this regard. As an example, the recent ruling of the sixth court of appeal in United States v. Skinner (http://www.ca6.uscourts.gov/opinions.pdf/12a0262p-06.pdf) ruled that police do not need a warrant to access GPS data of individuals. This data is of course held by your telephone provider, or navigation service provider. What type of doors will this open to other types of data you are storing in large national public clouds?
It is still possible to gain the efficiencies of a public cloud, while avoiding the conflicts of interest of the big players. Look for an established local or regional partner that is offering services comparable to the big players in availability and security, but is without those entangling agreements that were not made in the best interest of your data.
What about a Private Cloud?
Clearly, there is more control here. You have the most control when you own not only the servers, but the location they are housed in. At least in this scenario any entity will need a legal warrant to retrieve data from servers in your private infrastructure. Many companies are jumping on the private cloud band wagon. Companies with big data center experience, like EMC for example, are taking that knowledge to deliver reference architectures like VSPEX that blueprint a flexible and tested solution utilizing a host of technologies that can deliver on the private cloud promise. Microsoft, VMware, and even traditional network players like Cisco are offering private cloud solutions.
How do you access your data in the cloud?
Really there are two choices here, although the technology to deliver them will vary.
Encrypted VPN tunnels are the traditional method for gaining access back into a corporate network. These will work in this scenario as well, but you have a lot of issues with where data will be saved, how working copies are managed, and what is kept on a personal device. You have a wide range of choices in how this is delivered – Microsoft Direct Access, Citrix NetScaler Access Gateway, and a variety of Cisco solutions – just to name a few. The options in this space are nearly limitless, and chances are you already have one in place, even if it’s not in use.
The other option is to keep everything in the cloud, including the working environment, so there is literally nothing on the device. How is that done? Virtual Desktops.
Utilizing private cloud to deliver virtual desktops provides the best combination of usability and security. The environment is controlled completely by the organization, and no data is kept on the client device. Popular choices in this space are VMware View and Citrix XenDesktop. In addition to fully virtualized desktops, another popular option is to utilize virtual applications.
Virtual applications can be delivered like a desktop, and still benefit from saving data in the cloud. This can sometimes be easier for IT departments to deploy if they will not be able to standardize on specific desktops, and can actually be combined with virtual desktops for the most dynamic scenarios. Again, the options in this space are VMware with their Horizon Application Manager, along with Citrix and its flagship product XenApp.
In addition to the leaders, there are a multitude of startups in this space, as well as established companies that are trying to break into the market – and as such the quality and price run from free to astronomical, and everything in between.
Coming back to my initial story, during a border seizure event or similar incident, you can unlock a device and allow the ability to browse and search with without having to worry about exposing the data. If a device is lost, it can be replaced and you can get back to work immediately because your working environment is separate from the piece of hardware you are carrying around. It’s simply a usability device that acts as a stepping stone to the non-resident application execution and data storage environments.
There are many risks to your personal data, your company’s data, and a client’s data that we probably have not even thought of or experienced yet. The good news is that there are many technologies available to assist with mitigating these threats.
Over the last five years, virtual technologies have matured economically so that businesses of all sizes can take advantage of them. Indeed, even individuals can find solutions tailored to their budgets.
Virtualization can be a great platform for savings, but an even better one for protecting your data when deployed correctly.
In eGroup’s latest Roundtable Series, we sit down with Dave Riberdy, Infrastructure Architect, SC Farm Bureau Insurance Companies.
SC Farm Bureau VDI Keys to Success
If you’re considering desktop virtualization options for your enterprise and want to know how to do it successfully, we strongly recommend you listen to this eGroup Roundtable. Dave outlines the reasons why his team finally took the plunge with VMware View, how they secured funding, created a business case, addressed user change management issues, overcame technology challenges and, ultimately succeeded with the project.
In fact, it’s gone so well that the next phase will be to roll out to the field agents on their iPads. Dave explains.
So, go ahead and carve out 60 minutes for this Roundtable. You’ll be glad you did!
SC Farm Bureau VDI Keys to Success
We recently had the opportunity to speak with Microsoft’s Yung Chou on a few cloud computing and virtualization related topics. For those of you not familiar with Yung, he’s a Technology Evangelist on Microsoft’s US Developer and Platform Evangelism team. Prior to Microsoft, he had established capacities in system programming, application development, consulting services and IT management. His recent technical focuses have been in virtualization and cloud computing with strong interests in private cloud with service-based deployment, hybrid cloud, and emerging enterprise computing architectures.
We hope you you find this insight valuable. If you have any questions or comments, leave them below for Yung to answer!
And don’t forget to check out his blog: Yung Chou on Hybrid Cloud
eGroup: What is your definition of cloud computing? And why do you believe everyone has their own, distinct definition of cloud computing?
YC: I define cloud computing with the 5-3-2 Principle. Which specifies that to be qualified with the term, cloud, an object must:
- Exhibit the five essential characteristics, namely a self-serving model, providing ubiquitous access, employing resource pooling, offering elasticity, and including a consumption-based analytic model
- Be delivered in one of the three ways: Software as a Service, Platform as a Service, or Infrastructure as a Service
- Be deployed to either public cloud or private cloud where I consider hybrid cloud is an extension of either of the two
Indeed, cloud computing means different things to different people. And the reason is that cloud computing encompasses the entire spectrum of IT lifecycle from hardware acquisition, infrastructure, networking, software development, production operations and maintenance, to decommission and disposing of resources. And, regardless what role and responsibilities an IT professional assumes, it is in the umbrella scope of cloud computing. In fact, I always say if you are an IT professional you do know something about cloud computing for sure. This issue, in my view, is not on understanding cloud computing from a particular point of view, the issue is the inability to connect the dots and articulate the overall vision of cloud computing relevant to a project’s priorities or an individual’s role and responsibilities due to a lack of clarity of what cloud computing is about.
Cloud computing, in layman’s terms, is a concept, an objective, a state, or a capability such that a business can be available on demand. In the context of IT, the concept, the objective, the state, or the capability is realized with digital computing. The key here is this idea of “on demand” which is an ambitious and lofty goal for IT. On-demand in cloud computing means anytime, anywhere, any device access, which is the spirit of consumerization of IT which embraces the device of a user’s choice for productivity. On-demand sets the focus on an intended user and whenever, wherever, and with whatever device the user needs to access an authorized resource, either an application, a run-time environment, or infrastructure, the resource will be there, not only accessible but also ready for consumption. On-demand also implies the ability to timely increase capacities as a target market moves upward and perhaps more importantly to decrease capacity when demands subside. On-demand is a result of combining standardization of deployment, automation of operations and processes, and optimization of resource utilization while all requires meticulous capacity planning with a comprehensive system management solution.
This always accessibility and readiness of a resource for an authorized user is the essence of cloud computing. This concept is largely embedded in the term, service. In other words, cloud computing is to enable a business available of demand, i.e. business as a service. Frequently in a cloud computing discussion we have heard people reference software as a service, platform as a service, infrastructure as a service, IT as a service, and even everything as a service. They are essentially saying an application available on demand, a run-time environment available on demand, provisioning infrastructure on demand, IT functions available on demand, and a target delivery on demand, respectively.
So my terse description of cloud computing is simply a capability to deliver business as a service or make business available on demand.
eGroup: Can you explain the 5-3-2 Principle and what it means to IT and business executives?
YC: The 5-3-2 Principle outlines the requirements, the deliveries, and the deployments of cloud computing. For IT decision makers, a clear understanding with set priorities of these requirements relevant to the core business is imperative. The question is really not if but how much and to what extent a business needs cloud computing in various shapes and forms. All business should and will benefit from being available on demand which is the state for cloud computing to achieve. The strategies and decisions are, in principle, based on where your business is now and where it needs to be, what capabilities are in place today and what else to be acquired, what the cash flow looks like and when it can break even, etc. The cost model will be most appropriately developed by IT professionals who know not just how to run IT but to estimate the cost of the operations and processes in a traditional on-premise deployment and a target cloud computing environment.
The 5-3-2 Principle is a modified version of NIST SP 800-145 which officially defines cloud computing and is a good model to start. However, in my view, SP 800-145 misses addressing an important concept, service, which is critical for clarifying what cloud computing is about. SP 800-145 also applies inconsistent criteria in defining some of its cloud computing deployment models as I have described in my post, An Inconvenience Truth of NIST Cloud Computing Definition, SP 800-145.
eGroup: Is there a difference between virtualization and cloud computing? If so, what is that difference?
YC: Virtualization is not cloud computing and cloud computing goes far beyond virtualization. Fundamentally, virtualization is not required to abide by the 5-3-2 Principle while cloud computing is. In many occasion, I ask IT professionals what virtualization means to them. And the frequent answers I have been getting are various forms of describing running multiple OS instances with the same hardware which is basically server consolidation. Few mentioned anything about a self-serving model, universal accessibility, or embedded analytics while either is an essential characteristic of cloud computing. When planning a virtualization solution, the main focus for many has been on configuring and delivering virtual machines, and not about the accessibility and readiness of a service which is the application running in and delivered by a set of deployed virtual machines. Further, cloud computing is to ultimately shorten go-to-market which is why there is such a strong emphasis on the accessibility and the readiness of a target resource. While virtualization is much on the technical integrity of infrastructure which operates under the hood of cloud computing.
eGroup: Gartner recently said in its Hype Cycle report that cloud computing has become muddled in the market as vendors are “cloud washing” and such miscommunication could ultimately detract end users from the efficiencies cloud computing can bring. Would you agree or disagree with Gartner and why?
YC: I agree and again, in my view, this is largely due to a lack of clarity in understanding what cloud computing is. If not knowing what it is, I can’t imagine how one can effectively articulate why and how it is. Cloud computing is not a product or a particular technology. It is a set of capabilies with which a business can be available on demand, whatever the business is from deploying a server, to developing an application or delivering flowers. Lately it is getting even more confusing when people start using the term, cloud, interchangeably with remote access, equating virtualization to private cloud, and employing private cloud, hybrid cloud, and enterprise infrastructure as synonyms, while these terms, each represents a specific set of requirements and objectives, targets various business scenarios, and addresses different business challenges. They are simply not the same.
eGroup: Switching gears to SharePoint, can you briefly discuss how business professionals can use SharePoint for their business intelligence needs/projects?
YC: SharePoint, to the information worker, is just like Active Directory is to system administrators. In Windows infrastructure, Active Directory is the one version of truth and the ultimate authority of all objects managed in the associated domain. Eventually anything happens in a domain, starts and ends with Active Directory since in a distributed system as most what we have today in enterprise IT, authentication, authorization, and accounting are the core services IT provides.
Similarly, SharePoint is the repository and the one version of truth of information based on documents and data stored, derived, and delivered in a corporation. When architecting Windows infrastructure, IT professionals start with a Windows domain. When architecting a BI solution, an information architect should start with a SharePoint infrastructure when there is an opportunity. BI has much to do with data warehousing and mining. At the same time, BI is also about data gathering, portability, presentation, accessibility, taxonomy, workflow, extensibility, rights management, HR/CRM/ERP integration, etc. so what to get, how to get it, validated by what, who owns it, who can process it and to be consumed by whom, what is significant and in what way, and on and on, are all in a way contributed to the realization of BI. All the above mentioned are what ultimately SharePoint is designed to deliver.
So rather than trying to fit SharePoint into a business process, developing a business process with SharePoint is the fundamental concept we must appreciate. This becomes very obvious when trying to build Windows infrastructure without first planning Active Directory. It just does not work that way. SharePoint needs to be considered with the same level of respect when it comes to information management. Consolidate existing business processes and data and convert them into SharePoint, subsequently run your business and develop BI capabilities with SharePoint to maximize business values and accelerate ROI.
At eGroup, we’re huge fans of InfoWorld’s Paul Venezia and his Deep End blog – always provocative, never dull. But his latest post, Why Virtualization breaks monitoring systems, had me miffed. You’ve got to use the right tool for the job, Paul!
You wouldn’t want to take your next road trip in a golf cart, and we don’t wash our dishes in a car wash – so why is IT still using the same monitoring systems of old to monitor their virtualized infrastructure?
Sure, these tools may get the job done eventually, but the means to get there is tiring and messy! Just like with old monitoring systems telling you when something changes (ouch – every time something changes?).
So what’s my advice to IT?
Manage the change, and you’ll manage your infrastructure – improvements in technology constantly change the way we do business, and ultimately support the business – only with change we can do it faster, cheaper, and better. Embracing the dynamic nature of workloads, and managing and creating an environment of expected change is what today’s virtualized data center is all about. We expect it to change – we want it to change – we need it to change. Things will change, but we’ll still be in control – with the right tools – but not the old ones!
When you’re at the point when most of your workloads are virtualized – those old monitoring systems need to go – as a married man, I know when it’s time to take out the trash – it’s at the precise moment the trash can is completely overflowing. We don’t throw away everything we no longer need though – often we will recycle, up-cycle, reuse, or donate our unwanted items – but when something is completely useless and I am morally obligated to keep it from being used by anyone any longer – it’s going in the trash.
That’s similar to how I feel about old monitoring systems that were meant for old, static environments. When you’re at the point when most of your workloads are virtualized – those old monitoring systems and agent-based backups need to go. And if you need help through this phase, reach out to us, we’re happy to help!
Do not accept defeat - whatever you do, don’t surrender to the fact that you’ll never be able to monitor your virtualized infrastructure because the old monitoring tools have become instantly obsolete. IT Professionals are anything but defeatists – if there is a way, we’ll find it, code it, release it and conquer the world just to keep us from clicking a mouse that one extra time.
That attitude was certainly shared by VMware, which is why they released vCenter Operations Manager and the vCenter Operations Manager Suite. They knew that managing something so fluid couldn’t be accomplished with the same tools as static environments. The challenge of managing IT doesn’t decrease just because something is virtualized – it’s only when a workload is virtualized that we can see a management tool’s full potential, and value.
To understand how vCOPS acts as your greatest intelligence collector, read my recent post, “Hey IT Leaders – Would You Rather Be Proactive or Reactive?”
And so, with all due respect to Paul (and we have a TON of respect), virtualization doesn’t break monitoring systems – virtualization breaks OLD monitoring systems. Stop cleaning your dishes in the car wash, and get the right tool for the job!
Wouldn’t you agree?
As we mentioned in an earlier post, we rarely beat our chest, but when it comes to our customers, we have no problem in sharing their successes. And so is the case with Williams & Fudge, Inc., who received EMC’s Journey to the Cloud Award at VMworld 2012.
Phillip Reynolds (pictured above), Director of Data Center Operations, accepted the award for demonstrating excellence in driving innovation, performance and creating business value through the use of virtualization.
Well done, Phillip and team!
Page 1 of 1412345...10...»Last »