As public cloud computing continues to gain strong momentum, the vendor landscape for IaaS is rounding out nicely with Amazon Web Services (AWS) the clear leader of the pack. Focussing on the enterprise application space, from my observation, a significant amount of customers cloud projects over the past few years fell into two major buckets: 1) virtualizing server applications to build the private cloud (datacenter) and 2) running test and development projects to work out the kinks and build up the experience required to ensure applications will run equally or better than in the enterprise. With data center virtualization projects well in hand, organizations are now looking towards leveraging the public cloud in a variety of ways – for some, desktop virtualization projects are the priority, for others, data backup or disaster recovery, while a third grouping look to the public cloud as an extension of their private cloud to handle certain workloads. Regardless of the use case, while the confidence in cloud is growing due to its undeniable benefits, top customer pain points continue to revolve around security. While IaaS cloud providers have invested heavily in building strong security controls protecting everything from core networking, processing and storage infrastructure right up to and including the operating system, they promote a concept of shared responsibility where customers maintain responsibility for securing the actual workloads instances and their associated data. To help AWS customers take control of securing their private data in the public cloud, AFORE announced a suite of encryption solutions designed to help secure virtual desktops (VDI), applications servers and EBS storage. These solutions not only help companies address regulatory compliance and protect against cyber surveillance and other forms of external threats, but can also helps with the ever growing concern of insider threats. Underpinning our solutions is a unique approach that makes deployment and management simple for IT managers giving them the ability to secure both private and public cloud workloads from a single security management plane. To learn more about AFORE, check our cloud security products or better yet, drop by and see us if your attending the AWS re:Invent conference in Las Vegas.
Healthcare providers have a lot to contend with these days. Beyond the constant demands to improve the quality of patient care and operating efficiencies, healthcare organizations are experiencing dramatic changes in information technology with the emergence of Electronic Healthcare Records systems, mobile computing and Bring-Your-Own-Device (BYOD) as well as cloud and virtualized computing platforms. While these advancements promise improved patient care, workplace efficiency and reduced cost, they also introduce new issues related to how sensitive information is transmitted and stored and the associated security risks. To make matters more challenging, new HIPAA regulatory law, which deals with protecting patient data, came into play this past month. Under this new legislation, healthcare providers not only face steeper fines and more onerous notifications requirements should a data breach occur, but now HIPAA extends to their Business Associates (BA) as well. BA's include organizations such as payment providers, legal firms, and even technology service providers such as Amazon and Google cloud providers. This HIPAA compliance article covers some important considerations when dealing with cloud providers and the pro's and con's of how data security & encryption is implemented. Without giving it away, with public cloud, there are options. Whether you Bring-Your-Own-Security (BYOS) or subscribe to security services for data encryption from your cloud provider, one of the fundamental decisions is WHO has access to your data - WHO controls the encryption keys.
Visit us at ASIS 2013 Seminar and Exhibits, September 24 - 26, 2013, booth #283 at McCormick Place, Chicago, IL. To set up a meeting please contact: James LaPalme, VP Sales & Marketing by email firstname.lastname@example.org. https://securityexpo.asisonline.org/Pages/default.aspx
Most security systems currently deployed within organizations focus on access control, malware/virus detection and network edge protection (i.e. firewalls, DLP, etc…). Unfortunately, none of these security systems is very useful when it comes to dealing with Advanced Persistent Threats. Now I’m not saying that everyone should go off and get rid of these mechanisms since they still play a role in securing your IT infrastructure, they just don’t do much when it comes to APTs. APTs are usually tailor-made attacks, so malware removal and anti-virus software are ill suited to detect them. Some pundits say that DLP will prevent APTs from exfiltrating data, but that’s a silly proposition. DLP is good at preventing accidental leakage of data, but is totally incapable of preventing data leakage through things like encrypted tunnels, or through more exotic mechanisms such as stenography. Most properly crafted APTs will lie low to avoid detection, waiting for the best time to exfiltrate data or even extract small bits of data over a long period of time. I could drill down on how APTs get into a perimeter, how they propagate, move laterally from one system to the next, etc…, but these are all academic details. The fact is, APTs will get in and will be in a position to exfiltrate data if they can get to it. And therein lies the key focus when it comes to combatting APTs. The data! Data encryption is the silver bullet of data security. The problem with current data encryption solutions is that if a user has access to encrypted data on a system, then any and all applications are able to access that data. As such, if Microsoft Excel is able to access that spreadsheet, then so can Internet Explorer and so can that piece of malware lurking in your infrastructure. This is why AFORE Solutions engineered a solution called CypherX. CypherX is a policy driven security solution designed to allow some applications (let’s call them trusted) access to encrypted data while all other applications (let’s call those untrusted) can still open the encrypted files, but only see encrypted data. The more important thing is that CypherX ensures that all data emitted by a trusted application (i.e. via the file system, IPC, network sockets, etc…) is either encrypted on the way out or is only emitted towards another trusted application (i.e. sockets can only be established between trusted applications). As such, sensitive data is forced to stay within the ‘trusted environment’. This is a powerful paradigm. In essence, CypherX elevates applications to the level of securable objects, just like users or machines. As such, unknown applications (including malware, viruses and APTs) can come along and open all these sensitive data files, but because these applications are not seen as trusted, all they see is encrypted data. So let them exfiltrate your totally encrypted data! Hopefully, they’ll chase their tails trying to make sense of the gobbledygook before realizing they got nothing… And the coolest part is, CypherX is totally transparent to both applications and end-users. In the next part in this series, I want to continue with a quick discussion on cloud-based attack vectors and risks to data when it comes to things like backups (and how APTs might be able to reach those)…
So this is the first of several blogs where I want to write on the subject of advanced persistent threats, or APTs for short. Unlike viruses and malware that are spammed at hundreds of millions of users in the hopes that a few of them will get suckered in, APTs are totally different. Wikipedia defines it very well with: “Advanced persistent threat (APT) usually refers to a group, such as a foreign government, with both the capability and the intent to persistently and effectively target a specific entity" For the past few years, most targets of APTs have been government and large enterprise, but lately, I’ve come across interesting reports that smaller organizations are being targeted, and that’s an interesting shift in the security landscape. The most interesting case has been of attacks on law firms that specialize in patent law being targeted by foreign interests (starts with a ‘C’ and ends with a ‘hina’) so that intellectual property could be exfiltrated while patents are being authored (i.e. prior to filing). I presume the purpose of this would be to either allow an organization to file a patent before the competition (i.e. beating them to the punch) or to simply publish the intellectual property for all to see, thus establishing ‘prior art’. Either way, I suspect it would give someone a significant competitive advantage. Unlike regular hacking attacks, APTs are sophisticated multi-vector attacks designed to attack a specific target. The people behind APTs are exceptionally well funded and willing to sustain an attack over a long period of time. In some cases, APTs are designed to achieve a singular hard to achieve goal (i.e. steal that top secret file, acquire this trade secret, etc…), but in many cases, APTs attempt to establish and retain a foothold within an IT infrastructure over an extended period of time, thus allowing the perpetrator of the APT to continuously spy on their target. The biggest challenge with APT’s lies in the fact that many of them are custom developed. Traditional security systems such as anti-virus and malware removal software as well as perimeter security systems are often useless against such attacks. As one of my co-workers coined the other day, when it comes to APTs, most security solutions out there are not unlike ‘closing the gates after the cows have escaped’… As this is absolutely true. Organizations should accept that they will get breached (or have already been breached) by APTs. In part 2 of this blog, I want to continue this blog by discussion what is actually effective against APTs. So stay tuned!!!
Public cloud service is growing rapidly. For example, in 2012 Amazon AWS doubled its revenue to over $2 billion US from $1 billion in 2011. With my experience of using public cloud services for cloud based storage, product testing and customer demonstration, the cloud service on-boarding is so easy such that in a few minutes I am able to create an account, run my applications, and bring an external cloud service in to my company for business use. With rich choice of cloud services, different lines of business within an enterprise are able to subscribe cloud services in minutes. For example, an R&D department can go to Rackspace for software development and testing, a managed service unit can go to Savvis to host a SaaS application, and a BI department can go to Amazon for big data service. This Bring Your Own Cloud (BYOC) phenomenon is happening across enterprises. Business units are adopting public cloud service to accelerate time to market and reduce internal capital and operation cost. On the other hand, enterprise IT departments are really concern about losing control. Quite often, security, lack of manageability and visibility are listed as top concerns to hinder public cloud adoption. What we need to do with BYOC is that we shall provide enterprise IT with the solutions to retain the control of applications, workloads and data in the public clouds, including
- The capability to bring workloads and data in the cloud back to the control of enterprise IT. Enable an enterprise to use its domain controller, directory service and other existing enterprise security solutions to manage users, authentication, access control of the workloads in the cloud
- An integrated management solution such that enterprise IT is able to manage and monitor the workloads in the cloud from existing management tool. For example, VMware vSphere is the dominant virtualization management in the enterprise. Capability to monitor and manage workloads in Amazon AWS from enterprise VMware vSphere environment will equip enterprise IT with the manageability.
- Capability to secure data and remotely control the data security from enterprise environment. Enterprise security administrators have the capability to enforce data encryption policy, control encryption keys and perform remote “data wipe-out” to solve data remanence issue.
- Capability to monitor cloud workloads in terms of performance, access, alarms, security events and retrieve logs from cloud environment for auditing purpose.
Recently, I was involved in a lively discussion about training as a tool to enhance data security in the enterprise world. This is obviously not a new discussion, but it’s always an interesting debate and I certainly love to revisit it once in a while. When we talk about this subject, we need to make a clear separation between the training of IT staff and the training of end-users (i.e. employees that are NOT IT staff). Training IT staff is always a good idea since it's their job to maintain and manage IT infrastructure and security awareness is an important part of their duties. Budgeting for one or more members of an IT team to become security certified (with programs like CISSP) is always a good idea and IT team members often see this as advancing their careers, so it's good and positive for everyone. Enterprises should also consider sending a select few IT specialists to conferences like DefCon or BlackHat in order for them to get a feel of the new sophisticated threats coming their way (it can be a very sobering experience for some). But, if we are talking about trying to train the end-user.... Well... *Sigh*... Such training and awareness programs have seen very limited success. Employees rarely care about security and are usually focused on their task. People are usually measured on productivity, not on how secure their behavior is. Now, ask yourself this...
- How many times have you let someone tailgate past a security door after opening it with your access card?
- How many times did you walk away from your desk and leave your workstation unlocked?
- How many times did you bring your laptop to a coffee shop and leave it unattended for just a few moments while you went to put some honey in your coffee?... (after all, you didn't want to lose that table you managed to snag in the busy coffee shop)
- How many times did your bring work home on a USB drive?
- How many times did you upload corporate documents on DropBox or email them to one of your other email accounts?
- How many times do you reuse the same password across multiple accounts?
- How many times have you selected a less-than-strong password for an on-line service or site, as determined by the service’s change password screen?
I learned of yet another major data leak event happening, this one happening at the HRSDC (Human Resources and Skills Development of Canada). Based on the details published by the government, the data was lost when a removable storage device (i.e. a USB disk) with unencrypted data went missing. Data leakage events such as this never come as a galloping shock to anyone within the IT security industry… Contrary to what some would like to believe, data loss often comes from inside sources, not from evil hackers and their ilk (although they do exist). Some market research companies peg data loss from insider sources being responsible for as much as 75% of all data leakage. Now don’t get me wrong, I’m not going around pretending that ill-meaning data thieves are filling the ranks of your workplace, they are probably not. There is data leakage from malicious insiders, but a lot of data leakage actually happens accidently. How many times did some hard working employee decide to take some work home with them on a USB stick, only to end up misplacing it? I’ve misplaced those tiny little USB sticks before, and I’m willing to bet that has happened to a lot of readers as well. And of course, there is the ever terrible laptop loss or more likely, theft. Other nice ones include improper disposal of surplus hardware, employees that install file sharing software and end up sharing ‘a little too much’ or the ‘where did those backup tapes go again?’. And that’s just the old ways of data leakage. New services like DropBox, email as a service, virtual desktops as a service, etc… are opening up the corporate IT perimeters, making it increasingly blurry. But the long and short of it is, there are a lot of ways for data to grow legs on the inside. And then, there is ‘the malicious insider’, the person who actually MEANS to steal sensitive data. People like disgruntled employees, those planning to strike out on their own and wanting a ‘leg up’ with the company’s client list, and then those doing it just for the money, for a cause, or for country… This includes cloud provider admins capable of accessing a wide range of virtual assets from many customers. In those instances, protecting digital assets requires a lot more than the usual perimeter security like firewalls or perimeter DLP. Given that it is becoming increasingly easy for data to leak from within, either maliciously or accidently, IT professionals and policy makers like CIO’s and CTO’s must begin to look at security solutions capable of securing data far before it ever reaches the perimeter of the IT infrastructure. The ability to encrypt data as soon as possible, to control the context under which encrypted data can be accessed (who, what, when, and where) and to be absolutely transparent to the end-user are three required ingredients to any successful strategy designed to prevent data leakage from insiders, either accidental or malicious.
I can still recall that my first data analytics project was about using SAS and DB2 to analyse social and economic statistical data in the amount of megabytes which were stored on floppy disks. Analysing unstructured content was a big challenge then and involved a great deal of manual indexing, labeling and structuring work. Thanks to the rapid digitalization of data and content, advancements in big data technology such as Hadoop and powerful computing resource, we are now in a big data and cloud computing age! In this age, information explosion is continuing at an astonishing rate. According to IDC Digital Universe research, “in 2011 the world created a staggering 1.8 zettabytes of data”. By 2020 the world will generate 50 times that amount of information. Big data is here and getting bigger. Data processing technologies have also advanced significantly. For example, Hadoop technology is able to sort terabytes of data in 62 seconds using over one thousand nodes. It is quite common now for business and government organizations to leverage the elastic and on-demand nature of cloud computing and the economics of cloud storage to process large data sets in a timely and economical fashion. The intersection of cloud computing and big data is happening in a big way. According to a presentation at the recent NIST Cloud Computing and Big Data Workshop, two million paid Elastic MapReduce (EMR) clusters were launched in Amazon cloud in 2012. Consider the power of combining big data and cloud computing to convert big raw data into meaningful business intelligence, it is more important than ever before to protect privacy and confidentiality of sensitive data. For example, prescription data, smart meter data, credit card data can expose significant amount personal information. Given the wide availability of public computing resource and open source Hadoop technology, we need to ensure that this data shall never be accessed and used by the wrong people. According to the IDC research, “only half of information that should be protected is protected”. The data protection gap is still huge When big data intersects with cloud computing, the data protection becomes even more urgent and important. Quite often, cloud is a multi-tenant environment and the computing, storage and network resources are shared among tenants and managed by cloud administrators. When a tenant finishes a big data processing job, what happen to the tenant’s data left behind on cloud based storage systems. Data encryption will play an important role in this big data and cloud computing age. Sensitive data must be encrypted. Only trusted Hadoop nodes should have the encryption keys to process the sensitive data. And the keys shall always be under the control of the data owners. Protecting big data is about protecting individuals, preserving business brand and securing competitive advantage.