Thursday 31 May 2018

Data Storage: From Then to Now & What’s Still Needed Ahead


- Becca Bauer

Data storage has in fact been around for hundreds of years and has gone through a number of changes before arriving at the cloud storage era we have currently arrived at. Starting in the 1720’s, punch cards were introduced and eventually became the first tool for data storage and recording. Since then data storage has gone through a radical evolution including the introduction of magnetic tape and then the first hard drive invented by IBM in 1956. This was followed some 20-30 years later by the introduction of floppy disks, CD-ROM and DVDs. Next, the USB flash drive arrived on the scene in 2000 and dominated storage from several years until cloud storage entered the market and was debuted by Amazon Web Services in the early 2000s and took a major hold in 2006. Since then, the use of the Cloud has become increasingly popular for data storage and brings us to the present-day climate.

While there are many clear benefits to moving the Cloud, there are also several flaws revealing themselves that indicate the storage market is in need of a continued evolution. For example, recent studies are indicating that moving storage from in-house to the Cloud won’t achieve cost savings unless the storage needs are fully assessed, and anticipated savings are planned out. In fact, over-estimating storage capacity is one area that can make a dent in the savings. While estimating a higher capacity can secure better cloud storage rates from vendors, if you don’t have enough data to meet this higher capacity, you are essentially paying for unnecessary and wasted space.

Another major challenge to the current cloud data storage model is data security. Most organizations turning to cloud storage solutions hold a mix of data that often includes sensitive and protected personal information in their databases that they have a responsibility to keep protected, such as PHI (Protected Health Information), PII (Personally Identifiable Information) and GDPR (Global Data Protection Regulation). Unfortunately, current database systems can encrypt stored data, but this encryption is carried out in a way that anyone (human or machine) that has access to the system at any administration level generally also has access to the plain unencrypted data. This design flaw leaves a “come get me” sign that has led to many diverse organizations becoming victims to data theft and losing millions of dollars.

So now that we have highlighted what is missing from the current cloud storage systems, what is the solution?

Enter BOHH Labs introducing the next phase of data storage, Secure Data as a Service (SDaaS) that puts a focus on both the actual security of the data and removes wasted spend with a storage consumption model. SDaaS acts as a layer between the user/application and the back-end data store and enables total security on all your stored data, without changing the data structure, while making certain data points visible only to those with the correct permissions. Whether this is in a database or a document, the BOHH SDaaS enables full use of data without the security concerns. This solution uniquely offers database or specific field level security that businesses desperately need. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over sensitive data such as PHI, PII or GDPR. We do this because it stops the inside hacking job and it also enables companies can choose which data to store with full knowledge of data confidentiality/ sensitivity. This kills the flaws within the current storage market and enables stored data to be securely opened to the Cloud, without putting it at risk of breach.

All of this is done without impacting user accessibility and it is introducing secure storage as a consumption-based model, rather than the current

Thursday 24 May 2018

What’s the Point of Analytics if You Can’t Access Them?


- Becca Bauer, Director of Marketing & PR

In the workplace, data has become the golden ticket for companies to drive sales and stay competitive. However, much of the focus has been on the development of analytics and using data insights gathered on your marketing, sales, customers, products, new leads and so on to grow market share. All of this sounds good – hey it’s basically free marketing advice generated from your own information - but while this focus serves marketing and finance purposes, this emphasis fails to address the fundamental need of how we actually access those insights.

Big amounts of data is nothing new. It has always existed. Granted, not in such quantities as we have today with all of our applications and collaborative sharing tools, but document management has been holding a large amount of data since the 80s and email storage has been “big” ever since the late 90s. The onset of nonstop data being produced from everything from web history, emails and documents to contracts and CRM systems continues to grow daily, even hourly, enabling organizations to access corporate knowledge that is more relevant and targeted than ever. And as we move forward, there is no foreseeable end because we as a digital society produce a massive amount of data constantly that needs to be housed somewhere.

One major issue many companies are having is that they are looking to Cloud vendors and deployments to store historic or archived data in their infrastructure, as opposed to in the company’s own data center, and this often means recent data or even just 12-months old data get archived. The problem with this is, once the data is stored, how are companies easily able to access this data to extract business value or analytical insight to help businesses remain competitive? This is a big need! After all, how are companies supposed to take a targeted approach to their data and see if patterns emerge that can better be applied to high-value business decision to increase their bottom line if it is all archived and not easily accessible?

Access to data is the critical function here. Add to the fact that securely accessing stored data is an increasing challenge for companies, as often the data they want to examine contains confidential data such as PII (Personal Identifiable Information) and PHI (Personal Health Information), and it’s difficult for companies to guarantee the security of this sensitive data accessed by users.

So, what is the solution to putting your analytics and insights to use?

This is where BOHH Labs can step in and help. Our Secure Storage as A Service (SSaaS) acts as a layer between the user/application and the back-end data store and enables total security on all your stored data. This is done without changing the data structure, while making certain data points visible only to those with the correct permissions. Whether this is in a database or a document, the BOHH SSaaS solution enables full use of data without the security or accessibility concerns.

Our solution uniquely offers database or specific field level security. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over sensitive data such as PHI, PII or GDPR. If you know which data fields or rows contain sensitive data, companies can better protect these fields to ensure business compliance and enable data to be utilized by a wider audience to extract greater business value or insight, while still securing it and only providing access to those who have the correct privileges to see it.

As the enterprise market continues to become more digital and the amount of data we produce rapidly grows, it will be even more important that secure access to this data is simple. The ability to to manage your continually growing data and extracting more accurate results to deliver valuable analytics will be critical for businesses to stay competitive.

Tuesday 22 May 2018

Why a Cloud Consumption Model Should Replace Pay as You Go for Data Storage



- Alan Jamieson, VP of Business Development


In a previous blog, we highlighted why planning your Cloud storage requirements is the only way to ensure your company achieves operational savings moving to the cloud.  Today, we are going to look at the various consumption options:

Pay as you Go

Subscription (fixed term, monthly fee per user or unit) based commercial models have been around for several years driven by Customer Relationship Management vendors such as Salesforce, Infrastructure as a Service (IaaS) vendors such as VMware, and Information Technology Service Management (ITSM) vendors such as ServiceNow, who all enable us to pay for services we use, typically based on user number pricing bands. This pay as you go approach has also been widely adopted by the leading Cloud vendors who companies are turning to streamline operations and offerings.  However, this approach is more limited when you look at Cloud data storage. Typically Cloud vendors look to companies to their Cloud infrastructure by charging them lower rates for storing historic or archived data in their infrastructure as opposed to in the company’s own data center. This often means recent data or even just 12-months old data get archived. The problem with this is, once the data is stored, how are companies easily able to access this data to extract business value or analytical insight to help businesses remain competitive? This is a big need!

Research also shows that companies are paying money for storage that they currently don’t need, as they either have too much in-house storage capacity or they have estimated and invested more than their businesses need today for Cloud storage. Regulation is often the main driver for companies to retain transaction data and customer data for a defined number of years, but unless there is a data retention policy, storage investments in this area can be an unnecessary expense. Companies should only retain data for specific periods of time, exceeding these period is an overhead to the business.
It’s clear there is a struggle to find the right balance of leveraging the Cloud to streamline the collection and storage of a company’s increasingly growing data without wasting money. As detailed above, the current standard method of pay as you go is not setup to help companies cost-efficiently move their storage to the Cloud. 

So, what is the solution?

BOHH Labs believes introducing a consumption-based model to the storage market can help companies maximize the benefits of moving storage to the Cloud without paying for resources they don’t actually need. Subscription (consumption-based) fixed term agreements are paid monthly or quarterly and can help businesses to start achieving operational savings with small initial investments that grow through greater use of the service and increased user adoption over time. This approach allows companies to pay for the resources they need without overestimating on resources that roll in to wasted costs, yet still allows them to expand as they grow. This model will be beneficial for all companies – those with a small number of employees to global enterprises with hundreds of thousands to benefit from the same services. 

Within the consumption model, we believe it needs to be split into two areas:

  1. Data storage:
    As discussed above, companies should choose the right data storage period and commercial model to support their individual businesses, and thus pay for storage based on volume or data retention period.
  2. Data acces
    This is an area that is included in user subscription agreements such as CRM, but if not, it is an area that all companies need to explore to understand how they can gain business value from data they store. Stored data serves companies no purpose if it cannot be accessed easily to leverage insight and analytics from it t apply to their business decision-making. 

What Does this Look Like? 

When companies run a marketing campaign, they typically include all their active target customers to help ensure that they gain the maximum return from the planned campaign.  However, when a global financial services or health provider needs to make longer term investment decisions based on historic data over several years, securely accessing this stored data, which often contains confidential data such as PHI (Personal Health Information), PII (Personal Identifiable Information) is not available, as they cannot guarantee the security of sensitive data accessed by business users.

BOHH Labs has identified this business challenge and has a created a Secure Storage as a Service solution that ensures that all stored sensitive data remains secure and confidential.  If you know which data fields or rows contain sensitive data, BOHH Labs protects these fields to ensure business compliance. As such, the BOHH service leverages a consumption model to provide a secure way to enable your noncompliant data to be utilized and have the cost of storage recovered from it as its value is extracted. By protecting the compliant data, securing it and only providing access to those who have the correct privileges to see it, allows longer period (often years) non-compliant and non-corporate sensitive data to be utilized by a wider audience to extract greater business value or insight.


Thursday 17 May 2018

Post Industry Event POV: It’s Clear Customers Are Fed Up with New Security “Add-Ons”


BOHH Labs VP of Business shares his thoughts on BOHH's attendance at the GDS Security Summit.

Last week BOHH Labs attended GDS’s Security Summit in Atlanta, GA.  The summit was well attended with CISO’s and VP’s of Security from global enterprise accounts in the finance, healthcare, manufacturing sectors etc.

It was evident that global enterprise accounts across all market sectors are looking for innovative ways of protecting their critical business and data assets.  The desire to move from data center to cloud is prevalent and often cost driven but mitigating the risk of data breaches is the main obstacle to be overcome by organizations. Additionally, as part of the discussion on security challenges, there is an increasing need to avoid further data breaches and meet new compliance regulations as Friday May 25, 2018 is fast approaching when the new General Data Protection Regulation (GDPR) becomes effective and impacts all global customers with European Union (EU) member customers.

One thing was made very clear: Companies and customers have been searching for new solutions and are keen to avoid buying another security solution that is layered on existing security investments. They are fed up with the security industry “add-ons” for promised enhanced security. The same old vendors are promising new and exciting solutions to protect companies and their data, yet time after time major breaches keep happening.

BOHH Labs was honored to take part in this industry discussion and present its new solution to help address the security challenges plaguing customers and organizations alike. We were delighted with the reaction to our new Secure Storage as a Service (SSaaS), which enables companies to protect compliant and confidential data fields while importantly enabling analytical insights to be gained without the risk of a data breach.

We demonstrated our service on a cloud platform to highlight how our solution is platform agnostic and show how compliant plus non-compliant data residing in a Hadoop data platform could be searched for analytical insights and the compliant data (i.e. PII, PHI) fields only being accessed by authorized users.

Seldom has the BOHH Labs team comprising of Simon Bain – CEO, Ken Hawkins – CTO and Alan Jamieson – VP Business Development seen such a positive and enthusiastic reaction to new technology.  Perhaps it was showing the Secure Storage as a Service working on a cloud platform and accessing data in a Hadoop data platform that validated how powerful and importantly how relevant our service is?  We left the summit upbeat with our message and solution resonating with most of the
conference attendees.

All in all it was a great industry event and if you want proof and to learn more, contact us and we will be happy to place a POC on your storage system, to show how we secure, scale and enable easy access to your valuable information stores.

Tuesday 8 May 2018

Security, so what is it exactly?


- BOHH CEO Simon Bain

Talking to customers, vendors and the great and the good of the industry, it is no surprise that we seem to have a data security issue at the moment. Maybe though not the obvious one of data being stolen, but one of the description of what security actually is!

These two quotes from Sridhar Muppidi, who serves as VP and CTO IBM Security,
 are taken out of context, but they sum up a large sector of the technology industry’s view on security:

  1. "IBM Security is a division that focuses on keeping the bad guys out and the good guys in, it's as simple as that," Muppidi said. 
  2. "It's a discipline," Muppidi said about security. "It's a discipline that can be morphed into a program, a set of practises, solutions and products."

See them here: http://www.eweek.com/security/ibm-security-cto-details-how-cyber-security-fits-into-ibm-portfolio

While there may not be anything fundamentally wrong with these two statements, I do believe that they totally miss the point and try to turn security of customers’ data, documents, and corporate secrets in to a commodity for IBM to play with, and worse it trivializes the issues.

I do not believe that security is just a discipline. Yes, users do have to learn how to treat data and how to help themselves. But, we in the industry must start to look at security in a different light. Security is privacy and we should help maintain the privacy of data and not just by trying to keep the “bad guys out,” after all, a lot, if not most hacks are insider initiated not external. So, in that case you keep the “bad guy out” by not employing them!

We need to start talking privacy and looking at ways of how we can truly keep data private both from insider threats as well as external ones. 

Threat detection is as good as useless for privacy at the point of attack. It is a great learning resource to work out how to secure data after the fact.

Threat prevention is the only way that can work, but it is multifaceted. We do need to look at keeping “The Bad Guys Out,” but not just out of the network, also out of the data. And, the “bad guys” are not just external people (Guys and Gals’) they may also be inside. So, we need to make sure that the data is secured in such a way that makes it usable, but also completely private and away from prying eyes, whether they be a system admin or someone who has been given admin permissions to do some data cleansing.

Security is not about creating back-to-front detention centers where one group is kept out and another is kept in! It is about privacy of information.

Security must also not get in the way of peoples working tasks. Otherwise, yes, they will circumnavigate it or they cannot do their job and they then find themselves without work.

As such, I believe our job as technologists is to make this possible, not just talk about it, not create long overly lawyered disclaimers, but actually create applications that create a full privacy zone where data can be utilized free from fear that sensitive data will be lost or stolen.

Only time will tell where the industry is headed and how we as a collective group approach security.

Tuesday 1 May 2018

Least-Privilege Security – Why and How


- Greg Gray, BOHH Labs Senior Software Engineer

Your users only need access to the resources they need to do their jobs. It’s a pretty simple concept that’s been a best-practice for many years. Simple in concept, not so simple in execution.

To make this concept work, I believe there are three areas where least-privilege security must be implemented: users, applications, and processes. Below we will take a deeper diver into each one.

Users
Users are given access to machines (e.g., their workstations) from which they perform their jobs. In many corporate environments, users are not given the privilege to install software. The applications they are allowed to run might also be controlled. Even access to some areas of the local file system might be necessarily restricted to maintain the integrity of the running machine.

Software developers typically need access to their own workstations and remote databases and servers. Since many of these needs are project-based, the privileges should be given to get the project done, then revoked when the privileges are no longer needed. This revoking of privileges is often missed, giving some users, who have worked on many projects, access to many more resources than they need to do their jobs at any one point in time and the opportunity to misuse data they have access to but should not.

System administrators need access to systems at a high-level of privilege, but does each administrator need access to all systems at that level?  In the smaller environments, probably yes, but for large environments, with a larger number of administrators, some narrowing of the privilege scope is probably warranted and ultimately a best practice when it comes to security.

The disgruntled employee with privileges to many systems is a danger to the company’s assets should this person be fired. A protocol needs to be established for quickly removing the privileges of this employee if needed.

Applications and Processes
Applications need privileges to access file systems, local databases, and remote databases and systems. Sometimes privileges are given to an application early in the development life-cycle that might not be needed later in the development life-cycle.

These privileges should be pared as needed so the application doesn’t go into production with unnecessarily liberal access to certain database resources or other company resources. If the application doesn’t create or drop tables, then the application shouldn’t have that privilege. There may also be different roles for users of an application. A role that only does reporting probably doesn’t need data modification privileges.

User Privileges
Implement Centralized Login:
The first thing that needs to be done to get control of user accounts and privileges is to establish a centralized login system. In a Microsoft Windows environment, this usually begins with the Active Directory Domain Services. In a Linux/Unix environment, an LDAP server can provide for central management of logins and permissions. An Active Directory service can also act as an LDAP server for Linux/Unix environments.

Active Directory Domain Services actually encompass multiple services that can be used for privilege management and maintenance:

  • Domain Services – Provides authentication and centralized storage for users and data. Also provides search functions.
  • Lightweight Directory Services – Provides for directory-enabled applications using the LDAP  protocol.
  • Directory Federation Services – Provides single-sign-on (SSO) to authenticate users to multiple applications and devices.
  • Certificate Services – Provides for the creation, distribution and management of secure certificates.

Kerberos is an authentication protocol that uses the concept of tickets to allow machines communicating over an insecure network to provide proof of identity. Kerberos is commonly used for SSO in Linux/Unix environments, but Microsoft also has a Kerberos system available.

Centralized login systems aren’t just for users. Applications and Databases can use them, too. As an example, MySQL Enterprise can be setup to use Pluggable Authentication Modules (PAM) to authenticate using an LDAP, Kerberos, or other authentication source. Other systems can use Kerberos without PAM.

Establish Policies
Once a central login system is established, policies need to be established that outline what is permissible for each user, application, and process. Users will probably have a blanket policy that covers all users for access to their workstations, email servers, etc. System administrators will have their own policy that extends their permissions for their unique role in the security realm.

Applications and Processes will have their own unique policies that will likely vary from application to application. Database permissions for software developers will have phases where the beginning of the projects will allow more permissive access while the permissions might disappear altogether after the application is in production. The application must also have a documented policy for appropriate permissions for it to perform its functions. These varying permissions can be grouped into roles and assigned to each user as needed. The centralized login system can store and manage these roles.

Audit
Perhaps the most important actions to take for least-privilege policies is auditing the systems to ensure that the policies are being enforced. Audits should be scheduled on a regular basis in addition to having a few surprise audits. Auditing not only ensures that the policies are being enforced but also reminds the user community what the policies are and that the policies are important to the company.

Automation
Least-Privilege Security management is not an easy task. In small organizations it could probably be managed by a small number of people (at least two) but in larger organizations, the task becomes onerous. Luckily, there are a number of third party products that can provide automated services to bolster your privilege management efforts. These services run the full gamut, from automated probing of systems looking for vulnerabilities, to performing the necessary audits to ensure policy compliance.

It is not our aim to promote any third-party products so, using your favorite search engine, search for “least privilege security” for a list of such products.

Least-Privilege Security has a vital role in protecting your company’s systems and resources, in other words: your company’s bottom line. Users need clear policies to outline what is expected of them. Application developers need clear policies to ensure a secure environment in which to run their applications and access their databases. These policies need to be enforced to ensure there are no holes in the systems or in the policies themselves. Finally, verifying the policies are being used as intended via audit is the final necessary step to securing and protecting your company’s resources.