Thursday 6 September 2018

Continuous Web Applications Attacks Prove the Need for New, Proactive Security Approach


- Greg Gray, BOHH Labs Senior Software Engineer

While being proactive about security has always been a choice in the realm of web application deployment, it is only in recent years that it has become a priority over just reacting to security breaches. Too often we don’t discover the problem until after we have been hacked. This blog post will investigate methods for attack prevention rather than cleanup.

More and more we are seeing job postings for IT Security Managers. This is an important step in the development of a proactive security approach. Committing time and resources to the problem makes it clear to everyone up and down the organizational chart, and to the board and stockholders, that security is a high priority. It also ensures that there is consistent leadership for securing the company’s resources.

So, what does the IT Security Manager do? First and foremost, working closely with other management entities and IT personnel, they establish the policies for maintaining a secure work environment. This includes policies for passwords, application access permissions, workstation application restrictions, etc. These policies establish the baseline for everyone’s contribution to the company’s security. These policies must also include IT’s role in policing the policies and, with Human Resources’ contribution, procedures for dealing with violators.

While these policies are being developed and implemented, the Security Manager should also focus on what is necessary for the IT staff to implement the policies. One way to divide up the responsibilities is into several groups, for example: workstation, network, and development.

The workstation group is responsible for ensuring workstation policies are enforced by using a set of tools specific to this task. Virus protection, scans for unauthorized software, port scanning are some examples.

The network group is responsible for protecting important systems from attacks from outside the business and inside the business. Auditing is a primary line of defense and include such tools as Nessus, Nmap, SAINT, lsof (list open files). Monitoring tools are also important to the network group. Some tools will report attacks as they occur, such as: TripWire, LogCheck, and ZoneAlarm

The development group has a key role that encompasses the entire spectrum of attack vectors. This begins with the deployment process where the integrity of the software build is critical. Applications should be contained in their own isolated environments so that one compromised application doesn’t affect another.

Possibly the most important protections to be put into place involves the protection of customer data. This is the root of the data security problem. Data encryption with keys, tokenization, and newer approaches, such as BOHH Labs’ Secure Data as a Service, are necessary to ensure data integrity and prevent its dissemination to the world after a successful data breach.

The IT Security Manager will also need to remain vigilant for old and new attack vectors via up-to-date cyber security threat reports. Each of the security subgroups should watch for old and new attacks pertaining to their areas of support. 

After all new policies are in place, or maybe even before the policies are in place, the IT Security Manager should engage with a company that does independent security audits. The results may uncover areas that no one thought would be an issue or, they thought it was another groups’ responsibility.

Vigilance is key to preventing data theft and protecting customer and company resources. The successful effort may go unnoticed. It’s a “no news is good news” situation. But at the end of the day, when the reports to the company directors shift from reporting the number of breaches to reporting the number of failed attacks, the ideal of a Proactive Security Approach may be fully realized.

Thursday 30 August 2018

Post Industry Event POV: Security Concerns Are Still Challenging Enterprise Progression



Last week, several of BOHH Labs’ leaders, CEO, Simon Bain, COO and SVP of Business Development, AJ Jennings and SVP of Partner Engagement, Marina Simonians, attended the inaugural Arrow Technology Summit in Denver, a premier event for IT Value Added Resellers (VARs), Managed Service Providers (MSPs) and Systems Integrators (Sis).

The Arrow Technology Summit is for IT partners and solution providers who are actively seeking the information, opportunities and relationships that will grow their bottom line. ATS brings together leading industry experts to give you the actionable information and perspectives you need to grow your IT business.

At BOHH Labs, we are big fans of industry events, as they are a great opportunity to understand market trends and challenges, share information, learn from peers, and speak directly to people on what problems they need addressed and what they are currently lacking. ATS was no different and was a great summit focused on bringing companies together to address the changing digital enterprise landscape.

BOHH Labs was honored to not only attend but be a sponsor at the event and showcase our breakthrough technology, Intelligent Secure Data as a Service (SDaaS), that enables businesses to strengthen their offerings to data-driven industries, especially in the areas of security, edge computing, the internet of things, hybrid cloud, data intelligence, analytics and next-gen data centers.

One of the most interesting takeaways was understanding that all the companies we met with are plagued with the same challenge: the desire to embrace digital innovation into their solutions and help customers unlock the value of their data analytics, yet they are unable to securely open up their data. We heard countless stories from senior execs wary of the current data security solutions that promise protection with solutions that rely on application level security, data masking applications and other encryption algorithms that have been proven to leave open massive security threats. We heard everything from those who were terrified of losing keys, opening up databases to the cloud, and even those unsure of how to back up their databases.

It is evident enterprises are looking to increase their value with innovative digital services and maximize the value of their data assets. However, data security is challenging enterprise progression. As part of these conversations, we were delighted with the reaction received by our SDaaS solution to address these setbacks.

SDaaS uniquely protects all data, while maintaining security for searching, analytics, and use, unlocking valuable data to become an asset for companies and preventing data breaches. Our service acts as a layer between the user/application and the back-end data, and has no reliance on a keystore, which resonated with many who are wary to keeping encryption keys proven to be ineffective. Instead, its uses an AI engine to manage the encryption process to completely mitigate any access to data by attacking a keystore. SDaaS encrypts down to the sub-field level with each encryption process being handled with a uniquely derived key which is never stored, eliminating internal and external threats while protecting data at rest, in transit, and in use. 

As a result, SDaaS enables enterprises to securely deploy innovative applications, cloud services, and analytics that they desperately need, without opening their data to massive, widespread, and malicious security threats. 

Overall, it was a very successful event with over 300 companies attending and we gave 60 demos on how our solution is uniquely suited to enable partners and resellers in their efforts to leverage valuable new insights on customer and company data, while preventing internal and external data breaches. We left the summit upbeat with our message and solution resonating with most of the conference attendees.

Tuesday 14 August 2018

How the BOHH Labs/Approyo Partnership Can Accelerate the Transition to SAP HANA in the Cloud


The enterprise technology landscape is rapidly changing, and more companies are embracing digital transformation and integrating cloud services to enhance business agility, efficiency and extract analytics. In fact, technology leader in business applications, SAP, has pegged a deadline of 2025 for when it will terminate support for its ECC6 on-premise solution.

For businesses looking to migrate to more advanced SAP HANA systems in the cloud, this may seem like a long way off, but the deadline shouldn’t be the only driving force behind your migration strategy. Cloud-based SAP HANA solutions offer businesses a vast array of benefits, as they look to harness the potential of digitization and business transformation.

However, digital transformation is an evolving market and there is not a one-size-fits-all solution. For businesses to be successful in their transformation to cloud-based SAP HANA solutions, there must be collaboration, and Approyo and BOHH Labs have teamed up to offer SAP customers seamless migration to SAP HANA cloud environments. Approyo CEO, Chris Carter, and BOHH Labs CEO, Simon Bain, illustrate below how strategic partnerships can create a best in breed solution that bring success for the end customer.

Chris shares:
As part of the push to move to the cloud, SAP has developed SAP S/4HANA, which is built on the advanced in-memory platform, SAP HANA, and is an entirely new generation of SAP Business Suite. It fundamentally redefines how enterprise software creates value across industries with instant insight. SAP S/4HANA also personalizes the user experience on any device and natively connects to Big Data, the Internet of Things, and business and social networks - all in real time.

Like all new things, especially in technology, the updates can be daunting. Approyo can help businesses move SAP applications, such as SAP S/4HANA, to the cloud with tailored implementation roadmaps to ensure every organization undergoes a seamless migration. We are one of the only SAP partners in the world that can migrate SAP applications to the cloud, provide ongoing support and provide long term managed services for hosted cloud environments.

Simon Shares:
While Approyo specializes in making SAP migrations smooth from an implementation standpoint, we combine with their capabilities to make the migration smooth by enabling better security and access to data stored with SAP HANA. We bring security to the data and the business suite, so customers can get simple access to their complex data. We provide total security on all stored data by uniquely providing database or specific file security, enabling prioritization and control of sensitive data. BOHH’s security service provides full text search capabilities, even on secured data, and supports Bot technology for data to be conveniently accessed and bring enhanced access flow to data within the system.

Chris adds: 
Often companies are hesitant to migrate completely to cloud deployments due to security, performance disruption and accessibility. This includes migrating to the new SAP S/4HANA platform. The BOHH/Approyo partnership is reducing these setbacks.

Simon concludes:
By partnering together, Approyo and BOHH Labs can address customer demands and focus on what our enterprise, integrator and reseller customers need to secure access to their business-critical applications and data in the cloud with minimal disruption. The joint solution between Approyo and BOHH Labs is addressing the challenges enterprises face migrating to the cloud and enables SAP customers to leverage the analytics capabilities of SAP HANA, while gaining a secure method to easily access all enterprise systems to quickly find, search and unlock the value of all their data.

Tuesday 7 August 2018

Why Data Access is the Heart of a Competitive Business


- Alan Jamieson, VP of Business Development at BOHH Labs

Businesses are evolving and starting to quickly realize that the data they retain has value to the company, but only if they can leverage it.  In fact, by 2020, some global analysts predict that data will be listed as a company asset, so it has significant monetary value. 

Some businesses already have a data-driven strategy that leverages its rich customer data assets to look for new revenues streams, business opportunities or use its machine performance data to improve efficiency through fewer machine breakdowns or to ensure that production is at its highest level that is already giving them competitive advantage.

Let’s reflect on Fortune 1000 companies over the last decade. Globally, our traditional Fortune 1000 companies are changing, and a significant percentage have fallen off the global stock markets or ceased trading, due to competition from newer entrants.  New online banks have set up business and are winning business from global banking organizations, as they offer more cost-effective services and are better able to understand their customers. New banking companies are looking for customers to use them for multiple services and customers, especially the younger generation, who expect more convenient and real-time 24-7 answers. Why does this matter?  Customer experience is a key business matrix. How customers interact and access their data is important for all parties and traditional banks are not always of the same thinking, or importantly, don’t have the technical systems to know their customer.  Additionally, newly formed companies don’t have legacy infrastructure to consider and are often more agile in their approach.

The greater knowledge businesses have access to through real-time interactions and many years of customer engagement, the greater analytical visibility they can gain on their customer or business operations, which helps make informed decisions on how you can extend or develop that relationship through new services and products.

It’s clear our global use of digital technologies, both in our consumer and work lives, is helping companies get more data on customers/users and their behaviors. The business world is starting to analyze those large volumes of data to drive greater customer and business insight. However, in our increasingly connected world, the variety of data – compliant, sensitive and confidential, plus the volume of data that is being produced, available, and perhaps more importantly, collected, is impacting and challenging many global companies. Data breaches are still occurring on a frequent basis and are a key security consideration in how data is accessed and protected today.

To handle today’s influx of structured and unstructured data, businesses use real-time data warehouses and often archive or store data over 12 months old to reduce its storage or operational costs. While data volumes are growing annually, the overall mix of data types creates the biggest challenge. How do you ensure that authorized users can access sensitive information like pricing, compliant data like Personal Identify Information (PII), Personal Health Information (PHI) and confidential corporate data, but other business users can do analytics on the overall dataset without seeing the protected data?

BOHH Labs has developed a solution that enables businesses to keep data access at the heart of driving competitive business with our Secure Data as a Service acting as a layer between the user/application and the back-end data, enabling total security on all your stored data. This helps to protect only the compliant or sensitive data fields that can only be accessed by specific users, while importantly enabling a wider business community of users to access the data for analytical purposes.  Making decisions using greater sets of data, helps to make more informed decisions through analytics. Secondly, as our service includes a secure conversational bot, we help companies who have an interest in customer self-service initiatives. Customers securely access data and authenticate through existing applications to access their specific information using voice as opposed to a keyboard, which improves user’s experience, customer services and drive competitive advantage.

Tuesday 31 July 2018

Detection is Not Enough Protection



- Becca Bauer, Director of Marketing & PR


Another day, another breach. For today’s purposes, let’s look at the recent data breach from Dixon Carphone, where the names, addresses, and email addresses of anywhere from 1.2 million users to 10 million users was exposed. While the breach just came to light recently after GDPR came into effect, the breach actually occurred back in July 2017. That’s right – for just short of a year, the company had NO idea it was subject to a data breach.

While details on the how, who, and why of this particular attack are still coming to light, it does bring up the fact that breach detection is not protection. In fact, in a recent study sponsored by IBM Security with research independently conducted by Ponemon Institute, the 2018 Cost of a Data Breach Study finds that the Mean-time-to-identify (MTTI) a breach is 197 days, and the Mean-Time-to-Contain (MTTC) is 69 days. This means that on average, it takes half a year to identify a breach! Just imagine how much data an attacker could get in that amount of time while going unnoticed.

This figure is unacceptable, especially since the security industry as seen an influx in support for threat detection tools over the last several years. This ranges from everything like network threat detection to understand and monitor traffic patterns and endpoint threat detection to track information/behaviors on user machines to popular threat intelligence tools like AI and ML for their self-learning capabilities and ability to recognize patterns and anomalies.

Unfortunately, the industry has made people believe that detection can work. We are not saying that no detection solutions work and they should be removed from your security strategy all together, but it’s clear detection alone is not enough. What we need is a new way to protect our data. 

At BOHH, we believe the core focus must be on protecting the data at the foundation level. Given that a business will easily spend millions on their data protection solutions, it would only make sense to secure the data itself as it comes through and sits in your database. BOHH Labs has developed a Secure Data as a Service (SDaaS) solution that acts as a layer between the user/application and the back-end data store and enables protection of all stored data, no matter where it is located, by uniquely providing field level security, removing these fields from the source, storing the encrypted data and separately, without changing the underlying database structure or using a keystore to manage the encryption keys. By doing this we are removing not only the hacker threat to the data, but also the more prominent insider threat, which is often very difficult to detect. By putting the security focus on the data itself, not just where it is coming from, where it is stored or being transacted to, it enables better protection for both external and internal threats that organizations desperately need to keep sensitive information protected, and not just reliance on monitoring and detecting anomalies within the system.





Thursday 26 July 2018

Not All Encryptions Are Created Equal



In today’s volatile digital security world, encryption has become a standard security measure to keep your data protected. Many in the security industry would even goes as far to say that it is one of the most important methods for providing data security, especially for end-to-end protection of data transmitted across networks. The core foundation of encryption focuses on converting information or data into a form unreadable by anyone except the intended recipient. Once a file or data piece is encrypted, it becomes difficult for external sources to get access/understand the encrypted information.

While highly touted, encryption is hardly a new strategy with the origins of hidden messages and cryptography dating back to the 19th century. Since then, it has evolved and there are many different types of encryption algorithms that are used. However, not all of these are created equal or are completely secure. Below are several types of today’s popular encryption algorithms all of which have security loopholes.

Homomorphic Encryption

Homomorphic encryption requires a public key to enable search. This also means it requires a keystore to hold the private key to enable the encryption. The person with access to the keystore has access to your data! This means you are putting your data at risk to internal misuse and in the hands of who owns the keystore. You don’t believe you would have an internal person who abuse this power? Nor did the CIA until Edward Snowden fled the country.

Data Masking

Data masking has generally been created as an intermediate layer between the data store and the user and is becoming more common as part of the GDPR regulations. The masking gateway accesses the data as an administrator and transforms (masks) the data on a user query. However, the stored data remains in clear text and is vulnerable. Simply put, this is really just application redaction.

TDE – Transparent Data Encryption

This technology encrypts the data file on disk, stopping anyone from reading it, while it is at rest on the disk drive. HOWEVER, as soon as it is loaded in to the database, it is decrypted and available to be viewed by all who have admin privileges. This puts the encrypted data at risk to internal misuse as admins have approved access to the keys and could decide to capitalize on this access to sensitive information. 

Column Level Data Encryption

Column level data encryption is generally implemented with a keystore, which means that those with access to the store also have access to the data. However, just as importantly, if this is implemented post production, it requires whole-sale changes to the database and the calling applications, leading many implementations to remain incomplete, as well as expensive.

As you can see, many of these encryption tools are lacking in complete external and internal security.

At BOHH Labs, we believe that the parties at the two ends of a data message – the sender and requester – should be the only ones who have access to that data message. We believe encryption should be dynamic. In other words, your keystore should be dismantled and the encryption keys, IV’s Salts, should be created by the application based on different criteria at that moment in time. This means that each piece of data, each network message, or each file is encrypted to a unique key, so it doesn’t leave your data open on your key store and accessible to unauthorized employees.  Dynamic key creation encryption that has no reliance on web security or keystores is a cornerstone of BOHH’s data security service. Every data request is isolated from the requestor and is encrypted using transient keys that are destroyed after each transaction. This means the original data request never has direct access to the company network or backend database and terminates intercepting party connections and renders partial data a third party may get access to useless, making it very difficult for to steal useable data (including a database admin). Further, by uniquely providing field level security, removing these fields from the source, storing the encrypted data and separately, without changing the underlying database structure or using a keystore to manage the encryption keys, which removes not only the hacker threat to the data, but also the more prominent insider threat.

As such, despite being popular in the security industry, it’s clear that many of the current encryption methods that have backdoors, especially for internal misuse. If you are interested in more about BOHH’s keystore-less encryption method that makes these security loopholes obsolete, reach out to learn more.


Tuesday 10 July 2018

How to Keep Up on the Latest in Cybersecurity News


Cybersecurity is always a hot topic, for a very good reason: the hits just keep on coming. Trying to keep up with the latest news and vulnerabilities is a daunting task, but you have to do it. Installing the latest security software and running the latest tests is not complete due diligence in the modern world of continuous cyber attacks. As the saying goes, “knowledge is power,” and it is especially so in the web-connected world.

So, how do you keep your knowledge at peak efficiency? Reading, of course. There are hundreds of technology sites and blogs that will help keep you informed about the latest issues but that’s a lot of reading. What follows is a list of the some of the best. You should recognize some of these if cyber security is not a new to you. Hopefully, the list includes some that you weren’t aware of and will add some bulk to your reading list.

The Hacker News 

The Hacker News is one of the largest and well-read information security sites. They feature news and thorough coverage of the information technology vulnerabilities and trends. The Hacker News is supported and endorsed by security experts, administrators, and members of various underground hacker groups and communities worldwide.

Krebs on Security

Brian Krebs is not your typical cyber export (but who is typical?).  His formal education includes a Bachelor of Arts degree in International Studies from George Mason University in 1994 (programming was a hobby). So, what prompted him to switch his focus to cyber security? In 2001 his home network was compromised by a Chinese hacking group. What followed was a self-taught crash course in computer and Internet security.

In his own words from his website, “Much of my knowledge about computers and Internet security comes from having cultivated regular and direct access to some of the smartest and most clueful geeks on the planet. The rest I think probably comes from a willingness to take risks, make mistakes, and learn from them.”

Open Web Application Security Project (OWASP)

Established in 2001, OWASP is a non-profit organization that has dedicated itself to the development of knowledge, tools, and best practices for secure application development. In their own words, they want to “be the thriving global community that drives visibility and evolution in the safety and security of the world’s software.”

One of their most important projects in my experience has been the “OWASP Top 10 Most Critical Web Application Security Risks”. Not only do they describe the risks in detail but the also provide examples for mitigation in multiple languages.

Schneier on Security

Bruce Schneier’s blog has been in existence since 2004. He writes about security in articles, books, and academic papers. He is currently the CTO of IBM Resilient, a fellow at Harvard's Berkman Center, and a board member of the EFF.

The blog includes articles pertinent to current security issues and has an engaging comment area with lively discussions. He also produces a monthly, well-read newsletter.

Dark Reading 

Dark Reading is a long-time source for information about new cyber threats and current cybersecurity technology trends.

From their website: “Dark Reading.com encompasses 13 communities, each of which drills deeper into the enterprise security challenge: Analytics, Attacks & Breaches, Application Security, Careers and People, Cloud Security, Endpoint,  IoT, Mobile, Operations, Perimeter, Risk, Threat Intelligence, and Vulnerabilities and Threats. Each community is led by editors and subject matter experts who collaborate with security researchers, technology specialists, industry analysts and other Dark Reading members to provide timely, accurate and informative articles that lead to spirited discussions.”

Naked Security by SOPHOS 

Naked Security is SOPHOS’ news aggregator, providing the news, opinion, and advice on our favorite topic: computer security issues and the latest Internet threats.

Naked Security also produces a daily newsletter that provides a list of important cybersecurity news articles published within the last 24 hours. This is a must read.

Summary

I hope this list added a few more sources for your cybersecurity knowledge needs. Feel free to comment below on these and other sites that you have found invaluable to our work.

Thursday 21 June 2018

A Market Overview on the Changing Data Landscape



- Alan Jamieson


We live in changing times and data has become a major part of our lives.  With the recent enactment of the General Data Protection Regulation (GDPR) in late May, we have all been inundated with privacy emails from our suppliers emphasizing that we own our data which helps both parties with relevant information, offers etc.

Having data and the rights to opt out are important, but how do we know it’s safe with ongoing, frequent data breaches across the globe? Can we cope with the number of offers sent to us from suppliers who hold our data, and can we be assured that they are using my current and historic personal data?  If data has been collected over weeks, months and even years, the reality is the more data you can analyze, the greater insight can be obtained.

A few years ago, data was expected to increase in volume by 100% annually, which challenged computing infrastructures and brought to light questions such as where is the most cost-effective place to store the data? And, what analytical tools should we be using? Can the tool look at all data types (structured and unstructured)? Do we need to hire data scientists looking to make real-time decisions? Are we aware that running complex queries take time? Today, most data volumes are increasing higher than previously predicted, especially in social media where data volumes can increase by Petabytes of data daily (not solely text but increasingly with video and audio content) and through our adoption of IoT products.

Terminology is also changing, terms such as big data. which had various means based on its context.  Gaining business insights from the increasing volumes of data being held are important to help improve user experiences, drive business efficiency, help fine tune marketing offers, and predict what equipment needs maintenance to avoid unnecessarily outages etc.

While we can protect data through encryption technologies either when data is at rest or in transit, searching for data in databases (on prem or in the cloud), repositories such as Microsoft SharePoint and other documentation types is also a critical challenge. It’s great to collect data but it you can't easily access it, you are incurring unnecessary storage business costs that will not be recovered.

Speaking to enterprise customers and global vendors, there is another change in how we interact with data.  With our widening generation employee bases across most enterprise companies, how we access data is changing.  Our younger global workforce, who have grown up with smartphones, are increasingly looking to request information or data via a voice request and not a keyboard.  Enterprise companies must cater for information or data access via keyboard and/or voice request but only to authorized data requestors. 

We at BOHH Labs address this changing data landscape with a service that provides voice and keyboard access to secure data enabling data analytics to be performed whilst importantly preventing data breaches. We are hoping to lead a shift in how the market both views and interacts with their data. After all, data is a business asset and we are looking to help companies unlock its value.

Tuesday 19 June 2018

Why Every Company Needs A Proactive Plan to Secure their Proprietary and Sensitive Data


- Ted West, BOHH Labs Chairman

Companies store a massive amount of data which they want to liberate for new business applications, analytics and optimization. These data include everything about customers, suppliers, production and logistics operations, as well as financial transactions and results. All these data offer new value to companies looking at adding new business applications and analytics tools to help make better business decisions and remain competitive.

Many of these applications and tools reside on the cloud, outside of existing “firewalls” of security. And while the companies consider and procure more and more security solutions to harden the edges of the firewall, they remain reluctant to “let the data out” to the cloud due the risk of it being hacked, leaked, lost, or stolen. As a result, companies may be missing out on ways to better optimize their business.

In today’s business environment, proprietary data can be an immensely valuable asset. It must be treated as such. It is no longer enough for companies to take a laissez-faire approach to securing proprietary data, reacting to a threat or breach and figuring out how to deal with it on the fly. Rather, companies must adopt a proactive approach and plan to secure their data before letting the data out.

Companies are responsible for holding massive amounts of data – much of which is sensitive customer and employee personal information such as PHI (Protected Health Information) or PII (Personal Identifiable Information), as well as sensitive proprietary and financial data. When a company experiences a breach and any of this sensitive data is leaked, companies are exposed to financial and brand damage, trust and loyalty degradation, and even lawsuits, financial penalties and fines.

That is why it is critical companies have plan to approach their data security. But, what does such a proactive data security plan look like? The first step is understanding your data and how it is already protected. Once these questions are addressed, data security protocols and policies will be better understood, and new security protocols and solutions can be reinforced, updated or added.

Questions you must address to start your data security plan when moving data outside the enterprise:

1. What company data from the inside is needed to move on the outside into new applications?
2. How is this data transported from on premise (inside the firewall) to the cloud?
3. Which of these data are truly sensitive, subject to privacy and confidentiality requirements?
4. How will user access to these data be provided once data is outside the firewall?
5. How will the truly sensitive data be transported and accessed with privacy and confidentiality?

When companies don’t have a plan in place to protect their proprietary data, it can’t be properly leveraged to optimize their business. The analytical value of data is enormous and can be applied to everywhere from improving sales cycles to helping organizations plan better marketing efforts that will help companies make more informed decisions on how to interact with its current and potential customers. However, if data is not properly protected or, even worse, if data is breached and stolen from bad actors, companies will lose the ability to apply this value to their business efforts.

Every company should have a well-thought out plan to protect their proprietary data at the root level to help minimize risk of data breach and loss, while taking advantage of the full use of their data.


Tuesday 12 June 2018

Security Add Ons Are Crap & Don't Protect Data


In the last couple of weeks, we have participated in several industry events, including the recent SAPPHIRENOW event hosted by SAP. I am a big supporter of these types of events as it is invaluable to be able to learn, share information, monitor market trends, and perhaps most importantly, speak with customers and have one-on-one conversations on what problems they need addressed and what they are currently lacking.

It is evident that organizations are looking to increase their value and maximize their technology investments by moving many business-critical applications to the Cloud. There are obvious benefits to this – cost savings, more efficiency in operations and enhanced ability to leverage analytics for more focused business decisions are just a few to name. However, to move all of this forward, customers are looking for new and innovative ways of protecting their critical business and data assets in our very volatile and breach-prevalent market. Security is challenging enterprise progression, and after attending several events recently, it is clear there is a discrepancy between the perception and reality by vendors on what customers actually want to fix this.

Companies and vendors continue putting out “new” solutions that are simply add ons to existing security investments, but it seems that customers are fed up with the security industry “add-ons” for promised enhanced security, as time and time again, the same old vendors are promising new and exciting solutions to protect companies and their data, yet major breaches keep happening.

I will be blunt – security add ons are crap. They are just a patch to stop bleeding so to speak, but they are not permanent solutions and they certainly are not innovative and new. Now, I am not saying that all current security solutions are crap, but we have had the current security practices for over a decade now, and while they have definitely worked in some cases, there are other well documented ones where they have blatantly not. This alone tells me that we need to re-evaluate how we are protecting our systems and data.

I am not saying that companies should stop purchasing specific security applications, CASB's, Firewalls, VPN's etc., but instead of bolstering the current systems with new patches and add ons, it is time that we ask ourselves if we are doing it right and providing enough trust and security to enable organizations to allow their customers to use that data correctly. Where we need to focus our security efforts is on the data itself, both at rest and in transport. The core focus must be on protecting the data at the foundation level.

Given that a business will easily spend millions to protect access to data, it would only make sense to secure the data itself as it comes through and sits in your database. But wait, you say we do that, right? Well yes, this happens with encryption, but there is a flaw - current database systems can encrypt stored data, but it is carried out in a way that anyone (human or machine) that has access to the system at any administration level generally also has access to the plain unencrypted data. This leaves a big come get me sign. That’s why at BOHH Labs we believe in offering database or specific field level security. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over every data point. We do this because it stops the inside hacking job. If the database does not contain the data, then a malicious actor who has gained root or admin privileges cannot run a simple query on the data, extract it, and have it available to them to sell to whoever they like, unlike traditional TDE Data Encryption or homomorphic encryption technologies!

By putting our security focus on the data itself, not just where it is coming from, where it is stored or being transacted to, it enables better protection for both external and internal threats that customers desperately need.

Thursday 31 May 2018

Data Storage: From Then to Now & What’s Still Needed Ahead


- Becca Bauer

Data storage has in fact been around for hundreds of years and has gone through a number of changes before arriving at the cloud storage era we have currently arrived at. Starting in the 1720’s, punch cards were introduced and eventually became the first tool for data storage and recording. Since then data storage has gone through a radical evolution including the introduction of magnetic tape and then the first hard drive invented by IBM in 1956. This was followed some 20-30 years later by the introduction of floppy disks, CD-ROM and DVDs. Next, the USB flash drive arrived on the scene in 2000 and dominated storage from several years until cloud storage entered the market and was debuted by Amazon Web Services in the early 2000s and took a major hold in 2006. Since then, the use of the Cloud has become increasingly popular for data storage and brings us to the present-day climate.

While there are many clear benefits to moving the Cloud, there are also several flaws revealing themselves that indicate the storage market is in need of a continued evolution. For example, recent studies are indicating that moving storage from in-house to the Cloud won’t achieve cost savings unless the storage needs are fully assessed, and anticipated savings are planned out. In fact, over-estimating storage capacity is one area that can make a dent in the savings. While estimating a higher capacity can secure better cloud storage rates from vendors, if you don’t have enough data to meet this higher capacity, you are essentially paying for unnecessary and wasted space.

Another major challenge to the current cloud data storage model is data security. Most organizations turning to cloud storage solutions hold a mix of data that often includes sensitive and protected personal information in their databases that they have a responsibility to keep protected, such as PHI (Protected Health Information), PII (Personally Identifiable Information) and GDPR (Global Data Protection Regulation). Unfortunately, current database systems can encrypt stored data, but this encryption is carried out in a way that anyone (human or machine) that has access to the system at any administration level generally also has access to the plain unencrypted data. This design flaw leaves a “come get me” sign that has led to many diverse organizations becoming victims to data theft and losing millions of dollars.

So now that we have highlighted what is missing from the current cloud storage systems, what is the solution?

Enter BOHH Labs introducing the next phase of data storage, Secure Data as a Service (SDaaS) that puts a focus on both the actual security of the data and removes wasted spend with a storage consumption model. SDaaS acts as a layer between the user/application and the back-end data store and enables total security on all your stored data, without changing the data structure, while making certain data points visible only to those with the correct permissions. Whether this is in a database or a document, the BOHH SDaaS enables full use of data without the security concerns. This solution uniquely offers database or specific field level security that businesses desperately need. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over sensitive data such as PHI, PII or GDPR. We do this because it stops the inside hacking job and it also enables companies can choose which data to store with full knowledge of data confidentiality/ sensitivity. This kills the flaws within the current storage market and enables stored data to be securely opened to the Cloud, without putting it at risk of breach.

All of this is done without impacting user accessibility and it is introducing secure storage as a consumption-based model, rather than the current

Thursday 24 May 2018

What’s the Point of Analytics if You Can’t Access Them?


- Becca Bauer, Director of Marketing & PR

In the workplace, data has become the golden ticket for companies to drive sales and stay competitive. However, much of the focus has been on the development of analytics and using data insights gathered on your marketing, sales, customers, products, new leads and so on to grow market share. All of this sounds good – hey it’s basically free marketing advice generated from your own information - but while this focus serves marketing and finance purposes, this emphasis fails to address the fundamental need of how we actually access those insights.

Big amounts of data is nothing new. It has always existed. Granted, not in such quantities as we have today with all of our applications and collaborative sharing tools, but document management has been holding a large amount of data since the 80s and email storage has been “big” ever since the late 90s. The onset of nonstop data being produced from everything from web history, emails and documents to contracts and CRM systems continues to grow daily, even hourly, enabling organizations to access corporate knowledge that is more relevant and targeted than ever. And as we move forward, there is no foreseeable end because we as a digital society produce a massive amount of data constantly that needs to be housed somewhere.

One major issue many companies are having is that they are looking to Cloud vendors and deployments to store historic or archived data in their infrastructure, as opposed to in the company’s own data center, and this often means recent data or even just 12-months old data get archived. The problem with this is, once the data is stored, how are companies easily able to access this data to extract business value or analytical insight to help businesses remain competitive? This is a big need! After all, how are companies supposed to take a targeted approach to their data and see if patterns emerge that can better be applied to high-value business decision to increase their bottom line if it is all archived and not easily accessible?

Access to data is the critical function here. Add to the fact that securely accessing stored data is an increasing challenge for companies, as often the data they want to examine contains confidential data such as PII (Personal Identifiable Information) and PHI (Personal Health Information), and it’s difficult for companies to guarantee the security of this sensitive data accessed by users.

So, what is the solution to putting your analytics and insights to use?

This is where BOHH Labs can step in and help. Our Secure Storage as A Service (SSaaS) acts as a layer between the user/application and the back-end data store and enables total security on all your stored data. This is done without changing the data structure, while making certain data points visible only to those with the correct permissions. Whether this is in a database or a document, the BOHH SSaaS solution enables full use of data without the security or accessibility concerns.

Our solution uniquely offers database or specific field level security. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over sensitive data such as PHI, PII or GDPR. If you know which data fields or rows contain sensitive data, companies can better protect these fields to ensure business compliance and enable data to be utilized by a wider audience to extract greater business value or insight, while still securing it and only providing access to those who have the correct privileges to see it.

As the enterprise market continues to become more digital and the amount of data we produce rapidly grows, it will be even more important that secure access to this data is simple. The ability to to manage your continually growing data and extracting more accurate results to deliver valuable analytics will be critical for businesses to stay competitive.

Tuesday 22 May 2018

Why a Cloud Consumption Model Should Replace Pay as You Go for Data Storage



- Alan Jamieson, VP of Business Development


In a previous blog, we highlighted why planning your Cloud storage requirements is the only way to ensure your company achieves operational savings moving to the cloud.  Today, we are going to look at the various consumption options:

Pay as you Go

Subscription (fixed term, monthly fee per user or unit) based commercial models have been around for several years driven by Customer Relationship Management vendors such as Salesforce, Infrastructure as a Service (IaaS) vendors such as VMware, and Information Technology Service Management (ITSM) vendors such as ServiceNow, who all enable us to pay for services we use, typically based on user number pricing bands. This pay as you go approach has also been widely adopted by the leading Cloud vendors who companies are turning to streamline operations and offerings.  However, this approach is more limited when you look at Cloud data storage. Typically Cloud vendors look to companies to their Cloud infrastructure by charging them lower rates for storing historic or archived data in their infrastructure as opposed to in the company’s own data center. This often means recent data or even just 12-months old data get archived. The problem with this is, once the data is stored, how are companies easily able to access this data to extract business value or analytical insight to help businesses remain competitive? This is a big need!

Research also shows that companies are paying money for storage that they currently don’t need, as they either have too much in-house storage capacity or they have estimated and invested more than their businesses need today for Cloud storage. Regulation is often the main driver for companies to retain transaction data and customer data for a defined number of years, but unless there is a data retention policy, storage investments in this area can be an unnecessary expense. Companies should only retain data for specific periods of time, exceeding these period is an overhead to the business.
It’s clear there is a struggle to find the right balance of leveraging the Cloud to streamline the collection and storage of a company’s increasingly growing data without wasting money. As detailed above, the current standard method of pay as you go is not setup to help companies cost-efficiently move their storage to the Cloud. 

So, what is the solution?

BOHH Labs believes introducing a consumption-based model to the storage market can help companies maximize the benefits of moving storage to the Cloud without paying for resources they don’t actually need. Subscription (consumption-based) fixed term agreements are paid monthly or quarterly and can help businesses to start achieving operational savings with small initial investments that grow through greater use of the service and increased user adoption over time. This approach allows companies to pay for the resources they need without overestimating on resources that roll in to wasted costs, yet still allows them to expand as they grow. This model will be beneficial for all companies – those with a small number of employees to global enterprises with hundreds of thousands to benefit from the same services. 

Within the consumption model, we believe it needs to be split into two areas:

  1. Data storage:
    As discussed above, companies should choose the right data storage period and commercial model to support their individual businesses, and thus pay for storage based on volume or data retention period.
  2. Data acces
    This is an area that is included in user subscription agreements such as CRM, but if not, it is an area that all companies need to explore to understand how they can gain business value from data they store. Stored data serves companies no purpose if it cannot be accessed easily to leverage insight and analytics from it t apply to their business decision-making. 

What Does this Look Like? 

When companies run a marketing campaign, they typically include all their active target customers to help ensure that they gain the maximum return from the planned campaign.  However, when a global financial services or health provider needs to make longer term investment decisions based on historic data over several years, securely accessing this stored data, which often contains confidential data such as PHI (Personal Health Information), PII (Personal Identifiable Information) is not available, as they cannot guarantee the security of sensitive data accessed by business users.

BOHH Labs has identified this business challenge and has a created a Secure Storage as a Service solution that ensures that all stored sensitive data remains secure and confidential.  If you know which data fields or rows contain sensitive data, BOHH Labs protects these fields to ensure business compliance. As such, the BOHH service leverages a consumption model to provide a secure way to enable your noncompliant data to be utilized and have the cost of storage recovered from it as its value is extracted. By protecting the compliant data, securing it and only providing access to those who have the correct privileges to see it, allows longer period (often years) non-compliant and non-corporate sensitive data to be utilized by a wider audience to extract greater business value or insight.


Thursday 17 May 2018

Post Industry Event POV: It’s Clear Customers Are Fed Up with New Security “Add-Ons”


BOHH Labs VP of Business shares his thoughts on BOHH's attendance at the GDS Security Summit.

Last week BOHH Labs attended GDS’s Security Summit in Atlanta, GA.  The summit was well attended with CISO’s and VP’s of Security from global enterprise accounts in the finance, healthcare, manufacturing sectors etc.

It was evident that global enterprise accounts across all market sectors are looking for innovative ways of protecting their critical business and data assets.  The desire to move from data center to cloud is prevalent and often cost driven but mitigating the risk of data breaches is the main obstacle to be overcome by organizations. Additionally, as part of the discussion on security challenges, there is an increasing need to avoid further data breaches and meet new compliance regulations as Friday May 25, 2018 is fast approaching when the new General Data Protection Regulation (GDPR) becomes effective and impacts all global customers with European Union (EU) member customers.

One thing was made very clear: Companies and customers have been searching for new solutions and are keen to avoid buying another security solution that is layered on existing security investments. They are fed up with the security industry “add-ons” for promised enhanced security. The same old vendors are promising new and exciting solutions to protect companies and their data, yet time after time major breaches keep happening.

BOHH Labs was honored to take part in this industry discussion and present its new solution to help address the security challenges plaguing customers and organizations alike. We were delighted with the reaction to our new Secure Storage as a Service (SSaaS), which enables companies to protect compliant and confidential data fields while importantly enabling analytical insights to be gained without the risk of a data breach.

We demonstrated our service on a cloud platform to highlight how our solution is platform agnostic and show how compliant plus non-compliant data residing in a Hadoop data platform could be searched for analytical insights and the compliant data (i.e. PII, PHI) fields only being accessed by authorized users.

Seldom has the BOHH Labs team comprising of Simon Bain – CEO, Ken Hawkins – CTO and Alan Jamieson – VP Business Development seen such a positive and enthusiastic reaction to new technology.  Perhaps it was showing the Secure Storage as a Service working on a cloud platform and accessing data in a Hadoop data platform that validated how powerful and importantly how relevant our service is?  We left the summit upbeat with our message and solution resonating with most of the
conference attendees.

All in all it was a great industry event and if you want proof and to learn more, contact us and we will be happy to place a POC on your storage system, to show how we secure, scale and enable easy access to your valuable information stores.

Tuesday 8 May 2018

Security, so what is it exactly?


- BOHH CEO Simon Bain

Talking to customers, vendors and the great and the good of the industry, it is no surprise that we seem to have a data security issue at the moment. Maybe though not the obvious one of data being stolen, but one of the description of what security actually is!

These two quotes from Sridhar Muppidi, who serves as VP and CTO IBM Security,
 are taken out of context, but they sum up a large sector of the technology industry’s view on security:

  1. "IBM Security is a division that focuses on keeping the bad guys out and the good guys in, it's as simple as that," Muppidi said. 
  2. "It's a discipline," Muppidi said about security. "It's a discipline that can be morphed into a program, a set of practises, solutions and products."

See them here: http://www.eweek.com/security/ibm-security-cto-details-how-cyber-security-fits-into-ibm-portfolio

While there may not be anything fundamentally wrong with these two statements, I do believe that they totally miss the point and try to turn security of customers’ data, documents, and corporate secrets in to a commodity for IBM to play with, and worse it trivializes the issues.

I do not believe that security is just a discipline. Yes, users do have to learn how to treat data and how to help themselves. But, we in the industry must start to look at security in a different light. Security is privacy and we should help maintain the privacy of data and not just by trying to keep the “bad guys out,” after all, a lot, if not most hacks are insider initiated not external. So, in that case you keep the “bad guy out” by not employing them!

We need to start talking privacy and looking at ways of how we can truly keep data private both from insider threats as well as external ones. 

Threat detection is as good as useless for privacy at the point of attack. It is a great learning resource to work out how to secure data after the fact.

Threat prevention is the only way that can work, but it is multifaceted. We do need to look at keeping “The Bad Guys Out,” but not just out of the network, also out of the data. And, the “bad guys” are not just external people (Guys and Gals’) they may also be inside. So, we need to make sure that the data is secured in such a way that makes it usable, but also completely private and away from prying eyes, whether they be a system admin or someone who has been given admin permissions to do some data cleansing.

Security is not about creating back-to-front detention centers where one group is kept out and another is kept in! It is about privacy of information.

Security must also not get in the way of peoples working tasks. Otherwise, yes, they will circumnavigate it or they cannot do their job and they then find themselves without work.

As such, I believe our job as technologists is to make this possible, not just talk about it, not create long overly lawyered disclaimers, but actually create applications that create a full privacy zone where data can be utilized free from fear that sensitive data will be lost or stolen.

Only time will tell where the industry is headed and how we as a collective group approach security.

Tuesday 1 May 2018

Least-Privilege Security – Why and How


- Greg Gray, BOHH Labs Senior Software Engineer

Your users only need access to the resources they need to do their jobs. It’s a pretty simple concept that’s been a best-practice for many years. Simple in concept, not so simple in execution.

To make this concept work, I believe there are three areas where least-privilege security must be implemented: users, applications, and processes. Below we will take a deeper diver into each one.

Users
Users are given access to machines (e.g., their workstations) from which they perform their jobs. In many corporate environments, users are not given the privilege to install software. The applications they are allowed to run might also be controlled. Even access to some areas of the local file system might be necessarily restricted to maintain the integrity of the running machine.

Software developers typically need access to their own workstations and remote databases and servers. Since many of these needs are project-based, the privileges should be given to get the project done, then revoked when the privileges are no longer needed. This revoking of privileges is often missed, giving some users, who have worked on many projects, access to many more resources than they need to do their jobs at any one point in time and the opportunity to misuse data they have access to but should not.

System administrators need access to systems at a high-level of privilege, but does each administrator need access to all systems at that level?  In the smaller environments, probably yes, but for large environments, with a larger number of administrators, some narrowing of the privilege scope is probably warranted and ultimately a best practice when it comes to security.

The disgruntled employee with privileges to many systems is a danger to the company’s assets should this person be fired. A protocol needs to be established for quickly removing the privileges of this employee if needed.

Applications and Processes
Applications need privileges to access file systems, local databases, and remote databases and systems. Sometimes privileges are given to an application early in the development life-cycle that might not be needed later in the development life-cycle.

These privileges should be pared as needed so the application doesn’t go into production with unnecessarily liberal access to certain database resources or other company resources. If the application doesn’t create or drop tables, then the application shouldn’t have that privilege. There may also be different roles for users of an application. A role that only does reporting probably doesn’t need data modification privileges.

User Privileges
Implement Centralized Login:
The first thing that needs to be done to get control of user accounts and privileges is to establish a centralized login system. In a Microsoft Windows environment, this usually begins with the Active Directory Domain Services. In a Linux/Unix environment, an LDAP server can provide for central management of logins and permissions. An Active Directory service can also act as an LDAP server for Linux/Unix environments.

Active Directory Domain Services actually encompass multiple services that can be used for privilege management and maintenance:

  • Domain Services – Provides authentication and centralized storage for users and data. Also provides search functions.
  • Lightweight Directory Services – Provides for directory-enabled applications using the LDAP  protocol.
  • Directory Federation Services – Provides single-sign-on (SSO) to authenticate users to multiple applications and devices.
  • Certificate Services – Provides for the creation, distribution and management of secure certificates.

Kerberos is an authentication protocol that uses the concept of tickets to allow machines communicating over an insecure network to provide proof of identity. Kerberos is commonly used for SSO in Linux/Unix environments, but Microsoft also has a Kerberos system available.

Centralized login systems aren’t just for users. Applications and Databases can use them, too. As an example, MySQL Enterprise can be setup to use Pluggable Authentication Modules (PAM) to authenticate using an LDAP, Kerberos, or other authentication source. Other systems can use Kerberos without PAM.

Establish Policies
Once a central login system is established, policies need to be established that outline what is permissible for each user, application, and process. Users will probably have a blanket policy that covers all users for access to their workstations, email servers, etc. System administrators will have their own policy that extends their permissions for their unique role in the security realm.

Applications and Processes will have their own unique policies that will likely vary from application to application. Database permissions for software developers will have phases where the beginning of the projects will allow more permissive access while the permissions might disappear altogether after the application is in production. The application must also have a documented policy for appropriate permissions for it to perform its functions. These varying permissions can be grouped into roles and assigned to each user as needed. The centralized login system can store and manage these roles.

Audit
Perhaps the most important actions to take for least-privilege policies is auditing the systems to ensure that the policies are being enforced. Audits should be scheduled on a regular basis in addition to having a few surprise audits. Auditing not only ensures that the policies are being enforced but also reminds the user community what the policies are and that the policies are important to the company.

Automation
Least-Privilege Security management is not an easy task. In small organizations it could probably be managed by a small number of people (at least two) but in larger organizations, the task becomes onerous. Luckily, there are a number of third party products that can provide automated services to bolster your privilege management efforts. These services run the full gamut, from automated probing of systems looking for vulnerabilities, to performing the necessary audits to ensure policy compliance.

It is not our aim to promote any third-party products so, using your favorite search engine, search for “least privilege security” for a list of such products.

Least-Privilege Security has a vital role in protecting your company’s systems and resources, in other words: your company’s bottom line. Users need clear policies to outline what is expected of them. Application developers need clear policies to ensure a secure environment in which to run their applications and access their databases. These policies need to be enforced to ensure there are no holes in the systems or in the policies themselves. Finally, verifying the policies are being used as intended via audit is the final necessary step to securing and protecting your company’s resources.


Thursday 26 April 2018

Stop Throwing Away Money on Data Storage (Even When Moving it to the Cloud)


- Alan Jamieson, BOHH Labs VP of Business Development

Does moving your data center storage to the Cloud (Private/Hybrid) help with saving operational costs on increasingly challenged IT budgets?  For most people, this is an automatic answer, yes; however, recent studies are indicating that moving storage from in-house to the Cloud won’t achieve cost savings unless the storage needs are fully assessed, and anticipated savings are planned out. This is alarming, and you may ask why?

Firstly, capacity planning is an issue: how much do we need today and in 12 months’ time?  Over-estimating storage capacity is one area that can make a dent in the savings. While estimating a higher capacity can secure better cloud storage rates from vendors, if you don’t have enough data to meet this higher capacity, you are essentially paying for unnecessary and wasted space.

Secondly, while over 80% of data centers have the storage capacity in-house, it is difficult to do routine checks, so when looking to switch and invest in Cloud storage, companies often don’t have the whole picture and can be making a choice that is often not financially beneficial. The fixed costs (electricity, cooling, licenses and maintenance) of running a data center and any spare storage or processing capacity is often overlooked when formulating your cloud migration/deployment strategy.

The volume and variety (structured and unstructured, regulated data etc.) we are collecting is increasingly on an annual basis.  Data is now seen as a business asset with new Chief Data Officer roles in enterprise accounts being created, but are we realizing the value of the data assets we have?

From research done by Jonathan Koomey in late 2017, only 25% of companies would save money if they transferred their server data directly onto the Cloud, whereas 75% would see an increase in annual costs. However, all the sample group would save if the companies migrated after quantifying out how much server space they need. This unnecessary Cloud storage spends costs companies around the world an estimated $62 billion annually.

If we step back from the cost of storage, the other important and increasing challenge for global companies is extracting the value from data that is stored in the Cloud and often is not accessible.  With the massive amount of data being produced daily, operational cost challenges are pushing companies to store data over 12 months or even more recent too soon. The world has become analytically focused; however, insight is only gained when data over a significant number of years is analyzed to extract the insight to achieve greater operational efficiencies, greater insight in how to retain your customers and how to improve the quality of manufactured parts.

Another research report puts the cost of Cloud waste at about 35%. So, for every dollar spent on Cloud resources, you only get $0.65 investment value. Now that we have addressed how companies are losing money, below are six ways your company/department can alleviate some of the wasted

Cloud spend:

  1. Identify and retire abandoned applications – why store what is no longer needed?
  2. Choose the right storage model – What is needed today and plan.
  3.  Right-size instances – Invest in only what is needed.
  4. Perform licensing audits – Do your software vendors enable your licenses to be used in the Cloud at no extra cost?
  5. Automate server usage for peak/off peak hours – only pay the Cloud provider for services that are needed.
  6. Pay upfront – Looking at license options could save more than purely monthly subscription fees.

While the global market is focused on enhancing how data is stored and embracing the benefits of making a transition to the Cloud, having a clear idea of want storage is needed and how often you need to access your data will ensure that you select the most cost-effective model for your business.

Tuesday 24 April 2018

The Dark World of Terms of Agreements: A Complicated Data Security Nightmare



Ken Hawkins, BOHH Labs CTO

Now that the Facebook fire seems to be dimming in the public eye, let’s dig into the issues that really allowed our data and our friends data to be scraped and repurposed into a means to manipulate our online social behavior, and more importantly for Facebook advertisers to sell products to us.

Before we look into Facebook’s world of privacy and allowances, let’s first address how our data can be gathered and used when we make use of websites and web-based services. The term and effective service I am referring to is generally called OAuth, or more pointedly, “Allowance Tokens. OAuth was developed in 2006/2007 and is a simple way to publish and interact with protected data. However, we’ll stick with the term Token Allowance because this is the nomenclature Facebook uses as they put their unique spin on OAuth.

The Allowance Token creation and use can be loosely described as a means to allow another person or company to act on your behalf on websites and services. This is of course to make our lives easier on the Internet. One such example is to make it is easier to sign up on a website by simply choosing the option to login using another website’s login and information. When we as users choose this option, it means we allow the website we are interacting with, as well as the website who has our login information, varying access to our data and a means to act on our behalf on both sites. These tokens that are passed around can be good for minutes to days and can be used to actively monitor your activity on both sites you have linked together. There are variances of this interaction based on the Access Token type and the agreement you made when you chose the option; however, the most noninvasive Access Token usage would simply keep you from having to keep track of yet another password. Another could post updates, etc. to let your friends know what is happening in your life. Discussing it beyond the very general really needs a use case with actor’s names etc., so let’s use Facebook as the website that we trust to keep our data safe.

Armed with that cliff note of knowledge and confident that I trust CompanyA I have found this great new website and want to use my Facebook identity to sign up and log into it. Because I’m boring let’s call the website CompanyB. We have all seen the dialog for this right? Simply click the login with the Facebook button, agree to the terms, and use and the connection is made. Once we have made that connection (CompanyA with CompanyB) CompanyB can now perform varied actions based on what you agreed to. Those actions can range from simply looking at your profile and posts to posting updates on your page. You still might be thinking this is fine because now if I do something on CompanyB’s site then all my friends on CompanyA will know, and I don’t have to return to CompanyA to announce my actions. I trust CompanyA, so there should be no problem.  However, I have a question for you – did you read the agreement? Did you look into exactly how CompanyB was going to use your CompanyA generated Access Token? Before we answer that, let’s get a bit more specific and call CompanyA by the name we were all thinking. Let’s talk about Facebook’s applications – and specifically the quizzes that come up.

So, we are perusing our Facebook timeline enjoying the morning coffee and what comes up in our feed? A fun type of quiz my new best friend took to find out how great they are. Out of habit or boredom, we sometimes click through to take the quiz before we really think it through. What are we presented with to take that super cool quiz? Use your Facebook login to take the quiz. It’s ok you might think I trust Facebook, this is on Facebook what is the harm? Besides I’m too lazy to not use Facebook to login and take the quiz. It is an almost certainty that the super cool quiz company will gather any and all data it can about you and your friends (if you allow it) and then in return inundate you, your friends and friends of friends with the coolest quizzes and chances to win posts for you once do this. What are they doing? Gathering trends about you and your friends for the purpose of marketing to you, or worse manipulating your feed to gently nudge your thinking in a particular direction.

OAuth, Allowance tokens and their use is a wonderful ideal, but is woefully lacking in implementation and seriously blurs the lines regarding the ownership, security and responsibility of your data. As an end user it is incredibly convenient to let the world know what I’m doing or share information via this mechanism. Interacting with Facebook and the applications/websites using its tokens to access really do make our digital lives easier. But, what is the price for this convenience? I would argue that minimally your digital life is no longer wholly in your control. Agreeing to Facebook’s terms of use is just that, an agreement with Facebook. Facebook’s agreement of terms with its development community is another. The agreement you make with another website, say our famed CompanyB, would be the tie between the three of you. Once that connection is made, Facebook and CompanyB decide how your data is captured and used between them.

This is that blurry finger pointing area between the companies and you, the person who is allowing your data to be accessed. Facebook can be in compliance with CompanyB concerning the agreement they have, and Facebook can be in compliance with you concerning its storage of your data. Now did you read the fine print when you allowed CompanyB to use your Facebook identity? This is the area of concern that is clear as mud when it comes to responsibility. We are in control of how our data is captured and used and we the end user just allowed CompanyB to gather any and all data from us and possibly our friends by choosing the easy login option or taking that funny quiz to see which movie star we looked like. One need only to look at the latest Facebook and Cambridge Analytica revelation. Each party believes they were acting within the terms of their agreement and we are left to fend for ourselves after that.

From my point of view, I think it’s ultimately our responsibility to know that is going on with the transport of our data, and even though we might not want to, read and know what we are signing up for. To that end we owe it to ourselves to truly grasp what we are doing when we choose convenience in our lives. The use of OAuth and its implementations makes our lives easier but truly not safer. It is my opinion that OAuth makes our digital lives more encumbered and our data less safe today.

With Storage Solutions on the Brink of Disaster, Analytics Cannot Work


Yes, I said it. Storage solutions are on the brink of disaster. They are costing companies money, while moving to the Cloud can compromise data security and leads to potentially valuable data being lost or unable to be used. The same old vendors are promising new and exciting solutions to protect, manipulate, and visualize your data. In the past they promised Big Data Lakes of usable information enabling analytics. Today, it is Artificial Intelligence (AI) with security and learning capabilities  placed on your data. But, they failed in the past. Is today any different?

What is missing is a solution that enables secure, real-time data access and secures all compliant data held within storage systems without any impact to the user experience or putting the data at risk. Enter BOHH Labs – we are introducing Secure Storage as a Service to address these needs.

So why are we any different? We believe in privacy. Privacy of the individual with GDPR and PII, privacy of health records with PHI, and privacy of the state and corporation with secret and sensitive data being stored correctly. However, we also believe that data is an asset and has a value companies should be able to strategically leverage without fear that sensitive data will be leaked, lost, or stolen.

This all sounds great, but first we must look at what has gotten our industry to a place where storage is about to implode.

Storage. It is an old technology with a big future. Why? Because we need it for lots of reasons: from having somewhere to place our existing application data, archive storage, document & Image storage and videos.

We Need Storage.

Add to that the fact that we as a digital society produce a massive amount of data daily that needs to be housed somewhere, meaning that we have lots of data. Big amounts you may say. Apart from current data having immediate requirements, the analytical value of our historical data is enormous and can be deployed everywhere from sales cycles to help organizations plan better, to health data to see how well a specific treatment has worked over the past decade. If used correctly then this Big Data (yep 10 years of data can be classed as Big) can have a useful purpose and become an organizational profit center in its own right.

As such, storage solutions peaked in demand. And, as always when a market is in demand, vendors jumped on this market to sell promising solutions that would ostensibly provide businesses with competitive advantage, with the first major wave focusing on Big Data enabling valuable analytics.

Big Data Analytics.

One of the many overly hyped-up terms, Big Data Analytics was big in 2015. But now? Not so much. Why? Well it could be for many reasons:


1.      It was always a marketing hype. We have data in many huge storage silos. Why name it?
2.     The hype could never live up to the use cases. Hadoop, great storage, not so great for data usage.
3.     Why do we need it separated out? Data is data. Big data, small data lakes, ponds, puddles. It is data and it has a use, a value and is required.

It was the only thing people talked about years ago, with planes trailing banners and television adverts stating how it would change our lives.     But now… Naturally, when a fad dies, a new one begins. Enter AI.
AI.

Well when it comes to hype, AI is king pin. However, unlike Big Data, it is for good reasons. While AI is currently being over-hyped as a technology savior, it can actually help us in many ways, especially with storage needs, the security of data and the use one can extract of it. But will AI alone be a fix to our storage solution needs? No.

This brings us to Security. Everything above is all well and good, but storing data for the sake of it is expensive. Placing AI onto a store and learning from the data is a worthy cause but then turning it into a profit center through use becomes problematic, due to the increase in regulations and the information held within the data. Whether it be PHI (Protected Health Information), PII (Personal Identifiable Information), GDPR (General Data Protection Regulation), or data that is secret to an organization or government that must not be let in to the public domain. Using stored data with any of this information as part of it can be costly if that data is accidentally or maliciously leaked.

Secure Storage as A Service

This is where BOHH Labs can step in and help. Our Secure Storage as A Service (SSaaS) acts as a layer between the user/application and the back-end data store and enables total security on all your stored data, without changing the data structure, while making certain data points visible only to those with the correct permissions. Whether this is in a database or a document, the BOHH SSaaS enables full use of data without the security or accessibility concerns.

Our solution uniquely offers database or specific field level security. All data that needs to be secured is removed from the source, encrypting it and storing it separately without changing the structure, enabling prioritization and control over sensitive data such as PHI, PII or GDPR. We do this because it stops the inside hacking job. If the database does not contain the data, then a malicious actor who has gained root or admin privileges cannot run a simple query on the data, extract it, and have it available to them to sell to whoever they like, unlike traditional TDE Data Encryption or homomorphic encryption technologies! Also, the original data structure is maintained which helps negate the needs for costly application updates.


Traditional Data Masking solutions and TDE Data File encryption solutions do not hide your data. It can remain visible in its original format to inside threats. While application level security may have thwarted the hackers of the 90’s and early 2000’s, it does nothing against today’s more sophisticated hackers. BOHH Securely hides your data, making it inaccessible to bad actors, whether internal or external. In this way your data becomes usable by a wider audience without compromising your analytics and without the risk of PII, PHI, GDPR or other sensitive data being leaked or accessed. This same process can also be utilized on document and email stores, allowing your analytics engines to work on your entire storage not just a small subset!



We believe that BOHH is the only organization that is able to secure your data, without leaving a neon sign above an unlocked backdoor. Traditional security measures, such as TDE, Homomorphic encryption, Data Masking, and Application Level Security have proven ineffectual and left your data at risk, and in many cases your organization at risk of legal ramifications. Just look at the theft of data over the past 2 years!

BOHH works on all current data stores, document stores, and email systems; it works in the background without impeding users and enables you to work with your data without risk of theft or malicious data manipulation. BOHH has been architected in a way that does not compromise speed and that is infinitely scalable. If you want proof, and why wouldn’t you? After all you have heard all of this before and yet …  Contact us and we will be happy to place a POC on your storage system, to show how we secure, scale and enable easy access to your valuable information stores.