Category Archives: Tech Thought Leadership

Steph Charbonneau: Five Ways to Improve Data Loss Prevention and Avoid Third-Party Data Breaches

Written by Steph Charbonneau, Senior Director of Product Strategy at Vera, by HelpSystems

The majority of data loss incidents have one thing in common: they revolve around third-party data breaches. SecureLink and Ponemon Institute recently released a new report titled “A Crisis in Third-party Remote Access Security”, which revealed the alarming disconnect between an organization’s perceived third-party access threat and the security measures it employs. The report revealed that 44% of organizations have experienced a breach within the last 12 months, with 74% saying it was the result of giving too much privileged access to third parties. This is compounded by the State of Third-Party Risk Management 2020 report from RiskRecon, a Mastercard Company, which found that 31% of respondents have vendors they consider to be a material risk in the event of a data breach.

Third-party risk is certainly not a new risk vector. But in our hyper-collaborative economy, it’s rapidly rising in significance. Whether you’re in financial services, telecommunications or manufacturing, your greatest risk to data loss occurs when content moves outside of your direct control. Yet we can’t afford to stop collaborating. What’s needed is data-centric security, a way to keep control over this valuable information without paralyzing the ability to do business.

In other words, it’s time to rethink the way companies address vendor security. As more stringent data protection regulations go into effect (e.g., California’s CCPA and CPRA, New York’s SHIELD Act), every organization will need to keep pace. Companies need strong preventative controls that protect their data as it leaves their hands, especially when it’s stored with third parties. The bigger, stronger walls we’ve built are excellent at keeping attackers out, but they can’t protect data we’ve entrusted to others.

By applying security and identity-based access controls directly to the data, companies can mitigate the risk of human errors stemming from many common occurrences. Employees accidentally autocomplete an external email address, forward a file they shouldn’t, or move sensitive data off controlled systems. People will always be a weak link in the information security process. But by applying default data encryption and setting automated policies and controls, IT can take human decision-making out of the security equation.

To accomplish this task, we’ve compiled five recommended practices that can help organizations move to a more proactive security model for avoiding third-party data breaches.

Take a data-centric approach

By taking security to the data level, organizations can enable their employees to confidently collaborate freely with whomever they choose, while ensuring the highest levels of security, visibility and control.

Encrypt more data by default

Another mistake companies make is putting complete trust in their employees to do the right things. The great majority of employees certainly want to, but most may not know what or how. Let IT make it easy for them and set policies that will automatically be applied when data is created or shared externally. That’s especially important to apply file encryption for data shared through popular collaboration platforms like Dropbox, Box and Google Drive since if downloaded, those files could go anywhere.

Plan for auditing and compliance now

With many new regulations in the US and abroad, almost all companies are now required to provide a paper trail or audit log of what happens to their data. Taking steps to plan for these audits will best prepare you for a third-party data breach, should it happen. When you can see who has tried accessing your data, and where, you can mitigate the risk of having to issue a notification and can take steps to minimize future issues.

Make identity a central component of security

Tying access control to identity gives you control over who has access to your data by making users authenticate to you directly using an email alias. This can prevent forwarding information to unauthorized users or accidentally making a keyboard input error regarding an email address. Giving data owners the ability to control who can access your data and limit what they can do with it once it’s accessed provides an extra layer of security.

Don’t just monitor – take direct control of your data

In the event of a third-party data breach, or if your data accidentally finds itself in the wrong hands, you need to be able to kill access to it at a moment’s notice. No matter how high or how strong we build protective barriers, we’re always going to be at risk of a breach. A hacker’s biggest win is gaining access to your data. Proactively locking down any data they may get their hands on is a huge advantage.

By taking a data centric security approach, you can protect your team against data loss, even for files that have left your physical control. Moreover, you can proactively prevent unauthorized access, and track precisely who should (and should not) have access to your data. This approach will let you secure files and communications throughout their entire lifecycle. You’ll be confident that even if your data is sent externally, you can still verify that it was used appropriately.

The Enduring Appeal of Email for Marketers

Written By Hayley Strang, Field Marketing Manager, UK & NE, Mapp Digital

You have probably heard it said that email will become obsolete – but don’t believe a word of it. Nobody is abandoning email anytime soon. Despite the increase in marketing trends and new communication channels, people still love to receive emails in their inboxes. Owing, in part, to email’s enduring appeal, email marketing remains one of the most cost-effective ways to reach consumers today.

Personal, transactional, and timely, email marketing campaigns consistently outperform other digital channels when it comes to generating higher-than-average open and click-through rates. According to the Data & Marketing Association (DMA), for every pound spent on email marketing, the return is an impressive £42. But how can marketers effectively harness the power of email marketing to deliver the most value? What are the issues that might be holding back an email marketing campaign from delivering the very best results? And how do the most recently hyped marketing trends compare as alternatives?

Mobile
As a marketer, formatting email for mobile devices is a must. With 50% of email marketing campaigns being opened on mobile devices (Litmus June 2021) and users glued to their smartphones, email with a mobile-friendly design is indispensable.

Emails that are not mobile-friendly are also raising your bounce rate exponentially due to poor user experience. For most audiences, marketing emails that will work well for mobile should take priority over emails for desktop, or you run the risk of turning off a huge number of readers with a desktop-friendly template. Mobile-friendly email design looks and works the same way all the time, no matter what kind of device views the content. Such designs include a simple navigation menu, static content, and images appearing smaller on mobile due to a compact screen. Marketers are also well advised to create a mobile-friendly landing page.

Millennials / Generation Z
It’s been said that millennials don’t engage with emails, but for many reasons this is untrue and unfair. Despite their love of texting and social media, email still remains close to millennial hearts. According to YouGov, 44% of millennials began using email between the ages of 10 and 14 with 60% saying this is how they prefer to hear from brands. According to YouGov, 44% of millennials start using email between the ages of 10 and 14. Another important thing to note about these younger consumers is that they don’t have the burden of many years of marketing email subscriptions to sort through. Similarly, as many of them receive fewer work related emails, younger generations tend to receive far fewer total emails compared with older generations.

Millennials may use email differently to other generations, but they still want to hear from the brands they love on a regular basis – the trick is to understand the best email marketing strategy for a younger audience. They are a savvy group who will be turned off by uninspired marketing tactics. Although millennials might not always be attracted to sales boosting emails, they also like marketing emails grabbing their attention if they find them helpful and informative. Above all, personalisation and emails that allow an authentic human connection to develop are the most effective.

Instagram
Instagram and email complement each other brilliantly from a marketing perspective. Besides including the Instagram icon in emails with a link to the company Instagram account, marketers might also use Instagram-style photography paired with coupons to increase click-through rates and conversion rates. Even if a company has plenty of Instagram followers, there’s every possibility their followers will miss posts. But if marketers have their email addresses, they can send important information like launches, product information, and promotions directly into their inbox. Marketers can tweak the rules by asking others to sign up to a specific email list.

Email continues to be one of the most viable and valuable marketing channels for building a long-lasting bond with audiences. While emails still generate high engagement levels with content from brands, many marketers can still improve by using compelling Instagram images and content within emails. Instagram continues to be an untapped and unexplored source of high quality images and marketers can use Instagram content in their email newsletters to great effect.

TikTok
TikTok is not for kids only anymore – it’s time for marketers to take TikTok seriously and consider it as an additional channel to reach their audiences. TikTok is increasingly becoming a content platform and, as with Instagram TikTok is complementary to email. TikTok is primarily a video app based on user generated content and video is an entertaining way for users to connect and interact. Marketers are well advised to include TikTok videos or video links in their email campaigns. Videos, just like any other interactive content, increases the likelihood of customers clicking through to a marketer’s website. It works exceptionally well with product launches, teasers, behind the scenes or event testimonials from customers.

Customers should be encouraged to interact with a brand on TikTok and feature user generated content in email newsletters. This will act as social proof to other prospective customers that a brand is admired by thousands of people like them. It will also increase the engagement rates by people that do interact on TikTok, as they will be looking out for their post to be featured in the next email.

GDPR
GDPR simply means making sure the readers give clear, unambiguous permission to receive marketing emails. It was created so consumers know their data is protected and being used by brands they have trusted with personal information. This just means that they opt-in to emails they would like to receive from brands they are interested in.

Marketers should not look at the limitations of GDPR as restrictions, but as guidelines that can be used to develop more profitable uses of email marketing. Providing your forms are equipped with clearly defined requests for consent on the use of consumer data you can remain GDPR compliant.

Significantly, they can also continue to profit from email marketing as earned media and even build stronger customer relationships built off better email lists.

Email is indispensable 
Today, email is a crucial part of any kind of communication, be it private messaging or eCommerce campaigns. That’s why launching effective campaigns that will catch the recipient’s attention is as necessary as having a well-organized easy-to-use website. Email enables marketers to reach more users for less. It’s hard to imagine a business strategy that does not leverage email marketing to connect with customers. With its powerful ROI, far-reaching quality, and continued relevance, email marketing is still as vital now as it has always been.

 

How organisations can react in real-time with event-driven APIs

Written by Dakshitha Ratnayake, Associate Director of Technology at WSO2 

Where is my cab? What time will my train arrive? Was there a price drop in the stock that I invested in? Today, users increasingly demand ever more interactive experiences, and they expect to be automatically informed when something has changed without having to hit the refresh button.

Receiving notifications about someone liking your picture or reacting to your story on Instagram, or your Facebook news stream showing the latest updates about what your friends have been up to, are just a few examples of how frontend applications react to events in the backend. This is all made possible through event-driven APIs, which are supported through various modes of asynchronous event-driven communication between the client application and the backend.

So, the question is how do you build apps that require (near) real-time updates based on remote events? How should developers and architects who create reactive web applications or expose APIs for the consumption of such applications, handle asynchronous or event-driven communication in the world of APIs where synchronous communication is predominant, and most firewalls block non-HTTP traffic?

The answer?  This is all about having an event-driven backend, choosing an asynchronous web API technology based on exact async requirements, and a client that can work with remote events.

Synchronous APIs versus asynchronous web API technology

Many APIs that make up the web today are synchronous APIs, where the client initiates all communication between the client and the backend by sending a request to the backend and waiting for a response. However, these request/reply API calls happen one at a time, in a pre-arranged sequence, and each interaction blocks the progress of the process until its completion.

If the client application wishes to know about an update, it must continuously invoke the API for updates at a regular interval. This is known as polling and has been a common approach for client apps that need to become aware of new data. But this method is largely inefficient and less reactive than if the backend immediately notified the client when an event occurs. The term “polling madness” was coined because most of these polling calls are wasted because the data has not changed.

When should you use event-driven APIs?

Event-enabling APIs is a relatively recent phenomenon compared to event-driven architecture (EDA), and it uses EDA to support scalable, real-time or near-real-time, push-based communication in APIs published to third parties. Being event-driven is about taking advantage of events as they happen, and consumers can subscribe and register their interest and react to events. Making APIs event-driven or asynchronous (along with an underlying server-side event-driven architecture) can eliminate the need for inefficient polling requests and send updates to the client or event subscriber as soon as they occur. This provides a much better experience for users.

So, in a nutshell, event driven APIs should be used if:

  •    The application needs to push changes from the server to the client rather than waiting for a client request.
  •    The application must provide users with a highly active two-way communication channel.
  •    The application requires many continuous interactions between the client and the backend that create scaling issues for the backend if synchronous APIs are used.
  •    The application needs to be able to monitor system-wide events.

Choosing asynchronous API technology

Event-enabling APIs do come with multiple complexities, starting from what frameworks and networking protocols to choose, to building the reliability of delivery, and ensuring the scalability of the solution. The asynchronous protocols commonly used today solve the problem of real-time communication, but they solve different aspects of this problem in different ways which means that some protocols serve different purposes better than others:

  •    For multiplexed, bidirectional streaming and for applications that need a considerable number of messages from both ends of the connection, WebSockets is ideal.
  •    Server-Sent Events are especially useful in cases where a browser client is just interested in subscribing to a stream of data, without needing to communicate back with the server in the same connection.
  •    Webhooks can be used for a simple implementation of pushing notifications to one or a small number of servers.
  •    The newer async variants of GraphQL (Subscriptions and Live Queries) come with the benefits associated with GraphQL, however, the implementation is relatively complex and entails a considerable learning curve.
  •    gRPC and Kafka are generally used for communication between internal microservices.

 

Understanding what the backend architecture should look like

While the client architecture must subscribe to state changed events that originate from the backend, the typical backend architecture can be extended to create remote event connections with the clients via:

  •    Event-driven APIs (ideally exposed through an API management platform that supports eventing semantics).
  •    A message broker (if one is not already present).
  •    Microservices that publish and process state changed events.

Although event-driven APIs do not explicitly require a broker, using an intermediary between event producers and consumers helps to implement the required patterns to deliver more manageable and scalable solutions. The broker receives events from IoT devices, change data capture (CDC) tools, other backend systems and services, and from client applications if two-way communication is enabled. It can then alert the services that subscribe to those events. A scalable microservices architecture (MSA) is the optimal architecture for complex event-driven backend services. These event-driven microservices can act as event subscribers or publishers to process events, handle errors, and persist event-driven states.

The remote event connection between the backend and web clients can be established through event-driven APIs powered by various asynchronous API technologies — such as Webhooks, WebSockets, Server-Sent Events, GraphQL subscriptions, etc. The backend uses these APIs to send events pertinent to the clients and receive events that originate from clients.

Managing event-driven APIs

An event-driven backbone will manage the overall real-time data flow securely and at scale, while asynchronous APIs can be managed for external and internal consumption with a traditional API management solution that comes with inherent or extended capabilities to support event-driven semantics. Most organisations have basic event processing infrastructure, but many do not have the capabilities to design, develop, test, and manage event-centric APIs. Essential API management capabilities — particularly governance, access control, monitoring, analytics, and monetisation — provided by API management solutions can be used to manage asynchronous and REST APIs. Some API management solutions already support Webhooks or/and WebSockets.

Expanding business reach and adoption

Synchronous APIs are ideal for many scenarios, particularly if you need an instant response. In other cases, when clients need to be informed of events or the processing required for the response happens at a different time, ordinary synchronous messaging becomes tricky and event-driven APIs can aptly address such needs.

EDA is flexible enough that you can start with simple notifications or combine two-way event communication with a synchronous approach to deliver an optimal architecture. So, using an event-driven design alongside a traditional request/response design for your APIs gives you the best of both worlds to build highly scalable and reactive web applications. Furthermore, combining traditional API management capabilities with an event-driven architecture provides tremendous value additions to expand business reach and adoption.

Why do 70% of digital transformations miss the mark?

By Phill Bolland, co-founder, at YaYa

Covid-19 has accelerated the need for companies to embrace changing consumer and employee behaviour, which is why it’s essential for organisations to succeed with new digital technology to stay ahead.

But why do almost three-quarters of digital transformation fall short and what is the knock-on effect of this failure on businesses?

The impact on people

If your tools are for internal use, getting your digital strategy wrong can have a dire effect on your people. Yaya research found that half of employees surveyed would leave their current positions due to bad digital products, and a further 58% said the tools they’re provided at work were a poor second to the apps they use at home.

Failing to involve employees during the design process was a key reason for lack of adoption: only 10% of those surveyed said they or their fellow workers were consulted at this stage. Perhaps most surprising of all was that half of workers stated that the new tools they used at work made no difference or harmed their jobs.

It doesn’t matter whether employees are being paid to use technology as part of their job, or whether customers are paying a company to use a product, the same principles apply, and no business will thrive by disengaging one of its most vital assets: its users.

Building a house

So how can a business produce digital tools that will delight users? The answer is to take a people-centered approach. This kind of mindset is a bit like building a house. Housebuilders might ask how many bedrooms are required or what kind of material is needed. Alternatively, they might ask who will live in the house and where they see themselves in five years’ time. Taking such a people-centric approach will result in a home the owners love that facilitates and supports their goals.

The companies that succeed in developing digital tools aren’t obsessed with technology, they’re obsessed with human behaviour. It’s no great surprise that digital products miss the mark if we don’t seek to understand the behaviours, motivations and needs of our audiences.

Whether it’s empathy mapping, user testing or field studies, remember to walk a mile in your audience’s shoes and understand how a product can fit into their lives before attempting the solution.

Challenging your mindset

Although no one has a crystal ball that can predict the future, producing a product that’s slightly better than a previous version or a bit superior to competitors isn’t enough. This approach doesn’t move the business into a place where it has a differentiated or compelling enough offering to increase commercial advantage, it just means it’s scrapping away in a congested space.

But where should businesses start with a people-centric approach?

There are various research approaches that can help to build an empathetic picture of an audience, with the researchers immersing themselves to understand how audiences are shaped by their customs, habits or mutual differences. This approach, combined with other data we have available to reveal how an audience behaves, means companies can build a detailed view of the people it wants to engage from the offset.

For example, if you’re looking to develop a new sports app, putting on your trainers and sweating it out in fitness classes to understand what drives people will align your business strategy with the needs of users. From research to strategy, through to design and delivery, this kind of human-centered insight challenges any pre-conceptions and instead builds a product that people will love.

Aligning your business strategy

Crucially, starting with the needs and desires of your users means businesses avoid the expensive embarrassment of failed products created in isolation. Strategy isn’t delivered in a boardroom, so it’s vital to think about how to connect with the people who actually make it happen. In fact, companies enjoy 32% higher revenue[1] when taking this approach than traditional businesses.

Further still, fixing a problem in development costs 10 times as much as fixing it in design, and 100 times as much when the tool has been released[2]. With a people-centered approach, your business strategy is molded to your audience, lowering risk, and ensuring scalability.

So, the solution is clear: building successful digital tools starts with people – not technology.

To read the full report from YaYa: People centered design at work: how to design digital tools your employees love, click here.

Galit Michel: How Does the FCA Deadline Extension Impact UK Merchants?

Written by Galit Michel, VP of Payments, Forter

The UK Financial Conduct Authority (FCA) announced in May that the deadline for SCA compliance has been pushed back to March 14th, 2022, due to a lack of industry readiness and long-term impacts of the Coronavirus crisis on UK merchants.

The Revised Payment Services Directive (PSD2) was initially scheduled to go into effect throughout Europe on September 1st, 2019, and then pushed to December 31st, 2020. The global pandemic and evident impact of the regulation on merchants led many countries to push the deadline back, However, the latest FCA extension makes the UK the very last country to require full PSD2 compliance.

This extension is expected to be the last before full PSD2 enforcement in the UK, making this the last chance for UK merchants to look in-house, examine their operations, and get their SCA ducks in a row.

What the Delay Means for Merchants 

The FCA deadline extension is a clear win for UK merchants who have had a tough time with the outbreak of the Coronavirus and cannot afford to lose revenue due to SCA issues.

Instead of being alarmed by the sudden drop in conversions, UK merchants can take the time to learn from their European counterparts, examine the true impact PSD2 has on conversions, adapt their payment process, and monitor the readiness of their payment ecosystem before SCA is enforced.

This added time is a huge opportunity for merchants, especially given the dire impact SCA has had on conversion, revenue generation, and profitability for merchants in other EU countries.

UK merchants should recognise this extension as the opportunity it is, and familiarise themselves with the critical changes PSD2, and particularly 3DS, impose on their customers’ checkout experience. This includes ensuring they are able to request exemptions, understanding the difference between 3DS methods and knowing which one their providers are using, implementing solutions to create a frictionless and compliant checkout process, and protecting their business from risk.

The Problem with PSD2 is 3DS 

PSD2 requires Secure Customer Authentication (SCA) to be performed on all transactions, most frequently done through 3DS. This is a problem for merchants who want to increase revenue generation and create a frictionless checkout experience for their customers.

In theory, 3DS is great; when 3DS is applied, liability shifts to the issuers, and the merchant can ensure they are PSD2 compliant. However, 3DS also creates many challenges for merchants.

One of the most significant problems with 3DS is the friction that it causes consumers. By adding additional touchpoints to the checkout process, the chance of abandonment and human error rise. In addition to challenges on the consumer side, 3DS brings about many challenges from the payment ecosystem side.

The 3DS process requires the entire payment ecosystem to be 3DS ready, or transactions will not be able to be processed. Many failure points can occur during 3DS, including technical failure, authentication failure, and more. 3DS also increases the risk of a transaction being falsely declined due to the aversion of banks to assume liability for transactions they are unsure of. Legitimate transactions that are denied result in lost revenue as well as damage a brand’s reputation.

As PSD2 has gone into effect throughout much of Europe, many merchants have experienced these challenges first-hand and on their bottom-line. In France and Spain, merchants have experienced a 25% decline in conversion rates, which is better than merchants in Germany and Italy who have seen conversions decline by over 30% and 40% respectively.

The decline in conversions costs European merchants millions of Euros. While the UK payment ecosystem is more prepared for PSD2 than other countries, merchants may still see a conversion decline of 15-20% once SCA is enforced – unless they do something now.

What Other Merchants Wish They Knew 

One of the key things European merchants have learned post-PSD2 enforcement is that they need to do everything possible to provide their consumers with a frictionless checkout experience.

The best way to do that is by leveraging exemptions to their advantage.

Under PSD2, merchants can apply for SCA exemption for eligible transactions such as low-risk exemptions, low-value exemptions, recurring payments, and more. However, to know if a transaction is exemption eligible and go through the steps of requesting the exemption requires having an exemption engine in place. Merchants should be careful to only request exemptions from acquirers that have agreed to process exemptions from them, or they will risk the transaction being declined.

While exemptions can reduce the friction on consumers, when a transaction is processed without 3DS, the bank does not assume liability, leaving the merchant responsible in the event of fraud. To protect their business while maximising exemption requests, merchants need a powerful fraud prevention partner. This is especially crucial as fraud rates are increasing globally, and merchants that want to process transactions without SCA will be liable for any chargebacks.

However, relying on exemptions does not guarantee frictionless checkout; some transactions still do not meet the exemption requirements while other transactions may be declined by the issuer even if they are exemption eligible. When this happens, merchants need to have an alternative solution – namely, Dynamic 3DS.

Dynamic 3DS uses real-time information and behavioural analytics to provide consumers with a 3DS experience that is as frictionless as possible. The Forter Dynamic 3DS solution coupled with the covered model enables merchants to enjoy the same liability and higher conversion rates. In just five months, Forter has increased approval ratios for global merchants, increasing conversions to close to their pre-PSD2 levels.

UK merchants that want to provide customers with a frictionless checkout experience need to take the time now to ensure their payment optimisation partner can request exemptions on their behalf, that their payment partners are able to process exemptions, and that they can provide alternative checkout experiences to their customers when the transactions are not eligible for an exemption.

SCA is Not a Drill 

It seems like the threat of SCA has loomed over the heads of UK merchants for so long that they no longer fear it. From my experience in the payment industry, I can firmly say that this delay is just that – a delay.

Despite taking longer than expected, SCA enforcement will still reach the UK borders, and when this happens, the merchants that did not take the time to plan, prepare and test their PSD2 solution will suffer the consequences.

It is important to note that when SCA enforcement goes into effect in the UK, it is very possible that 3DS2.2 will already be released. This will create even more opportunities for merchants to reduce the impact of PSD2 on their operations and ensure their revenue generation and profitability stays high. Merchants that still use 3DS1 or are not prepared to use 3DS2.2 when it is released will not be able to support exemptions at a large scale via 3DS on rails, nor will they be able to leverage delegated authentication to their advantage.

To ensure they are ready for PSD2, UK merchants need to examine their solution today, involving their PSP’s, issuers, and the entire ecosystem in the process. Merchants need to pay close attention to their monitoring capabilities and understand exactly what is being counted to ensure they get a full overview of the state of their operations before and after SCA enforcement.

While the SCA enforcement date may seem far away, adapting the entire payment process may require significant changes on the merchant’s side. By starting early, examining their PSD2 solution, partnering with the right payment optimisation solution, and learning from the mistakes of their neighbours, UK merchants will be able to continue generating revenue and profit while being fully PSD2 compliant.

Ransomware Resurgence: Is your Organization Prepared?

Written by By Rick McElroy, Principal Cybersecurity Strategist, VMware

Ransomware made mainstream news when cybercriminal group, DarkSide, launched an attack on U.S. fuel company Colonial Pipeline, which carries nearly half the fuel consumed along the U.S. East Coast. The disruption of critical infrastructure and the impact on our daily lives was a sobering reminder of the havoc that a successful cyberattack can wreak. 

While its scale and impact grabbed headlines, this attack is only symptomatic of a dramatic resurgence in ransomware campaigns over the past year. Alongside an increase in the number of attacks, VMware found ransomware groups are becoming even more organized and sophisticated, while the rise in ransomware-as-a-service is enabling a much broader cybercriminal base to execute attacks using existing tools.

Understandably, this adds to the pressure already felt by CISOs, who are defending a more distributed environment than ever before.

 

Ransomware is a leading cause of security breaches worldwide

VMware surveyed 3,542 CISOs across 14 countries for its recently published Global Security Insights report and found ransomware attacks were the dominant cause of breaches for organizations. The average number of ransomware attacks organizations experienced have doubled over the past year. Additionally, the VMware Threat Analysis Unit identified a 900% increase in ransomware over the first half of 2020.

Malicious actors have spent the pandemic capitalizing on the rapid adoption of an anywhere workforce and the use of personal devices and networks by remote workers.  Attackers now have an unprecedented opportunity to launch social engineering attacks, such as phishing, on unsuspecting employees.

No industry was off limits to attackers, either. The healthcare sector – already in the grip of pandemic response – was disproportionately targeted with ransomware in 2020. One in five breaches reported by the healthcare CISOs we surveyed were caused by ransomware. In the same way that DarkSide targeted critical national infrastructure, ransomware groups have looked to cash in on the healthcare sector, an industry more likely to pay due to their critical nature of their business. 

 

Double extortion tactics pile pressure on victims

New tactics are making ransomware a much more nuanced threat, too. Instead of locking up systems immediately, attackers are aiming to infiltrate systems undetected and establish persistence on the target network, moving laterally and extracting data that can be monetized even if no ransom is ultimately paid. A system encryption and ransom demand will not be made until the perpetrator has covered their tracks and established a route back into the target network.

This gives cybercriminals greater hold over victims. As well as needing to decrypt their systems, organizations also face the possibility that critical assets such as customer data or trade secrets will be released for sale to the dark web and the breach will be made public. The reputational and regulatory risk tied to ransomware means the pressure to pay ransoms is often significant. However, unless the attacker’s presence in an organization’s network is fully removed, they are likely to return for another strike on a target that has shown willingness to pay.

The cybercriminal community has capitalized on the growing profitability of this approach, with nearly 40% of security professionals saying double-extortion ransomware was the most observed new ransomware attack technique in 2020.

 

Strengthening defenses against ransomware

As businesses adapt to supporting the anywhere workforce and malicious actors continue to target the expanded threat landscape, CISOs have a once-in-a-generation opportunity to strengthen defenses against ransomware and protect their organization by:

Delivering security as a distributed service: To protect the anywhere workforce, regardless of the devices and networks workers are using, deliver endpoint and network controls as a distributed service that follows the assets being protected throughout the environment.

Prioritizing visibility: Better visibility over endpoints and workloads delivers contextual insight and situational intelligence to help defenders prioritize and remediate risk with confidence.

Conducting regular threat hunting: The first step of a multistage ransomware campaign is gaining undetected access to networks. Regular threat hunting can detect silent incursions and the presence of adversaries in the environment by spotting anomalous behavior.

Keeping monitoring “quiet” to avoid counter-incident response: Assume the adversary has multiple means of gaining access to the environment. Watch and wait before taking action – don’t start blocking malware or terminating C2 systems until you are sure you understand all possible avenues of re-entry.

Engaging with an incident response partner: It is not a matter of if, but when organizations will be targeted, so it is essential to be prepared. Engage with an IR partner to devise a response plan and retain them to put it into action when needed. This should include post-incident remediation and analysis to root out any remaining adversary presence and avoid repeat attacks.

As organizations rethink their approach to security, defending against ransomware should be a top priority as the impact and scope of attacks increases. The anywhere workforce must be supported by a security strategy that surrounds and protects employees to let them work safely and productively without putting the infrastructure, reputation, and competitive position of the business at risk.

eCrime industrialisation – how ransomware groups are lowering the bar of entry and maximising profitability

Written by VMware Security Business Unit

Wherever there is disruption, cyber criminals see opportunity. Alongside the devastating health and economic impacts of the global coronavirus pandemic, we have also seen a huge escalation in ransomware attacks as people shifted to working from home. VMware threat researchers have recorded a 900% year on year increase in ransomware attacks in the first half of 2020.

Attacks are not only more frequent, they are also more sophisticated, as adversaries strive to maximise the revenue potential from each hit. As modular and more extensive malware has become ubiquitous, adversaries are diversifying and adopting more strategic and multi-stage tactics. They’ve identified factors such as high financial and regulatory penalties and reputational damage that offer more leverage to extort money from victims. As a result, it is now easier than ever for criminals with minimal skill to execute highly impactful attacks.

Destructive attacks and the sale of direct access into corporate networks are also rising trends and the lucrative payoff potential from all these is changing how adversaries approach their craft; a typical ransomware attack today is designed to do a lot more than simply encrypt data.

 

Shift from spray and pray to cultivate and curate – rise of the hands-on ransomware attack

In the past, a ransomware attack typically originated in a phishing email where the victim unwittingly opened an infected document or clicked a link that executed actions to immediately encrypt the environment and demand a ransom. Adversaries launched high volumes attack campaigns, on the assumption that some would make it through defences and pay-day would follow.

The current approach is much more hands-on-keyboard, with the attacker actively involved in orchestrating targeted attacks that will deliver multiple opportunities to monetise the results. In the attacks we’re seeing today, the eventual encryption and ransom demand comes a long way down the line; victims should assume that attackers have been inside their network for a significant period, have mapped out their infrastructure, and have already exfiltrated their most sensitive assets. The new evolution of ransomware attacks involves:

 

Research phase: the adversary gathers intelligence about your organisation through open source intelligence gathering (OSINT)  – everything from social media, geographical footprint, publicly exposed IP addresses found on Shodan. Paying special attention to an organisations employees. All of this helps to establish an attack plan, most commonly targeted towards unsecured edge-devices, with Microsoft’s Remote Desktop Protocol (RDP) being leveraged by far and away

 

Reconnaissance: Adversaries scan your organisation from the internet, looking at edge devices that could be a potential entry point, extrapolating what the rest of your environment might look like and what resources are worth targeting. They might identify home users with publicly exposed devices and target them with a phishing email, but more typically we see adversaries go after poorly configured edge devices, such as a Windows server with Remote Desktop Protocol exposed and no multifactor authentication in place as an ideal access vector.

 

Access and consolidation: On entry the attacker conducts initial post exploitation reconnaissance to gain access to a credential and elevate their privileges so they can pivot from the Demilitarised Zone into the internal systems and map out the internal infrastructure. At this point most ransomware groups we’ve been following will try to back-door additional systems with redundant access to a secondary command and control server, additionally with the goal of infecting the back-up server even getting their payloads deployed within the backups themselves. They probably won’t use this – it’s insurance in case their initial route gets cut off – but from a victim’s perspective this is something you need to look out for in incident response.

 

Slow and steady data exfiltration: to avoid triggering the controls companies have in place to prevent large scale data exfiltration, attackers will look for a discreet way to get the data out of the organisation. This might be through a user within the environment, moving files slowly or overtly to a compromised user and offloading the files to another server – such as a compromised webserver – which serves as a collection point for the stolen data. Or they might move the data out slowly through protocols such as DNS.

By now the attacker has achieved the first part of their goal. They have stolen data that they can monetise directly, and they have persistence on the victim’s systems. The victim is still unaware and now the attacker starts to plan for the next stage of their attack.

 

Extortion – reputations and data held to ransom

This is where we are seeing the convergence of data theft and ransomware. Once attackers launch the encryption phase of the attack, they lock up the victim’s data and demand payment in a traditional ransomware style.

Businesses with good data back-ups and recovery capabilities might be tempted to call the attacker’s bluff – until the extortion starts. Attackers threaten to release parts of the stolen data on the web to publicise the exploit if payment is not forthcoming. So even if the business can recover its data, its reputation and company secrets are still on the line.

The Maze Cartel is an arch-exponent of this technique. When victims don’t pay, they publish stolen data on their website. It is bold and shows the capabilities and power these groups exercise. We’re also seeing these groups collaborating and sharing infrastructure and code, which is making attacks harder to attribute and increasing their overall capabilities.

If the victim bows to pressure and pays the ransom their data has still been breached and is for sale on the dark web, adding another revenue stream for the attacker. Of equal concern should be the fact that the adversary still has a redundant command and control access that they can sell or use to conduct further attacks.

 

How to combat evolving ransomware attacks

You have to treat ransomware like you would any other breach – this is someone who is in your environment, and they have access to a lot of sensitive data. You need to conduct full incident response and recovery following each of these attacks, looking especially for signs of residual access to your environment following ransomware data theft.

To protect networks, defenders need to deploy endpoint protection, making sure they are blocking ransomware and have layered visibility of what is happening within the network. Understand the details of what your processes are doing and segment your networks effectively so that the scenario described above is not easy for an attacker to achieve.

Watch for evidence of initial access reconnaissance activity, configure alerts for large-scale data exfiltration, look for redundant command and control access and bear in mind that attackers are playing the long game. They are aiming to retain their foothold in the environment for as long as possible, so you might be looking for something that activates on a weekly or even monthly cycle, so is easy to miss. If you have suffered an attack, you should hire an incident response firm to look for these hard-to-find indications that your network is still being curated for future attacks.

It’s important to understand that this new approach is bespoke work. It’s targeted and long-term tradecraft and the pay-off is higher as a result; attackers will use every means at their disposal to get the most return on their efforts and grow their profits in the current highly disrupted environment.

Gary Blower: Calling in An Expert – As and When you Need It!

Written by Gary Blower, Solutions Architect, Clearvision

Among the many ‘hard-to-fill’ and skills shortage vacancies in 2020 were IT programmers and software developers. As businesses looked to digitally transform, almost overnight, and move employees to remote working, they found that demand for skilled professionals outstripped supply.

And as restrictions lift and our digital world evolves, companies are finding that they need to increase their investment in new technologies to innovate and move forward. Not only will this give them an edge over the competition, but technology, automation and innovation drives profit and improves customer retention. However, with technology innovation comes a growing digital skills gap.

Filling the IT skills gap

 

Back in 2016, Sanjay Brahmawar, the global head of strategic business development for IBM’s Internet of Things business pointed out: “One of the biggest issues is going to be the gap in skills. Getting the skills required to analyse and manage data is going to be difficult. By 2020 we will have one million unfilled jobs in the IT sector. Primarily because the skills we have today aren’t the right skills for the future.”

This is just one of many comments and articles highlighting the ever-growing tech talent shortage, which is now at its highest level since 2008. Again, this recently published article from Deloitte suggests that less than half of executives believe they have the skills to compete and lead within the digital economy. It goes on to confirm that companies are heavily budgeting to invest in digital development and ways of working. Technologies such as AI are at the forefront of their investment. Indeed, 82% of executives plan to address the AI gap by 2021 and beyond.

Delivering ad-hoc help 

It could be argued that ‘big deployment or migration’ company shortfalls are easier to address. Issues arise when there is a lack of skills to implement or action ongoing updates, upgrades and other digital changes. Often it is the small to mid-sized organisations that suffer most, because they usually don’t have in-house experts or specialists in particular areas.

This means that as vendors release new products and features to enhance the user experience, so the more ‘generalist’ IT professionals struggle to keep up. Often software and hardware adoption can start as one install and then very quickly spread across the organisation organically and someone in IT is tasked with managing all these instances and expected to be an expert across multiple technical products. Likewise, it tends to be the small to mid-sized companies who are hit with IT staffing bottlenecks, especially if they have a big and demanding project which is consuming developer bandwidth and internal capacity.

Atlassian is a great example. Millions of users globally rely on Atlassian products every day for improving software development, project management, collaboration, and code quality. At the forefront of innovation, Atlassian is constantly releasing new products and features to enhance the user experience and it is hard for IT teams to keep abreast of all the latest updates while focusing on the day job. It is equally hard for these companies to really optimise their investment and ensure that the software is maximising RoI.

Putting you in control of costs

This is where our Atlassian Experts on Demand subscription-based service really helps, because it delivers an Atlassian expert as and when a company needs it. This is someone an organisation can tap into for expert guidance, advice and support, without the expense of committing to a long-term engagement. Through this service, organisations can book individual mentoring sessions, create a coaching plan for the team, or our Experts can just be available to assist with Atlassian-related initiatives. In response to customer demand, we have just launched the service and a number of our existing customers are already beta testing and benefiting from it.

Knowing that it will typically be the small-to-mid sized organisations that will utilise the service frequently, we based it on a subscription model as the cost per hour or day is lower. This enables customers to plan more effectively, as each credit grants access to a solution expert for up to half a day. Additionally, organisations can budget for this, rather than have unexpected costs at the end of the month; our fixed cost price puts them in control, and they aren’t hit with any nasty surprises.

Additionally, the flexibility of the service means they can tailor their Experts on Demand subscription to include whatever they require, so whether they are designing for digital transformation programmes, Agile/SAFe best practices, ITIL/ITSM/eSM best practices or just looking for continuous improvement, we can help. Furthermore, we provide ad hoc services such as one-to-one coaching and mentoring, training sessions, Q&A with an expert and health checks and problem solving.

Customer success is important to Clearvision

This is largely managed by our customer success team, because this is also about making sure that existing customers get the best out of their Atlassian solutions. All too often software ends up as shelfware and here at Clearvision we want to ensure that customers see the benefit of their investment and can quantify and measure the RoI.

As we slowly tread a path back to some kind of normality, technology and innovation will play a critical role to our recovery and here at Clearvision we are keen to level the playing field and make sure that those small to mid-sized organisations have access to the specialist skills, knowledge, support and expertise that they need in order to compete and remain relevant – both now and in the future. We recognise that there are times when IT teams just need a bit of support to push a project over the line, or they have a short-term staffing bottleneck, and we’re on hand to help.

For more information about Experts on Demand, please visit https://www.clearvision-cm.com/atlassian/experts-on-demand/.

Best Practice Steps for Safe Data Sharing

Written by Steph Charbonneau, Senior Director of Product Strategy at HelpSystems

Digital data is everywhere. You only have to look at how much data is transmitted over the internet on a weekly, daily, hourly, or even second-by-second basis to understand just how much data is being shared. In fact, at the start of 2020, the amount of data in the world was estimated to be 44 zettabytes. Given how much data is created every day, pundits predict that this will likely increase to 175 zettabytes by 2025.

As employees and businesses, we are constantly sharing information. Likewise, the number and variety of entities and individuals we share that information with has grown exponentially. No longer is this simply restricted to the perimeter of our own businesses, but it now extends to partners, suppliers, customers, prospects, and influencers around the globe.

Consequently, the challenge for most organisations now is: how do we share data easily, quickly, yet also securely?

More Regulation, More Data Breaches

The good news is that there is more regulation to govern data, requiring organisations to protect it from unauthorised access. However, the bad news is that there are also more data breaches occurring. And if your data is vulnerable to cybercriminals or even to human error, unfortunately you need to be prepared to pay. According to a study undertaken in 2020 by IBM, the global average total cost of a data breach is now estimated at $3.86 million.

Layer on top of this the reality that many employees will continue to work remotely yet still need to securely collaborate from anywhere, and you can quickly appreciate how the risk is escalating with this extended attack surface.

However, it is challenging to find a solution that is capable of handling file sharing or the secure sharing of confidential information on a regular basis. Often it can be hard to trace what happens to that information after it has been shared, or to identify whether the information should be shared in the first place.

Prevent Unauthorised Access to Sensitive and Confidential Information

Organisations must therefore implement the appropriate measures to prevent unauthorized access to sensitive and confidential information. They also need to prevent accidental loss or the deletion of any confidential data. This is where UK public sector organisations make it easier for employees to understand what constitutes confidential information which needs to be protected, as most have some form of Protective Marking System in place which highlights the sensitivity of the information and what action employees need to take.

However, private sector organisations don’t typically have such policies in place and often this can leave employees unsure about what constitutes sensitive or confidential information.  It is therefore important that organisations establish a culture of security whereby employees are trained on how to appropriately classify, handle, transfer, and delete any such data.  And of course, that they have the right tools and technology to enable them to do this, efficiently, proactively, and securely.

Take a Risk-based Cybersecurity Approach

In deciding the most appropriate way to do this and the level of security required, organisations should take a risk-based approach. For example, when sharing confidential information, the employee must ensure the recipient understands why the information is being shared and the circumstances under which it may or may not be shared. They also need to ensure that any further handling of the information is secure. This applies whether it is being shared with someone inside or outside the organisation.

When dealing with external parties, businesses need to understand what data they will need access to and why, and ultimately what level of risk this poses. Likewise, they need to understand what controls such parties have in place to safeguard data and protect against incoming and outgoing cyber threats. This needs to be monitored, logged, and regularly reviewed, and a baseline of normal activities between the organisation and the external party should be established.

Layer your Data Security Solutions

Here at HelpSystems we advocate taking a layered approach to data security that starts with understanding and classifying your data and identifying what information needs to be protected. Here data classification tools are critical to ensure that sensitive data is appropriately treated, stored, and disposed of during its lifetime in accordance with its importance to the organisation. Appropriate classification protects the organisation from the risk of sensitive data being exposed.

But inevitably employees will accidentally send sensitive data to the wrong person or transfer an otherwise “safe” document that contains hidden metadata that could compromise the organisation. Any number of scenarios can put an organisation at risk unless they have a solution in place to detect and sanitise data in real time, before a breach occurs. Therefore, organisations need to detect and prevent data leaks, and this means ensuring that documents uploaded and downloaded from the web are thoroughly analysed.  To do this effectively, they need an integrated Data Loss Prevention (DLP) solution that removes risks from email, web, and endpoints, yet still allows the transfer of information.

After you’ve ensured your data is identified and classified, scrubbed of potentially sensitive data, and approved for sending by authorised users, it needs to be sent or transferred securely. This can be achieved by email encryption or, where there are large volumes of data through a managed file transfer (MFT) solution.

And finally, to secure confidential data whenever and wherever it travels, Digital Rights Management software provides organisations with the ability to track, audit, and revoke access at any time by encrypting the data with a unique key that is secured via a cloud platform.

Layering data security solutions is a proactive approach to protecting your confidential and sensitive information.  Data security is only as robust as the various elements that support it. Tiering proven solutions to ensure your sensitive data remains secure from start to finish will help you to avoid any data compromise – and the financial and reputational costs that go with it.

If you are interested in finding out more about specific use cases around best practice for sharing sensitive data, please download our guide.

Hacker and online security concerns rise, following COVID-spurred digital transformations

With online services growing at an exponential rate, each requiring us to change our passwords every 6 months, Nicole Lin Managing Director of Synology UK explores how we should approach identity authentication and password management from individual and business perspectives in the long run.

Before 2020, remote working was a perk to employees, even a bandwagon some jumped on as it was how the future was supposed to be like. One pandemic and global lockdown later, lead the entire world to scramble to get to grips with Zoom meetings, cloud storage and VPNs, plus fast-tracking long overdue IT skills updates.

In this global rush to working remote, handling security particularly in the cloud has been the sword on which many an IT admin has fallen. Smart security providers have sensed an opportunity to market sophisticated tools to protect your network infrastructure. And let us be clear; these tools serve a vital purpose. Should we not have advanced security gateways, for example, which inspect every packet entering your network and flagging any potential threat? Of course we should! Should we not use advanced antivirus tools powered by AI and leveraging global databases to stay ahead of hackers and to stay protected from the latest forms of ransomware? Who would object?

Now with that disclaimer in mind, let’s address the elephant in the room namely, these sophisticated tools are a like a castle built on shaky foundations if IT admins leave the humans in the organisation to their own devices when it comes to security. Verizon investigation report on data breaches puts things into perspective: 61% leverage credentials. So where have things gone wrong?

Let us put ourselves in the shoes of a hacker. What will require the least amount of effort to breach an organisation’s security? Rather than spending hours identifying a system’s vulnerability to hit a target with ransomware, “guessing” a password is just as easy and allows entry without creating a fuss, potentially remaining undetected until it’s too late.

It is important to consider all aspects around passwords. We are told, reminded, encouraged to make passwords complicated. “123456”, anything containing your date of birth, names etc… are too obvious and constitute a risk. Increasing the complexity by making it longer, including special characters, is the logical solution. However, unless one has an eidetic memory, the temptation is great to re-use the same password for Gmail, Windows, Salesforce, Twitter and once one account is cracked, your whole privacy is at risk.

And to that the massive growth in computing capabilities means even entry level laptops can now be used to carry brute force attacks.

Making matters worse, password-reset functions can be easily hijacked: “What was your mother maiden name” may have been relevant in 1990, but with most of us posting our lives on social media, this can ever so easily be discovered by unscrupulous hackers.

This shows us one thing: passwords have served us well, but an arms race with hackers is not going to end well for corporations and honest netizens without a change of strategy. Preferably one that does not involve retreating from the web back to a bygone age of letters and carrier pigeons.

 

Password + 2FA for optimised protections

So, from an individual’s perspective, how can password complexity be enhanced with a growing number to remember as we use ever more online services?  Password vaults are a first step, centralising all passwords, as they also allow to generate strong, secure passphrases. But the more cynical of us will simply see this centralisation as a single point of failure: gain access to the vault, and every single account is then compromised.

This needs to be combined with a consistent use of multi-factor authentication methods. The concept is quite simple, with unauthorised logins being prevented by adding an extra layer of checks to ensure you are the right person. If we consider your password as “something you know” then an extra layer will be something you have. This is typically another device such as your phone, to which a one-time passcode is sent, and you need to enter within a short period of time to confirm you are not a hacker who has stolen the original password. Security can be pushed even further with “something you are” in the form a biometric identifier. In everyday life smartphone fingerprint recognition is the most common example.

 

Moving beyond passwords

With two-factor authentication increasingly common in the tech industry, from Gmail to Amazon accounts, one would think hackers would soon be running out of options. Well humans are remarkably creative, and to impersonate you and “something you own”, you may have heard of “sim-swapping”: Here a hacker fools your mobile provider into switching your SIM card information to a different phone – a process normally reserved for customers who just lost their phones – and the hacker is then able to access your verification code.

However a more fundamental flaw with passwords, even with 2FA methods, is that no matter how vigilant we are as users trust is a fundamental part of the equation. It’s often assumed for example, that the company hosting the website or service will follow strict security practices to keep the stored passwords safe. And no company is safe, particularly as we know big names like Dropbox or Facebook have had customer credentials leaked due to poor practices.

Since passwords will always present a certain level of vulnerability, the logical conclusion is to move beyond them. Which is what came from a meeting between PayPal and Validity Sensors back in 2009 where when discussing the use biometrics for identification of online users, it appeared clear that the first bricks for an industry standard would be needed. This would soon become the FIDO alliance, for Fast IDentity Online.

The concept is simple enough: contrary to passwords where authentication is initiated by the user who sends information to the website’s servers, the FIDO approach is device-centric, with no personal / biometric information ever leaving the user’s device. This is achieved by using a public-key cryptography model. When registering to a website, a public key is provided rather than a password. Later, when the user wishes to log in the website’s server will initiate a challenge to the user’s device, which can only be solved using the private key which was kept on the device. Security is further enhanced by ensuring that the public / private key is issued for the website in question. Importantly, this removes the threat of phishing scams where a fake website, visually similar to a mainstream one, is used to collect a customer’s credentials without their knowledge.

 

Last thoughts.

The staggering growth seen since last year in the use of malware, up by 358%, as well as ransomware, up by 435% shows how essential it becomes to spread best practices around online security, be it by standardising extra password complexity and 2-factor authentication, to a more fundamental shift in attitudes with the adoption of public-key authentication methods. To accompany this shift in attitudes, websites and platforms, as well as manufacturers of servers, must make these safer authentication methods available.